If there’s one thing we know about after millions of surveys, it’s that some things do affect response rates.
And some things do not.
Regardless of the type of survey we are doing for a client – from staff surveys to community surveys, and everything in between – our ultimate goal is to get the highest response rate possible. While we can statistically review community response, in our experience, there are more anecdotal factors that can influence your rates (or not have an influence at all).
“When it comes to spending money, people have an opinion and they want to share it,” says Chelsea Davis, data analyst at School Perceptions.
In her role, Chelsea focuses on survey management and analysis of results, and she’s worked with communities and school districts of all sizes. She’s observed that there are a handful of factors that influence response rates, and some of them may be a bit interconnected:
Timing: As a company, our heaviest survey times are in the spring and fall. Some of this is tied to consistent factors, such as the timing of elections and the start and end of each school year. But is this really a huge factor?
“We actually conduct surveys throughout the entire year,” she says. “You would think that summer wouldn’t be the best time of the year to survey, but people are home and they check their mail, and we’ve actually gotten some pretty high response rates during the summer months.”
We actually have a running joke in our office about it: People in Minnesota love to take surveys in the middle of summer. We don’t know why, but they do, and that’s fantastic.
Population: We’ve discovered that there’s a bit of a reverse correlation between the size of a community and its response rate.
“Generally speaking, higher population density negatively affects response rates,” she says. “As a very general rule, we’ve noticed that smaller districts and less densely populated areas tend to have surveys with higher response rates. That being said, higher populations will receive more responses overall, which makes the data very statistically reliable.”
Visible Leadership: Chelsea has noticed that school districts with engaged leadership tend to have higher response rates.
“I think this might be really simple: if you have leadership that is out in the community, talking about the survey and encouraging people to take it, that probably leads to higher response rates,” she says. “It’s really about getting the word out.”
That may be one way to counter a potentially lower response rate if you live in a population-dense community. The more proactive leadership is, the more people will know about the ability to share their opinions. Chelsea said that those districts with administrators that do a mid-survey check of some sort – an email, making an effort to ask people in public if they’ve taken it, even a reminder in a newsletter – tend to see a bounce in responses.
“We really do appreciate the extra efforts that school districts take in promoting our surveys,” she says, noting that School Perceptions staff also provides some materials and scripts to act as a jumping off point that districts can use to increase survey awareness.
Survey size: In nearly 20 years of surveying people, we know that when it comes to community surveys, there’s a sweet spot.
“Eight pages is ideal,” Chelsea says of what is now our standard survey size, though depending on the needs, we do surveys of other sizes. “We’ve found that anything longer than that tends to create survey fatigue, and people either stop in mid-survey or don’t start it at all.”
The majority of our surveys are eight pages, and every household is sent a paper copy, which they can complete and return directly to us or they can respond online. We’ve found that this format is also something that people visually look for in their mailboxes.
“We find that this survey, in this format, really outperforms any other means we’ve used,” says Chelsea. “Unfortunately, people tend to ignore postcards. We’ve also found that not sending out an actual survey, and just promoting that you can take it online, tends to really negatively impact response rates.”
A commitment to using the results: Finally, especially for districts repeating staff, student and parent surveys, if respondents believe and/or see the results are being used to drive changes/improvements, the response rates will rise over time.
“When people can see that their opinions matter, they’re much more willing to put in the time and effort to respond to a survey,” he says.
If you’re considering a first-time survey or want to talk to us about suggestions to improve response rates, give us a call at 262.644.4300 or email email@example.com.