top of page
Search

There's a lot in our reports. Let's talk about them.

Maybe you’ve recently sent out a survey. Or, maybe you’re still thinking about it. Either way, you’re in search of something – data. You’re hoping to learn about your stakeholders’ experiences, concerns, ideas for improvement, etc.


That’s where School Perceptions comes in. Yes, we administrate surveys and gather data, but then what? Well, that’s when the real fun begins. (I realize that last sentence is a bit biased [survey pun, anybody?! Anybody?!] since I work on our post-survey data team…)


Making sense of your data and providing formal yet easy-to-understand post-survey reports is my middle name. A long middle name, I know … imagine trying to sign a credit card application or mortgage loan.


Anyway, there is a method to the “data madness” at School Perceptions. We have standardized our results reporting process over the years so that we can offer several types of analyses to our customers that provide actionable insights. Those include:


1. Summary reports (A district-level look at your overall survey results)

2. Comparison reports (A color-coded spreadsheet showing how you “stack up” to similarly sized schools)

3. Longitudinal reports (A color-coded spreadsheet showing how your results have changed year-over-year)

4. Comment reports (A document of comment themes that are generated after our team has analyzed each comment)


Nearly every school district requests a summary report when their survey closes, and for good reason. It’s an overall report that simplifies the survey data and can be easily shared with your leadership team. Below are a couple of sample slides from a parent survey report. Let’s make sense of them, shall we?

Above, you see a chart with several questions (“items”) on the left. In the column immediately to the right, you see a “Strongly agree/Agree” percentage. That percentage includes the respondents who, in this case, answered either “Strongly agree” or “Agree.”


The next column to the right is the overall average and includes the number of respondents who answered each question in parentheses. As seen at the top of the slide, the average is calculated using a 5-point scale where Strongly agree = 5 and Strongly disagree = 1. Note that those who answered “don’t know/doesn’t apply” are not factored into the overall average.


The column farthest to the right shows the percentile ranking for each item. This shows how you compare to similarly sized schools and, thus, assigns a percentile for each item’s average score.


The three columns of data to the right of each item are helpful in achieving a full understanding of what you’re doing well and where there are areas for improvement.


Personally, my favorite column is the one that includes “Strongly agree/Agree” percentages. It’s easy to look at an average (for example, 3.14) and think, “we must be doing well because we’re closer to a ‘5’ than we are to a ‘1.’” While that may be true, move your eyes the neighboring column on the left…do you see that “52%?” That means nearly have of the respondents who answered that question disagree that the school board is doing what it takes to make your district successful. I like to call this column the “double check” column because it helps to bring meaning to the averages.


Next, let’s look at another example from a summary report.

The graph above displays data for a question that’s commonly asked in all types of surveys we administer (parent, staff, student, and community). Respondents can select a score from zero to ten where zero means they are “Extremely Unlikely” to recommend the school(s) to a friend or family member and 10 means they are “Extremely Likely” to do so. Your overall average as well as an average of similarly sized schools (the “Comparison” average) can be found on the bottom right hand corner of the slide.


The color coding is purposeful, too. It helps you to understand the level of support but also whether there are passive or negative attitudes towards the district. This is arguably the most telling question on the entire survey and is an important indicator of whether your stakeholders approve of the school district as a whole.


In addition to the summary report, comparison and longitudinal spreadsheets are available. They provide a “deep dive” into the data by breaking it out into subgroups like schools, staff positions, and student grade levels. We can also display multiple years of data at once to measure changes over time.


These reports make identifying strengths and weaknesses a breeze and are sure to satisfy even our nerdiest of customers (I mean that lovingly…I’m a bit of a data nerd myself).


Finally, there are often hundreds (or even thousands!) of comments provided by survey respondents and it can be a headache to make sense of them all. Good news – our team can analyze every comment from your survey and generate a report that organizes them into simple, succinct themes.


Comment reports are an excellent complement to your quantitative data, provide additional nuance to your stakeholders’ thoughts, and are highly effective at driving communications with staff, students, families, and your community.


So, here we are at the end of the blog. I could probably keep writing about post-survey reports for days, but I’ll press the pause button here since not everyone shares my level of enthusiasm.


We’re looking forward to working with you on your next survey project and can’t wait to provide you with valuable post-survey analyses to help you engage and understand your stakeholders.

 

The School Perceptions Blog and Resource Center features the voices of our team members. This post was written by Chelsea Davis, Data Analyst.

20 views

Recent Posts

See All
bottom of page