top of page
Search

What we learned from being wrong

It doesn’t happen very often.


But when it does, it really eats at us.


In November's midterm, three referendum results (of the 45 questions we tested) produced an outcome different than we predicted.


Why?

To improve our process, we have what the military calls “an after-action review” after every cycle. We want to share what we learned.


But before we move any further, let’s review a definition. In our world, “success” isn’t defined as a referendum passage. It can’t be. That would ruin our ability to provide objective, independent research. Our job is to represent the taxpayers in the planning process. Therefore, “success” for us is correctly predicting the outcome – in other words, when voters reject a referendum and we predicted failure or when voters approve a referendum and we predicted passage.


Our “success” – in fact, our mission – is to help educational leaders gather, organize, and use data to make strategic decisions.


With that out of the way, let’s talk about a few reasons why predictions may have differed from outcomes.


1. Enthusiasm and organizing

Occasionally, a yes-group or no-group forms after the survey and before the election date. These groups can be powerful and create a “pinwheel” effect. In other words, the vote no or yes folks produce a compelling message. This motivates people and even join the organizing effort. A broader organizing effort persuades more people, and the cycle continues.


Sometimes, these groups are pretty easy to spot coming. In the case of voting no, when there’s talk about closing schools or consolidation. And then the “Save our school!” signs pop up. In the case of voting yes, the vision of a new school, fieldhouse, or auditorium becomes compelling.


Groups that successfully organize these efforts explain three things very clearly.

  1. Where potential referendum money is going.

  2. Why a project is needed or, in the case of no folks, why it’s not.

  3. Why that project is needed now or, again for the no folks, why it’s something that can wait or isn’t needed at all.

The strength of these yes or no groups can be challenging to predict many months ahead.

2. The “the hell we are!” effect


Surveys are taken in a moment in time, and predictions don’t exist in a vacuum. In fact, outcomes can change simply because we make a prediction.


You sometimes see this effect while watching basketball. A team that has a lead down the stretch plays on their heels. They lose the edge they had early in the game, their defense isn’t as tough, and their passes aren’t as crisp and clean. The team that’s behind has something to play for – the win. Before you know it, the team losing by 12 with two minutes left suddenly cut the lead to two. They changed how hard they played because they were losing, not in spite of it.


“You think we’re gonna lose? The hell we are!”


Something similar can happen in referendums. Consider a project we researched several years ago now. This district was facing complete closure. At first, many people were fine with consolidating with another, neighboring district, especially those folks who grew up elsewhere or did not have kids or grandkids in school.


When the threat of losing a school became real, and the data pointed in that direction, the folks who didn’t want to lose this school looked at the impending vote and said, “The hell we are!” They changed their behavior, they changed their methods of organizing, and they brought tremendous enthusiasm.


Now, that district very much continues to exist. But it was the threat that motivated.


3. Extraneous factors

The third reason our predictions may stray from the outcome is because of factors neither we nor the district saw coming. Perhaps inflation spikes, gas hits $5 per gallon, unemployment increases, a local manufacturer unexpectedly closes, interest rates increase, a pandemic takes hold, and so on.


Another example we’ve seen is when districts are prodded to change their curriculum. These updates may very well have nothing to do with the referendum. Another would be a well-written opinion piece landing in the local paper two weeks before the vote. Soon, a previously unforeseen issue becomes a lightning rod.


4. Timing of the survey


We’ve learned our data has a shelf life. Let it sit too long, and it spoils.


We recommend implementing a community survey about six to eight months before the vote. This gives enough time for the community to understand what’s being asked and provide feedback while also giving school leaders enough time to analyze results and produce a ballot question before the deadline.


However, other factors will get in the way if you implement a survey too early. Leadership fluctuates, board members change, and the economy booms and busts. Wait too long, and your once-accurate data is no longer fresh.

 

We're proud of our record. Over time, we are more than 94% accurate. We are confident we can get you the data you need. We’re not perfect, but we darn well try to be.

 

The School Perceptions Blog and Resource Center features the voices of our team members. This post was written by Bill Foster, President & Founder, and Rob DeMeuse, Research Director.

188 views

Recent Posts

See All
bottom of page