Predicting 2020: Research on the Research
We missed it because we weren’t looking
By David Burrell | CEO & Co-Founder of Wick
Wick is not a polling company for either political party. We exist to create technology and thought leadership that accelerates the market research industry’s journey to more speed, affordability, and accuracy. We withheld this article until the day before the election to limit the politicization of its data and insights for the media interests of either party.
For media inquiries please email us at info@wick.io
From the author:
We are predicting that Donald Trump is going to win re-election. In our most recent battleground polls in the 6 states of Florida, Pennsylvania, Michigan, Georgia, North Carolina, and Ohio he is up by over 2% in all but Michigan (for those results scroll to the end of this article).
But what’s more interesting than our prediction, is that until last week, our polls showed Trump losing by margins similar to what you have probably seen in the news.
What caused this change in results? It had little to do with either team’s campaigning or voters changing their opinions. We can still easily conduct a poll that has Biden up by a large margin. The change in our results was due to a change in methodology.
So, what made us change our methodology? There were a number of things that just didn’t feel right, but the final nudge to act on this feeling came a week or so ago as I was watching a Biden speech on TV and I couldn’t hear him over the sound of Trump supporters honking their horns. I joked that we needed to tally the honks because out of the hundreds of polls I’ve run this year, this is the first I have heard from this group of voters… Maybe this is 2020’s “hard to reach segment” voicing their opinion.
It was a joke, but it made me start to wonder how much truth there was to it, so we decided to dig deeper and find out.
How did we know the polling was wrong?…
We will get into these symptoms further down, but prior to that, we think it’s important to create an understanding of why this isn’t just another non-response problem that will be easily cured.
Accurate public opinion polling is only possible in democracies where people trust the democratic process and feel free to express their beliefs and opinions. If it seems like sorcery when 700 respondents in a survey accurately predicts the election day behavior of millions, the source of that magic is a healthy democracy.
Imagine the difficulty in achieving an accurate political poll — one that’s supposed to be representative of the honest beliefs of an entire population— in Communist China or North Korea. Would you trust it?
China and North Korea may seem to be extreme examples, but they’re the easiest modern-day example to illustrate that undemocratic societies have characteristics, such as limited freedom of expression and the use of propaganda, that make it difficult or impossible to obtain a set of survey respondents that is representative of a whole population.
In western democracies like America, having your beliefs and opinions represented through polling has been a long-standing component of participating in the democratic process. And thus, like the debate commission and the media, pollsters have been fixtures in the democratic process. But in 2020, we have started to demonstrate some pretty undemocratic characteristics that could be putting stress on the magic behind the ability for public opinion research to be truly representative. To put it plainly:
1 | If one belief group is championed for its beliefs and another is continually shamed, attacked, or threatened, which group do you think is more likely to share its beliefs in a poll?
2 | If the media intentionally censors information and promotes misinformation, how does that affect people’s percieved worth of polls they see in the news? Could that affect their likelihood to associate polls with a democratic process that they trust? If so, then what is the incentive to take polls in the first place?…
Questions like these helped inform our theory that our environment has created an underrepresentation issue due to an ideological non-response bias. Up until a week ago, it was just a theory, but once our team fully hypothesized the problem we did the following:
- We designed a polling study to test our hypothesis (our 2020 battleground polls)
- Identified symptoms that would suggest our sample isn’t truly representative
- Treated the systems we could with agile sampling and back end weighting
- Analyzed the results.
- And scrapped together this article as best we could to present the findings.
We chose 6 battleground states and collected 1,000 completes in each from a random sample of likely and newly registered voters on 10/27 and 10/28.* IVR and Text-to-Web survey methods were used to collect the responses.
*Corrected from the original which incorrectly listed 10/24–10/25 as the field dates.
Wick’s 2020 battleground poll results and key findings
What were the symptoms that people with certain beliefs might be avoiding the pollster calls, texts, and emails?
Symptom 1: Too many respondents with post-graduate degrees.
In 2016, the breakdown of the education levels amongst survey respondent’s was a root cause of inaccurate polls. Most pollsters tried to adjust for this in 2020, but if a pollster is grouping people with graduate degrees with those who have post-graduate degrees in a “college grad or higher” value (like we were!), then there is still a big problem.
We made graduate and post-graduate degrees seperate response values in our surveys and kept a close eye on “education”. Midway through our first day of fielding, 23% - 31% of day 1 survey respondents reported having post-graduate degrees (they represent 11% of the turnout). These voters answered 71% for Biden whereas those with graduate degrees answered 53% for Biden. Grouping these voters together, like we (until now) and many other pollsters have been, includes too many post-graduate degree voters in the sample and skews the results.
Symptom 2: Too many early voters taking our polls
Admittedly, there is not a historical model with which to compare the 2020 voter turnout. Still, early voters (and absentee ballot voters in particular) were extremely overrepresented after our first day of fielding our study. Respondents who answered they had “already voted” was on average 16% higher than the actual percentage of people that had voted early in their respective states. Across the 6 battleground states, 36.5% of this group reported they voted for Trump. Voting Early carried a strong correlation with voting for Biden, and in turn, it was a variable that left unaccounted for could heavily push projections towards Biden.
Symptom 3: A socioeconomic anomaly
This is a symptom that we couldn't treat, but was impossible to ignore. The raw responses of the battleground polls show nearly a 3x increase in African American support and a 2x increase in Hispanic support for Trump in 2020. A jump in support like this demanded attention. Looking past just the race of these respondents, we sought patterns among their other responses in the survey. What most traditional analysis identified as a gain in minority support, we choose to identify as gains with the working class abroad. It should follow that there is a counterpart of white voters with the same socioeconomic status as these African Americans and Hispanics whom Trump is doing well with, but they are not showing up in the polling.
So what’s going to happen?
- We have Trump winning FL (+2.9), NC (+2.2), OH (+2.9), & GA (+2.5). We have Michigan very close with Trump losing MI (-0.3).
- Trump is going to win a historic percentage of African American and Hispanic votes (which will probably be the go-to-explanation for why polls were wrong).
- Pollsters acknowledge this shift in minorities — and believe that this ground is covered by Biden’s gains with white voters in all demographics, but this ground is not covered. Those gains are with the white voters who are willing to take polls in 2020 (which should be the go-to-explanation for why polls were wrong).
- If Biden does win in a close race, after polls showed a blowout for months, we all (not just pollsters) need to ask why public opinion polling isn’t working in our democracy. The Public Opinion Research industry needs to and will improve. But it’s not going to be easy — its players are going to need to look hard to find the symptoms of where our polls might not be telling the whole story.