If you followed the presidential polls at all closely, chances are that you expected Hillary Clinton to win last week. So did all of the major prediction models that use polls to game out election outcome probabilities. So perhaps everyone should have expected that in a year when all political norms were broken, the polls that the political world fixates upon would also prove to be flawed.
Pollsters will be digging for months (at least) to figure out how exactly their results may have been off. The American Association for Public Opinion Research is convening a committee to study this year’s polling, but answers will be a long time coming, as the committee won’t wrap up until May 2017. Until then, here are a few ways to think about what was wrong — and right — with the polling in the 2016 election.
1. The national polls weren’t that off — they did predict more people would vote for Clinton. That’s what happened.
Donald Trump did win the most electoral votes. However, at latest count, Clinton is up by a little over half a percentage point over Trump in the popular vote (or about 725,000 votes).
That’s around 2.7 points off of Real Clear Politics’ final polling average estimating Clinton’s lead over Trump.
Is that big? Not compared with 2012. That year, Obama beat Romney by around 3.9 points. RCP’s final estimate was 3.2 points below that (this was in two-way polling; polls including third-party candidates were rare that year). But then, in 2008, the results were remarkably close, only a few tenths of a point away from the final poling average.
So this year’s national polls were off a bit, but not outlandishly so (and, again, they did predict the popular vote winner). However, we have an Electoral College, and so it’s state polls that matter in predicting who will win the presidency.
Many swing-state polls weren’t terribly far off either, as the noted Republican polling firm Public Opinion Strategies pointed out in a memo last week. However, something was still clearly off in those polls.
After all, in those nine swing states Public Opinion Strategies cited, the polls were all wrong in the same direction. All of them predicted a better performance for Clinton than she ended up having. What exactly may have caused systematic problems is what many people are questioning.
2. Some people just don’t answer the phone.
Many pollsters that do phone polling conduct it via random digit dialing. That means they should theoretically get a pretty representative sample — after all, they’re reaching out to people randomly.
It’s possible that some pollsters managed to miss Trump supporters in a big way, explains Claudia Deane, vice president of research at the Pew Research Center.
“The problem is if you get what pollsters call nonresponse bias, people are less likely to take your call or stay on the phone with you,” she explained.
She told NPR this is one of three big ways in which polling may have been off. Some populations, like people with less education, are less likely to answer when pollsters call, Deane said. Less-educated whites heavily supported Trump — far more even than they supported Romney in 2012 or McCain in 2008. And when Trump constantly beat the drum against the media (many of whose organizations conduct polling) and polling (when they showed him losing), then perhaps this nonresponse factor among his supporters isn’t so surprising.
That’s just one example. The point is that if the groups of people who are less likely to answer pollsters’ calls also happen to be the demographic groups that are more likely to support Trump, that may have thrown polls off.
3. Did people lie to pollsters?
The idea of the “secret Trump vote” popped up throughout the election — the theory being that voters didn’t want to tell a stranger on the phone that they were voting for Trump, who said and did so many controversial things on the campaign trail. This is what is called “social desirability bias” — the idea that voters give polling answers that for whatever reason they think will reflect well upon them, and it’s the second reason Deane listed that polls could have been off.
A week ahead of the election, a panel of GOP insiders told Politico that they believed this was happening.
“I personally know many Republicans that won’t admit that they are voting for Trump,” one Virginia Republican told Politico. “I don’t like admitting it myself. It won’t matter if Hillary is up more than 5 points, but we might be in for a surprise if Hillary’s lead is less than 5 points on Election Day.”
Her lead on Election Day was indeed smaller than 5 points, and the nation most definitely was in for a surprise (but presumably not that Politico insider).
Still, there’s reason to be skeptical that this really threw off polls much.
“If that was true, that really should apply in phone versus online polls, and for the most part we didn’t see that,” Deane said, pointing out that Trump should have performed better in online polls than in phone polls, if social desirability bias was a factor, as online polls don’t involve talking to a live person.
4. It’s hard to capture enthusiasm (or lack thereof).
Pollsters talk to a lot of people, and they try to predict which ones will, in fact, turn up at the polling place. That’s harder than it sounds.
“Way too many people tell you they’re going to vote,” Deane said.
What may have happened is that the usual models of predicting simply didn’t work this year, she explained. After all, lots of other things about the election were unusual: high levels of anger and two candidates with high unfavorability ratings, for example. That may have made this year unique in terms of figuring out which of those people were motivated to vote (or were ambivalent enough to stay home).
“We know polls do a poor job with emotion/enthusiasm/commitment,” Evans Witt, head of Princeton Survey Research and president of the National Council on Public Polls, emailed NPR. “And that appears key to Trump support.”
To the extent that pollsters overestimated Clinton supporters’ willingness to vote — or underestimated Trump supporters’ willingness — that could have thrown things off.
To be clear, these are all possibilities — pollsters will have plenty of work to do to figure out what they didn’t capture this year.
“Lots of polls this year and a lot of cutting corners to save money — a reality, but at a cost,” said Witt, who is also on the American Association for Public Opinion Research’s committee that is doing a post-hoc analysis of the accuracy of polling in the presidential election.
“It will be months before this can be sorted out,” Witt added. “Lots of data to be examined, lots of hard questions to ask and answer.”
Come 2018 and 2020, many Americans will likely find themselves taking poll numbers with an extra grain of salt or three.