Ahead of the United States presidential elections on Tuesday, public opinion polls had predicted a neck-and-neck race between Donald Trump and Vice President Kamala Harris.
Yet eventually, Trump cruised to a comfortable victory, defying most polls. He has already won five of the seven swing states – Pennsylvania, Georgia, North Carolina, Michigan and Wisconsin – and appears poised to win the remaining two, Arizona and Nevada. Most of these wins are by margins larger than the polls had forecast.
And, while most pollsters had predicted a narrowing margin between Harris and Trump in the popular vote, almost all showed Harris ahead. In the end, Trump is on course to not just win the popular vote – but to do so by a margin of close to 5 million votes. That’s a win no Republican can boast of since George HW Bush in 1988.
Overall, Trump has already won 295 Electoral College votes, comfortably more than the 270 needed to win, while Harris won 226. If he wins Arizona and Nevada as is predicted, Trump will end up with 312 Electoral College votes.
So how did the opinion polls go wrong – so wrong?
What did the polls predict about swing states?
Most national polls, weeks into the vote, predicted the two candidates deadlocked, deeming the race too close to call.
A few days before the elections, some pollsters, such as poll aggregator FiveThirtyEight then shifted slightly and predicted that Harris was more likely to win, although by a small gap of less than 2 percent.
In the seven battleground states, Harris was predicted – based on an average of polls by aggregator FiveThirtyEight – to win a majority in the traditionally Democrat, or Blue Wall states of Michigan, Pennsylvania and Wisconsin.
Trump was leading in the polls in North Carolina, Georgia and Arizona, while there was almost nothing separating the two candidates in Nevada, according to the polls.
On election night, Trump won all three of Michigan, Pennsylvania and Wisconsin. He is expected to win Arizona handsomely. And he is ahead in Nevada by three percentage points – well beyond what the polls had predicted.
What about other states Trump won?
In Iowa, the Midwestern state that has long been solidly Republican, Selzer and Co, a trusted polling company owned by analyst J Ann Selzer, surprisingly predicted Harris winning by three percentage points over Trump in the closing days of the campaign.
To be sure, it was an outlier poll: an Emerson College poll that came out at almost the same time showed Trump winning the state by nine percentage points.
But Selzer is widely respected in the polling industry and has repeatedly called Iowa correctly in presidential and Senate races over the decades.
She cited widespread anger among white women over the overturn of hard-won abortion rights by Trump-appointed Supreme Court Judges back in 2022, and said previously undecided women voters were breaking late for Harris, giving her the edge.
Trump, on his social media channel, Truth Social, condemned Selzer’s poll, calling her an “enemy” and saying that the poll was wrong “by a lot”.
Eventually, Trump won the state by 13 percentage points – more than what even many Republican-funded polls had predicted.
When polls get it so wrong, it “exacerbates a key challenge in this race: the perceived lack of legitimacy of polling”, Tina Fordham of risk advisory company Fordham Global Foresight told Al Jazeera.
What about states that Trump lost?
Pollsters got it wrong even in several states that Harris won – undercounting Trump’s support and thereby predicting a far great margin of victory for the vice president in solidly Blue states than what happened in the election:
- New York: The polling average at the start of November 5 had Harris winning by 16 percentage points. She won by 11 points.
- New Jersey: Harris, per FiveThirtyEight, was forecast to win by 17 percentage points. She beat Trump – but only by 5 points.
- New Hampshire: The polls suggested Harris would win by 5 percentage points. She barely beat Trump by two percentage points.
Did pollsters warn of possible errors?
Yes, pollsters always point out that their surveys operate within margins of error in their calculations – about 4 percent in many cases. That means that their predictions could be off by 4 percent in either direction: if Harris is shown leading Trump 48 percent to 44 percent, for instance, they could actually end up equal, or Harris could end up with an 8 percent win eventually.
Nate Silver, who founded pollster FiveThirtyEight, and now anchors the newsletter, Silver Bulletin, wrote in The New York Times ahead of the vote that his “gut” went with Trump. Silver had earlier predicted a deadlock, but it was possible, he noted, that the polls were underestimating the numbers of Trump supporters because they could not reach them for surveys.
But in the final days before November 5, Silver was one of several pollsters who said their models had shifted slightly more towards Harris, giving her a 48 percent chance at victory over Trump’s 47 percent.
Have polls got it wrong before?
Yes. Polling in the US began from newspapers collecting local opinions in the 1880s. Predictions have often been right, historically.
But of late, they have often also been horribly wrong.
In 2016, opinion polls correctly predicted the popular vote for Hillary Clinton, but also had her winning, comfortably, in states like Pennsylvania, Michigan and Wisconsin, that Trump eventually won. Their forecast of Clinton winning the Electoral College was proved wrong.
Polls were off in 2020 again, when COVID-19 restrictions greatly limited surveys. Most polls correctly predicted that Joe Biden would win the Electoral College and national vote. But they significantly overestimated the support for Democrats by an “unusual magnitude”, according to the American Association for Public Opinion Research (AAPOR), while undercounting voters backing Trump. Researchers called it the least accurate polling in 40 years.
Then, in 2022, polls got it wrong the other way – for the midterm elections.
Some polls predicted that Republicans would sweep the House and Senate that year. In the end, the race was much closer, at least in the Senate, where neither party won a majority, but Democrats ended up gaining control 51- 49, with the support of independents who caucus with them. Republicans, as predicted, won the House 222 – 213.
Why do polls get it wrong?
It all comes down to who participates in their surveys, how representative they are of the electorate, and how truthfully they respond, say researchers. Without accurate data, polls mean nothing.
As Silver acknowledged in his New York Times column, one key challenge pollsters face is getting enough numbers of likely voters to respond to their surveys. Usually, opinions are collected over phone calls, but that has become more difficult because of caller ID applications that help people screen calls seen as spam.
Republicans, in particular, may be less likely than Democrats to speak to the media or respond to surveys, and have been underrepresented in previous polls, according to findings by AAPOR. It doesn’t help that Trump has also publicly attacked opinion polls as “fake”, likely further causing his supporters to drift from participating. Trump has often attacked the mainstream media, calling the press the “enemy of the state” in 2019.
By contrast, Democrats, especially college-educated people, are more likely to engage, and also likely to be overrepresented, analysts say.
Although pollsters are trying to close the participation gap by using emails and online surveys, some online surveys tend to attract only certain types of participants because they offer compensation, academic Jerome Viala-Guadefroy writes in the research publication The Conversation.
“(That compensation) leads to issues of accuracy and representation,” he wrote.
In 2020, the COVID-19 pandemic restrictions appeared to make surveys more difficult. AAPOR found that states that had the highest polling errors corresponded with states that had higher cases of the virus.
Did online betting sites do better than pollsters?
American University professor and polls pundit Allan Lichtman who had rightly predicted the 2016 elections in favour of Trump, admitted that his predictions this time – he had forecast a Harris win – were wrong. In a post on X on Thursday, Lichtman said he wanted to “assess why the keys were wrong and what we can learn from this error”.
Meanwhile, online, a new crop of prediction betting companies, where people can put money on topics like crypto or election candidates, are gloating and lapping up praise for correctly predicting a more likely Trump win. Thousands who gambled on Trump are looking at potential payouts of about $450m collectively.
In the days just before the November 5 vote, the odds of Trump winning increased on at least five online betting websites, providing, some say, a much more realistic picture than the polls did.
Last night, Polymarket proved the wisdom of markets over the polls, the media, and the pundits.
Polymarket consistently and accurately forecasted outcomes well ahead of all three, demonstrating the power of high volume, deeply liquid prediction markets like those pioneered by…
— Polymarket (@Polymarket) November 6, 2024
Polymarket, which also has Nate Silver as one of its advisers, was one of several who put Trump on a better footing. In a post on X on Wednesday, Polymarket said it proved the wisdom of “markets over the polls, the media and the pundits”.
“Polymarket consistently and accurately forecasted outcomes well ahead of all three, demonstrating the power of high volume, deeply liquid prediction markets like those pioneered by Polymarket,” the statement read.
Kalshi, another popular betting site, disclosed to US publication, Fast Company that 28,000 people bet on Harris on its platform, while 40,000 bet on Trump. They got it right.