We here at Apester knew Bernie Sanders was going to win Michigan. We also predicted Sanders had a real chance to win Missouri (he barely lost,) that he would fall short in Illinois, that he'd lose in North Carolina and Ohio and that he'd get crushed in Florida.
With the help of an AOL poll created with Apester, we called Super Tuesday 2 before the results started coming in. Here's the proof. Notice the post time and that it has not been edited. We posted on Facebook for that reason.
Full disclosure, we’re not a serious polling company. We ply our trade in sentiment and engagement and our publisher partners primarily use our editorial tools such as polls, quizzes, personality tests and video quizzes to significantly boost time on site, social shares on articles, reduce bounce rates and get insight about topics and their audience.
That makes these predictions even more amazing.
Here's the actual poll that was embedded on the articles. Feel free to vote for yourself.
Election polls that mean something, that get “serious” media coverage, are typically “serious.” They're conducted by serious sounding organizations citing pages of serious research methodology that nobody will ever read lead by serious people with seriously high IQs and holding PH. D's in serious topics.
This poll was not one of those. Quite the opposite. This is more of a rather large middle finger to traditional polling methods.
AOL didn't create their poll with the intention of predicting primary outcomes. They created it with the intention of engaging their audience on the topic in the article. It was not targeted. It did not send emails to liberal women aged 18-34 with black wire framed glasses or to conservative men aged 45 – 65 with camo baseball caps and hunting licenses. It didn't ask for their home state, their religious leanings, the color of their skin or if they like red or blue, monster trucks or hybrids. It was a one question poll.
Last week the Internet and country went apeshit with the Bernie Sanders upset win in Michigan. The Bern was felt. Hillary was licking her Benghazied, private email server wounds, Nate Silver was making declarations that “if Sanders winds up winning in Michigan, in fact, it will count as among the greatest polling errors in primary history.”
We pay attention to Nate Silver, he's a smart man who bleeds 0s and 1s. Basically, it was a damn big deal.
I mean look at these polls for Michigan before the primary on March 5, this is utterly ridiculous.
Wrong, every single one of them.
We were curious. We started lurking in the data after the March 5 primary.
The AOL poll had 80,000 votes, which is not insignificant. But just as importantly it asked a fantastic question that bypassed bias by asking “who do you think will become the democratic nominee for President?” Not who do you want to win, not who will you vote for in a particular state to win a primary, but who you think will win. Who you think will win could be very different from who you want to win. Somebody engaging in the AOL poll might consider the larger view of what people think in their choice of answer rather than who they actually want to win.
There was no political agenda behind the poll. Apester has no political agenda. We’re on the fence. Voyeurs.
What could we discover if we dug around in the AOL poll data? If we did more than just lurk? What if we took a shot at actually predicting?
We dug. But not until after lunch. Lunch is important. And coffee. Once we get caffeine we go full beast mode.
What we discovered — after lunch — when we broke down the voting by state, was that Michigan voted for Bernie by a wide margin. And huh, look at that… pretty much every other state voted for Bernie, except Florida, because Florida. Florida is also the state that Bugs Bunny cut loose in a cartoon in 1949. Just saying.
We noticed a trend that synced up: how many people that voted for Bernie in each state had a strong correlation to the actual primary result percentage for Bernie in the states that had already completed primary elections.
Then we predicted the percentage range of the votes Mr. Bern would get in the March 15 primaries for Florida, Illinois, Ohio, Missouri and North Carolina. And if they came out close, we'd have more confidence the Michigan upset that was identified in the AOL poll data was also predictive, while every other serious poll were caught with their pants down.
As a matter of rule, our pants are always down at Apester, so there was nothing to lose.
Pants or no pants, we are now confident enough to claim the Michigan upset prediction was accurate.
Between closing out 346 Chrome browser windows before heading home for the night, Apester CEO Moti Cohen said, “These quite frankly stunning results are a pure product of the Apester unit created by the publisher, even if AOL never in a million years intended this outcome, there it is.” Click Browser window 142 closes. “Our goal isn't to interfere, but to elevate, to analyze and help a publisher discover more about their own content.”
There's power in understanding the sentiment of an audience. “This time,” said Moti as browser window 26 is eliminated, “it led us to predict election results, what it will be next time is unknown. Our product evolves right along with the content created by publishers in very unique ways because of the wide variety of stories created across all verticals. That's thrilling.”
Thrilling indeed. We were right about Sanders winning Michigan, even if we didn't fully grasp just how right we were until we saw our predictions play out for Super Tuesday 2. Even if our methods of arriving at our numbers are nowhere in the same league as a Politico or FiveThirtyEight (yet,) we still got there in the end.
We absolutely cannot wait to see what else lurks in the data, just waiting to be discovered. Maybe next time we will predict Kylie Jenner will be humble by the age of 26. Now that, would be a long shot!
Words by Chris C. Anderson, Apester VP of Editorial and Publisher Relations, US