Superbowl 48 - the day the Seahawks found a Black Swan

Profile picture for user gonzodaddy By Den Howlett February 3, 2014
Summary:
The Superbowl result confounded many who predicted a Broncos win. What happened and does it knock down predictive analytics? No - but you have to choose your model carefully.

http://www.thewaiverwire.co/2014/01/31/the-waiver-wire-predicts-super-bowl-xlviii/
I have no clue about American football. I'm told it's a sport designed to fill time in between TV adverts. Even so, one thing I DO know - Americans are obsessed with game statistics. The amount of data generated per game is mind boggling and on the rare occasions I've watched bits of the game, the streams of data spinning across the screen is truly amazing.

So when the Seattle Seahawks thrashed the Denver Broncos 43-8, I was interested to discover what happened ot the data behind the result. A quick search showed some other and rather worrying conversations.

At the end of last month, Forbes had an impressive piece entitled: The Denver Broncos' Super Bowl Advantage? Data! claiming that:

Before [Peyton] Manning [quarterback] ever runs out onto the MetLife Stadium field in East Rutherford, NJ, the Broncos’ dedicated IT staff will have collected, stored, assimilated, managed, protected and delivered terabytes of data to Manning, his teammates, and the coaching staff.

The data covers each game the Seahawks have played, each play, both offensive and defensive, and each player.

In the modern era, no team can hope to get to the Super Bowl without peak performance from its IT department. Teams are mid-size businesses with enterprise-level IT needs.

The Broncos IT staff manages hundreds of terabytes of data. Not only do they collect and manage all of the games, plays and players for the Broncos themselves, but with the Super Bowl, there’s double the amount of data: The team has a single opponent, which they need to know as well as they know themselves.

Someone didn't pass that script onto the Seahawks...or SAP which used its predictive analytics software to predict a result of Denver 26-Seattle 23. Yikes!

Then I saw Vijay Vijayasankar's discussion about the perils of predictive analytics. He makes the crucial points:

Predictive Analytics in general cannot be used to make absolute predictions when there are so many variables involved . In fact – I think there is no place for absolute predictions at all . And when the results are explained to the non-statistical expert user – it should not be dumbed down to the extent that it appears to be an absolute prediction .

Predictive models make assumptions – and these should be explained to the user to provide the context . And when the model spits out a result – it also comes with some boundaries (the probability of the prediction coming true , margin of error , confidence etc). When those things are not explained – predictive Analytics start to look like reading palms or tarot cards . That is a disservice to Predictive Analytics .

That's one reason why companies need people who truly understand the limitations of applying statistical algorithms AND communicate those appropriately. Plus - we should always be mindful about asking the right questions in the right context.

So for example, under normal conditions, the application of Monte Carlo simulation and game mechanics for algorithmic stock trading is OK. Algo-trading was a very successful strategy in the 2000s in large measure because many machine based trades were operating. It was only when bearish human sentiment was added into the mix that the market falls and bank collapses really took hold in the late 2000s.

In the case of sports, it seems that no amount of statistical analysis can outwit the human condition. Vijay notes:

I decided to skip watching the India vs NewZealand cricket series thinking India will win this 5-0 and it will be boring . I was close on my gut prediction – the score was 4-0, just that India was on the losing side of that equation. On the bright side , I am happy that I didn’t have to watch the massacre and live with the nightmares.

For my part, I sat through the mauling that Australia handed out to England during the winter Test cricket series - a 5-0 whitewash. That too was very much against all the pundits predictions. I don't know about the Superbowl or the India/NZ matches but I saw for myself how a determined team (Australia) could not only spring a surprise upon another team but utterly destroy the mental fitness of a team accustomed to winning through relentless and aggressively inventive play.

I'm firmly of the view that no amount of predictive modeling/analysis can take those factors into account. That's why we have Black Swans.

Has predictive analysis been dealt a death blow? Not at all. We have already discussed some of the findings of studies that suggest algorithmic analysis outperforms intuition. We should however take great care which topics we choose to predict and the context in which those predictions are made.

Disclosure: at the time of writing, SAP is a premier partner

Images via The Waiver Wire - which also muffed the predictions by some margin.