Skip to content

The Pitfalls of Prediction

2011 December 4

Prognostication is a multi-billion dollar industry.  We have weathermen, Wall Street Analysts, political pundits and futurologists.  They all claim some expertise.

These people exist because there is strong demand for their services. Businesses need to create budgets.   People have to decide what to wear.  Politicians are expected to anticipate issues that will matter to society.  Without predictions, there can be no plans.
Yet as Philip Tetlock discovered in his 20 year long study, experts are little better at predicting the future than flipping a coin.  Moreover, the more specialized the expertise, the worse the predictive performance tends to be.  In other words, the people who get paid to know the most, do the worst.  How can this be?

Messy Data

The problem starts when smart people in nice suits and lab jackets proclaim that “the data says…”  In truth, the data never says anything.  We interpret it in one way or another and there are lots of ways to interpret it incorrectly.

Data is, after all, messy.  It doesn’t spring forth whole, but must be collected in some way. We count, measure, survey, aggregate, slice and dice, picking up errors all the time.  We need to make choices about which data we want to focus on and which fades into the background.

Moreover,as I explained in an earlier post, the mathematics we have long used to form statistical models has been found wanting.  Cross interactions in massive data sets create feedback and, rather than follow relatively tame Gaussian patterns, we get chaotic ones with emergent properties.  Mere extrapolation of current trends is totally inadaquate.

Data is anything but objective.  We bring our own biases to how we collect, process, interpret and report it, often without realizing the extent to which we’re putting our own stamp on it.

What You See is All There Is

Nobel laureate Daniel Kahneman explains another reason why our predictions often fail in this article.  He recounts that, based on how soldiers performed on a test, he was supposed to predict their future success as leaders.  When he followed up later, he found that his supposedly informed judgments were little better than random guesses.

What he found interesting is that even armed with the knowledge that the tests didn’t work, he found himself no less sure of his assessments while administering them.  If a soldier performed poorly on the course, he was sure that he would make a poor leader.  No amount of academic knowledge could overcome the power of his first hand observation.

He calls this principle WYSIATI (What You See Is All There Is) and it is a fundamental aspect of prospect theory.  Our brains are wired for survival, not for analysis.  Therefore, we tend to make judgments based on information close at hand and then apply them to the world at large.  That’s a good skill for emergency room doctors, not so good for economic forecasters.

Another point Kahneman brings up is that we tend to give more weight to the first information that we see.  For instance, if we hear that “Jane is intelligent and strong”, we will tend to make excuses for her when we later hear that she is corrupt and cruel, but if we hear the bad attributes first, we’ll tend to judge that being intelligent and strong only makes her more villainous.

Black Swans

We take it on faith that past performance will point the way to future results and that’s generally how predictions are made.  We analyze data, search our experience and extrapolate out to form a picture of what we expect to come.

However, as Nassim Taleb points out in his book Black Swan, that blinds us to low probability, high impact events that can have enormous ramifications. Turkeys, after all, live quite nicely until Thanksgiving Day.   Statisticians tell us that playing the lottery is a waste of time, but people do win.  

History has, in large part, been driven by the improbable and the future will be too.

What makes black swans so nefarious is Kahneman’s WYSIATI phenomenon.  It is, after all, our past which is close and familiar and we are secure in our knowledge of it.  We then discount the possibility of facts not in evidence.  The more we review and analyze, the more certain we become.

Unfortunately, that certainty is a mirage; an artifact of our primitive psychology.

Reflexivity and Feedback

Prognistication can also create self fulfilling prophesies, which create expectations about the future that alter human behavior.  After all, the reason make predictions in the first place is so that we can align our actions with future events. Billionaire investor George Soros calls this effect reflexivity.

Sometimes, reflexivity can be positive, as in the case of Moore’s Law.  In 1965, Gordon Moore noticed a pattern that the efficiency of microchips was doubling every 18 months. He then pushed Intel, to keep that pace, which pushed competitors, created expectations for customers and so on.  Half a century later, Moore’s Law still holds.

However, Soros himself is most concerned about the way that reflexivity affects financial markets.  People get the idea in their heads that you can make a lot of money in real estate and that idea affects others.  Next thing you know, housing prices are going up just because people think they are supposed to and a bubble emerges.

An analyst looking at data trends would reasonably conclude that real estate prices will continue to go up.  However, as we have seen, sentiment can be fickle and eventually prices come crashing down to reality (in truth, they usually shoot past that, which is how Soros became so rich).

The Best Way to Predict the Future is to Create It

Steve Jobs is often described as a visionary.  His biographer, Walter Isaacson, notes that he revolutionized 7 industries (personal computing, animated movies, music, phones, tablets, publishing and retail).  Yet he eschewed all forms of market analysis and declined to predict trends.

What he did do, however, was create.  He would see things that he didn’t like in the world and set out to change them, make them better.  There’s no way to download music legally?  Hogwash! Complicated, hard to use phones?  There’s got to be a better way. Tablet computers with a stylus?  Outrageous!

And that’s what most analysts miss.  The future is hard to predict not just because of our cognitive biases or inexplicable natural events, but because we have the power to make our own future.

– Greg

6 Responses leave one →
  1. December 5, 2011

    Greg, great summary of the problem. You may enjoy this excellent piece from Richard Danzig (former US Secy. of the Navy), “Driving in the Dark.” Google it. I think you will find that his thinking comports with yours, but given his perspective on acquisition of military equipment, which is a VERY long term proposition, he has some interesting ideas about what to do in the face of uncertainty. -Bill

  2. December 5, 2011

    Thanks. I’ll check it out.


  3. December 9, 2011

    About ten years ago, I attended a briefing on forecasting by the University of Warwick. One of their nuggets of wisdom was that only two groups of people regularly make forecasts that are any better than random guesses: weathermen and poker players.

    Why these people? Because they make predictions based on firm data and have their predictions validated (or not) almost immediately. This enables them to calibrate their methods (or in layman’s terms, get good at it).

  4. December 9, 2011

    Interesting. Thanks.

    Have a great weekend.

    – Greg

  5. December 10, 2011

    Interesting post Greg. Glad I discovered your blog.

  6. December 10, 2011

    I’m glad you did too!

    Come back again.

    – Greg

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS