Skip to content

The Problem With Facts

2013 February 3

In the classic TV show Dragnet, Sergeant Joe Friday famously admonished witnesses to give him “just the facts.”  Generations of business executives have adopted the same approach, demanding substantiation rather than conjecture.

The problem is that the world is a confusing place and there are plenty of facts to go around.  A quick Google search is all that is required to find the facts to support any argument.  Studies conflict with other studies, contexts shift and the game goes on.

Yet even that far understates the problem.  Even truths born out by rigorous analysis are often laid asunder by a rapidly changing world.  Last year’s truths are often today’s red herrings.  As rapid technological change transforms politics, culture and economics, we need a new approach that is based less on false certainty and more on simulation.

Cargo Cults

During World War II, the fighting in the Pacific theater was fierce.  Small islands became improvised bases and large amounts of supplies were airdropped to feed the war machine. Food, medicine, weapons and even vehicles appeared from the sky, as if by magic.  Once the conflict ended, the flow of manufactured goods mysteriously disappeared.

Alarmed by this sudden turn of events, some of the indigenous island peoples sought to replicate the conditions that led to the benevolence the visiting troops enjoyed.  They built makeshift airfields and offices, fashioned radios and headsets from wood and coconuts and even marched with makeshift rifles in imitation of soldiers’ drills.

Cargo-Cult

In some remote areas, Pacific islanders still perform military rituals, hoping that valuable cargo will come from the sky.

Alas, no cargo ever came.  Anthropologists have named these groups cargo cults and it’s fun to laugh at their naiveté.  They confuse correlation with causality.  Clearly, mimicking superficial behaviors achieves nothing and those who thinks it does are simply fooling themselves.

However, similar rituals are alarmingly common in the corporate world.  They worship their own gods, (like Steve Jobs, or whoever else is the darling of the business press at any given moment), hoping that by emulating their superficial behavior, fortune will smile upon them as well. I’ve come to call these people cargo cult marketers.

Looking For Support Rather Than Illumination

We like to think we’re rational, but we’re really not.  In fact, the basis of our beliefs is often strictly irrational.

We’re very susceptible to previous suggestions (a phenomenon that psychologists call priming) and will pay much more attention to easily available information (the availability heuristic) when making decisions.  For instance, a recent pile-up on the news will affect our driving behavior more than comprehensive statistics will.

Once we’ve hit on a belief, we will tend to focus on facts that seem to confirm it (i.e. confirmation bias).  The physicist Richard Feynman had this to say about the problem with facts in a famous speech about cargo cults:
 

The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that. After you’ve not fooled yourself, it’s easy not to fool other scientists. You just have to be honest in a conventional way after that.

 
So how do you avoid fooling yourself?  You put the data first.  You search for illumination rather than support.  You begin with doubt, rather than certainty.  You act, in other words, scientifically, which unfortunately is a term so often misused that it bears some explaining.

Faith and Science

“Science” is a word that gets thrown around a lot.  We’re told that “scientists” have “proven” global warming and then that other “scientists” have expressed doubt.  We’re told that evolution is real science, but intelligent design is not.  Why is that?

The fundamental value of science is that it is falsifiable.  It makes predictions that can be refuted.  The predictions of climate scientists can be compared against observable data and, as temperatures rise, we gain confidence in the hypothesis.  Darwin gave us more than explanations of the past, he described a testable mechanism that we can verify.

Creationists and advocates of the paranormal give us none of those things.  They merely state assertions that we may or may not find plausible and which may or may not be true. We can choose to believe them, gain strength from our faith in them and they may even offer lessons that make us better people.  Faith can be a very positive thing.

What we can’t do is disprove them and that’s the difference between science and pseudoscience.  Science can always be disproven, matters of faith can not be.

Perpetual Beta

So if we can never be sure that we’re right, only that we’re wrong, what do we do?

Tim O’Reilly, long a fixture in Silicon Valley, likes to talk about perpetual beta. The idea is that products should be constantly updated.   Google’s Gmail, for example, was in “beta” until 2009, five years after it had been launched.  It had already become the most popular service on the planet and still wasn’t considered finished!

Anybody who has been involved with developing technology products knows what a painstaking process this can be.  Seemingly endless development meetings and usability testing along with exasperating feature launches and redesigns can certainly inflame passions.  However, there is simply no other way to build a great product.

“Fail cheap and fail fast” has become a mantra among start-ups, but larger organizations have problems adopting the same approach.  When thousands of jobs are at stake, you have to be careful about what you experiment with. Fortunately, there is another way.

Bayesian Strategy

Way back in the 1740’s, the Presbyterian minister Thomas Bayes suggested that we should not be shy about guessing.  No matter how wrong we are, subsequent evidence will bear that out and we will become less wrong over time.  The method, called Bayesian inference, was mostly abandoned in favor of more controlled methods, but it’s coming back.

In far reaching and varied fields, we are learning to simulate failure in order to succeed in the real world.  Digital marketers reserve a small part of their campaigns for A/B testing so that they can improve results in real time.  Logistics operations run millions of simulated routes on computers before choosing the best one.

Probably part of the reason that Bayes’ method is becoming popular is that the power of computers makes testing easy.  We have a variety of tools, such as Markov chains, agent based models and other forms of sequential analysis, which allow us to simulate extremely cheaply.  Nate Silver has successfully predicted 3 elections this way.

However, beyond the technology, we need a change in mindset.  Good strategy is always becoming, never being.  The mindless quest for absolute substantiation leads to false certainty, not greater rigor.

In the end, all you can really do is try to improve your odds as best you can, manage your risk through good portfolio strategies and adapt to changes in the marketplace when they come about.  If you can survive, you can thrive.

– Greg

16 Responses leave one →
  1. Carlos permalink
    February 3, 2013

    If you are not busy being born, you are busy dying. (Bob Dylan)

    If you are not busy growing, you are busy liquidating.

    If you are just busy, you may be flying fast but have not planned a destination…eventually you will run out of gas.

    If you are growing but always asking what then, and going out there to see from the outside in, you are a true visionary; for you are not busy creating a cult, only serving your purpose.

  2. February 3, 2013

    Very true Carlos, but sometimes I just grow tired:-)

  3. Carlos permalink
    February 3, 2013

    We all do. It takes a lot of manure to grow flowers. But, what if there were no flowers? – we must endure the manure…

  4. February 3, 2013

    Now there’s a thought!

    – Greg

  5. February 3, 2013

    Hi Greg – another awesome blog post.
    The confusiion of correlation with causality by the psuedoscientific manager has been on my mind for a while and you have wrote another blog post for me 🙂

    The obsession with data and evidence is in fact exactly that – an obsession and a compulsion – a comforting OCD ritual.

    A while ago I wrote about this in “High Anxiety” – Anxiety as a dimension in organisational culture
    http://martinking.wordpress.com/2009/02/01/anxiety-organisationalculture/

  6. February 3, 2013

    Thanks Martin. I like your post too!

    – Greg

  7. February 4, 2013

    Delightful as always mate. Working in the creative arena my exasperation comes from this oft-fielded request.

    “We need something unique and novel. Could you also provide three case studies of brands who’ve done it previously”

    Data and substantiation are a thinly-veiled attempt to avoid blame for failure – “I bought IBM so no way i’m responsible” – and most corporate culture’s don’t give much latitude for failure…despite all mission statements that suggest otherwise.

    As always, I’ll beat the drum of “scenario planning” because that gives you options, requires both fact-finding and hypothesis-testing, and satisfies both the rational and faith-based pundits amongst us.

  8. February 4, 2013

    Thanks Hilton. Scenario planning is important (essential actually), but I fear even that is becoming too slow. How do you plan for an Instagram or Pinterest? Once technology cycles become shorter than corporate decision cycles (and we’re almost there), planning itself will become obsolete.

    We’re eventually going to have to stop planning and start simulating.

    – Greg

  9. February 4, 2013

    Spectacular. I’m going to be replaced by a simulation. Scary, but not irrational, thought. Perhaps you could explore the linkage between creativity/imagination and simulation in a future post. If your hypothesis comes to pass, both will become vital skills in the years ahead.

  10. February 4, 2013

    I already have:-) https://digitaltonto.com/2013/the-infinite-monkey-theorem/

    You might also want to check out my post on “The Simulation Economy”: https://digitaltonto.com/2013/the-simulation-economy/

    – Greg

  11. February 4, 2013

    Sneaky devil – that was quick. Thanks for your weekly kicks in the cranium. Always welcomed.

  12. February 4, 2013

    Thx for coming!

  13. February 4, 2013

    Greg, I am totally in line with your post, but don’t see AGW as a reinforcing example.

    Global temps in the last 15 years have been flatter than the models anticipated. There is still quite a bit of uncertainty in the theory.

    Add to that your qualification of science as being falsifiable — any shift in temperatures in either direction is hailed after the fact as “proof” of global warming. Those holding to AGW as doctrine haven’t had any luck at all with their predictive models, and the positive reinforcement feedback just isn’t in the math at all. Unusually mild winter? Climate change. Unusual snow in April? Climate change. Glaciers expanding in some places, retreating in others? Climate change, even though they have no idea in advance.

    I want to clarify — CO2 does cause some warming. Just not the multiple degrees that are being forecast for the next 100 years. Those depend on positive feedback mechanisms that have not been observed. And if the people promoting the theory continue adding every anecdotal observation as proof, they are contributing to a theory that is non-falsifiable, and therefore not science.

  14. February 4, 2013

    Thanks for sharing.

    – Greg

  15. mani permalink
    February 6, 2013

    I’m surprised the name of Popper hasn’t come up in any of the comments or the text. (did i miss?)

    thanks for the post.

  16. February 7, 2013

    Well, he’s there in spirit in any discussion of falsifiability:-)

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS