Skip to content

We Need To Stop Fooling Ourselves And Get Our Facts Straight. That Takes Work.

2023 April 23
by Greg Satell

Mehdi Hasan’s brutal takedown of Matt Taibbi was almost painful to watch. Taibbi, a longtime muckraking journalist of some renown, was invited by Elon Musk to review internal communications that came to be known as the Twitter Files and made big headlines with accusations regarding government censorship of social media.

Yet as Hasan quickly revealed, Taibbi got basic facts wrong, either not understanding what he was looking at, doing sloppy work or just plainly being disingenuous. What Taibbi was reporting as censorship was, in fact, a normal, deliberative process for flagging problematic content, most of which was not taken down.

He looked foolish, but I could feel his pain. In both of my books, I had similarly foolish errors. The difference was that I sent out sections to be fact-checked by experts and people with first-hand knowledge of events before I published. The truth is that it’s not easy to get facts straight. It takes hard work and humility to get things right. We need to be careful.

A Stupid Mistake

Some of the most famous business stories we hear are simply not accurate. Gurus and pundits love to tell you that after inventing digital photography Kodak ignored the market. Nothing could be further from the truth. In fact, its EasyShare line of cameras were top sellers. It also made big investments in quality printing for digital photos. The problem was that it made most of its money on developing film, a business that completely disappeared.

Another popular fable is that Xerox failed to commercialize the technology developed at its Palo Alto Research Center (PARC), when in fact the laser printer developed there saved the company. What also conveniently gets left out is that Steve Jobs was able to get access to the company’s technology to build the Macintosh because Xerox had invested in Apple and then profited handsomely from that investment.

But my favorite mistold myth is that of Blockbuster, which supposedly ignored Netflix until it was too late. As Gina Keating, who covered the story for years at Reuters, explains in her book Netflixed, the video giant moved relatively quickly and came up with a successful strategy, but the CEO, John Antioco, left after a fight with investor Carl Icahn and the strategy was reversed.

Yet that’s not exactly how I told the story. For years I reported that Antioco was fired. I even wrote it up that way in my book Cascades until I contacted the former CEO to fact-check it. He was incredibly generous with his time, corrected me and then gave me additional insights that improved the book.

To this day, I don’t know exactly why I made the mistake. In fact, as soon as he pointed it out I knew I was wrong. Somehow the notion that he was fired got stuck in my head and, with no one to correct me, it just stayed there. We like to think that we remember things as they happened, but unfortunately our brains don’t work that way.

Why We Get Fooled

We tend to imagine that our minds are some sort of machines, recording what we see and hear, then storing those experiences away to be retrieved at a later time, but that’s not how our brains work at all. Humans have a need to build narratives. We like things to fit into neat patterns and fill in the gaps in our knowledge so that everything makes sense.

Psychologists often point to a halo effect, the tendency for an impression created in one area to influence opinion in another. For example, when someone is physically attractive, we tend to infer other good qualities and when a company is successful, we tend to think other good things about it.

The truth is that our thinking is riddled with subtle yet predictable biases. We are apt to be influenced not by the most rigorous information, but what we can most readily access. We make confounding errors that confuse correlation with causality and then look for information that confirms our judgments while discounting evidence to the contrary.

I’m sure that both Matt Taibbi and I fell into a number of these pitfalls. We observed a set of facts, perceived a pattern, built a narrative and then began filling in gaps with things that we thought we knew. As we looked for more evidence, we seized on what bolstered the stories we were telling ourselves, while ignoring contrary facts.

The difference, of course, is that I went and checked with a primary source, who immediately pointed out my error and, as soon as he did, it broke the spell. I immediately remembered reading in Keating’s book that he resigned and agreed to stay on for six months while a new CEO was being hired. Our brains do weird things.

How Our Errors Perpetuate

In addition to our own cognitive biases, there are a number of external factors that conspire to perpetuate our beliefs. The first is that we tend to embed ourselves in networks that have similar experiences and perspectives that we do. Scientific evidence shows that we conform to the views around us and that effect extends out to three degrees of relationships.

Once we find our tribe, we tend to view outsiders suspiciously and are less likely to scrutinize allies. In a study of adults that were randomly assigned to “leopards” and “tigers,” fMRI studies noted hostility to out-group members. Research from MIT suggests that when we are around people we expect to agree with us, we don’t check facts closely and are more likely to share false information.

In David McRraney’s new book, How to Change a Mind, he points out that people who are able to leave cults or reject long-held conspiracy theories first build alternative social networks. Our associations form an important part of our identity, so we are loath to change our opinions that signal inclusion into our tribe. There are deep evolutionary forces that drive us to be stalwart citizens of the communities we join.

Taibbi was, for years, a respected investigative journalist at Rolling Stone magazine. There, he had editors and fact checkers to answer to. Now, as an independent journalist, he has only the networks that he chooses to give him feedback and, being human like all of us, he subtly conforms to a set of dispositions and perspectives.

I probably fell prey to similar influences. As someone who researches innovation, I spend a lot of time with people who regard Netflix as a hero and Blockbuster as something of a bumbler. That probably affected how I perceived Antioco’s departure from the company. We all have blind spots and fall prey to the operational glitches in our brains. No one is immune.

Learning How To Not Fool Ourselves

In one of my favorite essays the physicist Richard Feynman wrote, “The first principle is that you must not fool yourself—and you are the easiest person to fool.  So you have to be very careful about that,” He goes on further to say that simply being honest isn’t enough, you also need to “bend over backwards” to provide information so that others may prove you wrong.

So the first step is to be hyper-vigilant and aware that your brain has a tendency to fool you. It will quickly grasp on the most readily available data and detect patterns that may or may not be there. Then it will seek out other evidence that confirms those initial hunches while disregarding contrary evidence.

This is especially true of smart, accomplished people. Those who have been right in the past, who have proved the doubters wrong, are going to be less likely to see the warning signs. In many cases, they will even see opposition to their views as evidence they are on the right track. There’s a sucker born every minute and they’re usually the ones who think that they’re playing it smart.

Checking ourselves isn’t nearly enough, we need to actively seek out other views and perspectives. Some of this can be done with formal processes such as pre-mortems and red teams, but a lot of it is just acknowledging that we have blind spots, building the habit of reaching out to others and improving our listening skills.

Perhaps most of all, we need to have a sense of humility. It’s far too easy to be impressed with ourselves and far too difficult to see how we’re being led astray. There is often a negative correlation between our level of certainty and the likelihood of us being wrong. We all need to make an effort to believe less of what we think.


Greg Satell is Co-Founder of ChangeOS, a transformation & change advisory, an international keynote speaker, and bestselling author of Cascades: How to Create a Movement that Drives Transformational Change. His previous effort, Mapping Innovation, was selected as one of the best business books of 2017. You can learn more about Greg on his website, and follow him on Twitter @DigitalTonto

Like this article? Sign up to receive weekly insights from Greg!


Image: Wikimedia Commons

No comments yet

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS