Why Smart People Are So Easily Fooled
When I lived in Moscow, my gym was just a five-minute walk from my flat. So rather than use a locker, I would just run over in my shorts and a jacket no matter what the weather was. The locals thought I was crazy. Elderly Russians would sometimes scream at me to go home and get dressed properly.
I had always heard that Russians were impervious to the effects of weather, but the truth is that they get cold just like the rest of us. We tend to mythologize the unknown. Our brains work in strange ways, soaking up patterns from what we see. Often, however, those experiences are unreliable, such as the Hollywood images that helped shape my views about Russians and their impenetrability.
The problem is that myths often feel more real than facts. We have a tendency to seize on information that is most accessible, not the most accurate, and then interpret new evidence based on that prior perception. We need to accept that we can’t avoid our own cognitive biases. The unavoidable truth is that we’re easiest to fool when we think we’re being clever.
Inventing Myths
When Jessica Pressler first published her story about Anna Sorokin in New York Magazine, it could scarcely be believed. A Russian emigrant, with no assets to speak of, somehow managed to convince the cream of New York society that she was, in fact, a wealthy German heiress and swindled them out of hundreds of thousands of dollars.
Her crimes pale in comparison to Elizabeth Holmes of Theranos, who made fools of the elites on the opposite coast. Attracting a powerful board that included Henry Kissinger (but no one with expertise in life sciences), the 20-something entrepreneur convinced investors that she had invented a revolutionary blood testing technology and was able to attract $700 million.
In both cases, there was no shortage of opportunities to unmask the fraud. Anna Sorokin left unpaid bills all over town. Despite Holmes’s claims, she wasn’t able to produce a single peer-reviewed study that her technology worked even after 10 years in business. There were no shortage of whistle blowers from inside and outside the company.
Still, many bought the ruses and would interpret facts to support them. Sorokin’s unpaid bills were seen as proof of her wealth. After all, who but the fabulously rich could be so nonchalant with money? In Holmes’ case, her eccentricities were taken as evidence that she truly was a genius, in the mold of Steve Jobs or Mark Zuckerberg.
The Halo Effect
People like Sorokin and Holmes intentionally prey on our weaknesses. Whenever anybody tried to uncover the facts, they threw elaborate defenses, making counter-accusations of any who dared to question them. Often, they used relationships with powerful people to protect them. At Theranos, there was very strict corporate security and an army of lawyers.
Still, it doesn’t have to be so diabolical. As Phil Rosenzweig explains in The Halo Effect, when a company is doing well, we tend to see every aspect of the organization in a positive light. We assume a profitable company has wise leadership, motivated employees and a sound strategy. At the same time, we see the traits of poorly performing firms in a negative light.
But what if it’s the same company? Rosenzweig points out that, when Cisco was at its peak before the dot-com bust, it was said to have an “extreme customer focus.” But a year later, when things turned south, Cisco was criticized for “a cavalier attitude toward potential customers” and “irksome” sales policies. Did its culture really change so much in a year?
Business pundits, in ways very similar to swindlers, prey on how our minds work. When they say that companies that employ risky strategies outperform others who don’t, they are leveraging survivorship bias and, of course, firms that took big risks and failed are never counted in the analysis. When consulting companies survey industry executives, they are relying more on social proof than uncovering expert opinion.
The Principle Of Reflexivity
In the early 70’s, a young MBA student named Michael Milken noticed that debt that was considered below investment grade could provide higher risk-adjusted returns than other investments. He decided to create a market for the so-called junk bonds and, by the 80’s, was making a ton of money.
Then everybody else piled on and the value of the bonds increased so much that they became a bad investment. Nevertheless, investors continued to rush in. Inevitably, the bubble popped and the market crashed as the crowds rushed for the exit. Many who were considered “smart money” lost billions.
That’s what George Soros calls reflexivity. Expectations aren’t formed in a vacuum, but in the context of other’s expectations. If many believe that the stock market will go up, we’re more likely to believe it too. That makes the stock market actually go up, which only adds fuel to the fire. Nobody wants to get left out of a good thing.
Very few ever seem to learn this lesson and that’s why people like Anna Sorokin and Elizabeth Holmes are able to play us for suckers. We are wired to conform and the effect extends widely throughout our social networks. The best indication of what we believe is not any discernible fact pattern, but what those around us happen to believe.
Don’t Believe Everything You Think
One of the things that I’ve learned over the years is that it’s best to assume people are smart, hardworking and well intentioned. Of course, that’s not always true, but we don’t learn much from dismissing people as stupid, lazy and crooked. And if we don’t learn from others’ mistakes, then how can we avoid the same failures?
Often, smart people get taken in because they’re smart. They have a track record of seeing things others don’t, making good bets and winning big. People give them deference, come to them for advice and laugh at their jokes. They’re used to seeing things others don’t. For them, a lack of discernible evidence isn’t always a warning sign. It can be an opportunity.
We all need to check ourselves so that we don’t believe everything that we think. There are formal processes that can help, such as pre-mortems and red teams, but most of all we need to own up to the flaws in our own brains. We have a tendency to see patterns that aren’t really there and to double down on bad ideas once we’ve committed to them.
As Richard Feynman famously put it, “The first principle is that you must not fool yourself—and you are the easiest person to fool.” Smart people get taken in so easily because they forget that basic principle. They mythologize themselves and become the heroes of their own stories. That’s why there will always be more stories like “Inventing Anna” and Theranos.
Suckers are born every minute and, invariably, they think they’re playing it smart.
Greg Satell is a transformation & change expert, international keynote speaker, and bestselling author of Cascades: How to Create a Movement that Drives Transformational Change. His previous effort, Mapping Innovation, was selected as one of the best business books of 2017. You can learn more about Greg on his website, GregSatell.com and follow him on Twitter @DigitalTonto
Need to overcome resistance to change? Sign up for the Adopting A Changemaker Mindset Course today!
Photo by charlesdeluvio on Unsplash
Please Greg write a piece in Nuclear disarmament.
Those that think it is possible are kidding themselves.
What may be possible is to hand control of nuclear weapons to a single world body charged with using them as a defence against any rogue dictator that retains some or secretly hides some in order to be the only nation able to threaten the world and get away with it.
No nation wants to be the target in a nuclear war. This overcomes that problem. No nation would have control of its own defence or that of others.
In reality the American promise to defend other nations only puts America at risk. When the chips are down, it’s basically a bluff. First use wins. The alterative being tit for tat or uncontrolled escalation.
Am I right?
Someone needs to get it right.
Ed
Unfortunately, Edward, that is far beyond my area of expertise.
– Greg