Why The Suckers Always Think They’re Playing It Smart
A 2021 Pew survey found that roughly half of US adults get their news often or sometimes from social media, sources are subject to influence by not only run-of-the-mill trolls and hucksters, but also by nation states deliberately looking to shape and distort what we think. Clearly, we live in an era of misinformation and disinformation.
The effect goes far beyond those directly exposed. Much like an epidemic, those influenced by misinformation and disinformation tend to pass it on and, since we tend to be heavily influenced by our local environments, fiction can begin to seem more real than fact. In social theory, this is called the principle of reflexivity.
We tend to assume that getting taken in is due to a lack of education and intelligence, but that’s rarely the case. Smart people get fooled all the time. In fact, those who intend to deceive us often start by flattering our intelligence, making us feel that we’re privy to information that others fail to grasp. It is by boosting our confidence that they take us in.
The Inverse Relationship Between Confidence And Competence
In 1995, McArthur Wheeler and Clifton Earl Johnson robbed two banks in Pittsburgh in broad daylight. Being completely visible to security cameras, Wheeler’s picture was broadcast on the 11:00 news and he was quickly apprehended. When confronted with the surveillance tapes, Wheeley stared in disbelief. “But I wore the juice,” he said.
Apparently, Wheeler and Johnson had somehow got it in their heads that covering their faces with lemon juice would make them invisible to the cameras. After reading the story in the newspaper, David Dunning, a psychology professor at the University of Michigan, suspected that if Wheeler was too stupid to be a bank robber, maybe he was too stupid to know that he was too stupid to be a bank robber.
So Dunning set out with his graduate student, Justin Kruger, to investigate further. They tested subjects across a number of domains, including humor (what others would find funny), logical reasoning and English grammar. After testing was complete, they asked participants to judge their own performance. What they found was fascinating.
In each of the studies, the subjects in the bottom quartile vastly overestimated their performance (which they judged to be slightly above average), while those in the top quartile actually underestimated their performance. After receiving training, however, the poorest performers began to have a more accurate perception of their abilities, while the best performers began to rate themselves higher.
This has become known as the Dunning–Kruger effect and it is largely due to a lack of metacognition in incompetent people. In effect, because they don’t understand what they are doing, they don’t think much about how they are doing it, while highly competent people are hyper-aware of any flaws they might have.
That’s why it’s usually better to think of yourself as careful than smart. Confidence is often inversely related to competence.
Why We Get Fooled
We tend to imagine that our minds are some sort of machines, recording what we see and hear, then storing those experiences away to be retrieved at a later time, but that’s not how our brains work at all. Humans have a need to build narratives. We like things to fit into neat patterns and fill in the gaps in our knowledge so that everything makes sense.
Psychologists often point to a halo effect, the tendency for an impression created in one area to influence opinion in another. For example, when someone is physically attractive, we tend to infer other good qualities and when a company is successful, we tend to look favorably on its practices, while a less successful firm doing the same things may get judged more harshly.
The truth is that our thinking is riddled with subtle yet predictable biases. We are apt to be influenced not by the most rigorous information, but what we can most readily access. We make confounding errors that confuse correlation with causality and then look for information that confirms our judgments while discounting evidence to the contrary.
It’s incredibly easy to fall into these traps. We observe a set of facts, perceive a pattern, build a narrative and then begin filling in gaps with things that we think we know. As we look for more evidence, we seize on what bolsters the stories we’re telling ourselves, while ignoring contrary facts and perspectives.
How Our Errors Perpetuate
In addition to our own cognitive biases, there are a number of external factors that conspire to perpetuate our beliefs. The first is that we tend to embed ourselves in networks that have similar experiences and perspectives that we do. Scientific evidence shows that we conform to the views around us and that effect extends out to three degrees of relationships.
Once we find our tribe, we tend to view outsiders suspiciously and are less likely to scrutinize allies. In a study of adults that were randomly assigned to “leopards” and “tigers,” fMRI studies noted hostility to out-group members. Research from MIT suggests that when we are around people we expect to agree with us, we don’t check facts closely and are more likely to share false information.
In David McRaney’s new book, How to Change a Mind, he points out that people who are able to leave cults or reject long-held conspiracy theories first build alternative social networks. Our associations form an important part of our identity, so we are loath to change our opinions that signal inclusion into our tribe. There are deep evolutionary forces that drive us to be stalwart citizens of the communities we join.
That’s why it’s important to seek out alternative viewpoints and to surround yourself with people who challenge your point of view. The truth is that we’re all vulnerable. We can’t possibly check every fact that comes our way and question every instinct that we have. We can, however, set ourselves up to be less prone to failure.
Learning How To Be Careful
At any given time, there are literally thousands of people looking to fool us. Grifters, politicians and nation states looking to sow discord are constantly bombarding us with falsities mixed with just enough truth to seem plausible. Often, these messages reach us through trusted friends and family members and, when we go to verify them online, we’re likely to find confirmation (often from the same sources that duped our peers).
Anybody can get fooled.
If we are to avoid getting taken in we need to be hyper-vigilant and aware that our brains have a tendency to fool us. Our minds will quickly grasp onto the most readily available data and detect patterns that may or may not be there. Then they will seek out other evidence that confirms those initial hunches while disregarding contrary evidence.
This is especially true of smart, accomplished people. Those who have been right in the past and have proved the doubters wrong, are going to be less likely to see the warning signs. In many cases, they will even see opposition to their views as evidence they are on the right track, given that they’ve seen their hunches pay off before.
Merely checking ourselves isn’t nearly enough, we need to actively seek out other views and perspectives. Some of this can be done with formal processes such as pre-mortems and red teams, but a lot of it is just acknowledging that we have blind spots, building the habit of reaching out to others and improving our listening skills.
Perhaps most of all, we need to have a sense of humility. It’s far too easy to be impressed with ourselves and far too difficult to see how we’re being led astray. There is often a negative correlation between our level of certainty and the likelihood of us being wrong. We all need to make an effort to believe less of what we think.
Greg Satell is Co-Founder of ChangeOS, a transformation & change advisory, an international keynote speaker, and bestselling author of Cascades: How to Create a Movement that Drives Transformational Change. His previous effort, Mapping Innovation, was selected as one of the best business books of 2017. You can learn more about Greg on his website, GregSatell.com, follow him on Twitter @DigitalTonto, his YouTube Channel and connect on LinkedIn.
Like this article? Sign up to receive weekly insights from Greg!