3 Reasons Why Business Thinking Is So Consistently Shoddy
“The single most important message in this book is very simple,” reads the first line in John Kotter’s highly regarded The Heart of Change. “People change what they do less because they are given analysis that shifts their thinking than because they are shown a truth that influences their feelings.
Really? That’s the important message? That emotive arguments are more powerful than factual arguments? What about other reasons why people change their behavior, such as social proof, conformity, incentives or coercion? By setting up a binary and artificial choice between two communication alternatives, he eliminates important strategic and tactical options.
It’s not just Kotter either, who is a well respected professor at Harvard Business School. The truth is that a lot of management thinking is surprisingly shoddy, with arbitrary notions and cognitive biases dressed up as scholarly work. We need to be more skeptical about “research” that comes out of business schools and consultancies. Here are three things to look for:
1. WYSIATI And Confirmation Bias
Kotter’s point about emotive vs. analytic arguments is, of course, completely valid. The fundamental error he makes is that he focuses on that particular aspect to the exclusion of everything else. Daniel Kahneman calls this WYSIATI, or “what you see is all there is.” Once you get tunnel vision on a particular fact or idea, it’s hard to see anything else.
Consider this thought experiment: You go to a conference featuring a powerful, emotive presentation on the need to combat climate change. You see glaciers melting, polar bears losing their habitat and young children starving from drought. Then you go back to the office, fired up and ready to do something about it, but everyone else has a strong argument against acting on climate change.
What is likely to happen next? You convince you co-workers—including your bosses— about the urgency of the crisis? Or, surrounded by skeptics, your conviction begins to wane? When all we see is the poor polar bears and starving in an echo chamber of likeminded people, we forget about other considerations, but that doesn’t mean that’s all there is.
An issue related to WYSIATI is confirmation bias. Kotter proudly points out that he worked with Deloitte to conduct extensive research for his book. Amazingly, after analyzing over 200 interviews, he ended up with the same 8-step process he cited in his earlier work. So what was the purpose of the research, to gain actual insights or to confirm what he thought he already knew?
Perhaps not surprisingly, after decades of organizations applying Kotter’s ideas about change McKinsey still finds that more than two-thirds of transformational efforts fail. Maybe there is actually more to change than communication strategy.
2. Halo Effects And Confounding Variables
One of the most popular modes of analysis that business thinkers use is to examine successful companies and see what they do differently. A number of bestselling management books, such as In Search of Excellence, have used this method. Unfortunately, when doing so they often fall prey to a cognitive bias known as the halo effect.
For example, in 2000, before the dotcom crash, Cisco was flying high. A profile in Fortune reported it to have an unparalleled culture with highly motivated employees. But just one year later, when the market tanked, the very same publication described it as “cocksure” and “naive.” Did the “culture,” under the very same leadership, really change that much in a year? Or did the perceptions of its performance change?
Cisco had a highly motivated and, some would say, aggressive sales force. When the company was doing well, analysts assumed it was their aggressiveness that produced good results and when its fortunes changed, that same aggressive behavior was blamed for its failures. This is what’s known as a confounding error, the fact that an aggressive sales force correlated with specific results doesn’t mean that the aggressive sales force caused the results.
Every organization has things which it does differently, that are idiosyncratic to its management and culture. In some market contexts those traits will be advantageous, in other environments they may not be. It takes work—and some humility—to separate what’s truly a success factor, what’s merely fit for a narrow purpose and what’s not really relevant.
3. Survivorship Bias
Business school professors and consultants gain fame—not to mention large fees—when they are able to define a novel concept or success factor. If you are able to isolate one thing that organizations should do differently, you have a powerful product to sell. A single powerful insight can make an entire career, which is probably why so many cut corners.
For example, in their study of 108 companies, distinguished INSEAD professors W. Chan Kim and Renée Mauborgne found that “blue ocean” products, those in new categories without competition, far outperform those in the more competitive “red ocean” markets. Their book, Blue Ocean Strategy, was an immediate hit, selling over 3.5 million copies.
Bain consultants Chris Zook and James Allen’ book, Profit from the Core, boasted even more extensive research encompassing 200 case studies, a database of 1,854 companies, 100 interviews of senior executives and an “extensive review” of existing literature. They found that firms that focused on their ”core” far outperformed those who strayed.
It doesn’t take too much thinking to start seeing problems. How can you both “focus on your core” and seek out “blue oceans”? It betrays logic that both strategies could outperform one another. Also, how do you define “core?” Core markets? Core capabilities? Core customers? While it’s true that “blue ocean” markets lack competitors, they don’t have any customers either. Who do you sell to?
Yet there is an even bigger, more insidious problem called survivorship bias. Notice how “research” doesn’t include firms that went out of business because there were no customers in those “blue oceans” or because they failed to diversify outside of their “core.” The data only pertains to those that survived.
It’s hard to think of any other field where researchers could get away with such obviously careless work. Can you imagine medical research that didn’t include patients that died, or airplane research that didn’t include the flights that crashed? Suffice it to say that since the two books were published two decades ago, they’ve shown no capacity to predict whether a business will succeed or fail.
Don’t Believe Everything You Think
When I’m finishing up a book, I send out sections to be fact-checked by experts and those who have first-person knowledge of events. I’m always amazed at how much I get wrong. In some cases, I make truly egregious errors about facts I should have known (or did know, but failed to take into account). It can be an incredibly humbling process.
That’s why it’s so important to not to believe everything you think, there are simply too many ways to get things wrong. As Richard Feynman put it, “The first principle is that you must not fool yourself—and you are the easiest person to fool.” I would also add a second principle that just because you’ve managed to fool others, doesn’t mean you’ve gotten it right.
Unfortunately, so many of the popular management ideas today come from people who never actually operated a business, such as business school professors and consultants. These are often people who’ve never failed. They’ve been told that they’re smart all their lives and expect others to be impressed by their ideas, not to examine them thoroughly.
The problem with so much business thinking today is that there is an appalling lack of rigor. That’s the only way that obviously flawed ideas such as “blue oceans,” “profiting from the core” and John Kotter’s ideas about change management are able to gain traction. It’s hard to imagine any other field with such a complete lack of quality control.
That’s why I send out fact checks, because I know how likely I am to think foolish and inaccurate things. I’ve also noticed that I tend to be most wrong when I think I’ve come up with something brilliant. Much as Tolstoy wrote about families, there are infinitely more ways to get things wrong than to get things right.
Greg Satell is a transformation & change expert, international keynote speaker, and bestselling author of Cascades: How to Create a Movement that Drives Transformational Change. His previous effort, Mapping Innovation, was selected as one of the best business books of 2017. You can learn more about Greg on his website, GregSatell.com and follow him on Twitter @DigitalTonto
Need to overcome resistance to change? Sign up for the Adopting A Changemaker Mindset Course today!
Thanks a lot for another insightful post
Very interesting and very true … Thanks for posting.
Thanks Ian!
I appreciate your support Jose!