How Numbers Lie

It’s easy to laugh off an academic squabble. When overeducated combattants square off in an arena that most people don’t even know exists, few take notice. Yet some reverberate outside the academic world and I suspect that Paul Romer’s assault on mathiness, ably summarized by Justin Fox at Bloomberg View, will be one of them.
The issue at hand is the tendency of economists to cloak ideology in obscure equations to give their views a false appearance of rigor. Well, you might say, that’s what overeducated eggheads do, but seemingly practically minded business people have their own version of “mathiness.”
When managers say they are data driven and ROI focused they are usually more intent on professing a belief than delivering results. They are, essentially, accidental theorists, putting their faith in an abstract idea rather than engaging in any true analysis of cause and effect. Despite what many will tell you, numbers can lie and only fools follow them blindly.
The Engineering Of Efficiency
Frederick Winslow Taylor must have been a strange sight at the Midvale Steel Works. Unlike most factory foremen, he didn’t bark at his men to work harder and faster, but stood by with a stopwatch, pen and ledger, observing and timing their movements. His aim was to find the one, best way to perform every task.
We now know Taylor as the father of scientific management, which later spawned the best practices and six sigma movements. In the 1960’s and 70’s Wall Street got into the act, developing an efficient markets hypothesis that led to a capital asset pricing model (CAPM) to guide capital investments and the Black-Scholes model to mitigate risk.
The goal of these was to engineer optimized solutions to common business problems. If, as Taylor preached, there was a “one, best way” of doing things, then managers using math could hone their industrial machines by identifying best practices and deploying them throughout their organizations.
Alas, this was as much “mathiness” as it was math. Underlying all the numbers and complicated formulas were human assumptions. These were not, in fact, “scientific,” but mere guesses that made the math simpler and more manageable. Unfortunately, they were also very, very wrong.
The Mathematics of “Anything Can Happen”
The idea that results can be “engineered” is an attractive one. It suggests that by knowing inputs we can predict, with a remarkable degree of accuracy, what outputs will be. If true, then performance is mostly a matter of getting the data right. With better measurement and analysis, we should be able to get better results.
Yet Benoit Mandelbrot cast doubt on this neat little story. Although much of the time these models worked and past results did indicate future performance, sometimes they were far off the mark. He argued that economic analysis was too dependent on “Joseph effects,” which supported continuity and neat models, but ignored “Noah effects” which created discontinuity and blew those same models to bits.
In a sense, he wasn’t telling anybody anything they didn’t know. Statisticians had long been aware that no model is perfect, but disregarded stray points of data as “outliers” that could be largely ignored. Yet Mandelbrot pointed out that the efficiency engineering models were failing. Outliers like market crashes happened far more often than they predicted.
So while everyone else was preaching about the wonders of “scientific management,” Mandelbrot was arguing for the mathematics of anything can happen. In his mind, it was the outliers—market crashes, innovations like electric cars and iPhones, world wars and the like—that determined the course of history.
The System Crashes
For most of his career, Mandelbrot was seen as an iconoclast to be listened to and then ignored. Paul Cootner, one of the pioneers of financial engineering wrote that he forced his colleagues “to face up in a substantive way to those uncomfortable empirical observations that there is little doubt most of us have had to sweep under the carpet until now.”
Then he added, “but surely before consigning centuries of work to the ash pile, we should like to have some assurance that all of our work is truly useless.” The message was clear: the train had left the station. The dream of a clockwork universe was far too alluring— and there was far too much money to be made—to to stop it.
The financial crisis of 2008, just a few years before his death, largely redeemed Mandelbrot in financial circles. The risk models of the financial engineers failed us all. Unforeseen defaults in mortgage markets cascaded through system and reverberated throughout the entire economy. Risk wasn’t being managed. In fact, it was being stepped up.
Unfortunately, the lessons remain surprisingly unlearned. Today’s managers, driven by data and focused on ROI, continually believe that, with just a little bit more effort and precision, that they can keep Mandelbrots “Noah effects” at bay and consistently engineer results.
Robustness And Resilience
As he describes in his new book, Team of Teams, when General Stanley McChrystal first took over Special Forces in Iraq, he presided over a magnificently engineered military machine. No force in the world could match their efficiency, expertise and effectiveness. Yet, although they won every battle, they were losing the war.
The problem, he now explains, is that although his force had robust capabilities—they could perform any task they were given—they failed to be resilient in the face of unseen circumstances. It wasn’t that they weren’t doing their jobs right, but that they weren’t doing the right jobs—and that was the Achilles heel of his elite force.
As McChrystal puts it, “In complex environments, resilience often spells success, while even the most brilliantly engineered fixed solutions are often insufficient or counterproductive.” Or, in business terms, they were performing to plan, but the plan itself was flawed. It was based on assumptions that turned out not to be true.
And that’s the problem with mathiness. For any endeavor, there is a simple model that should reasonably lead to good results. However, every model includes assumptions and, unless we understand and account for those assumptions, the model will eventually blow up in our face. Obscuring that reality with Greek letters and abstract symbols only compounds the problem.
So while mathiness conveys a certain authority, and the idea of “scientifically engineered” solutions sounds attractive, we should remember that science isn’t about certitude, but skepticism. There is never a magic formula that can solve all our problems. A leader’s job is to deal with uncertainty, not ignore it.
– Greg
Economics is the dismal science for a reason, and at times the science of statistics is tortured to make results fit a model even when they do not want to cooperate. Unlike atomic particles which in large amounts may appear as a wave, humans are different kinds of individuals who can act. Unlike particles their action may not rise from sense and may be influenced by many things including culture, environment and more.
It is not that math does not inform, it is blindly following it off the cliff, or torturing it to make the point that leads to troubles.
As always, the future comes in ways both predicted and unpredictable and we must adjust. Those open minded and flexible enough will be the first to ride the waves.
Great points. Interestingly, many of the models used derive from Einstein’s statistical analysis of particles in Brownian motion. They don’t always translate as well to real life.
Thanks Robert.
– Greg
What a great post.
It is germane to a conversation I am currently having with a correspondent about a recent artice published in the “MIT Technology Review:”
Network Theory Reveals The Hidden Link Between Trade And Military Alliances That Leads to Conflict-Free Stability
To wit:
Here’s what Wikipedia has to say about the Santa Fe Institute in its entry on systems theory:
Not all systems theorists, however, are so sanguine that complex systems, and especially our current global capitalist system, can be or are “adaptive.”
For instance, in this interview with Benoit Mandelbrot, who was one of the pioneers of systems analysis, and who was also associated with the SFI, he asserts that:
Nicholas Taleb then elaborates on Mandelbrot’s point:
Another pioneer of systems analysis, Immanuel Wallerstein, says “our existing historical system is in the process of dying,” and that “there is a fierce struggle over what kind of new historical system will succeed it.”
This notion that our current global system is in crisis or is dying is of course a far cry from the sublime conclusions the MIT researchers in the SFI link you sent me came to, who claimed that:
Jackson and Nei offer this graph to illustrate the fact that wars between states have diminished significantly in the post-WWII era:
“Graph from MIT Technology Review”
There is, however, an alternate explanation as to why there have been so few interstate wars in the post-WWII era. Here’s how Jonathan Schell puts it in The Unconquerable World:
What the MIT researchers Jackson and Nei give us is what William I. Robinson, another renowned global systems theorist, would call an “uncritical” global systems analysis. Here’s how Robinson explains the difference between their “uncritical” global systems analysis and a “critical,” or “subversive” global systems anaysis:
Insightful. Thanks Glenn.
– Greg
Just found your blog Greg – a lot of inciteful stuff on here!
Coming from a scientific discipline (Neuroscience) I always find myself looking for data to back up my decisions. But so often in business (as in economics) that data isn’t the result of changing a single variable and measuring its effect and as such it is often difficult to establish any causality.
How do we combat this? Question our assumptions more or simply abandon data driven “scientific” management and all trust our guts?
I think the answer is somewhere in between. It’s important to take data seriously, the trouble comes when we try to over-engineer solutions. For example, the financial crisis largely happened not because the error in the models were so great, but we had far too much confidence in them and pushed them past their limits. Human judgment was largely taken out of the equation because the traders largely didn’t understand the understand the underlying models nor their limitations.
– Greg