America’s Innovation Ecosystem Needs To Innovate Itself
The world today just seems to move faster and faster all the time. From artificial intelligence and self-driving cars to gene editing and blockchain, it seems like every time you turn around, there’s some newfangled thing that promises to transform our lives and disrupt our businesses.
Yet a paper published by a team of researchers in Harvard Business Review argues that things aren’t as they appear. They point out that total factor productivity growth has been depressed since 1970 and that recent innovations, despite all the hype surrounding them, haven’t produced nearly the impact of those earlier in the 20th century.
The truth is that the digital revolution has been a big disappointment and, more broadly, technology and globalization have failed us. However, the answer won’t be found in snazzier gadgets or some fabulous “Golden Era” of innovation of years long past. Rather we need to continually innovate how we innovate to solve problems that are relevant to our future.
The Productivity Paradox, Then And Now
In the 1970s and 80s, business investment in computer technology was increasing by more than 20% per year. Strangely though, productivity growth had decreased during the same period. Economists found this turn of events so bizarre that they called it the “productivity paradox” to underline their confusion.
Yet by the late 1990s, increased computing power combined with the Internet to create a new productivity boom. Many economists hailed the digital age as a “new economy” of increasing returns, in which the old rules no longer applied and a small initial advantage, a first mover advantage, would lead to market dominance. The mystery of the productivity paradox, it seemed, had been solved. We just needed to wait for technology to hit critical mass.
Yet by 2004 productivity growth fell once again and has not recovered since. Today, more than a decade later, we’re in the midst of a second productivity paradox, just as mysterious as the first one. New technologies like mobile computing and artificial intelligence are there for everyone to see, but they have done little, if anything, to boost productivity.
Considering the rhetoric of many of the techno-enthusiasts, this is fairly shocking. Compare the meager eight years of elevated productivity that digital technology produced with the 50-year boom in productivity created in the wake of electricity and internal combustion and it’s clear that the digital economy, for all the hype, hasn’t achieved as much as many would like to think.
Are Corporations To Blame?
One explanation that the researchers give for the low productivity growth is that large firms are cutting back on investment in science. They explain that since the 1980s, a “combination shareholder pressure, heightened competition, and public failures led firms to cut back investments in science” and point to the decline of Bell Labs and Xerox PARC as key examples.
Yet a broader analysis tells a different story. Yes, while Bell Labs and Xerox PARC still exist, they are but a shadow of their former selves, but others, such as IBM Research, have expanded their efforts. Microsoft Research, established in 1991, does cutting edge science. Google runs a highly innovative science program that partners with researchers in the academic world.
So anecdotally speaking, the idea that corporations haven’t been investing in science seems off base. However, the numbers tell an even stronger story. Data from the National Science Foundation shows that corporate research has increased from roughly 40% of total investment in the 1950s and 60s to more than 60% today. Overall R&D spending has risen over time.
Also, even where corporations have cut back, new initiatives often emerge. Consider DuPont Experimental Station which, in an earlier era, gave birth to innovations such as nylon, teflon and neoprene. In recent years, DuPont has cut back on its own research but the facility, which still employs 2000 researchers, is also home to the Delaware Incubation Space, which incubates new entrepreneurial businesses.
The Rise Of Physical Technologies
One theory about the productivity paradox is that investment in digital technology, while significant, is simply not big enough to move the needle. Even today, at the height of the digital revolution, information and communication technologies only make up about 6% of GDP in advanced economies.
The truth is that we still live in a world largely made up of atoms, not bits and we continue to spend most of our money on what we live in, ride in, eat and wear. If we expect to improve productivity growth significantly, we will have to do it in the physical world. Fortunately, there are two technologies that have the potential to seriously move the needle.
The first is synthetic biology, driven largely by advances in gene editing such as CRISPR, which have dramatically lowered costs while improving accuracy. In fact, over the last decade efficiency in gene sequencing has far outpaced Moore’s Law. These advances have the potential to drive important productivity gains in healthcare, agriculture and, to a lesser extent, manufacturing.
The second nascent technology is a revolution in materials science. Traditionally a slow-moving field, over the past decade improved simulation techniques and machine learning have improved the efficiencies of materials discovery dramatically, which may have a tremendous impact in manufacturing, construction and renewable energy.
Yet none of these gains are assured. To finally break free of the productivity paradox, we need to look to the future, not the past.
Collaboration Is The New Competitive Advantage
In 1900, General Electric established the first corporate research facility in Schenectady, New York. Later came similar facilities at leading firms such as Kodak, AT&T and IBM. At the time, these were some of the premier scientific institutions in the world, but they would not remain so.
In the 1920s new academic institutions, such as the Institute for Advanced Study, as well as the increasing quality of American universities, became an important driver of innovation. Later, in the 1940s, 50s and 60s, federal government agencies, such as DARPA, NIH and the national labs became hotbeds of research. More recently, the Silicon Valley model of venture funded entrepreneurship has risen to prominence.
Each of these did not replace, but added to what came before. As noted above, we still have excellent corporate research programs, academic labs and public scientific institutions as well as an entrepreneurial investment ecosystem that is the envy of the world. Yet none of these will be sufficient for the challenges ahead.
The model that seems to be taking hold now is that of consortia, such as JCESR in energy storage, Partnership on AI for cognitive technologies and the Manufacturing USA Institutes, that bring together diverse stakeholders to drive advancement in key areas. Perhaps most conspicuously, unprecedented collaboration sparked by the Covid-19 crisis has allowed us to develop therapies and vaccines faster than previously thought possible.
Most of all, we need to come to terms with the fact that the answers to the challenges of the future will not be found in the past. The truth is that we need to continually innovate how we innovate if we expect to ever return to an era of renewed productivity growth.
– Greg
Image: Unsplash
Regarding the Productivity Paradox, part of the problem is in measurement. Because digital techs are on an exponential improvement curve, the value being delivered with, say, each new generation of mobile phone is not being reflected in the price. Hence, IT overall “only make up about 6% of GDP in advanced economies” but that’s in part because the smartphone in your pocket today costs about the same as the one from 10 years ago, but you’re receiving far more value. Also, many digital products are “free” so not counted in GDP.
Researchers like Erik Brynjolfsson and Chad Syverson have been trying to measure the Consumer Surplus being generated by these technologies, suggesting that we create a GDP-B to account for them (https://www.nber.org/papers/w25695). My colleagues and I have also been investigating the phenomenon.
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3042915
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3373830
Michael,
I take your point. However, as Robert Gordon and others have pointed out, previous technology shifts have had similar issues with measurement and still were able to show strong productivity growth.
More anecdotally speaking, anybody today could go to a typical American home in 1970 and see much that he or she would recognize. There would be indoor plumbing, a kitchen with appliances, TV, radio, a car in the garage, etc. However, a person going from 1970 to a typical household in 1920, with no running water or electricity, would struggle to survive.
By any objective standard, most facets of life now are worse off than before the digital revolution. We work more, but average real wages have stagnated while costs for education and health care have skyrocketed. We’re less healthy and more vulnerable. Anxiety and depression are at epidemic levels. Life expectancy is actually decreasing.
When Steve Jobs promised to “change the world,” was this what he really meant?
Greg
Hi Gregg — Thanks for engaging with me. Very familiar with Gordon–interviewed him for our research, as I live nearby–and agree with his core thesis that you nicely illustrate with your anecdote.
My point was that the measurement issues may be much more profound than they were earlier. Indeed, those prior measurement issues, such as unpaid household work and child/senior care, are still with us. On top of it, however, is a proliferation of “free” goods and digital deflation that we don’t have a good handle on. So there may be much less of a productivity paradox than economists today can measure. That does not negate Gordon but maybe softens the hype critique? There may also be a delay factor, as with the first wave of IT that took a couple decades to show up in GDP growth.
I get you. My point isn’t so much that the digital revolution was all hype. Clearly, some of it was hype and some wasn’t. I also believe that as we learn to apply bits to atoms, for example with genomics and materials science, we may be able to unlock a real productivity boom that can rival what was unleashed in the 1920s.
However—and I think this is a crucial point—the digital revolution has been somewhat of a disappointment and there should be an accounting of what worked and what didn’t before we move forward and make the same mistakes all over again.
– Greg