The 2020s Will Be An Era Of Atoms, Not Bits
When Barack Obama appointed Aneesh Chopra as the first Chief Technology Officer of the United States he was sending a clear message: No longer a quirky trend, digital technology would become central to managing the country. He was, in effect, laying down a marker for how important information technology had become.
More recently, Joe Biden sent a similarly important message when he announced that not only would he be nominating Eric Lander, a mathematician turned geneticist, as director for the Office of Science and Technology Policy, but that he would be elevating the position to a cabinet-level appointment.
The simple truth is that the digital era is ending and innovation is shifting to other places. Digital technology will remain —just as heavy industry persists long after the end of the industrial era— but will no longer be primary. Over the next decade, we’ll see a major shift from bits to atoms that, hopefully, will help us emerge from our extended productivity slump.
The Other 94%
It should be clear by now that the digital revolution has been a big disappointment. Compared to other general purpose technologies such as electricity and internal combustion, which drove a 50-year productivity boom between 1920 and 1970, we’ve only had about eight years of elevated productivity since then. Given the level of hype, that’s incredibly paltry.
Even today, after seemingly pervasive innovations such as social media, the mobile web and artificial intelligence, we appear to be in the midst of a second productivity paradox. To paraphrase economist Robert Solow, we can see digital technology everywhere except in the productivity statistics.
Think about what life would look like if you were immediately transported back 50 years, to 1971. In a typical household, you would see many of the same things you would today—running water, appliances in the kitchen, a TV in the living room and a car in the garage. Clearly you would miss some things, but you’d get along just fine.
Go back 50 years before that and you would struggle to survive. Backbreaking hours would need to be spent hauling water and chopping wood just to cook a simple meal or clean the house. The lack of motorized transportation and refrigeration would severely limit your diet and if you got sick there would be no antibiotics. Back then, few would travel further than 10 miles from where they were born during their entire lives.
The truth is that, even at this late date, we still spend most of our money on atoms—what we live in, ride in, wear and eat—and relatively little on bits. Information and communication technologies today only make up about 6% of GDP in advanced countries. If we are ever to get the engines of prosperity roaring again, we need to focus on the other 94%.
A New Era Of Innovation
While it is true that digital technology failed to make the impact many of us hoped it would, there is still cause for hope. Sure, the value of bits and pixels dancing around on our screens is limited, but when we start applying computing power to the atoms around us, we can unleash amazing efficiencies that have the potential to make us materially better off.
Consider the Human Genome Project, which has been estimated to have generated over $1 trillion in economic impact. Given that our DNA includes more than 3 billion base pairs, decoding our genome manually would have been impractical, to say the least, but increased computing power made the enormous task inherently manageable.
Since then, as computing power has increased and new discoveries such as CRISPR have multiplied the power of our capabilities, we have unlocked a revolution in synthetic biology. A similar effort to apply bits to atoms is driving a transformation in materials science which, in some cases, is improving efficiencies in materials discovery by 200x-300x.
That’s just the start. New computing architectures such as quantum and neuromorphic will increase our understanding of the physical world and the greater bandwidth delivered by 5G (and eventually 6G) will transform our ability to interact with physical objects. There will be enormous potential to exponentially improve how we grow, make and power things.
A Multi-Polar Technological Universe
In Regional Advantage, Cal Berkeley’s AnnaLee Saxenian explains how the computer technology universe shifted from Boston’s Route 128 to Silicon Valley in the 70s and 80s. Later, she explained in The New Argonauts how immigrants who worked in Silicon Valley eventually went back home and started new hubs in places like Tel-Aviv, Taipei and Bangalore.
Ironically, the new hubs that sprung up only strengthened the primacy of Silicon Valley. These thrived in connection with, not competition to, their former Bay Area colleagues. Digital technology, when you get right down to it, is fairly straightforward. A microchip is a microchip and a line of code is a line of code anywhere in the world. So it makes sense that one central hub would dominate the entire industry.
Today’s nascent technologies, however, are very different. To understand why, take a look at the Manufacturing USA Institutes that the government set up with industry. There is a robotics hub in Pittsburgh PA, a composite materials hub in Knoxville TN, an additive manufacturing hub in Youngstown OH and many others strewn across the country. Each requires very different expertise and applies to different industries.
Even within specific industries expertise can be highly specialized. Biotechnology, for example, has a number of major hubs across the country. While Boston leads in funding and lab space, it is in second in patents and third place for biotech employment. If there is a primary ecosystem it is not any one city, but the Mid-Atlantic region, with hubs stretching from Boston to Washington D.C and even, one could argue, extending all the way south to North Carolina.
The regional nature of atom-based technologies, combined with the increasing possibilities for remote work, has the potential to revive regions and improve quality of life for millions across the US and the world.
How To Win In the New Era
It’s become common to glamorize the Silicon Valley venture-funded model. Corporations start their own venture capital funds. Ambitious executives dub themselves “intrapreneurs.” When former General Electric CEO Jeffrey Immelt wanted to impress the business press, he marketed his company as a 124 year-old start-up (it didn’t end well).
Yet the truth is that while Silicon Valley’s way of doing things works perfectly well for software and consumer gadgets, it is rarely a good fit for atom-based industries. If we are to compete in this new era, we need to develop new ways of doing business and create new platforms upon which ecosystems of technology, talent and information can flourish.
What will be most crucial in this new era will be to build more effective collaboration that transcend traditional domains and organizational types. We need corporations working with startups, government entities and research universities. We need new collaborative structures such as the Manufacturing institutes mentioned above, but also places like mHUB in Chicago and the Delaware Innovation Space. We need a new breed of investors that specialize in technical risk instead of market risk.
Most of all, we need to be clear-eyed about the fact that the future will not look like the past. We need to learn from the mistakes of the digital age, not repeat those same mistakes in some misguided effort to market our failures into perceived successes. We need to march boldly forward, wiser and better equipped, precisely because of our earlier failings.
– Greg
Image: Unsplash