Skip to content

Why The Future Is Not Digital

2018 July 22
by Greg Satell

In a famous scene in the 1967 movie The Graduate, a family friend takes aside Dustin Hoffman’s character, Benjamin Braddock, and whispers in a conspiratorial tone “Plastics.  There’s a great future in plastics.” It’s seems quaint today, but back then plastics really were new and exciting.

If the movie had been set in another age, the advice to young Braddock would have been different. He might have been counseled to go into railroads or electronics or simply to “Go west young man!” Every age has things that seem novel and wonderful at the time, but tepid and banal to future generations.

Today, digital technology is all the rage because after decades of development it has become incredibly useful. Still, if you look closely, you can already see the contours of its inevitable descent into the mundane. We need to start preparing for a new era of innovation in which different technologies, like genomics, materials science and robotics rise to the fore.

Understanding The Technology Lifecycle

To understand what’s happening, it helps to look at earlier technologies. The rise of electricity, for example, began in the early 1830s, when Michael Faraday invented the electric dynamo and motor. Still, it wasn’t until fifty years later that Edison opened his first power plant and forty years after that, during the 1920s, that electricity began to have a measurable impact on productivity.

Every technology follows a similar path of discovery, engineering and transformation. In the case of electricity, Faraday uncovered new principles, but no one knew how to make them useful. The technology had to be understood better for people like Edison, Westinghouse and Tesla to figure out how to make things that people would be willing to buy.

However, to create a true transformation takes more than a single technology. First, people need to change their habits and then secondary innovations need to come into play. For electricity, factories needed to be redesigned and work itself had to be reimagined before it began to have a real economic impact. Then household appliances, radio communications and other things changed life as we knew it, but that took another few decades.

Today, digital technology has truly become transformative. It would be hard to explain to someone looking at an IBM mainframe back in the 1960s that someday similar machines would replace books and newspapers, give us recommendations on where to eat, directions how to get there and even talk to us, but today those things have become commonplace.

The Twilight Of The Digital Age

Today, there are several reasons to believe that the twilight of the digital age is upon us. First is the technology itself. What’s driven advancement has been our ability to cram more and more transistors onto a silicon wafer, a phenomenon we’ve come to know as Moore’s law. That enabled us to make technology exponentially more powerful year after year.

Yet now Moore’s law is ending and advancement isn’t so easy anymore. Companies like Microsoft and Google are now designing custom chips to run their algorithms, because it is no longer feasible to just wait for a new generation of chips anymore. To maximize performance, you increasingly need to optimize technology for a specific task.

Second, the technical skill required to create digital technology has dramatically decreased, marked by the rising popularity of so-called no-code platforms. Much like with auto mechanics and electricians, the ability to work with digital technology is increasingly becoming a mid-level skill. With democratization comes commoditization.

Finally, digital applications are becoming fairly mature. Buy a new laptop or mobile phone today and it pretty much does the same things as one you bought five years ago. New technologies, such as smart speakers like Amazon Alexa and Google Home, add the convenience of voice interfaces, but little else.

Bits Driving Atoms

While there is limited new value to be gleaned from things like word processors and smartphone apps, there is tremendous value to be unlocked in applying digital technology to fields like genomics and materials science to power traditional industries like manufacturing, energy and medicine. Essentially, the challenge ahead is to learn how to use bits to drive atoms.

To understand how this will work, let’s look at the The Cancer Genome Atlas. Introduced in 2005, its mission was simply to sequence tumor genomes and put them online. To date, it has catalogued over 10,000 genomes across more than 30 cancer types and unlocked a deluge of innovation in cancer science. It also helped inspire a similar program for materials called the Materials Genome Initiative.

These efforts are already greatly increasing our ability to innovate. Consider the effort to develop advanced battery chemistries to drive the clean energy economy, which requires the discovery of new materials that don’t yet exist. Historically, this would involve testing hundreds or thousands of molecules, but researchers have been able to apply high performance supercomputers to run simulations on materials genomes and greatly narrow down the possibilities.

Over the next decade, these techniques will increasingly incorporate machine learning algorithms as well as new computing architectures, such as quantum computing and neuromorphic chips that function very differently than digital computers do.

Adapting To A New Era Of Innovation

The possibilities of this new era of innovation are profoundly exciting. The digital revolution, for all of its charms, has had a fairly limited economic impact compared to earlier technologies, like electricity and the internal combustion engine. Even now, information technologies only make up about 6% of GDP in advanced economies.

Compare that to manufacturing, healthcare and energy, which make up 17%, 10% and 8% or global GDP, respectively and you can see how there is vastly more potential to make an impact beyond the digital world. Yet to capture that value, we need to rethink innovation for the 21st century.

For digital technology, speed and agility are key competitive attributes. Techniques like rapid prototyping and iteration greatly accelerated development, and often improved quality, because we understood the underlying technologies extremely well. Yet with the nascent technologies that are emerging now, that is often not the case.

You can’t rapidly prototype a quantum computer, a cure for cancer or an undiscovered material. There are serious ethical issues surrounding technologies like genomics and artificial intelligence. We’ve spent the last few decades learning how to move fast. Over the next few decades we’re going to have to relearn how to go slow again.

So while the mantras for the digital age have been agility and disruption, for this new era of innovation exploration and discovery will once again become prominent. It’s time to think less about hackathons and more about tackling grand challenges.

– Greg

 

An earlier version of this article first appeared in Harvard Business Review

Image: Pixabay

No comments yet

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS