Skip to content

These 4 Major Paradigm Shifts Will Transform The Future Of Technology

2016 May 22
by Greg Satell

For the past fifty years or so, technology has followed a fairly predictable path. We squeeze more transistors onto silicon wafers, which makes chips more powerful and devices smaller. Manual processes become automated, productivity increases and life gets better. Rinse and repeat.

Today, we’re at an inflection point and that predictable path to progress will soon be closed off. What lies ahead is a period of extreme disruption in which most of what we’ve come to expect from technology is becoming undone. What replaces it will be truly new and different.

Over the next decade, Moore’s Law will end. Instead of replacing manual labor, technology will automate routine cognitive work. As information technology fades into the background, second order technologies, such as genomics, nanotechnology and robotics will take center stage. Here are the four major paradigm shifts that we need to watch and prepare for.

From The Chip to The System

In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years. He also predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.

That simple idea, known today as Moore’s Law, has helped power the digital revolution. As computing performance has become exponentially cheaper and more robust, we have been able to do a lot more with it. Even a basic smartphone today is more powerful than the supercomputers of past generations.

Yet Moore’s law is now nearing its end. The problem is twofold. First, there are only so many transistors you can squeeze onto a chip before quantum effects cause them to malfunction. Second, is the problem known as the von Neumann bottleneck. Simply put, it doesn’t matter how fast chips can process if they need to wait too long to communicate with each other.

So we have to shift our approach from the chip to the system. One approach, called 3D stacking, would simply combine integrated circuits into a single three dimensional chip. This is harder than it sounds, because entirely new chip designs have to be devised, but it could increase speeds significantly and allow progress to continue.

From Applications To Architectures

Since the 1960’s, when Moore wrote his article, the ever expanding power of computers made new applications possible. For example, after relational databases were developed in 1970, it became possible to store and retrieve massive amounts of information quickly and easily. That, in turn, dramatically changed how organizations could be managed.

Later innovations, like graphic displays, word processors and spreadsheets, set the stage for personal computers to be widely deployed. The Internet led to email, e-commerce and, eventually, mobile computing. In essence, the modern world is little more than the applications that make it possible.

Till now, all of these applications have taken place on von Neumann machines—devices with a central processing unit paired with data and applications stored in a separate place. So far, that’s worked well enough, but for the things that we’ve begun asking computers to do, like power self-driving cars, the von Neumann bottleneck is proving to be a major constraint.

So the emphasis is moving from developing new applications to developing new architectures that can handle them better. Neuromorphic chips, based on the brain itself, will be thousands of times more efficient than conventional chips. Quantum computers, which IBM has recently made available in the cloud, work far better for security applications. New FPGA chips can be optimized for other applications.

Soon, when we choose to use a specific application, our devices will automatically be switched to the architecture—often, but not always, made available through the cloud—that can run it best.

From Products To Platforms

It used to be that firms looked to launch hit products. If you look at the great companies of the last century, they often rode to prominence on the back of a single great product, like IBM’s System/360, the Apple II or Sony’s Walkman. Those first successes could then lead to follow ups—like the PC and the Macintosh—and lead to further dominance.

Yet look at successful companies today and they make their money off of platforms. Amazon earns the bulk of its profits from third party sellers, Amazon Prime and cloud computing, all of which are platforms. And what would Apple’s iPhone be without the App Store, where so much of its functionality comes from?

Platforms are important because they allow us to access ecosystems. Amazon’s platform connects ecosystems of retailers to ecosystems of consumers. The App Store connects ecosystems of developers to ecosystems of end users. IBM has learned to embrace open technology platforms, because they give it access to capabilities far beyond it own engineers.

The rise of platforms makes it imperative that managers learn to think differently about their businesses. While in the 20th century, firms could achieve competitive advantage by optimizing their value chains, the future belongs to those who can widen and deepen connections.

Moving From Bits To Atoms

In The Rise and Fall of American Growth, economist Robert Gordon argues that the rapid productivity growth the US experienced from 1920-1970 is largely a thing of the past. While there may be short spurts of growth, like there was in the late 90’s, we’re not likely to see a sustained period of progress anytime soon.

Among the reasons he gives is that, while earlier innovations such as electricity and the internal combustion engine had broad implications, the impact of digital technology has been fairly narrow. The evidence bears this out. We see, to paraphrase Robert Solow, digital technology just about everywhere except in the productivity statistics.

Still, there are indications that the future will look very different than the past. Digital technology is beginning to power new areas in the physical world, such as genomics, nanotechnology and robotics, that are already having a profound impact on such high potential fields as renewable technology, medical research and logistics.

It is all too easy to get caught up in old paradigms. When progress is powered by chip performance and the increased capabilities of computer software, we tend to judge the future by those same standards. What we often miss is that paradigms shift and the challenges—and opportunities—of the future are likely to be vastly different.

In an age of disruption, the only viable strategy is to adapt.

– Greg

9 Responses leave one →
  1. Yaqoob tahir izhar permalink
    May 24, 2016

    A very good and easy to understand article.

  2. Moti permalink
    June 2, 2016

    The least the claim “From Applications To Architectures” is farfetched. We are not able to solve a simple security problem application of “password protection” manually and carefully handcrafted challenge and now this will be “solved” automatically? This challenge is way more demanding than a simple AI task – but it is not bad dreaming. Hope that the government will not waste funds (DHS?) on this one.

  3. June 3, 2016

    Thank you for sharing your thoughts.

    – Greg

  4. FOREST permalink
    June 4, 2016

    Liked your article on the future. In my opinion, the present culture of technology in the USA, with its openness to new thinking and the contributions of the young and migratory, will be able to outreach, for now, the non adaptive cultures of the old worlds in Europe and Asia. Our education system is still the best and most flexible in the world. We allow multiple chances to succeed and to fail. China sends hundreds of thousands of students to our schools.
    But did you know that “Tonto” in Spanish means “stupid”? Digital Stupid?
    Another NO VA?
    Forest in Costa Rica

  5. June 4, 2016

    Yes, I did realize that (although I think the more correct translation would be “Digital Fool”). I actually kind of like the idea. Apparently others do too. There is a marketing agency in Mexico with the same name.

    – Greg

  6. tommi chen permalink
    June 21, 2016

    Productivity improvements will arrive. We are still at an early stage in the digital economy. When conventional firms adopt crowdsourcing following the digital startups (most employ crowdsourcing in one form or another, some totally, like Facebook), the ‘work’ done by the public will show up eventually in the computation. If anyone’s interested, here’s how firms could adopt crowdsourcing http://bit.ly/28Lr03d.

  7. June 21, 2016

    Hi Greg,

    Chip to The System, and Applications To Architectures – the common thread running through this is a shift “up the stack” .. its common … as things mature they build on the legacy of what has gone before … computers shrink from room sized things to things that you wear. Legacy becomes common and cheaper so the vale moves up the stack.

    Bits To Atoms .. this like Tim O’Reilly’s Web squared concept is the most intriguing to me … IT affects the world and Moore’s law will impact our lives – everything goes exponential …. who knows where this will lead.

    Martin (@timekord)

  8. June 23, 2016

    Good points. Thanks Tim!

  9. Moti permalink
    June 23, 2016

    The trouble is that the legacy “van Newman” architecture has a design fault – this because the architecture never intended to fully support a personal communication device, just DATA PROCESSING. Building on this architecture for the future will lead to cyber security strangle of the Future Digital Everything
    Solving the Cyber security challenge should be before the wide adoption of IoT and IoE.

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS