Innovation Is Combination
Much has been made about the difference between innovation and invention. One writer went so far as to argue that Steve Jobs development of the iPod wasn’t an innovation because it was dependent on so much that came before it. A real innovation, so the argument goes, must be truly transformational, like the IBM PC, which created an entire industry.
The problem with these kind of word games is that they lead us to an infinite regress. The IBM PC can be seen as the logical extension of the microchip, which was the logical extension of the transistor. These, in turn, rose in part through earlier developments, such Turing’s universal computer and the completely irrational science of quantum mechanics.
The truth is that innovation is never a single event, but happens when fundamental concepts combine with important problems to create an impact. Traditionally, that’s been done within a particular organization or field, but to come up with breakthrough ideas in the 21st century, we increasingly need to transcend conventional boundaries of company and industry.
Transforming Alchemy Into Chemistry
Everybody knows the story of Benjamin Franklin and his famous kite, but few have ever heard of John Dalton and his law of multiple proportions. What Dalton noticed was that if you combine two or more elements, the weight resulting compound will be proportional to its components. That may seem vague, but it did more for electricity than Franklin ever did.
The reason that Dalton’s obscure law became so important is that it led him to invent the modern concept of atoms and, in doing so, transformed the strange art of alchemy into the hard science of chemistry. Once matter could be reduced down to a single, fundamental concept, it could be combined to make new and wondrous things.
Dmitri Mendeleev transformed Dalton’s insight into the periodic table, transforming the lives of high school students and major chemical corporations alike. Michael Faraday’s chemical experiments led to his development of the dynamo and the electric motor, which in turn led to Edison’s electric light, modern home appliances and even IBM’s PC and Apple’s iPod.
Which of these are inventions and which are innovations? It’s impossible to tell and silly to argue about. What’s clear is none are the product of a single idea, but are all combinations of ideas built on the foundation that Dalton created.
Merging Man and Machine
In the early 1960s, IBM made what was perhaps the biggest gamble in corporate history. Although it was already the clear leader in the computer industry, it invested $5 billion — in 1960 dollars, worth more than $30 billion today — on a new line of computers, the 360 series, which would make all of its existing products obsolete.
The rest, as the say, is history. The 360 series was more than just a product, it was a whole new way of thinking about computers. Before, computers were highly specialized machines designed to do specific jobs. IBM’s new product line, however, offered a wide range of capabilities, allowing customers to add to their initial purchase as their business grew. It would dominate the industry for decades.
When Fred Brooks, who led the project, looked back a half century later, he said that the most important decision he made was to switch from a 6-bit byte to an 8-bit byte, which enabled the use of lowercase letters. Considering the size of the investment and the business it created, that may seem like a minor detail.
But consider this: That single decision merged the language of machines with the language of humans into a fundamental unit. In effect, the 8-bit byte transformed computers from obscure calculating machines into a collaboration tool.
Learning The Language of Life
Much like Dalton came up with the fundamental unit of chemistry, a century later Wilhelm Johannsen developed the fundamental unit of biology in 1909: the gene. This too was a combination — and also a refinement — of earlier ideas from men like Charles Darwin, Gregor Mendel and others.
However, for scientists of the early twentieth century, a gene was little more than a concept. No one knew what a gene was made of or even where they could be found. It was little more of an abstract idea until Watson and Crick discovered the structure and function of DNA. Even then, there was little we could do with genes except know that they were there.
That changed when the Human Genome Project was completed in 2003 and unleashed the new field of genomics. Today, genetic treatments for cancer have become common and, with prices for genetic sequencing falling faster than those for computer chips, we can expect gene therapies to be applied to a much wider array of ailments over the next decade.
Yet these new developments are not the product of just biologists. The challenges of gene mapping required massive computing power. So researchers working on genes needed to work closely with computer scientists to put supercomputers to work helping to solve the problem.
A New Era of Innovation
The confusion about innovation and invention reflects a fundamental misunderstanding about how innovation really works. The idea that certain ideas are flashes of divine inspiration while others are merely riffs off of earlier tunes sung long ago fails to recognize that all innovations are combinations.
Over the last century, most inventions have been combinations of fundamental units. Many important products, from household goods to miracle cures, have been developed through combining atoms in new and important ways. Learning how to combine bytes of information gave rise to the computer industry and we’re now learning how to combine genes.
The 21st century, however, will give rise to a new era of innovation in which we combine not just fundamental elements, but entire fields of endeavor. As Dr. Angel Diaz, IBM’s VP of Cloud Technology & Architecture told me, “We need computer scientists working with cancer scientists, with climate scientists and with experts in many other fields to tackle grand challenges and make large impacts on the world.”
Today, it takes more than just a big idea to innovate. Increasingly, collaboration is becoming a key competitive advantage because you need to combine ideas from widely disparate fields. So if you want to innovate, don’t sit around waiting for a great eureka moment — look for what you can combine to create something truly new and powerful.
– Greg
An earlier version of this article first appeared in Inc.com
Thanks Greg. I like to think that innovation makes a difference, while invention well it may or may not. I concur with you, every invention has built upon what was known before. I was asked this when speaking with a company about product development. When asked about managing or making something completely new, I responded I had not, and really did not think anyone had. All things come from observation of what is, and then how it can be altered. They of course did not like that answer and went another way, and are now out of business as an ongoing entity. Managing innovation on the other hand can be done, since it involved observation and collaboration and as you state network effect. We can make many new things and new ways of doing things, but all built on nature, physics and what has come before.
Nice article.
IBM is a company with an amazing history of innovation and invention.
I’ve been in IT since before the Tandon hard drive. One traumatic moment was the change from desktop to Web. Even with that background I did not recognize the revolution represented by the cloud. I started taking an online class in Amazon Web Services by Cloud Guru. It was an eye opener. Even with my past experience, I had no idea how much the cloud changed things. At the risk of advertising… I’ve seen the Cloud Guru Architect class for as low as $15. The guy is real easy to listen to. I would recommend paying the $15 just to see the first 2 videos in the series. They were an eye opener. The change the cloud represents is far greater than I ever guessed. … and I’m at 30% on the cert course 🙂
I think the problem with splitting hairs like that is that the definition is always in the eye of the beholder. Did Einstein’s theory of relativity make a difference? Well, yes, but not until decades later. So does that mean that his ideas were not innovative in 1920, but they were in 1950? Or take Engelbart’s “Mother of All Demos,” which led to so many of the innovations that are central in computers today. Were Engelbart’s ideas not innovative until the Macintosh came out in 1984?
– Greg
That’s very true. The cloud is extremely transformative.
– Greg
Don’t you think the new era of 21st Century innovation is well underway?
I’m more of an online immigrant than a digital native but the speed of progress over these last few years equals way more than the sum of its parts.
As you say, cancer researchers should be (and are) working with analysts and together their progress (I guess) is advancing at a faster rate than ever before.
But I also suggest this is a generational thing. So while researchers into common conditions (such as cancer) have been traditionally happy to move forward at a painfully slow rate, newer and younger professionals remain unburdened by the restrictions that existed say 20 years ago.
Hopefully, as a race we’re on the verge of many great advances…
I’m not sure it’s a generational thing. Many of the most innovative people I know are not Millenials by a long shot. To take one example, James Allison, who developed cancer immunotherapy, which is a revolutionary new cure, is 68.