Skip to content

Are We Losing the Will to Innovate?

2012 September 23

Are we really achieving anything?  Or are we squandering the ingenuity of our predecessors on a trivial consumer culture of tweets and likes?  Where is our moon landing, our theory of relativity, our decisive breakthrough that future generations will remember?

That’s the challenge that Justin Fox raised in a provocative HBR post.   It’s a valid question, one that should be asked and answered.  However, after thinking about it for a while, I realized that much of the hand wringing in this area has been misplaced.

Digital technology has become superficial because we’ve gotten so very good at it.  It’s a 60 year-old paradigm that achieved little in its first few decades, then sprinted forward, devouring everything in its path and will soon come to an end.  That’s what innovation theorists call the S-Curve and it’s how technology works.  The future is bright.

The Present Paradigm

The technology cycle that everybody talks about today really began in 1948, with two developments at Bell Labs.  The first and more famous was the invention of the transistor. The second, less well known but just as important, was Claude Shannon’s groundbreaking paper that launched information theory.

Since then, we’ve learned how to squeeze billions of transistors onto a single chip creating devices that, although they fit comfortably in our pockets, have more computing power than the entire Apollo program.  We unconsciously refer to Shannon’s obscure paper every time we choose a data plan from our cable company or download a movie in MB’s.

The sublime has paved the way for the ridiculous and that’s exactly as it should be.  We’ve become so good at the present technology that we use it to enrich our everyday lives.  That might seem trivial, but it’s fun and we like it.

On the other hand, there are some truly new paradigms that seem more like science fiction than real research.  Nevertheless, they are advancing quickly and are either starting to manifest themselves in real products or probably will within the next decade.

Nanotechnology

In the late winter of 1959, a young scientist named Richard Feynman, got up to address the American Physical Society.  His lecture was not the usual fare of decaying sub-atomic particles and obscure Greek letter strewn formulas.  Nevertheless, it promised to become one of the most significant and consequential scientific events the world has ever known.

It was entitled There’s Plenty of Room at the Bottom and it is an absolute delight.  In that room a half century ago, speaking at roughly a high school level, Feynman asked why we could not print the Encyclopedia Britannica on the head of a pin and introduced the world to nanotechnology.  Half a century later it’s just getting started.

The next step is nanocomputing.  We will create devices out of microscopic components and there will be computers as small as a grain of sand.  In the future, we will literally be able to spray on information technology (and possibly, in the case of cosmetics, rub on). This isn’t science fiction, as this article shows, the effort is already well underway.

But nanotech goes far beyond computing.  Today, it’s being deployed to build the next generation of solar panels.  New materials such as super-strong carbon structures called fullerenes are revolutionizing materials science and may also provide the key to unlocking superconductivity, while self-replicating nanorobots will change manufacturing forever.

Genomics

While moon landing captured the world’s imagination, this generation’s great achievement, the mapping of the human genome, which took 13 years and $3 billion, is probably more significant. Since then, scientists have cut the cost to less than $1000 and the price will fall to under $100 in another decade.

That’s about the cost of a basic blood test today, so it is not surprising that genomics is becoming one of the hottest areas of investment around.  From using personal genomes to better diagnose illnesses to using gene therapies for diseases like cancer and Alzheimer’s, this new field will revolutionize medicine as we know it.

Yet, the impact will go far beyond health services.  This article about Craig Venter, one of the pioneers of the field, shows how genetic engineering can be used to solve a wide range of thorny problems, like energy.  He and his team are converting microorganisms such as algae and bacteria into organic factories that will produce 17% of our fuel by 2022.

Artificial Intelligence

In 1956, a group of luminaries including Claude Shannon and Marvin Minsky convened at a conference at Dartmouth College.   The purpose was to launch the new field of artificial intelligence.  The organizers boldly predicted that in 20 years the problem could be solved and a computer could do anything a human could do.

Alas, the prediction turned out to be wildly optimistic and in the early 1970’s DARPA pulled its funding from artificial intelligence, inaugurating a period now called the AI Winter.  The stigma lasted until Deep Blue’s defeat of reigning world chess champion Gary Kasparov in a highly publicized match in 1997.

Since then, artificial intelligence methods such as Markov chains, genetic algorithms and neural nets have become widely deployed for purposes such as facial recognition, natural language processing, logistics and, of course, video games.

Probably the best indication of the impact of artificial intelligence is how many tasks we used to consider uniquely human for which we now routinely use computers.  We think nothing of having Expedia calculate a multi-flight itinerary or Amazon and Netflix recommending media choices.

In the not so distant future, self-driving cars and computerized medical diagnostics will become the norm.

Quantum Teleportation

In the late 1920’s, Einstein and Bohr were engaged in a series of famous debates about the future of physics in which Einstein declared “God does not play dice with the universe.” Bohr retorted, “Einstein, stop telling God what to do.”  Einstein lost the argument and was so pissed off that he squandered the rest of his career trying to prove himself right.

At issue was the new field of quantum mechanics.  Einstein’s objection was that this probabilistic view gave rise to some seriously wacky ideas.  He proposed an experiment, dubbed the EPR paradox, which he felt would redeem him.  He pointed out, that if quantum mechanics were valid, the experiment would result in instantaneous teleportation, an apparent impossibility.

He was proved wrong when the experiment was successfully carried out at IBM labs in 1993 and researchers teleported photons a short distance.  More recently, scientists in Europe have achieved a teleportation of nearly 100 miles.

This same principle of quantum entanglement may provide the answer to the limits of the present computer paradigm, which will reach its limit around 2020.  The first quantum computer was sold last year.

The Life Cycle of a Paradigm and the End of The Computer Age

We are coming to the end of the computer age.  It’s not unusual for a family today to own a few laptops, a bunch of smartphones and maybe a tablet or two.  That’s an enormous amount of computing power and far more than we really need.  In ten years, the power of our technology will multiply 100 times; in 15 years, a 1000 times.

With capabilities so cheap and plentiful, it shouldn’t be surprising that they are put to use in seemingly trivial ways.  There’s nothing wrong with that.  If people like to spend time on Facebook or toying with virtual reality on their Xbox Kinect, that’s a perfectly acceptable deployment of technology.

The next generation of technology paradigms are much earlier in their life cycles.  Just as no one knew what a transistor was until small radios came out in the 70’s, many of the most exciting breakthroughs today don’t grab big headlines.  As they improve and become more productive, they will find more frivolous uses and enter the public consciousness.

As Arthur Clark once said, “Any sufficiently advanced technology is indistinguishable from magic.”  After we learn how to use it, it becomes just “stuff.”

And that, after all, is why we do it in the first place.

– Greg

8 Responses leave one →
  1. September 23, 2012

    Greg – I see innovation around me every day but the issue seems to be more about incrementalism than a lack of will. Few investors (or inventors) have the patience for ground-breaking innovation that doesn’t turn a profit in short cycles. Edison’s famous 100 failures before inventing the light-bulb wouldn’t have gotten any Series A investment from today’s angels. Likewise BHAG (Big Hairy Audacious Goals) of the type made famous by Jim Collins need a level of corporate or national temerity that is in woefully short supply. I might suggest the issue lies, as the HBR article alludes, in a lower appetite for long-term payouts and a smaller acceptance of failure. At a nation level, read Thomas Freidman’s delightful “That Used to Be Us” which details how the US has lost the education and science race. That’s partially why you and I have not seen a manned mission to Mars in our generation – but we do have 20 flavours of QR codes and 500+ designs of mobile phones 🙂

  2. September 23, 2012

    Hilton,

    I’m not sure that I agree. As I noted above, we have plenty of BHAG’s in the US, they just don’t make news. Energy is a good example (I recently posted on this: https://digitaltonto.com/2012/energy-by-algorithm/), we are engineering synthetic life to make new fuels (and we are required by law to get 17% of our fuel that way by 2020). Gas mileage standards in cars will double by 2025, by which time we might very well have electric cars that can go 500 miles on a single charge.

    We are teleporting things, creating programmable matter, inventing computers that can think for themselves, curing cancer and making spectacular advances in a thousand different areas. Maybe the real problem is that we are making so many moonshots that it’s too overwhelming to take in?

    As much as I like and admire Tom Friedman and understand the issues with education, it is one of the super hot areas of investment now, so it’s very likely that will get sorted out as well.

    – Greg

  3. September 23, 2012

    Greg – that’s fair but I look at the mis-steps with solar and question whether innovation is hampered or accelerated by the need for a commercial imperative – or a fractuous US political system unwilling to bind together to push common initiatives (that’s for another forum). Perhaps, as you mention, we’re seeing some many types of innovation in so many fields that our minds can comprehend the leaps forward that are being made. Potential cures for AIDS, an end to Malaria, strides in neutralizing some cancers. Singularly these are great medical innovation advances that get lost in the news cycle of iPhone 5.

  4. September 23, 2012

    I see what you’re saying Hilton, but aren’t missteps part of it all? Solar, for all the Solyndras, is advancing spectacularly fast and is on pace to be be on parity with coal by 2015 and half the price by 2030.

    I guess my point is that it’s easy to look at twitter and the iPhone and say, “isn’t all this stuff superficial and stupid,” but that’s not how technology works. The theory of relativity, the transistor, information theory, the microchip and just about everything that makes modern life possible went completely unnoticed when they first appeared.

    – Greg

  5. September 23, 2012

    Could be that innovation itself is changing and that its happening all around us we are looking but not seeing it because we are looking in the wrong places.

    I think the long tail is starting to rise – we are starting to see less big innovation but lots and lots of small innovation – normal people innovating and combining through mediated communications.

    So while major corporations may lose the will to innovate the baton has been taken up by the 99%.

    The next century may be very different – we may return to a “cottage industry” of innovation.

    While major scientific and technology breakthroughs may come from large corporations innovation around teh discoveries may come from citizen innovators.

    There are so many amazing technologies on the cusp at the moment that when\if they go mainstream in the next 20 years will really have a remarkable catalytic effect e.g.
    Personal Fabs (3D print), Stretchable, bendable malleable electronics, augmented and wearable IT, HCI developments and maybe even nanotechnology.

    Innovation used to be easy to spot – it used to be industrial and on an industrial scale. Innovation age innovation is more discrete but far more powerful – its is innovation based around information – even ourselves , our minds and how we think.

    the 21st century is just a century – more significantly – its the first in a new millenium.

  6. September 23, 2012

    Martin,

    I do think there’s some truth to that, especially in the US and have written about it before.

    When you look at the achievements in the 20th century, almost all of the serious innovation that happened was either directly funded by the government (the Internet, GPS) or by government supported monopolies (i.e. Bell Labs). The Military Industrial complex financed an awful lot of research.

    That does seem to have changed, at least to some degree. The Higgs boson was discovered in Europe because the US government defunded the supercollider in Texas. Much of the serious basic research is privately funded either by corporations like Intel, Microsoft and Google (strangely enough, Apple does not fund a lab).

    Another curious thing is that although the Obama administration has brought back serious research (i.e. Arpa-e) and made amazing progress in a very short amount of time, they have not touted it. I do think that a generation ago things like Arpa-e would have been politically popular, these days… not so much.

    I also think you alluded to a good point when you pointed to the visibility of an achievement. We notice when there is a guy on the moon waving at us. Moving around subatomic particles and mapping the genome aren’t such great spectator sports.

    Thanks for a great comment!

    – Greg

  7. September 29, 2012

    No comment Greg apart from saying I really liked this, climbed into it and enjoyed where you went -terrific- thanks

  8. September 29, 2012

    Thanks Paul. Much appreciated!

    – Greg

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS