Skip to content

Where Did The iPhone Really Come From?

2013 January 20

It’s hard to imagine a more revolutionary product than the iPhone.  Launched just six years ago, in January 2007, it has significantly changed the way we live.  Go to any city street almost anywhere in the world and you’ll see people glued to their smartphones.

Beyond just the sheer human enjoyment, the numbers are pretty startling as well.  By itself, the iPhone would be the most profitable business in the world and has the power to lift the entire US economy. to say nothing the vast ecosystem that has built up around it.

Amazingly, Steve Jobs was, at best, a mediocre engineer and Apple continues to have one of the stingiest R&D programs in the business.  So where did the technology come from? How can we create more businesses like Apple?  Or, for that matter, more Silicon Valleys and Research Triangles?  The answers are as clear as they are surprising.

What Makes an iPhone?

When the iPhone first appeared, it was primarily an achievement in user experience, not technology.  There were already smartphones on the market and the basic technology had, in fact, been around for decades, including:

The Architecture:  At its heart, a smartphone is much like any other computer.  It has a processor for making calculations and a separate memory unit for holding data and storing programs

This is known as the Von Neumann architecture and it wasn’’t originally intended for phones, but to make weapons.  The design was developed by way of a US government grant to John von Neumann in order to facilitate the large and complex computations needed to develop the Hydrogen bomb.

The Chip: While transistors were developed at Bell Labs in 1947, they weren’t capable of harnessing enough processing power to make modern computers possible.  It wasn’t until a decade later, when the integrated circuit was developed by Jack Kirby at Texas Instruments and Robert Noyce at Intel, that the computer age really got underway.

Unfortunately, they couldn’t initially find buyers for commercial applications.  However, the integrated circuit’s ability to reduce weight made it perfect for the Minuteman missile program, which provided a market and jumpstarted the technology.

The Internet & The Web:  One of the smartphone’s most useful features is the ability to connect us to both the Internet and the Web.  Both of these have unlikely origins as well.

The Internet is essentially hardware – a patchwork of fiber, frequencies and protocols that link together the world’s computers. It began with the Advanced Research Projects Agency of the US Military (ARPA) in the 1960′s.  This ARPANET, was eventually opened to public and commercial traffic and evolved into the Internet.

The Web, on the other hand, provides a software layer.  It was developed by Tim Berners-Lee as a file management system for CERN, a physics lab funded by the European Union. The Web is now governed by the World Wide Web Consortium, which Berners-Lee continues to lead.  Most major technology companies, including Apple, are members.

Geolocation:  Finally a smartphone’s ability to know where we are in relation to objects around us comes from the GPS satellite system that the U.S. military developed.  Simple navigation, however, could not justify the vast expense.  Rather, the funds for GPS were allocated because of the system’s ability to target missiles accurately.

By this point, a disturbing theme should be clear.  The miracles of modern technology often do not originate in the private sector (in fact, breakthroughs are often rejected by incumbent firms) but come instead from government programs.  Further, they come almost exclusively from military programs.

A Simple Question

Why is it that we only seem to be able to do useful things with our collective efforts when we are trying to blow things up and kill people?  In the 20th century, the US led the world in basic research and the vast majority of important discoveries came from our shores. That’s beginning to change.

For example, while congress decided not to allocate a few billion dollars to build a supercollider in Texas, the Europeans did and the center of physics, as well as the discovery of the Higgs Boson, shifted across the Atlantic.  China as well has shown an increasing commitment to basic science.

If we are to remain competitive in the 21st century, we will have to renew our passion for fundamental discovery without the benefit of the almost limitless funds that the Cold War engendered.  The data is clear, we are losing our lead and, unless we renew our commitment to basic research, the consequences will be dire.

If present trends persist, it is doubtful that we will be able to maintain our lead in cutting edge technologies.  After all, our prowess is not a birthright, but something which we had to earn.

Technology’s Last Mile

In truth, great entrepreneurs like Steve Jobs, Mark Zuckerberg and others mostly function as technology’s last mile.  They determine the final application and the user experience, but very rarely are responsible for the breakthroughs that make them possible.

I don’t mean to shortchange entrepreneurs or denigrate their accomplishments.  On the contrary, they perform an invaluable service by bringing technology to us in a form in which we can use it.

Moreover, they do so at great risk.  It’s one thing to diligently work on a government funded program, quite another to build products for fickle consumers and uncertain markets.

Clearly, we need both:  Market driven entrepreneurs and a strong technological infrastructure that provides a substrate from which they can identify and cultivate opportunities.  However, that’s easier said than done.  Basic research and entrepreneurial endeavors require quite different skills, mindsets and incentives.

The Way Forward

Much like physical infrastructure, building out our technological infrastructure is primarily one of commitment rather than a lack of viable approaches.  In fact, we have a variety of options that have proved both effective and cost efficient.

Tax Incentives: One very simple way to encourage basic research is to provide a tax relief for investment.  A R&D tax credit was originally implemented in the US in 1985, but has been renewed sporadically, expiring for the last time in 2011.  It should be made permanent.

Government Programs: Besides military programs, other notable programs include National Institutes of Health, which has made significant contributions to healthcare, the Human Genome Project, which has unleashed a flurry of innovation in genomics and, most recently, ARPA-e, an initiative to provide seed funding to energy technologies. However, we can and should do more.

Innovation Prizes:  Another time-tested solution is the awarding of large monetary prizes for innovation, such as the X Prize and the US government’s Race to the Top initiative.  These tend to encourage investment many times greater than the value of the prize itself and are proving to be effective in jumpstarting emerging technologies and innovations.

A billion dollar prize for things like curing cancer might seem excessive, but would actually be incredibly cheap and would improve diffusion of important discoveries.

Peer Networks: In his recent book, Future Perfect, author Steven Johnson put forth another alternative.  Commons-based peer network driven methods such as the open source movement, have proven effective at creating viable technologies that such as Apache and Linux that run much of our Internet infrastructure.

New programs like the National Robotics Initiative and the Advanced Manufacturing Initiative help spur similar efforts by bringing together public and private institutions, as well as through funding pilot programs.

The 21st century will be fundamentally different than the 20th, when we were locked in a two-way race for dominance with an adversary that, in retrospect, never really had a shot. For today’s multipolar world, we need a far more comprehensive approach encompassing public, private and peer-networked efforts.

– Greg

Update:  Just in case you might think that government funded innovation is a thing of the past, I was recently reminded that Siri also recieved seed funding from a DARPA grant.

9 Responses leave one →
  1. Carlos permalink
    January 21, 2013

    Carlos Paz de Araujo likes an article.
    3 minutes ago
    You are right on the money on this with great insight. Today, an area of key development worldwide is nonvolatile memory beyond FLASH. Because Samsung had the initial patents over 8 years ago, the US simply decided that the game was lost and it is almost impossible to get funding from government agencies. The last 10 years has been an “of the shelf technologies” application route. Also, the fundamental research can be directed to a final transformational goal, like the nonvolatile memory – what the “Innovator’s Dilemma” gave as an example, that FLASH would end the Hard Disk dominance, cannot happen completely because FLASH is self destructive with use, and it is still expensive. Also, very power hungry. The new memory efforts however are at hand. In Japan, ecash was implemented with the SUICA using a technology invented in the US. I started it 26 years ago – no such thing happened in the US. Literally 300 million transactions per day use these japanese cards with FeRAM technology that would be perfect for things like the health card. Not a single incident of hacking has been reported.
    So, I have lived you article under my skin. The stories I could tell would make a book. Some “innovators” only re-arrange the technology offerings into applications and do nothing to make fundamental technology discoveries that made the US a technology leader.It is a fix that over time get overcome – the iPhone is an example, Samsung is right behind…

  2. January 21, 2013

    Thanks for your input Carlos.

    – Greg

  3. January 22, 2013

    Greg,

    Very good article. It is really very important to make some changes in the prevalent economical consciousness. It seems to me that today’s economical mainstream is widely known – to open up and develope new social entities: information space, new economy and information society. It is similar to the period when different countries made colonization of the New World after Columbus and when human society went through the Renaissance.

    Development of technologies within the conventional economy will increase the unemployment and will result the loss of leaderships.

    So, iPhone and Facebook are more socialized projects than tecnological ones.

    Old-fashioned social machines – firm, corporation, government, market – have lost their monopoly for knowledges and informantion and should be supplimented with new entities – e-companies, e-communities, e-markets and e-ecosystems. These new entities are based on creativity, information (electronic) sovereignty, new management and new marketing (relationships) and can transfer the new disruptive challenges into the social and economical development.

    Sergei

  4. January 22, 2013

    Interesting perspective Sergei. Thanks for sharing.

    – Greg

  5. Patty Coromina permalink
    January 28, 2013

    Hi Greg!

    Congratulations on your article. I really enjoyed it as I kept on reading it.
    From my point of view, Steve Jobs has been able to create not only one of the most innovative products such as the iPhone, but also an international desire and demand for the iPhone, which is not an easy task considering the price. Ultimately, what I mean is that people want an iPhone so much that they are willing to pay its $700 price.

    No one can argue the fact that the iPhone has provoked a whole revolution in the market of smartphones, and since its success, other brands have tried to compete against them by creating similar phones. The problem is they do not have the marketing or status that Apple has in order to achieve what Steve Jobs’ has done with his company.

    Notwithstanding, despite the fact we are currently aware of the new iPhone 5, the most probable thing is that Apple has already designed a far more innovative iPhone which they are reserving for the future. By doing this, Apple is ensuring a higher, possibly the highest, market share for a long-term period.

    As far as I am concerned, Apple has reached a very high market share, which is difficult to reach. However, Google is also stepping hard with its new smartphones, which are becoming of increasing demand. So, I would like to know your opinion about how iPhone’s sales and launches are going to be proceeding in the next few years.

    On the other hand, knowing that iPhone 5 had less sales than the expected, and that Google is achieving more power in the market, do you think Google can exceed and feasibly take all of Apple’s market share?

  6. January 29, 2013

    It’s tough to say, because competition is ramping up and a lot of the new phones are really, really good. Also, iPhones lack some features that other manufacturers already have (notably, NFC). So they’re getting squeezed on both price and features. There’s talk that they might release a cheaper phone for emerging markets, but other than that, I have so special insight.

    – Greg

  7. February 11, 2013

    Greg,
    So according to these examples, Obama was right when he said ‘you didn’t build that’? In terms of basic research and new technology, the answer would be yes. Research universities are the cradle of so many breakthrough companies because the seed funding needed to develop / prove a new technology is government funded. I’m sure many investors appreciated piggybacking into an A round when the government had paid for the basic research needed to create an entirely new industry, or re-create one. Yes, coming out of the cold war, the cast was set for funding a lot in the name of the common defense. There had to be some general rationalization for expenditures that had low hit rates.

    I think of Innovation (capital I) as innovation (re-making something differently), improvisation (putting together disparate elements for a new purpose), discovery (for ex, finding an unclassified plant from the Amazon and discovering its healing properties) and invention (harder to do now without basic science). In every case, there will be elements in the life cycle of the new offering that will be based in some measure on something that came before it. So, at this point, we all stand on the shoulders of past entrepreneurs / inventors / innovators / discoverers. Even if the entrepreneur “merely” innovates or improvises, that offering might make the inventor’s life easier. Just time saving or stress reduction may allow them to better focus on invention, and that extra focus makes a difference. So, it all has value. An app like Evernote may capture a fleeting idea that results in a breakthrough, but Evernote itself might not be thought of in the same category with nano-technology, for example. In fact even the failures help in that sense (Apple Newton). Since every glass of water is made up of millions of droplets, with the Jobs Act pending, I would add Crowd Funding to your list of “The Way Forward”.

    I’d also cross-comment on what might come after the Hacker Economy in your other post. It seems that the availability of knowledge and know-how that came before us is no longer the gating factor to invention. It’s all documented somewhere on the web. The gating factor is having timely awareness of the right knowledge or know-how to better focus on high value activities. So, I am sensing a growing cadre of “Enablers” for lack of a better term, that facilitate the location of the information needed to innovate. Put another way, who teaches the hackers the latest proven methods to hack? Is this part of the hacker economy, or a further evolution of it? Will this become a new service offering unto itself?

  8. February 12, 2013

    Thanks for sharing your thoughts, Ed.

    – Greg

Trackbacks and Pingbacks

  1. We Need To Innovate The Science Business Model | My- Tech Lab

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS