We Need To Stop Trying To Predict The Future And Start Exploring It
In a celebrated 1995 article in Newsweek, astronomer Clifford Stoll blasted Internet visionaries. “Do our computer pundits lack all common sense?” he asked. “The truth is no online database will replace your daily newspaper, no CD-ROM can take the place of a competent teacher and no computer network will change the way government works.”
Clearly, he was mistaken, but Stoll was no Luddite. In fact, he was an early adopter of the Internet who had spent years online. His article merely reflected what he saw: a complex technology meant for highly technical people like him, not a easy-to-use service for ordinary consumers.
Stoll’s mistake was to confuse his knowledge of the current state of technology with the power to know where it was going. As an astronomer, he was in a very poor position to predict about how the Internet would be applied to other fields. The truth is that even really smart people get the future wrong, which is why it’s more important to explore than to predict.
Missing The Macintosh
In the late 60s, Xerox CEO Peter McColough saw that his company needed to go in a new direction. Although it had made a fortune in copiers, it became clear to him that the future lay in what he called, “the architecture of information” and set up the Palo Alto Research Center (PARC) to become a leader in the computer technology still nascent at the time.
It proved to be a prescient idea. Over the next decade, many of the core technologies of modern computing, including the mouse, GUI, Ethernet and laser printer, were invented at PARC. It also probably saved the company. As Xerox’s core copier business ran into tough competition from Japanese firms, the laser printer became a crucial revenue source.
Yet today, PARC is best known for missing the Macintosh. By focusing on its vision of creating the “office of the future,” it failed to see that teenagers and hobbyists would want to buy cheap personal computers that were fun to use. Steve Jobs, quite famously, saw that immediately and used what the technology developed at PARC to redefine the industry.
So Xerox was both visionary and wildly off the mark. The system that it did build, the Star, was indeed way ahead of its time. However, it wrongly assumed that the opportunity was close to its existing business, when it should have been exploring new markets.
Ballmer’s Blunder
By any standard, Steve Ballmer has had an extraordinarily successful career. As Bill Gates’s right hand man through Microsoft’s meteoric rise, he helped to shape one of the most dominant businesses ever. As a CEO, he delivered steady grown and fat margins, while guiding the company through the fallout of a major antitrust case and the end of the PC era.
Additionally, he built a strong platform to propel the company in the future by funding a new division, called Servers and Tools, that led to Microsoft’s strong position in cloud computing. Today, that business is growing an annual rate of more than 100%.
Yet he also completely misjudged the shift to mobile computing. When the iPhone came out in 2007, Ballmer dismissed it, saying, “There’s no chance that the iPhone is going to get any significant market share. No chance.” It was an astounding blunder and Microsoft, in large part, missed out on the rise of the mobile web while Apple soared to unprecedented heights.
Today, Ballmer is widely regarded as a fool and not without reason. Microsoft had a thriving enterprise business, which is why it saw cloud technologies early, but had little traction in mobile phones. Like the executives at Xerox, he could only see mobile within the context of his business and was unable to explore other perspectives.
Welcoming Our Robot Overlords
In their 2014 book, The Second Machine Age, Erik Brynjolfsson and Andrew McAfee argued that as cognitive tasks become automated, professions ranging from law and medicine to journalism and truck driving, are being upended. In effect, we are all, to paraphrase Jeopardy champion Ken Jennings, in danger of being replaced by robot overlords.
It’s a troubling vision, yet over the past few months I’ve spoken to a number of people implementing cognitive technologies and, so far, no one seems to think that they are replacing workers. Some even told me that their customers were facing a labor shortage and struggling to find the workers they need.
Of course, this is merely anecdotal evidence and needs corroboration, but when I checked the economic data, it certainly seemed to hold true. Consider this: since 1970 the US workforce has doubled , but even with the larger supply of workers, labor participation rates have risen by more than 10% during the same period.
So which is it? Will robots take our jobs or create new ones? The truth is that it’s impossible to tell, because the future has not been created yet. We’ll need to do that ourselves.
Innovation Needs Exploration
We often treat innovation as if it were an action movie. We expect heroes to be constant and true and villains to be fundamentally flawed. We want to see the world in terms of “the visionary geniuses” vs. “the hopeless buffoons” and often ignore the simple fact that a little bit of both exists in all of us.
Xerox was able to innovate for the office because it had spent decades serving corporate clients, but had no experience with consumer products. Microsoft had a strong business in server technology, but mobile phones were mostly a sideline. Academics like Brynjolfsson and McAfee can analyze trends, but have no special insight into data that doesn’t exist yet.
The truth is that patterns can only be validated backward, never forward, so there is no way to know for sure what comes next. Unfortunately, we have a tendency to jump to conclusions based on the information that is most readily available and discount things that lay outside our immediate field of vision.
That’s why we need to stop trying so hard to predict the future and focus more on exploring the unknown. Innovation seldom arises from what we already know, but is driven by discovering what we still need to learn.
– Greg
An earlier version of this article first appeared in Inc.com
Great article on a great blog. You brought back memories of debating possible business use of the internet with Clifford Stoll at Cody’s bookstore in Berkeley, shortly after my book, Cyberpower for Business came out. Stoll called me “foolish” because Jeff Senne and I had some experience of the online world and we thought businesses would seize on it to increase revenues, cut costs, and make operations easier. I specifically remember him saying that no business would invest in a T1 phone line and without that business on the net would be impossible.
Jeff and I weren’t predicting a specific future. We just figured that the businesses would see the opportunity the internet offered and figure out ways to exploit it. Stoll was as you describe him, assuming that technology and costs would remain unchanged. It seems silly now but it was a reasoned and informed opinion that many “experts” shared at the time.
Whenever I start feeling really good about how right I was about the net, I remember a panel that the Wall Street Journal convened a couple of years earlier. We were asked what we thought the “killer app” for the internet would be. No one so much as mentioned the web, HTML, or Mosaic.
I think you’ve got it right, predicting the future is a fool’s errand. In addition to the fact that humans just don’t do it well, predicting locks you into defending your prediction instead of exploring possibilities. The future belongs to the explorers, not the map makers.
Great story! Thanks for sharing it Wally!
– Greg