In the seven years since IBM’s Watson beat two human champions in the game show Jeopardy, cognitive technologies have gone from a science fiction pipe-dream to a platform for essential business initiatives. Clearly, if you don’t have a plan for cognitive transformation, your chances for survival will be somewhat dim.
Yet progress to this point has been uneven. While there have clearly been some successes, we’ve all had tortured moments such as trying to access a human on customer service call. In some cases, things have gone seriously awry, such as when Amazon’s Echo has ordered unwanted merchandise.
Progress is never smooth. The early industrial revolution certainly had more than its share of problems, as did the dotcom era (remember Webvan?). The key is to go in with your eyes open, understanding that every transformation has its growing pains. With that in mind, here are four things you should know about big data and artificial intelligence.
read more…
Every era is defined by the problems it tackles. At the beginning of the 20th century, harnessing the power of internal combustion and electricity shaped society. In the 1960s there was the space race. Since the turn of this century, we’ve learned how to decode the human genome and make machines intelligent.
None of these were achieved by one person or even one organization. In the case of electricity, Faraday and Maxwell established key principles in the early and mid 1800s. Edison, Westinghouse and Tesla came up with the first applications later in that century. Scores of people made contributions for decades after that.
The challenges we face today will be fundamentally different because they won’t be solved by humans alone, but through complex human-machine interactions. That will require a new division of labor in which the highest level skills won’t be things like the ability to retain information or manipulate numbers, but to connect and collaborate with other humans.
read more…
In recent decades, innovation has become almost synonymous with digital technology. That wasn’t always true, of course. Long before the microchip was invented, we built an industrial economy based on electricity and internal combustion, harnessed the power of the atom and developed life-saving cures.
Nevertheless, lately everything seems to be digital. Entrepreneurs who create software apps and consumer gadgets can find themselves zillionaires in a matter of just a few years — or even a matter of months. Investors, not surprisingly, search far and wide to find the next Facebook or Google.
Today, however, we are entering a new era of innovation. The basic technology that powered the digital revolution is nearing its theoretical limits and the most exciting opportunities lie in new technologies. Many of these will be rooted in the physical world, as we learn to use bits to transform atoms. Some investors are already making the shift to a new model.
read more…
In 1961, the first minicomputer, called the PDP-1, arrived at the MIT Electrical Engineering Department. It was a revolutionary machine but, as with all things that are truly new and different, no one really knew what to do with it. Lacking any better ideas, a few of the proto-hackers in residence decided to build a game. That’s how Spacewar! was born.
Today, the creation of the Spacewar is considered a seminal event in computer history. Because it was a game, it encouraged experimentation. Hackers tried to figure out how to, say, simulate gravity or add accurate constellations of stars and by doing so would push the capabilities of the machine and themselves.
Tech investor Chris Dixon has said that the next big thing always starts out being dismissed as a toy. Yet it’s because so many technologies start out as toys that we are able to experiment with and improve them. As virtual reality becomes increasingly viable, this human-machine co-evolution will only accelerate because, to create a new future, we first have to imagine it.
read more…
Expensive technology used to be a significant advantage for big companies. Large enterprises had the resources to hire consultants, invest in sophisticated systems and collect masses of data to analyze. That gave them better visibility into market trends, helped them automate processes and make better decisions.
The cloud disrupted all that because it meant that world-class technology no longer needed a significant capital investment upfront. Today, anyone with an idea can sit at their kitchen table and access the world’s best technology with little more than a broadband connection. That’s been a real game changer.
It has also meant larger organizations have had to adapt. Cloud computing is not only much cheaper than legacy systems, it is also more flexible, adaptable and much easier to integrate with new capabilities like artificial intelligence. Yet moving your business to the cloud can also be a major challenge. Here’s what you need to know to get it right.
read more…
I’ve never really liked the phrase “innovate or die.” Why not, “finance or die” or “sell or die” or even “manage or die?” Clearly every business function is essential and no organization can survive without building some competency in all of them. In an ultra-competitive business environment, you have to do more than just show up.
What makes great innovators different is that they succeed where most others fail. They not only come up with new ideas, they find ways to make them work and create value for the rest of us. Even more importantly, they are able to do it consistently, year after year, decade after decade.
Over the years, I’ve gotten to know many of these extraordinary people and they are all impressive in their own way, but what has struck me is not their differences, but what they have in common. It seems that there are some things that all great innovators share and, importantly, they are all things that we can do as well. So there is hope for the rest of us.
read more…
In 2011, IBM’s Watson system squared off on the game show Jeopardy! against two human champions, Brad Rutter and Ken Jennings. It beat them both so handily that for his last response Jennings simply wrote, “I, for one, welcome our new computer overlords.” It was an awesome display, unlike anything anyone had seen before.
The implications went far beyond the company or the game show. Watson’s triumph kicked off an arms race in artificial intelligence. Later that same year, Apple launched Siri, its personal assistant. In 2015, Google’s AlphaGo computer beat a human champion at the famous Asian board game and Amazon launched its Echo smart speaker.
This summer, IBM raised the stakes again with its Project Debater, a system that can compete with skilled humans arguing about controversial topics. Much like Watson, Debater’s objective is not to launch a new product, but to expand horizons. While the full implications aren’t yet clear, we are surely we are embarking on a new era of possibility.
read more…
Every enterprise needs to innovate. It doesn’t matter whether you are a profit-seeking business, a nonprofit organization or a government entity, the simple truth is that every business model fails eventually, because things change over time. We have to manage not for stability, but for disruption or face irrelevance.
There is no shortage of advice for how to go about it. In fact, there is far too much advice. Design thinkers will tell you to focus on the end user, but Harvard’s Clayton Christensen says that listening too much to customers is how good business fail. Then there’s open innovation, lean startups and on and on it goes.
The truth is that there is no one path to innovation. Everybody has to find their own way. Just because someone had success with one strategy, doesn’t mean that it’s right for the problem you need to solve. So the best advice is to gather as many tools for your toolbox as you can. Here are four things about innovation you rarely hear, but are crucially important.
read more…
All too often, innovation is confused with agility. We’re told to “adapt or die” and encouraged to “move fast and break things.” But the most important innovations take time. Einstein spent ten years on special relativity and then another ten on general relativity. To solve tough, fundamental problems, we have to be able to commit for the long haul.
As John F. Kennedy put it in his moonshot speech, “We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills.” Every organization should pursue grand challenges for the same reason.
Make no mistake. Innovation needs exploration. If you don’t explore, you won’t discover. If you don’t discover you won’t invent and if you don’t invent you will be disrupted. It’s just a matter of time. Unfortunately, exploration can’t be optimized or iterated. That’s why grand challenges don’t favor the quick and agile, but the patient and the determined.
read more…
In the 1960s and 70s, Route 128 outside of Boston was the center of technology, but by the 1990s Silicon Valley had taken over and never looked back. As AnnaLee Saxenian explained in Regional Advantage, the key difference was that while Route 128 was a collection of value chains, Silicon Valley built an ecosystem.
Clearly, ecosystems are even more important today than they were back then. In fact, a recent study by Accenture Strategy found that ecosystems are a “cornerstone” of future growth and that 60% of executives surveyed viewed ecosystems as a way to disrupt their industry. A similar number saw them as key to increasing revenue.
The problem is that competing in an ecosystem environment is vastly different than a traditional value chain strategy. While a value chain is driven by efficiencies, an ecosystem is driven by connections in a network. So we need to do more than adapt our strategy and tactics, we need to learn how to play a whole new game. The first step is to learn the rules.
read more…