These Are The Biggest Innovation Challenges We Must Solve Over The Next Decade
Every era is defined by the problems it tackles. At the beginning of the 20th century, harnessing the power of internal combustion and electricity shaped society. In the 1960s there was the space race. Since the turn of this century, we’ve learned how to decode the human genome and make machines intelligent.
None of these were achieved by one person or even one organization. In the case of electricity, Faraday and Maxwell established key principles in the early and mid 1800s. Edison, Westinghouse and Tesla came up with the first applications later in that century. Scores of people made contributions for decades after that.
The challenges we face today will be fundamentally different because they won’t be solved by humans alone, but through complex human-machine interactions. That will require a new division of labor in which the highest level skills won’t be things like the ability to retain information or manipulate numbers, but to connect and collaborate with other humans.
Making New Computing Architectures Useful
Technology over the past century has been driven by a long succession of digital devices. First vacuum tubes, then transistors and finally microchips transformed electrical power into something approaching an intelligent control system for machines. That has been the key to the electronic and digital eras.
Yet today that smooth procession is coming to an end. Microchips are hitting their theoretical limits and will need to be replaced by new computing paradigms such as quantum computing and neuromorphic chips. The new technologies will not be digital, but will work fundamentally different than what we’re used to.
They will also have fundamentally different capabilities and will be applied in very different ways. Quantum computing, for example, will be able to simulate physical systems, which may revolutionize sciences like chemistry, materials research and biology. Neuromorphic chips may be thousands of times more energy efficient than conventional chips, opening up new possibilities for edge computing and intelligent materials.
There is still a lot of work to be done to make these technologies useful. To be commercially viable, not only do important applications need to be identified, but much like with classical computers, an entire generation of professionals will need to learn how to use them. That, in truth, may be the most significant hurdle.
Ethics For AI And Genomics
Artificial intelligence, once the stuff of science fiction, has become an everyday technology. We speak into our devices as a matter of course and expect to get back coherent answers. In the near future, we will see autonomous cars and other vehicles regularly deliver products and eventually become an integral part of our transportation system.
This opens up a significant number of ethical dilemmas. If given a choice to protect a passenger or a pedestrian, which should be encoded into the software of a autonomous car? Who gets to decide which factors are encoded into systems that make decisions about our education, whether we get hired or if we go to jail? How will these systems be trained? We all worry about who’s educating our kids, who’s teaching our algorithms?
Powerful new genomics techniques like CRISPR open up further ethical dilemmas. What are the guidelines for editing human genes? What are the risks of a mutation inserted in one species jumping to another? Should we revive extinct species, Jurassic Park style? What are the potential consequences?
What’s striking about the moral and ethical issues of both artificial intelligence and genomics is that they have no precedent, save for science fiction. We are in totally uncharted territory. Nevertheless, it is imperative that we develop a consensus about what principles should be applied, in what contexts and for what purpose.
Closing A Perpetual Skills Gap
Education used to be something that you underwent in preparation for your “real life.” Afterwards, you put away the schoolbooks and got down to work, raised a family and never really looked back. Even today, Pew Research reports that nearly one in four adults in the US did not read a single book last year.
Today technology is making many things we learned obsolete. In fact, a study at Oxford estimated that nearly half of the jobs that exist today will be automated in the next 20 years. That doesn’t mean that there won’t be jobs for humans to do, in fact we are in the midst of an acute labor shortage, especially in manufacturing, where automation is most pervasive.
Yet just as advanced technologies are eliminating the need for skills, they are also increasingly able to help us learn new ones. A number of companies are using virtual reality to train workers and finding that it can boost learning efficiency by as much as 40%. IBM, with the Rensselaer Polytechnic Institute, has recently unveiled a system that help you learn a new language like Mandarin. This video shows how it works.
Perhaps the most important challenge is a shift in mindset. We need to treat education as a lifelong need that extends long past childhood. If we only retrain workers once their industry has become obsolete and they’ve lost their jobs, then we are needlessly squandering human potential, not to mention courting an abundance of misery.
Shifting Value To Humans
The industrial revolution replaced the physical labor of humans with that of machines. The result was often mind-numbing labor in factories. Yet further automation opened up new opportunities for knowledge workers who could design ways to boost the productivity of both humans and machines.
Today, we’re seeing a similar shift from cognitive to social skills. Go into a highly automated Apple Store, to take just one example, and you don’t see a futuristic robot dystopia, but a small army of smiling attendants on hand to help you. The future of technology always seems to be more human.
In much the same way, when I talk to companies implementing advanced technologies like artificial intelligence or cloud computing, the one thing I constantly hear is that the human element is often the most important. Unless you can shift your employees to higher level tasks, you miss out on many of the most important benefits
What’s important to consider is that when a task is automated, it is also democratized and value shifts to another place. So, for example, e-commerce devalues the processing of transactions, but increases the value of things like customer service, expertise and resolving problems with orders, which is why we see all those smiling faces when we walk into an Apple Store.
That’s what we often forget about innovation. It’s essentially a very human endeavor and, to measure as true progress, humans always need to be at the center.
– Greg
An earlier version of this article first appeared in Inc.com
Image: Pixabay
Is progress the product of a group working together or one inspired person going where everyone else says not to go? In history, it has been both. I suspect that will be rue in the future as well.
How odd… You say ” the one thing I constantly hear is that the human element is often the most important.” Yet at another time you also talk about the “digital transformation” that is all the rage in the corporate world and that is the process of automation, getting rid of people.
But that is exactly my point. There is no competitive advantage to using automation simply to cut costs. If those resources, especially human resources, are not deployed into creating value elsewhere then any benefits will be temporary and short lived.
– Greg
Well, that’s not what they are doing. They have companies like Uber in mind or maybe Google where a fairly small number of employees can dominate a market using superior software and generate indecent profits. What they want is “staffing on demand”. They sure as heck don’t want to pay employees. Come on, you read the same stuff I do. Now they are talking about cutting hours – specifically, it was at Amazon warehouses, but you know it’s what they want in other companies if they can cut costs. They are going to cut hours and pay. A lot of people are screaming that they’ve lost their jobs to offshoring and immigrants, but you’ve read that that isn’t really the case either. The main cause is automation and it’s barely kicked in yet. It’s a large part of why the middle class is shrinking. Employees are expensive, machines are efficient.
You quite reasonably look at the opportunities for businesses and improvements in service, but that isn’t what happens. People wish there was someone to talk to when they call, but there is not.HR software drives people batty, but the HR department is 1/10 the size and far cheaper. That’s how it’s going, cheap, not quality. but I look at the social disruption. I think automation is going to lead to massive job losses.
I’m not sure what you mean by “what they are doing.” For any given context, there are a lot of companies doing lots of different things in lots of different ways. My point is that in talking directly with industry experts, the fairly unanimous consensus is that companies that are successful in implementing new technologies like cloud and AI repurpose talent to create new value. In other articles I have described how some companies are doing just that and increasing profits.
That doesn’t mean that there aren’t a lot of companies that see automation merely as a way to cut costs, but the fact that we are seeing unprecedented automation and a major labor shortage seems to indicate that automation isn’t killing many jobs.
– Greg
I hope you are right, but my view is different and there is no question but that the middle class is shrinking. I do think it will be amazing what will be possible with AI assist. I just played with an Azure Chatbot to see the possibilities. I’ve always wanted to make an expert system. It was text for an obvious reason, but you could ask it “what it knew”. It would reply which systems it knew, where the servers for the system were, it gave the SQL for tracking job data through the systems as well as querying the logs. While a bit trivial it seemed potentially very useful for maintaining systems. What I really want to see is AI applications to help medical diagnosis. I have been misdiagnosed repeatedly, including recently. … Oh, I’m waiting to see if I won a trivial prize for submitting a design for a medical data system – not for the medical staff at all, but to fulfill the needs of the patient to mitigate the well known dangers of hospital stays.
This is one of the best articles which I have read recently on innovation challenges in future. We live in perhaps the greatest age of technological innovation in human history. Yet many people are not experiencing the benefits of this progress, despite actively seeking to more fully participate in and benefit from new educational, financial, and work opportunities. While jobs that were once pathways to guaranteed prosperity have dramatically changed or disappeared, we believe that Inclusive Innovators, wielding technology as a tool, are creating solutions to this challenge today. Your statement that “The challenges we face today will be fundamentally different because they won’t be solved by humans alone but through complex human-machine interactions.”, is a notable point worth mentioning.
Thanks so much! So glad you enjoyed it.
– Greg