In Mindset, psychologist Carol Dweck argues, based on decades of research, that how we see ourselves is a major factor in what we can achieve. Whether it is children in school or executives in a boardroom, the mindset people adopt has a significant influence on how they perform.
Yet what she doesn’t say is that we need different mindsets for different jobs. A successful mindset for one set of tasks may hinder our performance in another. For example, aggression and competitiveness may work great for a professional athlete on the field, but not so great for building a productive home life.
Most of the changes in mindset we need to make, however, are far more subtle. They lack the social and environmental cues of the delineation between work and home life. So we always need to be looking for when best practices in one area lead to poor performance in another and shift our mental models accordingly. In an age of disruption, we need to learn to adapt.
read more…
We all have a change we want to see happen. For some of us, it is something in our organization or industry. Other times, it is something in our community or throughout society as a whole. That’s why people start companies or join groups at church and school. Sometimes groups connect with other groups and the call for change becomes a movement.
Creating true change is never easy. Most startups fail. Most community groups never get beyond small local actions and, even when a spark catches fire, as in the case of the Occupy movement, it often seems to fizzle out almost as fast. The status quo is, almost by definition, well-entrenched and never gives up without a fight.
Yet the kids from Stoneman Douglas High School seem to be succeeding where so many others have failed. Their “March for our Lives” protest was one of the biggest since the Vietnam War and they seem to be getting real results. Their home state of Florida has already passed new gun legislation and other states plan to do the same. Here’s why they’re winning:
read more…
We were told that “data is the new oil.” The Internet of Things combined with the ability to store massive amounts of data and powerful new analytical techniques like machine learning would help derive important new insights, automate processes and transform business models. It seemed like a massive opportunity.
Yet Gartner analyst Nick Heudecker estimates as many as 85% of big data projects fail, due to a lack of data skills, poor internal coordination between departments and lack of integration with line managers and staff. Implementing a big data project, it seems, is far more challenging than installing a new email system.
The news isn’t all bad. A survey by Deloitte of “aggressive adopters” of cognitive technologies found 76% believe that they will “substantially transform” their companies within the next three years. So while big data and cognitive technologies are no panacea, they can deliver value, if pursued wisely. Here’s how you can keep your data project from going off the rails.
read more…
Over the past few decades, Silicon Valley has been such a powerful engine for entrepreneurship in technology that, all too often, it is considered to be some kind of panacea. Corporate executives seek to inject “Silicon Valley DNA” into their cultures and policymakers point to venture-funded entrepreneurship as a solution for all manner of problems.
This is a dangerous mindset. The Silicon Valley model, for all of its charms, was developed for a specific industry, at a specific time, which was developing a specific set of technologies. While it can offer valuable lessons for other industries and other problems, it is not universally applicable.
The myth of Silicon Valley is that its model can applied to every problem, when in actuality it is a one that was built to commercialize mature technologies for specific markets. We’re now entering a new era of innovation and that model doesn’t quite fit as well as it once did. We need to develop a new innovation ecosystem to stay competitive in the 21st century.
read more…
We tend to think of innovation as being about ideas. A lone genius working in a secret lab somewhere screams “Eureka!” and the world is instantly changed. But that’s not how the real world works. In truth, innovation is about solving problems and it starts with identifying useful problems to solve.
It is with that in mind that IBM comes out with its annual list of five technologies that it expects to impact the world in five years. Clearly, each year’s list is somewhat speculative, but it also gives us a look at the problems that the company considers to be important and that its scientists are actively working on solving.
This year’s list focuses on two aspects of digital technology that are particularly important today. The first is how we can use digital technology to provide a greater impact on the physical world in which we all live and work. The second, which is becoming increasingly crucial, is to make those technologies more secure to protect both privacy and commerce.
read more…
When Peter Drucker first met IBM’s visionary CEO, Thomas J. Watson, he was somewhat taken aback. “He began talking about something called data processing,” Drucker recalled, “and it made absolutely no sense to me. I took it back and told my editor, and he said that Watson was a nut, and threw the interview away.”
That was back in the early 1930s, when “computers” were usually teams of women who performed rote calculations. The idea that data could be a valuable commodity just wasn’t on anyone’s radar yet and, in fact, wouldn’t be for decades. That would take not only advancement in technology, but also a transformation in business practices.
There were two major eras of innovation in the 20th century. The first hit its stride in the 1920s and the second had its biggest impact in the 1990s. We’re now on the brink of a new era of innovation and its impact will likely be profound. Though much like Drucker back in the 1930s, we are still unable to fully grasp what is yet to come.
read more…
Quantum computing is the hot new thing. With Moore’s Law ending, there is a mad rush to find a new avenue for advancement. To optimists, quantum computing will fit the bill nicely and we’ll make the transition smoothly. To pessimists, the technology will kill encryption and bring down global commerce with it.
Neither of these things are even remotely true. Quantum computing is not the only way to improve computing performance. There are a variety of other approaches, including ASIC, FPGA and neuromorphic chips that will play a part. The apocalyptic visions of killing encryption aren’t worth taking seriously.
The truth is, in many ways, more exciting. Innovation is never a single event, but a process of discovery, engineering and transformation. Quantum computing is already deep into the engineering phase and the transformational impact will begin to take shape in the next 5-10 years. On a recent trip to IBM Research, I got a much better sense of what that will look like.
read more…
It’s no secret that General Electric has fallen on hard times. Its CEO, Jeffrey Immelt, was forced to step down last June. In December, it announced that it was laying off 12,000 employees in its massive power business. Its stock has lost half of its value and there’s talk that it may get kicked off the Dow after 110 years.
In an earlier age, such a dismal performance would likely been attributed to corporate rot — an aging industrial firm gets fat and lazy and loses its competitive edge. Yet there’s no sign of that at GE. On the contrary, it seems to be a strong operational company that is highly competitive in the markets in which it competes.
The problem with GE, it appears, is that it has become a square-peg business in a round-hole world. It’s not that it’s gotten lazy, but that it invested heavily in getting better and better at things people care less and less about. That’s a problem we rarely talk about. We like to believe that success breeds more success, but the truth is that success often breeds failure.
read more…
IBM, to a large degree, invented the information technology industry. For the first half of the 20th century, it dominated the market for tabulating machines. Then digital computing posed new challenges and, by the 1950s it had begun to cede ground to UNIVAC, which led to Thomas Watson Jr’s $5 billion gamble to build the System 360.
That effort was transformative, but by the 1980s the company had fallen behind again and it was only the crash development of the PC that saved it from irrelevance. Yet this time, it did not return to dominance, but was consistently outmaneuvered by smaller and nimbler competitors, like Microsoft and Intel.
Lessons from one era often cannot be applied to the next. In the 50s and 60s, IBM’s singular focus proved decisive. A generation later, agility and speed to market became key attributes. Today, we’re entering a new era of innovation in which the basis of competition will shift from disruptive to fundamental technologies. Here’s what you need to do to win:
read more…
It’s become conventional wisdom that the last 30 years have been a hotbed of innovation, but evidence suggests otherwise. As Robert Gordon explains in The Rise and Fall of American Growth, productivity growth peaked between 1920 and 1970 and has declined ever since. Economist Tyler Cowen calls this the Great Stagnation.
Part of the reason for the dissonance is that information and communication technologies, which have been advancing quickly, make up a relatively small share of the economy — about 6% of GDP in advanced countries. Manufacturing, which makes up 17% of the global economy, gets relatively short shrift.
While all of the news isn’t bad, Deloitte recently ranked the US second to China in its Manufacturing Competitiveness Index, the fact is that we’ve lost 5 million manufacturing jobs since 2000. Clearly, that’s a disturbing trend and one we need to reverse. More to the point, there are four things we need to do to ensure that we can compete in the 21st century.
read more…