Innovation has become like a religion in business today, with “innovate or die” as its mantra. When a company succeeds, people attribute its good fortune to superior innovation. When it fails, people say it lacked the ability to innovate, no matter how many new products it launched. The message is simple: you need to disrupt to survive.
So it shouldn’t be surprising that there is no shortage of people offering silver bullets. They promise a “secret sauce” that will unlock the creativity in your organization. They preach disruption, open innovation, lean launchpads or whatever else is the flavor of the day with the passion and surety of evangelical ministers.
The truth is that there is no one true path to innovation. Compare any two great innovators and they inevitably do things very differently. So if you choose to emulate one, you are in a sense rejecting the other, which may be equally or even more successful. The only real path forward is to define the problems you seek to solve and build your own innovation playbook.
When Eric Haller graduated with a degree in finance from San Diego State University in 1990, he didn’t have any clear idea of what he wanted to do. If he was on the east coast, he probably would have went to work at an investment bank, but he ended up taking a job as a statistical analyst at Visa in Foster City, little more than a day’s drive away.
He would later say that it was a case of being in the right place at the right time. The credit card industry was at the beginning of a major transformation, moving from voice clearing and paper transactions to electronic commerce. New database solutions, like SAS and Oracle, made it possible to analyze transactions like never before.
Before long, Haller’s lowly position as a statistical analyst took on great importance and he quickly moved up the ranks. Still, he saw that there was something bigger going on—an emerging marketplace for data. It was around that time that he heard of a company that not only shared his vision, but was creating a business that was destined to be at the center of it.
Studies show that over 90% of startups fail. Even for those rare few that make it big, life doesn’t get much easier. In fact, only slightly more than 10% of the firms on the original Fortune 500 list are still in business today. Making an enterprise successful and keeping it that way is a staggeringly hard thing to do.
So it’s not hard to see why there has been so much effort devoted to narrowing down a company’s performance to a single factor. Some say that focusing on the customer is key. Others believe that building a great culture is the true path to success. Still others preach the gospel of developing capable leaders.
In The Halo Effect, IMD’s Phil Rosenzweig pours cold water on all of these. Firms that are successful, he points out, are perceived as being customer focused, having a great culture and building strong leadership, but when those firms hit hard times, critics claim that they are failing in those very same areas. To truly understand performance, we have to look deeper.
The truth is that innovation is never a single event, but rather the confluence of efforts undertaken by many different people and organizations. That’s why electricity only became really useful when factories adapted to it and complementary innovations, like home appliances and air conditioners, took root.
Today, we’re at a similar point of convergence to what happened a century ago. Digital technology, which has been around for decades, is beginning to power new complementary technologies, such as genomics, nanotechnology and robotics. By coincidence, these forces will all converge around the year 2020 and, after that, the world will be profoundly different.
Jules Verne, the 19th century science fiction writer, made a number of predictions, like submarines, space travel and even newscasts, that turned out to be accurate. His visions of the future were so vibrant that many inventors and scientists in the 20th century took inspiration from his work.
Yet many of our most difficult challenges couldn’t have been imagined even by a genius like Verne. The obesity epidemic, climate change and the economic and health issues that come with longer life spans wouldn’t have made any sense to a 19th century audience, when progress meant more to eat, larger industry and less mortality.
In much the same way, some of the thorniest problems we’ll have to face will come from the unintended consequences of advances we make today. What will make these so difficult to overcome is that they are cannot be solved in a research lab or a think tank, but in the public square. Unfortunately, we haven’t really even begun to start thinking them through.
Since World War II, the United States has been an innovation superpower. In virtually every advanced field, whether it be information technology, biotechnology, agriculture or renewable energy, America holds a leading position. Other nations may challenge in one field or another, but no one can match its depth and breadth.
To account for its success, many point to America’s entrepreneurial culture, its tolerance for failure and its unique ecosystem of venture funding. Those factors do play important roles, but the most important thing driving America’s success has been its unparalleled scientific leadership.
Most of this research is publicly funded. Take a look at any significant innovation, such as an iPhone, and you’ll find that most, if not all, of the technology came from some government program. Google itself got its start from a federal grant. Yet that poses a problem for managers. How can a private company turn public research into a competitive advantage?
In her bestselling book Mindset, psychologist Carol Dweck argues that people who see their skills as a fixed set of strengths and weaknesses tend not to achieve much. On the other hand, those that see their skills as dynamic and changeable are able to continually grow their abilities and soar to great heights.
Businesses are the same way. Most see their business models as a permanent facet of their DNA, so when their environment changes, they fail to adapt. That’s why 87% of the companies on Fortune’s original list of 500 top firms are no longer there. Over time, most companies get better and better at things that people want less and less.
Of course, that’s not always true. Firms like Procter & Gamble, General Electric and IBM still thrive after a century or more. The reason they endure is that they don’t see their business as fixed, but have continually reinvented themselves and are vastly different enterprises than when they started. In an age of disruption, the only viable strategy is to adapt.
Many people take it as an article of faith that technology is changing the world, but in his new book, The Rise and Fall of American Growth, economist Robert Gordon says otherwise. He points out—accurately—that productivity, which surged between 1920 and 1970, has stalled since then and is likely to stay that way.
The reason, he argues, is that earlier technologies, such as electricity, the internal combustion engine and antibiotics, had far ranging effects while digital technology is fairly narrow by comparison. It’s a serious argument and he may well very be right, but it also fails to take into account important second order effects.
In The Singularity Is Near, computer scientist and inventor Ray Kurzweil explained that the endpoint of digital technology isn’t better devices and apps, but new technologies such as genomics, nanotechnology and robotics. Today, these are just beginning to have an impact, but over the next decade they will determine whether Gordon is right or not.
Business today moves fast. So we like simple statements that speak to larger truths. It always seems that if we can find a simple rule of thumb—or maybe 3 to 5 bullet points for the really big picture stuff—managing a business would be much easier. Whenever a decision needed to be made, we could simply refer to the rule and go on with our day.
Unfortunately, that often leads to cartoonish slogans rather than genuine managerial wisdom. Catchy ideas like “the war for talent,” “focus on the core” and “maximizing shareholder value” end up taking the place of thorough analysis and good sense. When that happens, we’re in big trouble.
The problem is, as the philosopher Ludwig Wittgenstein pointed out, “no course of action can be determined by a rule, because any course of action can be made out to accord with the rule.” In other words, rules may seem to make sense, but when we try to apply them in the real world we find that things are far more complicated and simple rules aren’t much help.
Technology isn’t what it used to be. 40 years ago, computers were strange machines locked away in the bowels of large organizations, tended to by an exclusive priesthood who could speak their strange languages. They were mostly used for mundane work, like scientific computation and back office functions in major corporations.
Yet by the 1980’s, technology had advanced enough to develop relatively cheap computers for personal use. No longer were they relegated to back rooms, but began to appear on desktops in homes and offices to be used for writing letters, doing taxes and even playing games.
We’re now entering a third paradigm, in which computers have shrunk even further and assist us with everyday tasks, like driving to a friend’s house or planning a vacation. These jobs are very different because they require computers to recognize patterns. To power this new era , IBM has developed a revolutionary new chip modeled on the human brain.