Since World War II, the United States has been an innovation superpower. In virtually every advanced field, whether it be information technology, biotechnology, agriculture or renewable energy, America holds a leading position. Other nations may challenge in one field or another, but no one can match its depth and breadth.
To account for its success, many point to America’s entrepreneurial culture, its tolerance for failure and its unique ecosystem of venture funding. Those factors do play important roles, but the most important thing driving America’s success has been its unparalleled scientific leadership.
Most of this research is publicly funded. Take a look at any significant innovation, such as an iPhone, and you’ll find that most, if not all, of the technology came from some government program. Google itself got its start from a federal grant. Yet that poses a problem for managers. How can a private company turn public research into a competitive advantage?
In her bestselling book Mindset, psychologist Carol Dweck argues that people who see their skills as a fixed set of strengths and weaknesses tend not to achieve much. On the other hand, those that see their skills as dynamic and changeable are able to continually grow their abilities and soar to great heights.
Businesses are the same way. Most see their business models as a permanent facet of their DNA, so when their environment changes, they fail to adapt. That’s why 87% of the companies on Fortune’s original list of 500 top firms are no longer there. Over time, most companies get better and better at things that people want less and less.
Of course, that’s not always true. Firms like Procter & Gamble, General Electric and IBM still thrive after a century or more. The reason they endure is that they don’t see their business as fixed, but have continually reinvented themselves and are vastly different enterprises than when they started. In an age of disruption, the only viable strategy is to adapt.
Many people take it as an article of faith that technology is changing the world, but in his new book, The Rise and Fall of American Growth, economist Robert Gordon says otherwise. He points out—accurately—that productivity, which surged between 1920 and 1970, has stalled since then and is likely to stay that way.
The reason, he argues, is that earlier technologies, such as electricity, the internal combustion engine and antibiotics, had far ranging effects while digital technology is fairly narrow by comparison. It’s a serious argument and he may well very be right, but it also fails to take into account important second order effects.
In The Singularity Is Near, computer scientist and inventor Ray Kurzweil explained that the endpoint of digital technology isn’t better devices and apps, but new technologies such as genomics, nanotechnology and robotics. Today, these are just beginning to have an impact, but over the next decade they will determine whether Gordon is right or not.
Business today moves fast. So we like simple statements that speak to larger truths. It always seems that if we can find a simple rule of thumb—or maybe 3 to 5 bullet points for the really big picture stuff—managing a business would be much easier. Whenever a decision needed to be made, we could simply refer to the rule and go on with our day.
Unfortunately, that often leads to cartoonish slogans rather than genuine managerial wisdom. Catchy ideas like “the war for talent,” “focus on the core” and “maximizing shareholder value” end up taking the place of thorough analysis and good sense. When that happens, we’re in big trouble.
The problem is, as the philosopher Ludwig Wittgenstein pointed out, “no course of action can be determined by a rule, because any course of action can be made out to accord with the rule.” In other words, rules may seem to make sense, but when we try to apply them in the real world we find that things are far more complicated and simple rules aren’t much help.
Technology isn’t what it used to be. 40 years ago, computers were strange machines locked away in the bowels of large organizations, tended to by an exclusive priesthood who could speak their strange languages. They were mostly used for mundane work, like scientific computation and back office functions in major corporations.
Yet by the 1980’s, technology had advanced enough to develop relatively cheap computers for personal use. No longer were they relegated to back rooms, but began to appear on desktops in homes and offices to be used for writing letters, doing taxes and even playing games.
We’re now entering a third paradigm, in which computers have shrunk even further and assist us with everyday tasks, like driving to a friend’s house or planning a vacation. These jobs are very different because they require computers to recognize patterns. To power this new era , IBM has developed a revolutionary new chip modeled on the human brain.
When Alexander Fleming, a brilliant but sometimes careless scientist, returned to his lab after a summer holiday in 1928, he found his work ruined. A bacteria culture he had been growing was contaminated by fungus and, as it grew, it killed all the colonies it touched.
Most people would have simply started over, but Fleming switched his focus from the bacteria to the fungus itself. First, identified the mold and the bacteria-killing substance, which he called “penicillin,” then he tested it on other bacteria cultures. Seemingly in a single stroke, Fleming had created the new field of antibiotics.
That’s how most people see innovation. A flash of brilliance and Eureka!, a new world is born. The truth is far messier. In fact, it wasn’t until 1943—nearly two decades later—that penicillin came into widespread use and only then because it was accelerated by the war effort. We need to discard old myths and deal with innovation as it really happens.
In 1882, Thomas Edison built the Pearl Street Station, his first steam powered electrical distribution plant. In the years that followed, intense competition broke out between he and George Westinghouse, which became known as the War of the Currents, and the technology improved markedly in the coming decades.
As Robert Gordon pointed out in The Rise and Fall of American Growth, by 1940 life had been fully transformed. Even middle class homes had most of the modern conveniences we enjoy today, including refrigerators, air conditioners, telephones and radios. Soon, they would have TV’s as well.
Today, we’re going through a similar revolution. Just as electric light became competitive with gas light more than a century ago, renewable energy and electric cars are becoming competitive with technologies based on fossil fuels. Yet for the new technologies to become truly transformative, we need to develop a new generation of batteries to power them.
The underlying premise of any organization is to create value. Historically, firms have done so through engineering ever greater efficiency. By honing internal processes, optimizing the supply chain and reducing product inventories, managers could improve margins and create a sustainable competitive advantage.
That’s created a bias for simple, linear thinking. Adding extra variables to any process is bound to increase errors and diminish quality, so generations of managers have been trained to wring complexity out of the system in order to create streamlined operations. “Keep it simple, stupid” has become a common corporate mantra.
One of the biggest cop-outs in corporate life is to say, “we had a great a great strategy, but just couldn’t execute it.” Hogwash. Any strategy that doesn’t take the ability to execute is a lousy strategy to begin with. Strategy is not a game of chess, but depends on operational capacity.
The problem is particularly pervasive when it comes to content. For all of the talk about “brands becoming publishers,” most marketers are simply tacking on publishing functions to their existing operations without implementing any new processes or practices. That is a grave mistake.
As I previously wrote in Harvard Business Review, marketers need to think more like publishers, but they also need to act more like publishers if they are ever going to to hold an audience’s attention rather than merely broadcast messages. If you can’t create a compelling experience, it doesn’t really matter what your content strategy is, it will fail.
When Steve Jobs was trying to lure John Sculley from his lofty position as CEO at Pepsi to Apple, he asked him, “Do you want to sell sugar water for the rest of your life, or do you want to come with me and change the world?” The ploy worked and Sculley became the first major CEO of a conventional company to join a hot Silicon Valley startup.
That same spirit pervades the tech world today. People go to Silicon Valley and other technology hubs not just to make money, but to make a positive impact on the world through innovation. By searching frantically for the “next big thing,” they hope to do well by doing good.
In The Rise and Fall of American Growth, economist Robert Gordon throws cold water on that notion. With a painstaking—and fascinating—historical analysis of U.S. productivity, he argues that the innovations of today pale in comparison to earlier in our history and that we might actually be entering a period of prolonged stagnation. He may very well be right.