In Sapiens, Israeli historian Yuval Noah Harari argues that it was the exploratory mindset that led to European dominance over the world. Other empires, such as the Chinese and the Ottomans, had far greater military and economic power in the 18th century. Yet, it was the Europeans quest for understanding that made the difference.
To explore, you first need to come to terms with your own ignorance. We find the accomplishments of men like Columbus and Magellan so impressive precisely because they didn’t know what they were getting into. Yet they still had the the courage to sail boldly into the unknown when no one else dared to venture forth.
Today, scientific exploration is what fuels the modern world. We look at an iPhone and see the genius of Steve Jobs, but forget about the work of men like Maxwell, Faraday, Einstein and Turing that led to it. So science budgets are cut and skeptical politicians grill researchers about the value of their work. Yet without exploration, there can be no advancement.
In 1980, an obscure professor at Harvard Business School named Michael Porter published Competitive Strategy, which called for managers to drive efficiency by optimizing their firm’s value chain, maximize bargaining bargaining power with buyers and suppliers, while at the same time minimizing threats from new market entrants and substitute goods.
These concepts launched Porter into the top rank of strategic thinkers and profoundly influenced how businesses were run. Much like chess grandmasters, CEO’s worked to develop the right sequence of strategic moves that would position their firms to exert power and dominate their respective industries.
Yet much has changed in since then. Rather than an orderly marketplace defined by clear boundaries of industry and geography, we operate in a semantic economy where everything is connected. The most important assets are no longer the ones we control, but those that reside in ecosystems that we access through platforms. That changes the game entirely.
“Build a better mousetrap, and the world will beat a path to your door” is a phrase that has inspired generations of innovators and entrepreneurs. Unfortunately, they often find that a better mousetrap is not enough. You also need to find a customer who wants to buy a mousetrap at a price at which you can profit.
That, essentially, is the idea behind the lean startup, a concept originally developed by Steve Blank and then popularized by Eric Ries in his bestselling book. Blank’s key insight is that you can’t run a new business the same way you would manage an existing enterprise.
Today, concepts that Blank pioneered, such as customer development, minimum viable product and the pivot have become de rigueur for Silicon Valley entrepreneurs, but are still largely unknown in the greater business world. Yet, it’s becoming clear that lean startup methods can be effective for anyone that is trying to bring a new product or service to market.
When the world’s first digital computer was completed in 1946 it opened up new vast new worlds of possibility. Still, early computers were only used for limited applications because they could only be programmed in machine code. It took so long to set up problems that they were only practical for massive calculations.
That all changed when John Backus created the first programming language, FORTRAN, at IBM in 1957. For the first time, real world problems could be quickly and efficiently transformed into machine language, which made them far more practical and useful. In the 1960’s, the market for computers soared.
Blockbuster Video is a cautionary tale, but not for the reason most people think it is. As the story is usually told, the executives at the now defunct video rental giant ignored the threat coming from Netflix until it was too late. Their futile efforts to meet the challenge came too late and the company went bankrupt in 2010.
Notice how the real story is much more ambiguous. It’s easy to conjure up images of “fat cat executives” who are asleep at the wheel. It’s far more unsettling to realize that we can come up with the right strategy, execute well and still fail. The truth is that the world is a messy place and if we are to learn something useful from stories, we need to get them right.
The field of artificial intelligence got its start at a conference at Dartmouth in 1956. Optimism ran high and it was believed that machines would be able to do the work of humans within 20 years. Alas, it was not to be. By the 1970’s, funding dried up and the technology entered the period known as the AI winter.
Yet for all the dazzling technological wizardry, we’ve seen little effect on most businesses. Artificial intelligence, much like PC’s in the early 80’s or the Internet in the early 90’s, remains little more than a curiosity for most managers. So I talked to Josh Sutton, who heads up the AI practice at Publicis.Sapient, to learn about what we can expect in the years to come.
The history of business has been defined by rivalries. Great companies, such as GM and Ford, Coke and Pepsi, General Electric and Westinghouse were locked in mortal combat and the choice for customers was binary. Strategy was a zero-sum game. You either won or lost. There was no in-between.
Businesses approached their value chains the same way. They pushed hard to maximize their bargaining power with customers and suppliers, while at the same taking steps to defend against new market entrants and substitute goods. Business was seen as war, and managers looked to Sun Tzu and von Clausewitz for guidance.
Now, however, it’s becoming increasingly clear that no one can go it alone and firms must choose where they will cooperate and where they will compete. This is especially true when it comes to innovation, where no single entity is likely to be able to develop more than a piece of the overall puzzle. Today, open systems are not a choice, They are an imperative.
Every age has its achievements. The fifties brought us the post-war boom and the rise in living standards that came with it. The sixties brought us moonshots and the seventies free love. We ended the Cold War in the eighties and embraced the Internet in the nineties. Every generation leaves something to build on.
The converse is also true and the sins of each generation are visited on the the next. Jim Crow and the Red Scare brought us race riots and Vietnam. Free love and experimentation helped lead to AIDS and cocaine epidemics. Greater prosperity and industrialization have lifted entire populations, but are also warming the planet.
Our current age is unique in that many of our most profound problems arise from our past achievements. Rising longevity, prosperity, connectivity and automation are contributing to unsustainable healthcare costs, terrorism and income inequality. Every year, innovators from around the world gather at the BIF Summit suss out solutions to problems like these.
Take a look at any successful enterprise and you’ll find innovation at its core. That was just as true a hundred years ago when Henry Ford perfected the assembly line as it is today, when modern day giants like Elon Musk bring cutting edge technology to market. Innovation, as I’ve written before, is how people come up with novel solutions to important problems.
The tricky part is that every organization faces different types of challenges. Some, like Intel, focus on improving old technologies, while others, like MD Anderson Cancer Center, strive to make fundamental new discoveries. There are also those that innovate business models, marketing campaigns and many other things.
That’s why there is no one “true path” to innovation. There are, in fact, as many ways to innovate as there are types of problems to solve. However, in researching my upcoming book, Mapping Innovation, I noticed universal traits in every organization I looked at. From corporate giants to startups to world class labs, here are the 6 things they had in common.
In the PC era, the big rivalry was between Microsoft and Apple. Apple’s products were considered to be better, but Microsoft’s ability to leverage its operating system across a number of manufacturers proved to be the stronger model. By 1997, Apple was in such bad shape it needed an investment from Microsoft to keep the lights on.
Yet Apple came back with a vengeance in the post-PC era, in which its ability to seamlessly integrate across devices was decisive. Apple products became more than just productivity tools, but fashion icons and soon Apple took Microsoft’s former place as the most valuable company in the world.
That rivalry is mostly over now, but a new one is brewing between Google and IBM. It’s an unusual business rivalry because the two rarely compete in the same markets or for the same customers. In truth, it is a rivalry for technical rather than market dominance. Yet much like Apple vs. Microsoft, it’s likely to determine much about how technology shapes our world.