When Steve Jobs launched the iPhone in 2007, many pundits were less than impressed. Some said that its unusual shape made it unwieldy. Others thought that it was too expensive. Still others remarked that all the extra software made it a poor choice for its primary function — making phone calls.
But part of Jobs’s genius was his ability to recognize patterns that others couldn’t. Executives at Xerox, for example, didn’t see much potential in the Alto, but he built the Macintosh based on it. When music players seemed like a dead end, he reimagined them with the iPod and transformed the industry.
The problem with patterns is that it’s so devilishly hard to tell the good ones from the bad. What may look like a promising pattern is often out of context or incomplete. Sometimes, we think we see a pattern that isn’t really there. That’s what makes innovation so difficult, we can never validate new patterns by looking backward, we can only test them going forward.
IBM recently announced that broke the record for patents granted to a single company, with 8,088 patents being granted to its inventors, most of which are in key emerging areas such as machine learning, cloud computing and cyber security. To put that number in perspective, it is more than were granted to Google, Microsoft, Amazon and Facebook combined.
Clearly, the number of patents IBM consistently puts out year after year is impressive, but some would also say that it’s excessive and irrelevant. Tesla has open sourced its patents while others, like Google, have open sourced key technologies. Apple, which receives relatively few patents, has dominated the industry for a decade.
Yet IBM’s commitment to patents is unwavering. Since 1993, the number of patents it has benn awarded has increased at a compound annual rate of more than 9 percent. So to learn more about why IBM puts so much emphasis on patents, I talked to Bernie Meyerson, the company’s Chief Innovation Officer. What he had to say explains a lot about IBM’s strategy.
In the 1930’s, the great industrialist George Eastman engaged in a minor debate with his friend, the education reformer Abraham Flexner, about who contributed most to science. Eastman pointed to Guglielmo Marconi, the inventor of radio and transformed the world.
Yet Flexner argued that given the discoveries of scientists like James Clerk Maxwell and Heinrich Hertz, Marconi’s invention was inevitable. While neither men pursued any practical application of their work, it was their boundless curiosity that led them to the principles that created a revolution.
Chris Dixon has written that “the next big thing will start out looking like a toy,” but that is only half of the story. The truth is that the next big thing starts out looking like nothing much at all. Many great discoveries, such as that of penicillin, spent years lying in obscure journals before someone noticed that they could have a practical use. That’s where the future lies.
In Weapons of Math Destruction, mathematician and data scientist Cathy O’Neil paints a disturbing picture of how data can go awry. “Black box” algorithms that make decisions with little to no transparency or accountability can lead to bizarre situations in which judgments are handed down with no possibility of appeal.
For example, she tells the story of Sarah Wysocki, a teacher who, despite being widely respected by her students, their parents and her peers, was fired because she performed poorly according to an algorithm. She now works at another school district that uses humans to evaluate teachers.
Yet Cava Grill, a restaurant chain similar to Chipotle but focused on healthy Mediterranean cuisine, shows that the problem really isn’t with data or algorithms, but with us. The firm has built a strong culture around data even among its front line employees. The secret, as it turns out, has nothing to do with technology, but what your culture is like to begin with.
Much has been made about the difference between innovation and invention. One writer went so far as to argue that Steve Jobs development of the iPod wasn’t an innovation because it was dependent on so much that came before it. A real innovation, so the argument goes, must be truly transformational, like the IBM PC, which created an entire industry.
The problem with these kind of word games is that they lead us to an infinite regress. The IBM PC can be seen as the logical extension of the microchip, which was the logical extension of the transistor. These, in turn, rose in part through earlier developments, such Turing’s universal computer and the completely irrational science of quantum mechanics.
The truth is that innovation is never a single event, but happens when fundamental concepts combine with important problems to create an impact. Traditionally, that’s been done within a particular organization or field, but to come up with breakthrough ideas in the 21st century, we increasingly need to transcend conventional boundaries of company and industry.
Is America out of ideas? Scott Ip of The Wall Street Journal seems to think so. In a recent article, he points out, correctly, that total factor productivity growth has “steadily fallen” since its peak in the 1950’s and 60’s. According to Ip, rising research costs and greater regulatory burdens have reduced our ability to innovate.
“Outside of personal technology, improvements in everyday life have been incremental, not revolutionary,” he writes. “Houses, appliances and cars look much like they did a generation ago. Airplanes fly no faster than in the 1960s. None of the 20 most-prescribed drugs in the U.S. came to market in the past decade.”
This is not a new argument. In fact, economist Robert Gordon makes many of the same points in his book, The Rise and Fall of American Growth. Still, while the issues that both Ip and Gordon raise are very real, they are only part of the story. Innovation is not about the past, but the future and, we may very well be entering a new era of accelerated innovation.
Throughout history, social movements — small groups, loosely connected but united by a shared purpose — have created transformational change. Women’s suffrage and civil rights in the U.S., Indian independence, the color revolutions in Eastern Europe, and the Arab Spring all hinged on the powerless banding together against the powerful.
In these movements, protests play an important role. Consider the recent marches in Poland concerning an unpopular abortion law. They inspired millions to take further actions, including a women’s strike, that convinced lawmakers to back down.
Still, protests like the massive global marches that took place last weekend, although crucially important for creating transformational change, are merely a first step. There are clear reasons why some movements succeed and others fail, and activists need to take history’s lessons to heart. To truly make an impact, a movement needs to follow five steps:
When René Descartes wrote “I think, therefore I am” in the mid 1600s, he was doing more than coining a clever phrase, he was making an argument for a rational world ruled by pure logic. He believed that you could find the answers to problems you needed to solve merely by thinking about them clearly.
Yet Descartes and his rational movement soon ran out of steam. Many of the great minds that followed, such as John Locke and David Hume, took a more empirical view and argued that we can only truly understand the world around us through our experiences, however flawed and limited they may be.
A similar tension has been brewing in the 21st century with big data being used to build predictive models that drive human decisions. However, in its 5 in 5 — five predictions for the next five years — IBM Research sees a new era emerging in which software and instrumentation will combine to give us unprecedented insights into the physical world.
Henry Mintzberg once observed, “The great myth is the manager as orchestra conductor. It’s this idea of standing on a pedestal and you wave your baton and accounting comes in, and you wave it somewhere else and marketing chimes in with accounting, and they all sound very glorious.”
“But,” he continues, “management is more like orchestra conducting during rehearsals, when everything is going wrong.” In other words, leading people never turns out like you think it will. People, events and other factors often surprise you. That’s why the most important thing you do as a manager is to learn.
I first began managing people two decades ago, when I was in my mid -20s, and I probably wasn’t quite ready for it. But then again, you’re never quite ready to lead until you actually do it. Management is not about building and executing plans but, as Mintzberg suggests, the art of guiding teams through plans going awry. Here’s what I’ve learned about how to do that.
In 1977, at the Xerox World Conference held in Boca Raton, FL that year, the company’s senior executives got a glimpse of the future. On display was a new kind of computer, the Alto, that was designed for a single person to use with nothing more than a keyboard and a small device called a “mouse” that you operated with one hand.
They were not impressed. The tasks that the machine performed were mainly for writing and handling documents, secretarial work in other words, which did not excite them. And for executives who measured their performance by how many copies they generated, they didn’t see how the thing could make money.
Times have changed, of course, and today it’s hard to imagine any executive functioning without a computer. We’re now going through a transformation similar to that of the 1970’s. Today, every manager needs to work with data effectively. The problem is that most are as ill equipped as those Xerox executives in the 1970s. Here’s what you most need to know: