In Mindset, psychologist Carol Dweckargues, based on decades of research, that how we see ourselves is a major factor in what we can achieve. If we see our abilities as fixed, we tend not to go very far. However, if we see our capabilities as dynamic and changeable, we will work to improve them and are more likely to attain excellence.
The same can be said about a field like marketing. If we see the rules as fixed, we’ll tend to be limited by conventional barriers of achievement. But if we see that paradigms are, to a large degree, self imposed then the possibilities are endless. We are only bound by what we believe.
That’s why it’s important that we learn how to shift our mental models. While the tried and true gives comfort and has a track record to fall back on, the new and different often feels like a reckless shot in the dark. Still, we are living in an era of extreme change and we have no alternative but to keep pace. In an age of disruption, the only viable strategy is to adapt.
When scientists decoded the human genome in 2001, they found something astounding. While our DNA provides the blueprint for everything about us—from how we develop in the womb to eye color and personality traits—it takes only 20,000 genes to do so, less than one fifth of what had previously been thought.
What was even more mindblowing was the reason that they had been so off the mark. While our genome would seem to be the model of efficiency, squeezing all that information into a microscopic nucleus, 98% of our DNA is “junk” that doesn’t code for anything. How could our biology be so wasteful?
In The Selfish Gene, the eminent biologist Richard Dawkins explains that the confusion arises because we assume that DNA exists for our sakes rather than the other way around. We, he argues, are mere vehicles to propagate genes. Much the same can be said about ideas in an enterprise. All too often, we fail to recognize what our business’s DNA is telling us.
In 2014, a muslim student at The University of Michigan was harassed for a satirical column he wrote about the oversensitivity of students at his school. As Jonathan Chait described in a post in New York Magazine, the student was viewed as a perpetrator rather than a victim because he mocked politically correct norms.
Yet just in case you think that political correctness is strictly in the realm of the liberal left, consider the case of Larycia Hawkins, a professor at Wheaton College who was forced out after expressing sympathy for Muslims after the Charlie Hebdo attack, or Steven Salaita, a professor censured for criticizing Israel.
Political correctness, all too often, is in the eye of the beholder. One person’s empathy is another’s insensitivity, or so it would seem. But whatever your opinion of the merits and demerits of political correctness, it is, at least in part, a technological phenomenon. So perhaps instead of the usual vicious cycle of recriminations, we should look for deeper roots.
On December 9th, 1968, a research project funded by the US Department of Defense launched a revolution. The focus was not a Cold War adversary or even a resource rich banana republic, but rather to “augment human intellect” and the man driving it was not a general, but a mild mannered engineer named Douglas Engelbart.
His presentation that day would be so consequential that it is now called The Mother of All Demos. Two of those in attendance, Bob Taylor and Alan Kay would go on to develop Engelbart’s ideas into the Alto, the first truly personal computer. Later, Steve Jobs would take many elements of the Alto to create the Macintosh.
So who deserves credit? Engelbart for coming up with the idea? Taylor and Kay for engineering solutions around it? Jobs for creating a marketable product that made an impact on the world? Strong arguments can be made for each, as well as for many others not mentioned here. The truth is that there are many paths to innovation. Here are nine of them.
In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years and predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.
Yet the law has been fraying for years and experts predict that it will soon reach its limits. However, I spoke to Bernie Meyerson, IBM’s Chief Innovation Officer, and he feels strongly that the end of Moore’s Law doesn’t mean the end of progress. Not by a long shot. What we’ll see though is a shift in emphasis from the microchip to the system as a whole.
In David and Goliath, bestselling author Malcolm Gladwell explains how small upstarts often have surprising advantages over larger, more powerful opponents. “Giants are not what we think they are,” he writes, “and that often makes us fail to appreciate less conventional strategies that may be superior.”
That’s certainly true in business. Large enterprises must serve the present. Things are expected of them. They have to keep customers, employees and other stakeholders happy. These obligations often weigh them down and make them vulnerable to disruptive innovation.
Yet that shouldn’t blind us to the fact that startups make for such enticing stories precisely because they are so unlikely. Most fail. And when they succeed, they become Goliaths themselves and face the same challenges as incumbents do. So instead of glorifying startups, perhaps we should take a closer look at what it takes to stay on top once you get there.
After a terrorist attack, we demand increased vigilance because we perceive an increased threat. Yet TheWashington Post reports that we’re not only more likely to die of more mundane causes, like a bee sting or even getting hit by lightning, but even those remote odds are on a downward trend.
Cognitive scientists call this availability bias. Terrorism makes for a compelling news story. There is agency, a back-story and political ramifications that get reported on heavily. That makes the danger seem more clear and present, so we feel more compelled to act on it. Sqeaky wheels, in effect, get the grease.
Availability bias is more than a simple academic curiosity. It encourages us to react swiftly to tragic events, but ignore slow moving trends that will have a far greater impact. Today, aging, decreased poverty and automation are, at first glance, positive trends—and they are— but they are also starting to create problems that we haven’t even begun to think seriously about.
I recently got a call from my mother asking me to help her watch House of Cards on Netflix. She was frustrated and complained, “I keep pressing the thing and nothing happens!” It was hard to get her to understand I had no idea what thing she was pressing or what was supposed to happen when she did.
I’m still not exactly sure what the problem was, but getting her to understand that the buttons on her remote had little to do with the TV in her bedroom and everything to do with giving instructions to servers in faraway places seemed to help. Before long, her frustration with technology turned to fascination with the political machinations of Frank Underwood.
Many businesses have the same problem as my mother. As technology advances, its function evolves and those that are unable to shift their mental models find themselves unable to compete. This is especially true of digital technology, where every generation sees a new crop players emerge while old titans falter. Only a rare few are able to cross the chasm.
The media business used to be fairly simple. It operated on linear model, consisting of content, distribution and audience, with a small priesthood of publishers, producers and programing executives making editorial decisions for the rest of us. Stars were created by the choices they made.
The Internet blew that model apart. Today it is obvious—even trite—to say that anyone, anywhere can get their voice heard. Platforms like YouTube, Flipboard and Pandora offer a cacophony of voices—from major media companies to journeyman professionals and hopeful amateurs.
For the most part, the removal of the distribution choke-point has been a good thing, but it also comes with a built-in problem. Increase the supply of anything and prices will fall, which makes it harder for creators to finance their work. Now, a number of new business models are rising up to fill the void and it’s changing what we read, watch and listen to.
I recently watched a documentary about Kobe Bryant, the NBA legend. Kobe grew up in my hometown, so I was aware of his superstar status earlier than most people. At his high school games, he looked like a man playing with boys. No one was surprised when he was drafted by the NBA in the first round, at the age of 17.
We expect genius to always look like that. When we see someone of extreme accomplishment, it is almost inconceivable that their special gifts weren’t always apparent, but they usually aren’t. To take just one famous example, Albert Einstein wasn’t even made a full professor until 1911, six full years after his miracle year.
The problem is that true genius defies convention and it is by conventional standards that we ordinarily define achievement. When someone who comes along with a completely new paradigm, it usually looks like nonsense and tends to be ignored. Yet some people have the ability to recognize brilliance in an idiotic guise and that is itself a special kind of genius.