J.P. Morgan believed in trusts. It seemed to him that excessive competition diminished profits and undermined capital formation, which he saw as essential to building a modern economy. Although that may seem like a strange point of view today, it was one widely held by 19th century industrialists.
Today, however, efficiency will only get you so far. Three of the world’s five most valuable companies are not old line industrial or financial companies, but fast moving tech companies who prosper through agility rather than efficiency. Much like the 20th century industrialists, today’s managers need to adapt to new rules. Here are three trends you need to know.
Are communication technologies like Slack, Yammer and Skype actually helping us, or just getting in the way? Certainly, they have made it easier to communicate, share information and collaborate with colleagues, but what if all that extra communication is actually preventing us from getting important work done?
In a recent article in Harvard Business Review, Bain & Co. partner Michael Mankins estimates that while a typical executive in the 1970’s might have received 1,000 messages a year, that number has skyrocketed to more than 30,000 today and argues that we may “have reached the point of diminishing returns.”
I think just about everyone can see his point. Today, the amount of meetings, emails and IM’s we receive can seem overwhelming and it’s increasingly hard to find uninterrupted quiet time to focus and concentrate. However, the nature of work has changed. The real reason that we communicate more is because, today, we need to collaborate more to be effective.
In Mindset, psychologist Carol Dweckargues, based on decades of research, that how we see ourselves is a major factor in what we can achieve. If we see our abilities as fixed, we tend not to go very far. However, if we see our capabilities as dynamic and changeable, we will work to improve them and are more likely to attain excellence.
The same can be said about a field like marketing. If we see the rules as fixed, we’ll tend to be limited by conventional barriers of achievement. But if we see that paradigms are, to a large degree, self imposed then the possibilities are endless. We are only bound by what we believe.
That’s why it’s important that we learn how to shift our mental models. While the tried and true gives comfort and has a track record to fall back on, the new and different often feels like a reckless shot in the dark. Still, we are living in an era of extreme change and we have no alternative but to keep pace. In an age of disruption, the only viable strategy is to adapt.
When scientists decoded the human genome in 2001, they found something astounding. While our DNA provides the blueprint for everything about us—from how we develop in the womb to eye color and personality traits—it takes only 20,000 genes to do so, less than one fifth of what had previously been thought.
What was even more mindblowing was the reason that they had been so off the mark. While our genome would seem to be the model of efficiency, squeezing all that information into a microscopic nucleus, 98% of our DNA is “junk” that doesn’t code for anything. How could our biology be so wasteful?
In The Selfish Gene, the eminent biologist Richard Dawkins explains that the confusion arises because we assume that DNA exists for our sakes rather than the other way around. We, he argues, are mere vehicles to propagate genes. Much the same can be said about ideas in an enterprise. All too often, we fail to recognize what our business’s DNA is telling us.
In 2014, a muslim student at The University of Michigan was harassed for a satirical column he wrote about the oversensitivity of students at his school. As Jonathan Chait described in a post in New York Magazine, the student was viewed as a perpetrator rather than a victim because he mocked politically correct norms.
Yet just in case you think that political correctness is strictly in the realm of the liberal left, consider the case of Larycia Hawkins, a professor at Wheaton College who was forced out after expressing sympathy for Muslims after the Charlie Hebdo attack, or Steven Salaita, a professor censured for criticizing Israel.
Political correctness, all too often, is in the eye of the beholder. One person’s empathy is another’s insensitivity, or so it would seem. But whatever your opinion of the merits and demerits of political correctness, it is, at least in part, a technological phenomenon. So perhaps instead of the usual vicious cycle of recriminations, we should look for deeper roots.
On December 9th, 1968, a research project funded by the US Department of Defense launched a revolution. The focus was not a Cold War adversary or even a resource rich banana republic, but rather to “augment human intellect” and the man driving it was not a general, but a mild mannered engineer named Douglas Engelbart.
His presentation that day would be so consequential that it is now called The Mother of All Demos. Two of those in attendance, Bob Taylor and Alan Kay would go on to develop Engelbart’s ideas into the Alto, the first truly personal computer. Later, Steve Jobs would take many elements of the Alto to create the Macintosh.
So who deserves credit? Engelbart for coming up with the idea? Taylor and Kay for engineering solutions around it? Jobs for creating a marketable product that made an impact on the world? Strong arguments can be made for each, as well as for many others not mentioned here. The truth is that there are many paths to innovation. Here are nine of them.
In 1965, Intel cofounder Gordon Moore published a remarkably prescient paper which observed that the number of transistors on an integrated circuit was doubling every two years and predicted that this pace would lead to computers becoming embedded in homes, cars and communication systems.
Yet the law has been fraying for years and experts predict that it will soon reach its limits. However, I spoke to Bernie Meyerson, IBM’s Chief Innovation Officer, and he feels strongly that the end of Moore’s Law doesn’t mean the end of progress. Not by a long shot. What we’ll see though is a shift in emphasis from the microchip to the system as a whole.
In David and Goliath, bestselling author Malcolm Gladwell explains how small upstarts often have surprising advantages over larger, more powerful opponents. “Giants are not what we think they are,” he writes, “and that often makes us fail to appreciate less conventional strategies that may be superior.”
That’s certainly true in business. Large enterprises must serve the present. Things are expected of them. They have to keep customers, employees and other stakeholders happy. These obligations often weigh them down and make them vulnerable to disruptive innovation.
Yet that shouldn’t blind us to the fact that startups make for such enticing stories precisely because they are so unlikely. Most fail. And when they succeed, they become Goliaths themselves and face the same challenges as incumbents do. So instead of glorifying startups, perhaps we should take a closer look at what it takes to stay on top once you get there.
After a terrorist attack, we demand increased vigilance because we perceive an increased threat. Yet TheWashington Post reports that we’re not only more likely to die of more mundane causes, like a bee sting or even getting hit by lightning, but even those remote odds are on a downward trend.
Cognitive scientists call this availability bias. Terrorism makes for a compelling news story. There is agency, a back-story and political ramifications that get reported on heavily. That makes the danger seem more clear and present, so we feel more compelled to act on it. Sqeaky wheels, in effect, get the grease.
Availability bias is more than a simple academic curiosity. It encourages us to react swiftly to tragic events, but ignore slow moving trends that will have a far greater impact. Today, aging, decreased poverty and automation are, at first glance, positive trends—and they are— but they are also starting to create problems that we haven’t even begun to think seriously about.
I recently got a call from my mother asking me to help her watch House of Cards on Netflix. She was frustrated and complained, “I keep pressing the thing and nothing happens!” It was hard to get her to understand I had no idea what thing she was pressing or what was supposed to happen when she did.
I’m still not exactly sure what the problem was, but getting her to understand that the buttons on her remote had little to do with the TV in her bedroom and everything to do with giving instructions to servers in faraway places seemed to help. Before long, her frustration with technology turned to fascination with the political machinations of Frank Underwood.
Many businesses have the same problem as my mother. As technology advances, its function evolves and those that are unable to shift their mental models find themselves unable to compete. This is especially true of digital technology, where every generation sees a new crop players emerge while old titans falter. Only a rare few are able to cross the chasm.