In the late 17th century a middle-aged draper named Antonie van Leeuwenhoek became interested in the magnifying glasses he used to inspect fabric. He started experimenting with making his own and ended up creating one of the world’s first microscopes.
His work caught the attention of the Royal Society in London, which encouraged him to continue his research. Eventually he got around to examining a drop of water under his new device, which led to his discovery of an entirely new realm of microscopic organisms.
Technology and science have always been inextricably linked. Watson and Crick used the new technique of x-ray diffraction to discover the structure of DNA. Quantum mechanics would not be possible without proton accelerators. Strangely though, new data techniques have thus far had little effect on how science is done. That’s about to change.
Applying state-of-the-art tools and processes is widely seen as a mark of excellence. So, perhaps not surprisingly, “best practice” is one of those terms that you constantly hear in corporate circles. Managers often see implementing them as key to their performance.
Yet many experts point out that adopting so-called best practices can stifle your ability to innovate. After all, once you designate a particular way of doing things as “best,” who is going to question it? And if nobody questions it, it won’t be improved.
Still, even keeping those objections in mind, best practices can be immensely valuable, if approached with open eyes and good sense. The truth is that much like any business process, they’re only as good as the managers who implement them. While many do use best practices as a crutch, they can also be used as a platform from which to innovate.
The unicorn is perhaps unique among myths in that the creature doesn’t appear in the mythology of any culture. The ancient Greeks, for all of their centaurs, hydras and medusas, never had any stories of unicorns, they simply believed that some existed somewhere.
Of course, nobody had ever seen one, but they believed others had. Travelers would go to far away places, bring back stories of them and speak of the magical properties contained in their horns. Alas, no matter how hard anyone searched for unicorns, none were ever found.
Unfortunately, we don’t seem to have progressed much since then. Today, otherwise competent professionals are busy chasing their own unicorns. They hear stories of new practices, consumers and strategies and off they go, chasing what isn’t there. This is an enormous waste of time. We need to stop chasing unicorns and start killing them off.
We’re living in an age of networks. Facebook, Twitter and LinkedIn have hundreds of millions of users. New services like Instagram and Pinterest become billion dollar companies in months instead of years or decades. This year, marketers will spend over $4 billion on social media.
Of course, networks are not exactly new. We grew up watching TV networks and have invested time in going to networking events to meet people for generations. The idea that building connections is important is something we have always intuitively understood.
Yet today’s networks are decidedly different because digital technology allows connections to form much faster and become more pervasive. Also, over the last 15 years, a robust science of networks has been established, yielding important insights into how they function. It’s time we start putting the science to work in how we manage enterprises.
Some months ago, I downloaded the Kodable app for my 4 year-old, which boasts that it can teach toddlers how to code before they can read. My programming skills are pretty basic, but I like the idea of giving my child a head start.
Many of today’s business and political leaders stress that coding has become an essential skill for the digital age and there’s been an avalanche of new services—from classes and training programs to free online resources like Code Academy and Code.org—to meet the demand.
Yet long tech columnist Kevin Maney disagrees. In a recent piece in Newsweek, he writes that by the time today’s pre-teens reach the job market, they “will find that coding skills are about as valuable as cursive handwriting.” To many tech denizens, that’s apostasy, but he has a point. Preparing for the future will take much more than writing command lines.
Charles Kuralt once remarked that New York isn’t really a big city at all, but a lot of small villages sitting right next to one another, each one fairly oblivious to the others. The same can be said for countries and industries. We tend to create our own microcosms.
That kind of thing becomes more clear when you go from one culture to another. Switch countries—or even industries—and it quickly strikes you how people can have vastly different perspectives about the world, which can lead to serious problems.
Companies have similar challenges. Managers use data to recognize emerging patterns in the marketplace and then coordinate the actions of all the disparate villages within their enterprise. Clearly, as the world becomes more volatile, there’s a need to change how our organizations manage information. Recently, I talked to a company which is doing just that.
Most managers take it for granted that the world has become much more volatile and complex and that we need to constantly adapt. The days when we could simply plan and execute a strategy and hope to effectively compete are long gone.
So it was notable, to say the least, when Roger Martin recently wrote in Harvard Business Review that he thinks that all the talk about adaptive strategy is a cop-out. In his mind, it is just a way for managers to get out of making hard, dangerous choices.
It’s tempting to dismiss his objections out of hand, but Martin, a former partner at the Monitor Group and Dean of the Rotman School of Management, has been one of the sharpest strategic thinkers for over two decades. While I don’t agree with much of his argument, he does make some important points that need to be addressed.
Tim Geithner is not America’s sweetheart. Over the past six years, the former Treasury Secretary and central banker has been accused of being just about everything, from a rapacious capitalist working in the service of bankers to a pure unadulterated socialist.
Yet whatever you think of Geithner, his memoir, Stress Test, is a book that everybody in a position of responsibility—or hopes to be someday—should read. It offers an excellent first-hand account of what it’s like to operate at the center of a crisis.
True crises are rare. By their nature, they are unusual and unexpected. Crises break out when seemingly manageable risks take on a life of their own and spin out of control. When that happens, there is no easy guide, no clear rules to follow. Still, you have to make hard choices and that has consequences. Geithner offers some rare insight into what that’s like.
In 1997, a little known Harvard professor named Clayton Christensen published a surprise bestseller called The Innovator’s Dilemma, where he coined the term disruptive technology, which later evolved into disruptive innovation and became a mantra for the digital age.
Yet in a well argued piece in The New Yorker, his colleague at Harvard, the celebrated historian Jill Lepore, cries foul. She calls disruptive innovation a “competitive strategy for an age seized by terror.” “Transfixed by change,” she writes, “it’s blind to continuity.”
It’s not just Christensen’s theories that Lepore opposes, but what she calls the “rhetoric of disruption” which leads us to seek change for change’s sake, undermining productive stability. She also points out that disruption is no panacea and leads to failure more often than it does to success. So, is it time to rethink our culture of disruption?
In the late 90’s McKinsey declared the war for talent and argued that, in a knowledge economy, having the right people is even more important than having the right strategy or technology. Recruiting and retaining the “best and the brightest” became a corporate mantra.
Yet today, the firm is more concerned with the skills gap. In data science, for example it estimates a shortfall of 140,000 to 190,000 data scientists and 1.5 million managers who have the skills needed to use the insights to drive decisions. But even that understates the problem.
With technology accelerating change in the marketplace and automation replacing highly skilled workers with robots, the decision to invest in any particular set of skills is far from obvious. Empty platitudes about “upgrading skills” and “investing in our people” will not suffice. We need to start thinking seriously about viable strategies to manage the skills gap.