In 2001, Microsoft CEO Steve Ballmer declared war on the open source community. He called Linux a cancer and argued, essentially, that anybody who used open source resources in their was putting their business at risk. He even went as far as to urge the government not to support open source projects.
From Microsoft’s point of view, it was an understandable position. After all, its dominance of the industry was dependent on protecting its intellectual property. Yet a decade later, it open sourced Kinect, one of it’s most popular products ever. More recently, current CEO Satya Nadella declared that Microsoft loves Linux.
It was an startling shift and one the entire industry has long embraced as well. The reason why is simple: In a connected world, open beats closed and nobody can truly go it alone anymore. That’s why every business today must transform itself into a platform that connects to ecosystems of talent, technology and information outside the boundaries of the firm.
During World War II, natives on Pacific islands saw something most unusual. Strange men appeared, cleared long strips of land and built structures decorated with flags. Some of these men wore large cups over their ears, while others waved sticks and, almost magically, machines appeared from the sky carrying valuable cargo.
After the war ended, the men left and the supplies stopped coming. Some of the natives formed cargo cults which copied many of the the rituals the soldiers performed. They marched in formation, wore cups over their ears and waved sticks around. Alas, no airplanes ever came.
Clearly, the idea was patently absurd. Anybody who thinks that waving sticks will cause airplanes to appear is missing some basic principles about how air travel works. Yet many modern executives also believe by mimicking the tactics of others they will somehow achieve the same results. These “cargo cult strategists” don’t do much better than the islanders.
Technology used to be pretty simple. If you had some technical know-how, a few transistors and a soldering iron, you could go far. William Hewlett and David Packard built a billion dollar business in a garage and later, Steve Jobs and Steve Wozniak did the same. It was a time of amazing opportunity.
Later, hardware got much more sophisticated and there was not much you could do without a multimillion dollar facility, so the hackers moved to software. Then came the Internet and coders moved from garages to their bedrooms. With a laptop and a little bit of coding expertise, you could do impressive things.
Today, both hardware and software have become incredibly advanced. Experts in artificial intelligence are so rare that they’re being paid like sports stars. Yet this time around, tech giants like Microsoft, Amazon, IBM and Google are themselves making resources available, allowing anyone who wants to access some of the world’s most advanced technology.
In the Nicomachean Ethics, Aristotle states that it is a fact that “all knowledge and every pursuit aims at some good,” but then continues, “what then do we mean by the good?” That, in essence, encapsulates the ethical dilemma. We all agree that we should be good and just, but it’s much harder to decide just what that entails.
Since Aristotle’s time, the issues he raised have been continually discussed and debated. From the works of great philosophers like Kant, Bentham and Rawls, to modern day cocktail parties and late night dorm room bull sessions, ethical questions are endlessly mulled over and argued about, but never come to a fully satisfying conclusion.
Today, as we enter a “cognitive era” of thinking machines, the problem of what should guide our actions is gaining new importance. If we find it so difficult to denote the principles by which a person should act justly and wisely, then how are we to encode them within the artificial intelligences we are creating? This is no longer a purely theoretical question.
Since Donald Trump’s election, the media has been pilloried for bad coverage, with good reason. While there was extensive coverage of salacious scandals, there was little coverage of issues of governance, such as foreign policy, the federal budget and the environment. Actual policies were rarely compared side by side.
This is largely deserved. Cable news shows favor ratings over reporting. Online news outlets chase clicks over substance. Leaks and innuendo are routinely passed on without confirmation. As James Poniewozik, put it in the New York Times, “only one candidate was treated like she might be elected, set policy and make appointments.”
Yet still, while the media has a responsibility to report news fairly and accurately, we citizens have a responsibility to interpret it, separate fact from opinion and evaluate sources. This goes far beyond simple partisanship, even reputable and balanced reports can get it wrong, but requires us to think critically about what we see and hear. Our democracy depends on it.
I spent most of my adult life working in some of the world’s most challenging business environments. For 15 years, I managed and consulted for media businesses in places like Warsaw, Kyiv and Moscow. It was a difficult, but incredibly rewarding experience, both personally and professionally.
In time, I became adept at parachuting into a new market, learning the culture, learning the language and figuring out how to build a business. I was able to do so because I developed systems and processes for just about everything, from marketing and sales to operations and even crisis management.
Yet there was one thing I was never able to find a system for: innovation. It wasn’t for lack of effort. I studied many innovative people and organizations, but I found everyone I looked at did things very differently. Follow one and you defy another. Still, in my research I found one thing in common: Great innovators don’t just solve problems, they actively seek them out.
Apple is no longer the darling of the tech world it once was. It used to be that if you wrote something that even mildly suggested problems at the company, you were subjected to howls of execration by a seemingly endless legion of Apple fan boys. Yet clearly, those days are now over.
Consider this. In just the last few weeks, veteran tech journalist Walt Mossberg called Siri stupid, Silicon Valley guru Steve Blank questioned the company’s vision in Harvard Business Review and Business Insider reported that people are now saying that Microsoft is more innovative than Apple. Ouch!
How did what would have been considered heresy a few years ago become conventional wisdom today? The easiest answer is that Apple was unduly deified before and is now simply coming back to earth, but there’s something more at work as well. Technology cycles come and go and the present one simply doesn’t play to Apple’s strengths. It was bound to happen.
It’s a political season, so we’re hearing a lot of the usual arguments the economy. Should we raise taxes or lower them? Negotiate trade agreements or abandon them? These are important questions, but they are not the central economic issue that we face today. Productivity is.
As economist Robert Gordon explains in The Rise and Fall of American Growth, productivity growth soared between 1920 and 1970, but has sputtered since then. What’s more, he predicts that the productivity picture will get even worse in the decades to come, making it even harder to raise living standards.
To be clear, this is not a recent problem, nor can be laid neatly at the feet of one politician or another. It is also not a distinctly American challenge, but a global trend. So rehashing old arguments will get us nowhere. The truth is that the productivity problem is unlike anything we’ve faced in the last century and we’ll have to come up with new solutions for it.
Most technologies just seem to come and go. Try explaining to a teenager today about how much fun it was to go to a record store and buy a new album on vinyl, cassette or — heaven forbid! — 8-track and it immediately becomes clear how defunct and meaningless those technologies have become.
In the Innovator’s Dilemma, Harvard professor Clayton Christensen argued that disruption occurs when a technology’s performance surpasses customer needs. When that happens, the basis of competition changes and a new technology arrives that outperforms the incumbent on some other parameter.
Yet some technologies, like Leonardo DiCaprio’s character in the The Revenant, simply refuse to die. When faced with a disruptive competitor, they find new purpose and continue to thrive for decades. A big part of the difference seems to be not just the characteristics of the technology, but how it is able to integrate itself with other innovations.
In 1960, Harvard professor Theodore Levitt published a landmark paper in Harvard Business Review that urged executives to adapt by asking themselves, “What business are we really in?” He offered the both the railroad companies and Hollywood studios as examples of industries that failed to adapt because they defined their business incorrectly.
Yet today, the railroads don’t seem to be doing too badly. Union Pacific, the leading railroad company has a market capitalization of over $80 billion, about 60% more than Ford or GM. Disney, the leading movie studio company, has a market capitalization of about $150 billion. That doesn’t seem too shabby either.
While nimble startups chasing the next trend are exciting, the truth is that companies rarely succeed by adapting to market events. Rather, successful firms prevail by shaping the future. That can’t be done through agility alone, but takes years of preparation to achieve. The truth is that once you find yourself in a position where you need to adapt, it’s usually too late.