Today, every brand needs to become a publisher. It used to be that marketers could come up with some compelling images, add a clever tagline and then push their message out through mass media. That might have been simplistic, but if you could reach enough people efficiently, it worked well enough.
But today, when you create an effective promotion, your competitors can track its effects and then retarget the consumers you worked so hard to persuade. Essentially, by building awareness and walking away, you are doing lead generation for your competition. It’s no longer enough to grab attention, you now need to hold attention.
That’s why marketers need to learn how to tell stories. A great story can provide emotional transport for a brand and create the basis for a larger narrative. Make no mistake, brand publishing is vastly more than creating longer and more expensive versions of ads. Marketers need to shift their mindset from being promoters to becoming master narrators.
The declaration of surrender was touted as a triumph. Microsoft Loves Linux, the headline read, but just a decade earlier, the firm’s then CEO, Steve Ballmer had called Linux a cancer. The all-powerful tech giant had lost and lost badly — to a ragtag band of revolutionaries, no less — but still seemed strangely upbeat.
Overthrows like these are becoming increasingly common and not just in business. As Moisés Naím observed in his book, The End of Power, institutions of all types, from corporations and governments to traditional churches, charities and militaries, are being disrupted. “Power has become easier to get, but harder to use or keep,” he writes.
The truth is that it’s no longer enough to capture the trappings of power, because movements made up of small groups can synchronize their actions through networks. So if you want to effect lasting change today, it’s no longer enough to merely command resources, you have to inspire opponents to join your cause. History shows these movements follow a clear pattern.
In 2001, Microsoft CEO Steve Ballmer declared war on the open source community. He called Linux a cancer and argued, essentially, that anybody who used open source resources in their was putting their business at risk. He even went as far as to urge the government not to support open source projects.
From Microsoft’s point of view, it was an understandable position. After all, its dominance of the industry was dependent on protecting its intellectual property. Yet a decade later, it open sourced Kinect, one of it’s most popular products ever. More recently, current CEO Satya Nadella declared that Microsoft loves Linux.
It was an startling shift and one the entire industry has long embraced as well. The reason why is simple: In a connected world, open beats closed and nobody can truly go it alone anymore. That’s why every business today must transform itself into a platform that connects to ecosystems of talent, technology and information outside the boundaries of the firm.
During World War II, natives on Pacific islands saw something most unusual. Strange men appeared, cleared long strips of land and built structures decorated with flags. Some of these men wore large cups over their ears, while others waved sticks and, almost magically, machines appeared from the sky carrying valuable cargo.
After the war ended, the men left and the supplies stopped coming. Some of the natives formed cargo cults which copied many of the the rituals the soldiers performed. They marched in formation, wore cups over their ears and waved sticks around. Alas, no airplanes ever came.
Clearly, the idea was patently absurd. Anybody who thinks that waving sticks will cause airplanes to appear is missing some basic principles about how air travel works. Yet many modern executives also believe by mimicking the tactics of others they will somehow achieve the same results. These “cargo cult strategists” don’t do much better than the islanders.
Technology used to be pretty simple. If you had some technical know-how, a few transistors and a soldering iron, you could go far. William Hewlett and David Packard built a billion dollar business in a garage and later, Steve Jobs and Steve Wozniak did the same. It was a time of amazing opportunity.
Later, hardware got much more sophisticated and there was not much you could do without a multimillion dollar facility, so the hackers moved to software. Then came the Internet and coders moved from garages to their bedrooms. With a laptop and a little bit of coding expertise, you could do impressive things.
Today, both hardware and software have become incredibly advanced. Experts in artificial intelligence are so rare that they’re being paid like sports stars. Yet this time around, tech giants like Microsoft, Amazon, IBM and Google are themselves making resources available, allowing anyone who wants to access some of the world’s most advanced technology.
In the Nicomachean Ethics, Aristotle states that it is a fact that “all knowledge and every pursuit aims at some good,” but then continues, “what then do we mean by the good?” That, in essence, encapsulates the ethical dilemma. We all agree that we should be good and just, but it’s much harder to decide just what that entails.
Since Aristotle’s time, the issues he raised have been continually discussed and debated. From the works of great philosophers like Kant, Bentham and Rawls, to modern day cocktail parties and late night dorm room bull sessions, ethical questions are endlessly mulled over and argued about, but never come to a fully satisfying conclusion.
Today, as we enter a “cognitive era” of thinking machines, the problem of what should guide our actions is gaining new importance. If we find it so difficult to denote the principles by which a person should act justly and wisely, then how are we to encode them within the artificial intelligences we are creating? This is no longer a purely theoretical question.
Since Donald Trump’s election, the media has been pilloried for bad coverage, with good reason. While there was extensive coverage of salacious scandals, there was little coverage of issues of governance, such as foreign policy, the federal budget and the environment. Actual policies were rarely compared side by side.
This is largely deserved. Cable news shows favor ratings over reporting. Online news outlets chase clicks over substance. Leaks and innuendo are routinely passed on without confirmation. As James Poniewozik, put it in the New York Times, “only one candidate was treated like she might be elected, set policy and make appointments.”
Yet still, while the media has a responsibility to report news fairly and accurately, we citizens have a responsibility to interpret it, separate fact from opinion and evaluate sources. This goes far beyond simple partisanship, even reputable and balanced reports can get it wrong, but requires us to think critically about what we see and hear. Our democracy depends on it.
I spent most of my adult life working in some of the world’s most challenging business environments. For 15 years, I managed and consulted for media businesses in places like Warsaw, Kyiv and Moscow. It was a difficult, but incredibly rewarding experience, both personally and professionally.
In time, I became adept at parachuting into a new market, learning the culture, learning the language and figuring out how to build a business. I was able to do so because I developed systems and processes for just about everything, from marketing and sales to operations and even crisis management.
Yet there was one thing I was never able to find a system for: innovation. It wasn’t for lack of effort. I studied many innovative people and organizations, but I found everyone I looked at did things very differently. Follow one and you defy another. Still, in my research I found one thing in common: Great innovators don’t just solve problems, they actively seek them out.
Apple is no longer the darling of the tech world it once was. It used to be that if you wrote something that even mildly suggested problems at the company, you were subjected to howls of execration by a seemingly endless legion of Apple fan boys. Yet clearly, those days are now over.
Consider this. In just the last few weeks, veteran tech journalist Walt Mossberg called Siri stupid, Silicon Valley guru Steve Blank questioned the company’s vision in Harvard Business Review and Business Insider reported that people are now saying that Microsoft is more innovative than Apple. Ouch!
How did what would have been considered heresy a few years ago become conventional wisdom today? The easiest answer is that Apple was unduly deified before and is now simply coming back to earth, but there’s something more at work as well. Technology cycles come and go and the present one simply doesn’t play to Apple’s strengths. It was bound to happen.
It’s a political season, so we’re hearing a lot of the usual arguments the economy. Should we raise taxes or lower them? Negotiate trade agreements or abandon them? These are important questions, but they are not the central economic issue that we face today. Productivity is.
As economist Robert Gordon explains in The Rise and Fall of American Growth, productivity growth soared between 1920 and 1970, but has sputtered since then. What’s more, he predicts that the productivity picture will get even worse in the decades to come, making it even harder to raise living standards.
To be clear, this is not a recent problem, nor can be laid neatly at the feet of one politician or another. It is also not a distinctly American challenge, but a global trend. So rehashing old arguments will get us nowhere. The truth is that the productivity problem is unlike anything we’ve faced in the last century and we’ll have to come up with new solutions for it.