Everybody in business is looking for the “secret sauce.” Some gain insights through experience. Others receive wisdom passed down from a mentor and still others simply learn from experience.
Yet however we come by our ideas, we rarely revisit them. Accepted wisdoms have a way of becoming second nature. Before you know it they are “how we do things around here” and aren’t subjected to further scrutiny. That’s where things often go awry.
More than we would like to admit, we manage by myth. We tend to take conventional wisdom at face value and then blame ourselves when things don’t go well. Yet the truth is that many widely accepted business practices aren’t based on evidence, but conjecture. Here are five ideas that you probably never heard of, but are based on fact.
Every great idea begins as a revelation. Yet when that flash of insight leads to action, it inevitably encounters the real world and that’s when hard lessons are learned. Adjustments have to be made and, with some luck, success can be achieved. But profitable models rarely come easy.
With growth, comes procedures, processes and a management team to support and strengthen the model. New employees are indoctrinated and it becomes an intrinsic part of the organization’s identity, almost like a corporate version of DNA.
Unfortunately, at some point the model will fail. That’s always been true, but now it happens with blazing speed. These days, startups like Instagram and Pinterest become billion dollar businesses in a matter of months—not years—and that pace will only accelerate. Clearly, we need to stop planning for stability and start managing for disruption.
Leo Burnett’s fledgling firm got off to an inauspicious start when it opened in 1935. With one client account, a staff of eight and a bowl of apples in reception, cynics said that he would soon be selling those apples on the street.
Yet, even in the midst of the Great Depression, the firm survived and Burnett, along with other pioneers such as David Ogilvy and Bill Bernbach helped create the consumer culture that defined the post-war economy. Those halcyon days are now long gone.
As long time industry veteran John Winsor recently noted, advertising agencies are no longer the valued partners they once were. In fact, he argues, brands don’t really even need agencies anymore. He might actually be understating the case. It’s not just their work that’s losing relevance, the ad agency business model itself may now be defunct.
Richard Feynman was a legend in scientific circles. One of the preeminent physicists of the 20th century—even other top minds considered him a magician—he is almost as well known for his jokes and pranks as he is for his groundbreaking discoveries.
When Feynman was a young scientist, Eugene Wigner compared him to Paul Dirac, a giant at the time well known for his autistic qualities, saying that “he’s a second Dirac, only human this time.” The quote is telling, not least because Wigner was Dirac’s brother-in-law.
While Dirac was clearly a genius, Feynman was truly transcendent. Although Feynman won the Nobel prize in physics, he was also a pioneer in nanotechnology and computing, did important work in virology and became an accomplished painter. Much like Feynman, as robots replace human jobs, we must learn to do them anew, only human this time.
In 2010 Pepsi pulled its Super Bowl ads and invested $20 million into its Refresh project, which employed crowdsourcing to support good causes. It was an astounding social media success, with more than 87 million votes cast.
Unfortunately, as this HBR case study points out, it was an abysmal business failure and Pepsi eventually fell to third place in the soda category, behind Diet Coke. For all of the hype and hoopla on social media, sales suffered dearly.
Research by the Content Marketing Institute estimates that 90% of consumer marketers are investing in content. Unfortunately, most of those efforts will fail. In order to succeed, marketers will have to learn to think like publishers. That will mean more than a change in tactics or even strategy, but a starkly different perspective. Here’s what you need to do:
When Alfred Sloan created the modern corporation at General Motors, he based it on the military. The company was split into divisions, each with its own leadership. Information flowed up, orders went down and your rank determined your responsibility.
The model was designed to implement strategy from the top and move men and materiel efficiently. It assumed that leaders had better nderstanding than those on the lower rungs. Managers made plans and foot soldiers carried them out.
Strategy in the 21st century has become less directed and more emergent. Even the military relies less on plans and more on commander’s intent. Corporate chieftains are following suit, experimenting with management structures such as holocracy. Yet we need to do more than simply change policies and practices, leadership itself must be redefined.
In Nate Silver’s Five Thirty Eight manifesto, he argues that the plural of anecdote is data. It is through compiling and analyzing observations that we transform ordinary experiences into scientific conclusions.
Yet the concept of data is still relatively new. According to Google’s Ngram, the use of the word only began to be heat up in the 1960’s. So, although data is a term that gets thrown around a lot, it’s not something we’re really accustomed to using in a meaningful way.
That’s probably why many feel that data is in conflict with personal experience. Yet in reality, data is most often used to codify experience. Historical observations are aggregated and then analyzed to identify salient trends. And that’s the real problem with how we use data. History can be a trap. It’s not the past that we need to worry about, but the future.
I’m not a talented writer. In fact, in many ways I’m pretty lousy. I’m a miserable typist—capable of little better than hunt and peck—only have a vague idea about where to put punctuation and no matter how much I proofread, I always end up with typos.
Yet I am a reasonably successful writer. My personal blog has gained a large following and I regularly contribute to top publications like Forbes and Harvard Business Review. Despite my lack of talent, hundreds of thousands read me every month.
That’s not to say I don’t admire talented writers, I do. I’ve known many and wish that I had what they have. The point is that my lack of innate prowess hasn’t held me back. The truth is that we need to fundamentally change the way we think about talent, putting less emphasis on predetermined sets of skills and more focus on the ability to acquire new ones.
Peter Drucker once famously said that a business has only two functions: marketing and innovation. What he meant was that successful businesses create great products and sell them effectively. Everything else is secondary.
Yet today, both marketing and innovation are driven by digital technology. It’s tough to think of a product without a digital component and marketing has become so digitally focused that CMO’s will soon be spending more on technology than CTO’s.
In effect, every business today is a digital business and technology, no longer confined to the IT department, is everybody’s job. Sadly, few enterprises have adapted effectively and most corporate digital initiatives fail. The problem isn’t a lack of investment or even a lack of commitment, but an unwillingness to adapt core business practices to the digital world.
Has digital technology really made us better off? While there are lots of impressive gadgets, the impact on our actual well-being has been surprisingly mild. In fact, by many measures, we’ve become worse off since personal computing took hold.
Productivity growth since 1980 has been significantly lower than the post-war period of 1947-1980. GDP growth and income inequality show similar trends. In many ways, it seems like we had it better in the old economy of unionized manufacturing.
Yet there is some hope. If you take a closer look, you’ll find that almost all of the gains have come from sectors that use IT extensively. So the real problem is not that digital technology doesn’t increase productivity, but that its impact hasn’t spread far enough. As the world of bits begins to invade the world of atoms, that will change in a big way.