When I was a student, a man came to speak about Winston Churchill. Mostly, it was the usual mix of historical events and anecdotes, which in Churchill’s case was a potent mixture of the poignant, the irreverent and the hilarious. But what I remember most was how the talk ended.
The speaker concluded by saying that if we were to remember one thing about Churchill it should be that what made him so effective was his power to communicate. I didn’t understand that at the time. Growing up I had always heard about the importance of hard work, honesty and other things, but never communication.
Yet now, thirty years later, I’ve begun to understand what he meant. As Walter Isaacson argues in his book The Innovators, even in technology—maybe especially in technology—the ability to collaborate effectively is decisive. In order to innovate, it’s not enough to just come up with big ideas, you also need to work hard to communicate them clearly.
Peter Thiel loves secrets. In his book, Zero to One, he makes a fervent case that unless you firmly believe that there are still things to discover, you will never achieve much. The most you will be able to accomplish is a small tweak on conventional wisdom and that’s no way to create value.
He also points out that everything we now think of as foundational was once a secret. Schoolchildren today learn Euclidean geometry, but at one time it was new and different. When Google first discovered the search engine secret, Yahoo didn’t think it was buying.
The problem is that it’s hard to tell a secret—something true but still unknown—from a unicorn, something that people believe which doesn’t really exist. Intuition is a double-edged sword, it can alert us to facts that are not in evidence, but it can also lead us to chase things that aren’t really there. The trick is to be able to kill the unicorns and find real secrets.
When Henry Ford started his company in 1903, he did more than just create a car or an assembly line (neither of which he actually invented). What he did was establish an entirely new form of organization, culminating in the vertically integrated River Rouge complex that was completed in 1928.
By mid century, nearly every facet of life was transformed. We moved out to the suburbs, built gas stations and shopping malls, massed produced and mass marketed. Enormous enterprises arose that built large bureaucracies to control it all and make it run efficiently.
Yet today’s digital economy is fundamentally different. Rather than assets managed by centralized organizations, we have ecosystems managed by platforms. Capabilities are no longer determined by what you own or control, but what you can access. Therefore, we need to think less about how we build efficiencies and more about how we build connections.
There’s no doubt Tim Cook has a very tough job. When he stepped in as CEO of Apple he was following in the footsteps of one of the most—if not the most—iconic entrepreneurs in history. Every step would be scrutinized by a legion of die hard fans and magnified a thousandfold.
Yet Cook has performed admirably. The stock has more than doubled since he took over in August 2011 and Apple just posted the best quarter for any company ever. Despite aggressive share buybacks, the company is still sitting on a mountain of cash—more than $170 billion!
Tim Cook is no Steve Jobs and that may actually be a good thing. Unlike the bombastic Apple founder, he’s been even-keeled, launching no vendettas and creating no controversies. Although there has been a lack of blockbuster launches, the company is an operational wunderkind. Here’s how Cook is doing things differently than his famous predecessor.
In the go-go eighties, “Chainsaw” Al Dunlap’s enthusiasm for aggressive cost cutting and massive layoffs made him a corporate superhero. His subsequent indictment and conviction on fraud charges led business people to question his character, but not necessarily his methods.
Poor Al would never survive as a CEO today. Social media would eat him for breakfast. Today’s corporate executives need to mind their P’s and Q’s, because any stray word can instantly go viral, damage the stock price and diminish shareholder value.
These days, most corporate executives pay lip service to the idea that people come first, but beyond nice sounding platitudes, relatively little has changed. Boardroom discussions mostly focus on financial data and the need to be “practical” about people decisions Yet smart firms value their people not out of altruism or fear of a backlash, but because it’s good business.
Even today, it’s difficult to have a serious discussion about information technology without eventually hitting on Turing machines, Turing tests or some other concept that he invented. It’d be hard to think of anyone who matches his impact on information technology.
So it’s curious, to say the least, that his country has fared so poorly in the industry that Turing helped create. There is no British Apple, Google or IBM. In fact, only one company on the FTSE 100 is a computer firm. The story of how that happened is more than a mere historical curiosity, but offers important lessons for how to foster technology and innovation.
In 1905, a young Albert Einstein shocked the world. In one miracle year, he overturned the prevailing assumptions of his day and changed how we see the universe, transforming forever how we think of time, space, mass, energy and light. He paved the way for our modern world.
Yet 22 years later it was Einstein’s turn to be caught in the mire of his own assumptions. In a famous round of debates with Niels Bohr, he was unable to accept the consequences of the quantum world that, in fact, he had made possible in 1905, insisting that “God does not play dice with the universe.”
Einstein’s problem wasn’t grasping the importance of a new idea, but accepting an entirely new platform for physics—one which led to things like lasers, microprocessors and iPhones —and it doomed the rest of his career. Today, we all face a similar dilemma. New platforms require us not to merely alter our behavior, but our assumptions about how the world works.
One Friday afternoon in 2002, long before his company became a household verb, Larry Page walked into the office kitchen and posted some printouts of results from Google’s AdWords engine. On top, in big bold letters, he wrote, “THESE ADS SUCK.”
At most places, this would be seen as cruel—an arrogant executive publicly humiliating his hapless employees for shoddy work—but not at Google. In fact, his unusual act was a show of confidence, defining a tough problem that he knew his talented engineers would want to solve.
In their new book, How Google Works, Eric Schmidt and Jonathan Rosenberg describe what happened next. By early Monday morning, a group of engineers sent out an email that not only resolved the problem, but helped transform Google into a profit machine. The episode exemplifies 4 principles that enable the company to attack problems so effectively.
It’s become fashionable for politicians to say that they aren’t scientists. While these are usually statements of fact, they are still curious. Certainly, when it comes to issues of finance or war, we don’t see elected officials lining up to say, “I’m not an economist” or “I’m not a soldier.”
Science is what separates the kook from the professional. People who talk about aliens in flying saucers are usually written off as lunatics. Yet serious scientists are able to attract public and public funding for the search for extraterrestrial intelligence (SETI) looking for alien life.
On the surface, the term “scientific” seems to be a fairly arbitrary distinction. After all, alien hunters and SETI scientists are both engaged in a search for truth, but the difference is that the work of scientists, when properly done, is reproducible and testable and that makes all the difference. Science matters not because of its greater truth, but its lesser solipsism.
In the mid-20th century, Linus Pauling was one of the world’s most celebrated scientists. His discovery of the structure of hemoglobin and other biological molecules created an entirely new field of science and, when he focused his ample talents on deciphering the structure of DNA, no one doubted that he would be the one to do it.
But he wasn’t. In fact, it was two relatively unknown researchers, James Watson and Francis Crick who would win that particular prize. Yet what was really astounding was the way they did it. While other scientists spent their time in the lab, Watson and Crick played with models and talked about them with each other.
What seemed like child’s play to most academics was actually the best way to imagine possibilities and see how their ideas reflected diverse—and often confusing—empirical clues. Today, a growing contingent of academics believes that games can have the same effect on how children learn and a company called Kidaptive is determined to prove them right.