The presidential run of Bernie Sanders has often been referred to as a movement rather than a campaign and it certainly has all the trappings—a distinctive ideology, devoted followers and large crowds. Many believe that the Sanders movement will far outlive the current cycle and shape the political future.
To state the obvious, as a candidate Sanders did not succeed—by any objective Hillary Clinton trounced him—but I can see how the idea of his movement living on would salve some open wounds among his followers. To them, Bernie Sanders was always more than a candidate, he was a living embodiment of a shared purpose.
Yet I would argue that a much more likely scenario is that we’ll soon be forgetting Bernie Sanders and not because he failed as a politician, but because of how he failed as a leader of his movement, all too often choosing to attack rather than engage. Hopefully, in the years to come, his failure will become a cautionary tale to those who seek to effect change in society.
Steve Jobs built—and then revived—Apple by fusing technology with design. IBM has remained a top player in its industry for roughly a century by investing in research that is often a decade ahead of its time. Facebook “moves fast and maintains a stable infrastructure” (but apparently doesn’t break things anymore).
Each of these companies, in its own way, is a superior innovator. But what makes Google (now officially known as Alphabet) different is that it doesn’t rely on any one strategy, but deploys a number of them to create an intricate—but powerful—innovation ecosystem that seems to roll out innovations by the dozens.
The company is, of course, a massive enterprise, with $75 billion in revenues, over 60,000 employees and a dizzying array of products, from the core search business and the android operating system to nascent businesses like autonomous cars. So to better understand how Google innovates, I took a look at close look what it’s doing in one area: Deep Learning.
When Ray Kurzweil published The Singularity Is Near in 2006, many scoffed at his outlandish predictions. A year before Apple launched its iPhone, Kurzweil imagined a world in which humans and computers essentially fuse, unlocking capabilities we normally see in science fiction movies.
His argument though, was amazingly simple. He pointed out that as technology accelerates at an exponential rate, progress would eventually become virtually instantaneous—a singularity. Further, he predicted that as computers advanced, they would merge with other technologies, namely genomics, nanotechnology and robotics.
Today, Kurzweil’s ideas don’t seem quite so outlandish. Google’s DeepMind recently beat legendary Go world champion Lee Sedol. IBM’s Watson is expanding horizons in medicine, financial planning and even cooking. Self driving cars are expected to be on the road by 2020. Just as Kurzweil predicted, technology seems to be accelerating faster than ever before.
At the Code Conference last week, Elon Musk had a wide ranging interview about everything from who he thinks will compete with Tesla in self-driving cars to neural laces that will augment human intelligence and his plans for space travel. But the thing that caught my eye was his assertion that we all are might be living in a computer simulation.
It’s a fantastical idea, to be sure. So much so that it makes you wonder whether to actually take him seriously. Could it be that he actually believes that we’re nothing more than a set of bits in someone else’s computer game? If so, then can he really be trusted to run billion dollar enterprises?
By all appearances, Musk is dead serious about the possibility that we’re living in a computer simulation. And while it is, of course, an utterly impractical and illogical idea, computer technology itself was born out of an illogical idea. It is, in fact, people like Elon Musk, who are able to take a rational approach to utterly improbable ideas, that end up creating the future.
In a blog post that recently went viral, Marco Arment argued that Apple may be the next Blackberry. He pointed out that, while other tech companies like Amazon, Facebook and Google are advancing in key artificial intelligence technologies, Apple is falling behind.
Clearly, he’s right about the second point. Apple’s Siri service is noticeably weaker than competitors and these types of AI services do seem to be key to the digital future. What I think he misses is that Apple has never competed in fundamental technologies. Rather, it has been especially adept at molding existing technologies into products that are “insanely great.”
The truth is that Apple is not really a technology company and never has been. Yet, it remains what is perhaps the world’s best design and engineering company. It has also built a first rate supply chain that gives it an additional advantage over competitors. The truth is that Apple isn’t like Blackberry at all. If anything, in the years to come Apple will be the next Toyota.
Like most people, I try to do things well and am frustrated when I don’t. That’s why I always find it hard to start something new. I tend to remember my past work in its best light, so the new stuff I’m working on feels inadequate by comparison. I also worry that a future failure will tarnish any past success that I may have had.
That’s why the toughest part of any job is to start. Every project begins with enormous potential, but once you start it becomes a messy reality. Choices need to be made and with those choices come the inevitable errors and mistakes. We’re not always at our best, but we tend to judge ourselves against the times when we were.
The result is that I often look at something I’m working on and say, “what a load of crap.” Yet over the years, I’ve learned not to let it bother me. In fact, I take pride in it. I dare to be crap, knowing that any flaws can be fixed later on, while a blank page will get me nowhere. The truth is that if you are ever to do anything that’s any good, you usually have to start with crap.
In an article I wrote two years ago about the future of money, I explained that despite the libertarian fantasies that were being bandied about at the time, Bitcoin was unlikely to ever be a mainstream currency. Now it seems that Bitcoin may be collapsing and key figures in the community are starting to back away from it.
The reasons for Bitcoin’s troubles are many, including poor governance, a lack of technological infrastructure and infighting within its community. Besides, as I noted in my previous article, the fact that sovereign governments have the power to tax in their own currencies always made a Bitcoin takeover unlikely.
Still, as Don and Alex Tapscott explain in their new book, The Blockchain Revolution, the technology behind Bitcoin may still be revolutionary. In fact, despite Bitcoin’s problems there are already thousands of entrepreneurs and investors betting on blockchain technology to reimagine transactions of all types. It’s still early days, but the potential is undeniable.
For the past fifty years or so, technology has followed a fairly predictable path. We squeeze more transistors onto silicon wafers, which makes chips more powerful and devices smaller. Manual processes become automated, productivity increases and life gets better. Rinse and repeat.
Today, we’re at an inflection point and that predictable path to progress will soon be closed off. What lies ahead is a period of extreme disruption in which most of what we’ve come to expect from technology is becoming undone. What replaces it will be truly new and different.
Two years ago, I called the cloud the most disruptive technology ever, because it made the world’s most advanced technologies available to just about anyone with an internet connection. Previously, only large enterprises that could afford to maintain expensive IT staffs had access to cutting edge capabilities.
Still, while the cloud has proved to be highly disruptive, it offered few capabilities that didn’t exist before. Sure, it made those capabilities cheaper, more efficient and more accessible, but the truth is that, outside of large data applications like Hadoop and Spark, it didn’t allow us to do much that we couldn’t do before.
IBM’s recent announcement that it will make quantum computing available on its IBM Cloud platform will help change that. For the first time, just about anyone will be able to benefit from a technology that virtually no one had access to before. That, in itself, is big news. But it also opens the door to something much bigger—a truly new era for cloud computing.
Innovation has become like a religion in business today, with “innovate or die” as its mantra. When a company succeeds, people attribute its good fortune to superior innovation. When it fails, people say it lacked the ability to innovate, no matter how many new products it launched. The message is simple: you need to disrupt to survive.
So it shouldn’t be surprising that there is no shortage of people offering silver bullets. They promise a “secret sauce” that will unlock the creativity in your organization. They preach disruption, open innovation, lean launchpads or whatever else is the flavor of the day with the passion and surety of evangelical ministers.
The truth is that there is no one true path to innovation. Compare any two great innovators and they inevitably do things very differently. So if you choose to emulate one, you are in a sense rejecting the other, which may be equally or even more successful. The only real path forward is to define the problems you seek to solve and build your own innovation playbook.