Like most people, I try to do things well and am frustrated when I don’t. That’s why I always find it hard to start something new. I tend to remember my past work in its best light, so the new stuff I’m working on feels inadequate by comparison. I also worry that a future failure will tarnish any past success that I may have had.
That’s why the toughest part of any job is to start. Every project begins with enormous potential, but once you start it becomes a messy reality. Choices need to be made and with those choices come the inevitable errors and mistakes. We’re not always at our best, but we tend to judge ourselves against the times when we were.
The result is that I often look at something I’m working on and say, “what a load of crap.” Yet over the years, I’ve learned not to let it bother me. In fact, I take pride in it. I dare to be crap, knowing that any flaws can be fixed later on, while a blank page will get me nowhere. The truth is that if you are ever to do anything that’s any good, you usually have to start with crap.
In an article I wrote two years ago about the future of money, I explained that despite the libertarian fantasies that were being bandied about at the time, Bitcoin was unlikely to ever be a mainstream currency. Now it seems that Bitcoin may be collapsing and key figures in the community are starting to back away from it.
The reasons for Bitcoin’s troubles are many, including poor governance, a lack of technological infrastructure and infighting within its community. Besides, as I noted in my previous article, the fact that sovereign governments have the power to tax in their own currencies always made a Bitcoin takeover unlikely.
Still, as Don and Alex Tapscott explain in their new book, The Blockchain Revolution, the technology behind Bitcoin may still be revolutionary. In fact, despite Bitcoin’s problems there are already thousands of entrepreneurs and investors betting on blockchain technology to reimagine transactions of all types. It’s still early days, but the potential is undeniable.
For the past fifty years or so, technology has followed a fairly predictable path. We squeeze more transistors onto silicon wafers, which makes chips more powerful and devices smaller. Manual processes become automated, productivity increases and life gets better. Rinse and repeat.
Today, we’re at an inflection point and that predictable path to progress will soon be closed off. What lies ahead is a period of extreme disruption in which most of what we’ve come to expect from technology is becoming undone. What replaces it will be truly new and different.
Two years ago, I called the cloud the most disruptive technology ever, because it made the world’s most advanced technologies available to just about anyone with an internet connection. Previously, only large enterprises that could afford to maintain expensive IT staffs had access to cutting edge capabilities.
Still, while the cloud has proved to be highly disruptive, it offered few capabilities that didn’t exist before. Sure, it made those capabilities cheaper, more efficient and more accessible, but the truth is that, outside of large data applications like Hadoop and Spark, it didn’t allow us to do much that we couldn’t do before.
IBM’s recent announcement that it will make quantum computing available on its IBM Cloud platform will help change that. For the first time, just about anyone will be able to benefit from a technology that virtually no one had access to before. That, in itself, is big news. But it also opens the door to something much bigger—a truly new era for cloud computing.
Innovation has become like a religion in business today, with “innovate or die” as its mantra. When a company succeeds, people attribute its good fortune to superior innovation. When it fails, people say it lacked the ability to innovate, no matter how many new products it launched. The message is simple: you need to disrupt to survive.
So it shouldn’t be surprising that there is no shortage of people offering silver bullets. They promise a “secret sauce” that will unlock the creativity in your organization. They preach disruption, open innovation, lean launchpads or whatever else is the flavor of the day with the passion and surety of evangelical ministers.
The truth is that there is no one true path to innovation. Compare any two great innovators and they inevitably do things very differently. So if you choose to emulate one, you are in a sense rejecting the other, which may be equally or even more successful. The only real path forward is to define the problems you seek to solve and build your own innovation playbook.
When Eric Haller graduated with a degree in finance from San Diego State University in 1990, he didn’t have any clear idea of what he wanted to do. If he was on the east coast, he probably would have went to work at an investment bank, but he ended up taking a job as a statistical analyst at Visa in Foster City, little more than a day’s drive away.
He would later say that it was a case of being in the right place at the right time. The credit card industry was at the beginning of a major transformation, moving from voice clearing and paper transactions to electronic commerce. New database solutions, like SAS and Oracle, made it possible to analyze transactions like never before.
Before long, Haller’s lowly position as a statistical analyst took on great importance and he quickly moved up the ranks. Still, he saw that there was something bigger going on—an emerging marketplace for data. It was around that time that he heard of a company that not only shared his vision, but was creating a business that was destined to be at the center of it.
Studies show that over 90% of startups fail. Even for those rare few that make it big, life doesn’t get much easier. In fact, only slightly more than 10% of the firms on the original Fortune 500 list are still in business today. Making an enterprise successful and keeping it that way is a staggeringly hard thing to do.
So it’s not hard to see why there has been so much effort devoted to narrowing down a company’s performance to a single factor. Some say that focusing on the customer is key. Others believe that building a great culture is the true path to success. Still others preach the gospel of developing capable leaders.
In The Halo Effect, IMD’s Phil Rosenzweig pours cold water on all of these. Firms that are successful, he points out, are perceived as being customer focused, having a great culture and building strong leadership, but when those firms hit hard times, critics claim that they are failing in those very same areas. To truly understand performance, we have to look deeper.
The truth is that innovation is never a single event, but rather the confluence of efforts undertaken by many different people and organizations. That’s why electricity only became really useful when factories adapted to it and complementary innovations, like home appliances and air conditioners, took root.
Today, we’re at a similar point of convergence to what happened a century ago. Digital technology, which has been around for decades, is beginning to power new complementary technologies, such as genomics, nanotechnology and robotics. By coincidence, these forces will all converge around the year 2020 and, after that, the world will be profoundly different.
Jules Verne, the 19th century science fiction writer, made a number of predictions, like submarines, space travel and even newscasts, that turned out to be accurate. His visions of the future were so vibrant that many inventors and scientists in the 20th century took inspiration from his work.
Yet many of our most difficult challenges couldn’t have been imagined even by a genius like Verne. The obesity epidemic, climate change and the economic and health issues that come with longer life spans wouldn’t have made any sense to a 19th century audience, when progress meant more to eat, larger industry and less mortality.
In much the same way, some of the thorniest problems we’ll have to face will come from the unintended consequences of advances we make today. What will make these so difficult to overcome is that they are cannot be solved in a research lab or a think tank, but in the public square. Unfortunately, we haven’t really even begun to start thinking them through.
Since World War II, the United States has been an innovation superpower. In virtually every advanced field, whether it be information technology, biotechnology, agriculture or renewable energy, America holds a leading position. Other nations may challenge in one field or another, but no one can match its depth and breadth.
To account for its success, many point to America’s entrepreneurial culture, its tolerance for failure and its unique ecosystem of venture funding. Those factors do play important roles, but the most important thing driving America’s success has been its unparalleled scientific leadership.
Most of this research is publicly funded. Take a look at any significant innovation, such as an iPhone, and you’ll find that most, if not all, of the technology came from some government program. Google itself got its start from a federal grant. Yet that poses a problem for managers. How can a private company turn public research into a competitive advantage?