Tribune Publishing, a storied icon of American journalism, recently renamed itself Tronc and released a video to show off a new “content optimization platform,” that its Chief Technology Officer, Malcolm CasSelle, claims will be “the key to making our content really valuable to the broadest possible audience” through the use of machine learning.
As a marketing ploy the move clearly failed. Instead of debuting a new tech-savvy firm that would, in the words of Chief Digital Officer Anne Vasquez, be like “having a tech startup culture meet a legacy corporate culture,” it came off as buzzword-laden and naive. The Internet positively erupted with derision.
Yet what I find even more disturbing than the style is the substance. The notion that you can transform a failing media company—or any company in any industry for that matter— by infusing it with data and algorithms is terribly misguided. While technology can certainly improve performance, the idea that it can replace a sound strategy is a dangerous delusion.
“Managing without soul has become an epidemic in society. Many managers these days seem to specialize in killing cultures, at the expense of human engagement.” That’s what management guru Henry Mintzberg recently wrote about the current state of corporate culture on his blog.
Too make matters even worse, he points out that many executives are actually trained to operate that way at MBA programs. While business schools teach technocratic skills, such as finance, optimization and resource management, they do very little in the way of strengthening souls.
Sadly, corporate culture discussions usually devolve into buzzwords, like “authenticity.” And while Mintzberg says that after a half century of studying organizations he can get a sense of an one’s soul “in an instant,” he offers little guidance how to develop one. The truth is that you don’t find your soul inside yourself, but by finding your place in the world.
A data scientist, it’s been said, is a statistician who works in Silicon Valley, which is another way of saying that the term has attained true buzzword status. The potential to be unlocked is undeniably, but so far, there has been no shortage of disappointment and frustration. Truth be told, the steak hasn’t always lived up to the sizzle.
The problem hasn’t been with big data itself, but with the practicalities of technology. Simply put, we design systems to perform particular tasks and only later realize that we want them to do more than we originally realized. That’s when it becomes clear that our systems are hopelessly incompatible.
In a nutshell, that’s the problem IBM is now trying to fix. By creating a universal platform, which it calls the Data Science Experience, it hopes to integrate data trapped in separate protocols and incompatible systems. This will not only enable more advanced analytics, it will help us to reimagine how we manage our organizations and compete in the marketplace.
In 1882, just three years after he had almost literally shocked the world with his revolutionary lighting system, Thomas Edison opened his Pearl Street Station, the first commercial electrical distribution plant in the United States. By 1884 it was already servicing over 500 homes.
Up till that point, electric light was mostly a curiosity. While a few of the mighty elite could afford to install generators in their homes—J.P.Morgan was one of the very first—it was out of the reach of most people. Electrical transmission changed all that and in the ensuing years much of the country wired up.
Still, as Paul David explained in his paper, The Dynamo and the Computer, electricity didn’t have a measurable impact on the economy until the early 1920’s—40 years later, when we finally knew enough about the new technology and learned how to unleash its potential. The story of how that happened shows why it takes more than a single idea to change the world.
The presidential run of Bernie Sanders has often been referred to as a movement rather than a campaign and it certainly has all the trappings—a distinctive ideology, devoted followers and large crowds. Many believe that the Sanders movement will far outlive the current cycle and shape the political future.
To state the obvious, as a candidate Sanders did not succeed—by any objective Hillary Clinton trounced him—but I can see how the idea of his movement living on would salve some open wounds among his followers. To them, Bernie Sanders was always more than a candidate, he was a living embodiment of a shared purpose.
Yet I would argue that a much more likely scenario is that we’ll soon be forgetting Bernie Sanders and not because he failed as a politician, but because of how he failed as a leader of his movement, all too often choosing to attack rather than engage. Hopefully, in the years to come, his failure will become a cautionary tale to those who seek to effect change in society.
Steve Jobs built—and then revived—Apple by fusing technology with design. IBM has remained a top player in its industry for roughly a century by investing in research that is often a decade ahead of its time. Facebook “moves fast and maintains a stable infrastructure” (but apparently doesn’t break things anymore).
Each of these companies, in its own way, is a superior innovator. But what makes Google (now officially known as Alphabet) different is that it doesn’t rely on any one strategy, but deploys a number of them to create an intricate—but powerful—innovation ecosystem that seems to roll out innovations by the dozens.
The company is, of course, a massive enterprise, with $75 billion in revenues, over 60,000 employees and a dizzying array of products, from the core search business and the android operating system to nascent businesses like autonomous cars. So to better understand how Google innovates, I took a look at close look what it’s doing in one area: Deep Learning.
When Ray Kurzweil published The Singularity Is Near in 2006, many scoffed at his outlandish predictions. A year before Apple launched its iPhone, Kurzweil imagined a world in which humans and computers essentially fuse, unlocking capabilities we normally see in science fiction movies.
His argument though, was amazingly simple. He pointed out that as technology accelerates at an exponential rate, progress would eventually become virtually instantaneous—a singularity. Further, he predicted that as computers advanced, they would merge with other technologies, namely genomics, nanotechnology and robotics.
Today, Kurzweil’s ideas don’t seem quite so outlandish. Google’s DeepMind recently beat legendary Go world champion Lee Sedol. IBM’s Watson is expanding horizons in medicine, financial planning and even cooking. Self driving cars are expected to be on the road by 2020. Just as Kurzweil predicted, technology seems to be accelerating faster than ever before.
At the Code Conference last week, Elon Musk had a wide ranging interview about everything from who he thinks will compete with Tesla in self-driving cars to neural laces that will augment human intelligence and his plans for space travel. But the thing that caught my eye was his assertion that we all are might be living in a computer simulation.
It’s a fantastical idea, to be sure. So much so that it makes you wonder whether to actually take him seriously. Could it be that he actually believes that we’re nothing more than a set of bits in someone else’s computer game? If so, then can he really be trusted to run billion dollar enterprises?
By all appearances, Musk is dead serious about the possibility that we’re living in a computer simulation. And while it is, of course, an utterly impractical and illogical idea, computer technology itself was born out of an illogical idea. It is, in fact, people like Elon Musk, who are able to take a rational approach to utterly improbable ideas, that end up creating the future.
In a blog post that recently went viral, Marco Arment argued that Apple may be the next Blackberry. He pointed out that, while other tech companies like Amazon, Facebook and Google are advancing in key artificial intelligence technologies, Apple is falling behind.
Clearly, he’s right about the second point. Apple’s Siri service is noticeably weaker than competitors and these types of AI services do seem to be key to the digital future. What I think he misses is that Apple has never competed in fundamental technologies. Rather, it has been especially adept at molding existing technologies into products that are “insanely great.”
The truth is that Apple is not really a technology company and never has been. Yet, it remains what is perhaps the world’s best design and engineering company. It has also built a first rate supply chain that gives it an additional advantage over competitors. The truth is that Apple isn’t like Blackberry at all. If anything, in the years to come Apple will be the next Toyota.
Like most people, I try to do things well and am frustrated when I don’t. That’s why I always find it hard to start something new. I tend to remember my past work in its best light, so the new stuff I’m working on feels inadequate by comparison. I also worry that a future failure will tarnish any past success that I may have had.
That’s why the toughest part of any job is to start. Every project begins with enormous potential, but once you start it becomes a messy reality. Choices need to be made and with those choices come the inevitable errors and mistakes. We’re not always at our best, but we tend to judge ourselves against the times when we were.
The result is that I often look at something I’m working on and say, “what a load of crap.” Yet over the years, I’ve learned not to let it bother me. In fact, I take pride in it. I dare to be crap, knowing that any flaws can be fixed later on, while a blank page will get me nowhere. The truth is that if you are ever to do anything that’s any good, you usually have to start with crap.