The Power of Information
What’s the big deal about the information economy anyway? Surely, information has played a role in commerce since ancient times. What’s changed?
One reason for the confusion is that we’re not used to making the distinction between information and knowledge. Go to Istanbul’s ancient Grand Bazaar and you’ll instantly grasp that the traders know a lot, but not much that they can easily share even if they want to. That, it turns out, makes all the difference.
The emergence of information as a storable, fungible entity is transforming our economy and our society in ways that we scarcely realize. It’s making us richer, smarter and even healthier. What’s more, its impact is accelerating, so we’ll see a lot more change in the coming decades than we have in the past. In fact, we’re just getting started.
A Theory of Information
If you had to put a date on it, the digital age truly began in 1948. It was in that year that two seismic events happened, both at Bell Labs. The first and the more famous was the invention of the transistor, which forms the basis for our present (albeit soon to be defunct) digital technology paradigm.
The lesser known, but in some ways more important event, was Claude Shannon’s groundbreaking paper A Mathematical Theory of Communication, which launched information theory out of thin air, seemingly with no precursor.
Although not widely publicized or understood at the time, it has become central to all the digital technology we use today.
The basic idea was that information isn’t a function of content, but the absence of ambiguity, which can be broken down to a single unit – a choice between two alternatives. Much like a coin toss which lacks information while in the air, but takes on a level of certainty when it lands, information arises when ambiguity disappears.
He called this unit, a “binary digit” or a bit and much like the pound, quart, meter or liter, has become such a basic unit of measurement that it’s hard to imagine our modern world without it.
Storage and Transfer
In the late 1940’s, Shannon’s colleague at Bell Labs, Richard Hamming became frustrated that computer errors continually ruined his work. He built upon Shannon’s paper and created Hamming code, small bits of extra information that would allow computers to detect and correct errors.
Of course, that increased the amount of information that needed to be processed. No problem, information theory also shows us how to compress information by eliminating redundancies. Common technologies that we have come to use everyday, like JPEG and MP3 are, in fact compression techniques that have their roots in Shannon’s 1948 paper.
It is storage and transfer that make the information age so different from what we knew in the past. In contrast to the knowledge that a bazaar trader possesses, computers can duplicate and transfer information an infinite number of times, with as little error as we choose.
The ability to store and transfer information efficiently has an interesting side effect – we can improve at an exponential pace. Returns to our efforts not only increase, they accelerate. The most famous example is Moore’s Law, which says that processing speeds double about every 18 months.
Look at the chart above and notice the logarithmic scale. From 2000 to 2010, the number of transistors on a single chip increased from 10 million to a billion. That’s a hundred times more than the increase the previous decade and nearly a million times more than the decade before that. In the next ten years, we can expect it to increase 100 -fold again.
What’s even more amazing and also of paramount importance, is that the principle isn’t exclusive to processing, but applies to every facet of information technology, from storage to bandwidth to power consumption, everywhere you look, efficiency continues to improve exponentially.
The Information Invasion
Here’s where it gets really interesting. As information technology becomes more widely deployed, the information content of other products and services increases and they begin to follow the same exponential trends. Take a look at genome sequencing:
As sequencing genomes became less of a pure biological science and more of an information science, the pace of advancement changed drastically enabling a whole new field of bioinformatics that will revolutionize medicine.
Similar trends are being played out across almost every industry you can think of. In manufacturing, new technologies like 3D printing and (eventually) programmable matter are creating what The Economist calls a third industrial revolution. Even when you go and buy a box of cereal at Wal-Mart, a good portion of the retail price is made up of informationally dense logistics.
Ray Kurzweil spoke volumes when he said that in the future “all technologies will essentially become information technologies, including energy.”
From Belief to Observation
As the informational content of products and services increases and returns accelerate, lots of good things happen. Science fiction becomes engineered fact. The unthinkable will become commonplace. Incomes will rise while poverty falls. Seemingly intractable problems will be solved and dire needs will be met.
Yet there is also quite a bit that is unsettling. As technological cycles shorten, business models will have shorter life spans. The internalized experiences that we have come to regard as intuition will fail us more often. Our ability to plan will diminish and the need to experiment (and fail) will increase.
And that’s what’s disconcerting. This new information economy doesn’t run on beliefs or even, to a certain extent on ambition, but algorithms which, powered by ever more abundant processing power, test and accept or discard a dizzying multitude of possibilities, the results of which can be retrieved and recombined with other experiments.
The power of information means that we are no longer required to believe, only to imagine, test and observe.