Where Did The iPhone Really Come From?
It’s hard to imagine a more revolutionary product than the iPhone. Launched just six years ago, in January 2007, it has significantly changed the way we live. Go to any city street almost anywhere in the world and you’ll see people glued to their smartphones.
Beyond just the sheer human enjoyment, the numbers are pretty startling as well. By itself, the iPhone would be the most profitable business in the world and has the power to lift the entire US economy. to say nothing the vast ecosystem that has built up around it.
Amazingly, Steve Jobs was, at best, a mediocre engineer and Apple continues to have one of the stingiest R&D programs in the business. So where did the technology come from? How can we create more businesses like Apple? Or, for that matter, more Silicon Valleys and Research Triangles? The answers are as clear as they are surprising.
What Makes an iPhone?
When the iPhone first appeared, it was primarily an achievement in user experience, not technology. There were already smartphones on the market and the basic technology had, in fact, been around for decades, including:
The Architecture: At its heart, a smartphone is much like any other computer. It has a processor for making calculations and a separate memory unit for holding data and storing programs
This is known as the Von Neumann architecture and it wasn’’t originally intended for phones, but to make weapons. The design was developed by way of a US government grant to John von Neumann in order to facilitate the large and complex computations needed to develop the Hydrogen bomb.
The Chip: While transistors were developed at Bell Labs in 1947, they weren’t capable of harnessing enough processing power to make modern computers possible. It wasn’t until a decade later, when the integrated circuit was developed by Jack Kirby at Texas Instruments and Robert Noyce at Intel, that the computer age really got underway.
Unfortunately, they couldn’t initially find buyers for commercial applications. However, the integrated circuit’s ability to reduce weight made it perfect for the Minuteman missile program, which provided a market and jumpstarted the technology.
The Internet & The Web: One of the smartphone’s most useful features is the ability to connect us to both the Internet and the Web. Both of these have unlikely origins as well.
The Internet is essentially hardware – a patchwork of fiber, frequencies and protocols that link together the world’s computers. It began with the Advanced Research Projects Agency of the US Military (ARPA) in the 1960′s. This ARPANET, was eventually opened to public and commercial traffic and evolved into the Internet.
The Web, on the other hand, provides a software layer. It was developed by Tim Berners-Lee as a file management system for CERN, a physics lab funded by the European Union. The Web is now governed by the World Wide Web Consortium, which Berners-Lee continues to lead. Most major technology companies, including Apple, are members.
Geolocation: Finally a smartphone’s ability to know where we are in relation to objects around us comes from the GPS satellite system that the U.S. military developed. Simple navigation, however, could not justify the vast expense. Rather, the funds for GPS were allocated because of the system’s ability to target missiles accurately.
By this point, a disturbing theme should be clear. The miracles of modern technology often do not originate in the private sector (in fact, breakthroughs are often rejected by incumbent firms) but come instead from government programs. Further, they come almost exclusively from military programs.
A Simple Question
Why is it that we only seem to be able to do useful things with our collective efforts when we are trying to blow things up and kill people? In the 20th century, the US led the world in basic research and the vast majority of important discoveries came from our shores. That’s beginning to change.
For example, while congress decided not to allocate a few billion dollars to build a supercollider in Texas, the Europeans did and the center of physics, as well as the discovery of the Higgs Boson, shifted across the Atlantic. China as well has shown an increasing commitment to basic science.
If we are to remain competitive in the 21st century, we will have to renew our passion for fundamental discovery without the benefit of the almost limitless funds that the Cold War engendered. The data is clear, we are losing our lead and, unless we renew our commitment to basic research, the consequences will be dire.
If present trends persist, it is doubtful that we will be able to maintain our lead in cutting edge technologies. After all, our prowess is not a birthright, but something which we had to earn.
Technology’s Last Mile
In truth, great entrepreneurs like Steve Jobs, Mark Zuckerberg and others mostly function as technology’s last mile. They determine the final application and the user experience, but very rarely are responsible for the breakthroughs that make them possible.
I don’t mean to shortchange entrepreneurs or denigrate their accomplishments. On the contrary, they perform an invaluable service by bringing technology to us in a form in which we can use it.
Moreover, they do so at great risk. It’s one thing to diligently work on a government funded program, quite another to build products for fickle consumers and uncertain markets.
Clearly, we need both: Market driven entrepreneurs and a strong technological infrastructure that provides a substrate from which they can identify and cultivate opportunities. However, that’s easier said than done. Basic research and entrepreneurial endeavors require quite different skills, mindsets and incentives.
The Way Forward
Much like physical infrastructure, building out our technological infrastructure is primarily one of commitment rather than a lack of viable approaches. In fact, we have a variety of options that have proved both effective and cost efficient.
Tax Incentives: One very simple way to encourage basic research is to provide a tax relief for investment. A R&D tax credit was originally implemented in the US in 1985, but has been renewed sporadically, expiring for the last time in 2011. It should be made permanent.
Government Programs: Besides military programs, other notable programs include National Institutes of Health, which has made significant contributions to healthcare, the Human Genome Project, which has unleashed a flurry of innovation in genomics and, most recently, ARPA-e, an initiative to provide seed funding to energy technologies. However, we can and should do more.
Innovation Prizes: Another time-tested solution is the awarding of large monetary prizes for innovation, such as the X Prize and the US government’s Race to the Top initiative. These tend to encourage investment many times greater than the value of the prize itself and are proving to be effective in jumpstarting emerging technologies and innovations.
A billion dollar prize for things like curing cancer might seem excessive, but would actually be incredibly cheap and would improve diffusion of important discoveries.
Peer Networks: In his recent book, Future Perfect, author Steven Johnson put forth another alternative. Commons-based peer network driven methods such as the open source movement, have proven effective at creating viable technologies that such as Apache and Linux that run much of our Internet infrastructure.
New programs like the National Robotics Initiative and the Advanced Manufacturing Initiative help spur similar efforts by bringing together public and private institutions, as well as through funding pilot programs.
The 21st century will be fundamentally different than the 20th, when we were locked in a two-way race for dominance with an adversary that, in retrospect, never really had a shot. For today’s multipolar world, we need a far more comprehensive approach encompassing public, private and peer-networked efforts.
Update: Just in case you might think that government funded innovation is a thing of the past, I was recently reminded that Siri also recieved seed funding from a DARPA grant.