Skip to content

We Need To Change The Way We Think About Technology

2016 February 21
Thinking Technology

I recently got a call from my mother asking me to help her watch House of Cards on Netflix. She was frustrated and complained, “I keep pressing the thing and nothing happens!” It was hard to get her to understand I had no idea what thing she was pressing or what was supposed to happen when she did.

I’m still not exactly sure what the problem was, but getting her to understand that the buttons on her remote had little to do with the TV in her bedroom and everything to do with giving instructions to servers in faraway places seemed to help. Before long, her frustration with technology turned to fascination with the political machinations of Frank Underwood.

Many businesses have the same problem as my mother. As technology advances, its function evolves and those that are unable to shift their mental models find themselves unable to compete. This is especially true of digital technology, where every generation sees a new crop players emerge while old titans falter. Only a rare few are able to cross the chasm.

The Rise Of Mathematical Machines

The term “computer” used to refer to a person, not a machine. Teams of people, usually women, would sit in a room carrying out arithmetic for complex calculations that scientists needed to do their work. So it shouldn’t be surprising that the first digital computers were used in the same way, as purely calculating machines.

Three machines can claim to be the first true digital computer. The first, called Colossus, was designed in secret at Bletchley Park outside London. The second, the ENIAC, was built at the University of Pennsylvania and the third, the IAS machine was built in Princeton, NJ. All were monstrous and built at great expense by governments for military applications.

They soon found commercial applications as well. The design of the IAS machine was open sourced and before long a computer industry made up of IBM and the BUNCH companies (Burroughs, UNIVAC, NCR, Control Data Corporation, and Honeywell) began to serve private corporations with heavy computational tasks, such as payroll and accounting.

By the 1960’s, hard drives and databases were developed and computers took on a storage function as well as pure calculation, but they were still largely confined to the back office. Very few people actually worked directly with computers, although executives would occasionally see a printout of their results.

From Calculation To Communication

In 1968, Douglas Engelbart presented a completely new conception of the computer at an event that was so consequential it is now referred to as The Mother of All Demos. Within a few years, Bob Taylor, who financed Engelbart’s work at DARPA, began working to make it a commercial reality at Xerox PARC’s Computer Science Lab.

While the giants of industry still considered computers to be mathematical machines, Taylor saw that interactive technology could transform them into communication machines.  The product his team built, the Alto, had many of the features of the machines we know today, such as a graphical user interface set up as a desktop, a mouse and ethernet connections.

Yet as Michael Hiltzik reported in his history of PARC, Dealers of Lightning, the Xerox brass was less than impressed. Used to dealing with top level executives, they didn’t see the value of personal computers. On the other hand, their wives, many of whom had previously worked as secretaries, were fascinated by its ability to automate basic office tasks.

The BUNCH companies didn’t see it either. Like the Xerox executives, they remained stuck in old mental models as a new crop of companies, including Apple, Microsoft, Compaq and Dell came to dominate the industry. IBM, along with Intel, whose chips powered the revolution, survived intact, but they were rare exceptions.

Automating Cognitive Tasks

Today, digital technology is being transformed once again. Computers, once large machines that took up entire rooms and then bulky boxes placed under desks, have been transformed once again. Now, we not only carry around smartphones more powerful than yesterday’s supercomputers, microprocessors are embedded in everything from toasters to traffic lights.

These new advances are creating several trends that are converging into a completely new paradigm. First, small devices such as smartphones and embedded chips are collecting an unprecedented amount of data. Second, that data is being stored in a network of servers that make up the “cloud.” Third, that data can be accessed and analyzed in real time.

Yet what is truly revolutionary is the way that data is being analyzed. In the past, the logic of analysis was fairly rigid—a particular set of inputs would lead to a predetermined set of outputs. Now, however, an increasingly powerful cadre of learning algorithms take all that information in and make judgments based on context.

Essentially, much like the steam engine automated physical tasks in the 19th century, today digital technology is automating cognitive tasks, from medical diagnoses and legal discovery to even creative work. And, just like in the previous paradigm shifts, most enterprises won’t survive the transformation. We’ll see old giants fall and new leaders emerge.

Learning To Collaborate With Machines—And Each Other

None of these shifts came easy. When the first commercial computer, the UNIVAC, debuted on CBS to predict the results of the 1952 election, the network executives found the results so out of line with human predictions that they refused to air them. Nevertheless, UNIVAC had it right and the “experts” had to contend with being outsmarted by a machine.

Later, when Bob Taylor was building the Alto, he had to devote two thirds of its power to run the display. That seemed crazy for a machine devoted to back office calculations, but it was absolutely essential to create a truly interactive computer. Very few executives at the time knew how to type, so putting a machine with a keyboard on every desk was far from obvious.

In both cases, the paradigm shift was so profound that earlier attitudes seem silly today. Of course computers can make predictions that humans can’t! Of course we need computers to do our jobs! Who would wait for their secretary to get back from lunch so that an email could be typed up? Yet it took years—decades even—for these things to become clear.

Today, as machines begin to assist us in mental tasks, we’re starting to see social skills trump cognitive skills and the basic rules for success will change yet again. We will need to learn to collaborate with humans as well as machines in ways that aren’t obvious now, but in a generation will be as undeniable as the need for a personal computer.

Yet one thing should be abundantly already clear, our past mental models will hold us back. Our failure to adapt to the future is less likely to be due to a lack of intelligence than a lack of imagination.

– Greg

 

5 Responses leave one →
  1. gregorylent permalink
    February 21, 2016

    technology is just the out-picturing in 3d what the developed mind can already do ..

    [Reply]

    MKotyck Reply:

    Sure it is; computers emulate the only thinking mechanism we know – our brains. But what happens if that developed mind is constructed of materials that lay outside organics, becomes faster and can become almost unlimited in size?

    What does that mean for work? For humanity? Who controls the growth, writes the rules to limit the actions (ethics, compassion, etc.)?

    [Reply]

  2. Randy Wing permalink
    February 23, 2016

    Correction: The real first digital computer was the Atanasoff-Berry Computer.

    Supporting Material:

    from: http://www.computerhope.com/issues/ch000984.htm

    The first digital computer

    Short for Atanasoff-Berry Computer, the ABC began development by Professor John Vincent Atanasoff and graduate student Cliff Berry in 1937. Its development continued until 1942 at the Iowa State College (now Iowa State University).

    The ABC was an electrical computer that used vacuum tubes for digital computation, including binary math and Boolean logic and had no CPU. On October 19, 1973, the US Federal Judge Earl R. Larson signed his decision that the ENIAC patent by J. Presper Eckert and John Mauchly was invalid and named Atanasoff the inventor of the electronic digital computer.

    [Reply]

    Greg Reply:

    I understand the dispute, especially since, if I remember correctly, Mauchly actually visited Atanasoff before before completing the ENIAC. However despite the claims of Atanasoff and others (e.g. Zuse, Aiken, etc.) the Colossus, the Eniac and the IAS machine are most commonly recognized as the first digital computers, for various reasons.

    – Greg

    [Reply]

  3. Kuldip Singh permalink
    March 27, 2016

    As Future Shock author Alvin Toffler says, “The illiterate of the 21st Century will not be those who cannot read or write, but those who cannot learn, unlearn and relearn.”

    [Reply]

Leave a Reply

Note: You can use basic XHTML in your comments. Your email address will never be published.

Subscribe to this comment feed via RSS