THE
2
INNOVATORS
3
ALSO BY WALTER ISAACSON
Steve Jobs American Sketches
Einstein: His Life and Universe A Benjamin Franklin Reader
Benjamin Franklin: An American Life Kissinger: A Biography
The Wise Men: Six Friends and the World They Made (with Evan Thomas) Pro and Con
4
HOW A GROUP OF HACKERS, GENIUSES, AND GEEKS CREATED THE DIGITAL
REVOLUTION
5
6
First published in Great Britain by Simon & Schuster UK Ltd, 2014 A CBS COMPANY
Copyright © 2014 by Walter Isaacson
This book is copyright under the Berne Convention. No reproduction without permission.
All rights reserved.
The right of Walter Isaacson to be identified as the author of this work has been asserted by him in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act, 1988.
Simon & Schuster UK Ltd 1st Floor
222 Gray’s Inn Road London WC1X 8HB
www.simonandschuster.co.uk
Simon & Schuster Australia, Sydney Simon & Schuster India, New Delhi
A CIP catalogue record for this book is available from the British Library
Excerpts from “All Watched Over by Machines of Loving Grace” from The Pill Versus the Springhill Mine Disaster by Richard Brautigan. Copyright © 1968 by Richard Brautigan. Reproduced by permission of Houghton Mifflin Harcourt Publishing Company. All rights reserved.
Photo research and editing by Laura Wyss, Wyssphoto, Inc., with the assistance of Elizabeth Seramur, Amy Hikida, and Emily Vinson, and by Jonathan Cox.
Interior design by Ruth Lee-Mui
ISBN: 978-1-47113-879-9 Ebook: 978-1-47113-881-2
The author and publishers have made all reasonable efforts to contact copyright-holders for permission, and apologise for any omissions or errors in the form of credits given. Corrections may be made to future printings.
Printed and bound by CPI Group (UK) Ltd, Croydon, CR0 4YY
7
http://www.simonandschuster.co.uk
CONTENTS
Illustrated Timeline Introduction
CHAPTER 1
Ada, Countess of Lovelace CHAPTER 2
The Computer CHAPTER 3
Programming CHAPTER 4
The Transistor CHAPTER 5
The Microchip CHAPTER 6
Video Games CHAPTER 7
The Internet CHAPTER 8
The Personal Computer CHAPTER 9
Software CHAPTER 10
Online CHAPTER 11
The Web CHAPTER 12
Ada Forever
Acknowledgments Notes
Photo Credits
8
Index
9
THE
10
INNOVATORS
11
1800
1843
Ada, Countess of Lovelace, publishes “Notes” on Babbage’s Analytical Engine.
1847
George Boole creates a system using algebra for logical reasoning.
1890
The census is tabulated with Herman Hollerith’s punch-card machines.
12
1931
Vannevar Bush devises the Differential Analyzer, an analog electromechanical computer.
1935
Tommy Flowers pioneers use of vacuum tubes as on-off switches in circuits.
1937
13
Alan Turing publishes “On Computable Numbers,” describing a universal computer.
Claude Shannon describes how circuits of switches can perform tasks of Boolean algebra.
Bell Labs’ George Stibitz proposes a calculator using an electric circuit.
14
Howard Aiken proposes construction of large digital computer and discovers parts of Babbage’s Difference Engine at Harvard.
John Vincent Atanasoff puts together concepts for an electronic computer during a long December night’s drive.
1938
William Hewlett and David Packard form company in Palo Alto garage.
1939
Atanasoff finishes model of electronic computer with mechanical storage drums.
15
Turing arrives at Bletchley Park to work on breaking German codes.
1941
Konrad Zuse completes Z3, a fully functional electromechanical programmable digital computer.
16
John Mauchly visits Atanasoff in Iowa, sees computer demonstrated.
1952
1942
Atanasoff completes partly working computer with three hundred vacuum tubes, leaves for Navy.
1943
Colossus, a vacuum-tube computer to break German codes, is completed at Bletchley Park.
1944
17
Harvard Mark I goes into operation.
John von Neumann goes to Penn to work on ENIAC.
1945
Von Neumann writes “First Draft of a Report on the EDVAC” describing a stored-program computer.
18
Six women programmers of ENIAC are sent to Aberdeen for training.
Vannevar Bush publishes “As We May Think,” describing personal computer.
Bush publishes “Science, the Endless Frontier,” proposing government funding of academic and industrial research.
ENIAC is fully operational.
1947
Transistor invented at Bell Labs.
1950
Turing publishes article describing a test for artificial intelligence.
19
1952
Grace Hopper develops first computer compiler.
Von Neumann completes modern computer at the Institute for Advanced Study.
UNIVAC predicts Eisenhower election victory.
1954
1954
Turing commits suicide.
20
Texas Instruments introduces silicon transistor and helps launch Regency radio.
1956
Shockley Semiconductor founded.
First artificial intelligence conference.
1957
21
Robert Noyce, Gordon Moore, and others form Fairchild Semiconductor.
Russia launches Sputnik.
1958
Advanced Research Projects Agency (ARPA) announced.
22
Jack Kilby demonstrates integrated circuit, or microchip.
1959
Noyce and Fairchild colleagues independently invent microchip.
1960
J. C. R. Licklider publishes “Man-Computer Symbiosis.”
23
Paul Baran at RAND devises packet switching.
1961
President Kennedy proposes sending man to the moon.
1962
MIT hackers create Spacewar game.
Licklider becomes founding director of ARPA’s Information Processing Techniques Office.
Doug Engelbart publishes “Augmenting Human Intellect.”
1963
24
Licklider proposes an “Intergalactic Computer Network.”
Engelbart and Bill English invent the mouse.
1972
1964
Ken Kesey and the Merry Pranksters take bus trip across America.
1965
Ted Nelson publishes first article about “hypertext.”
25
Moore’s Law predicts microchips will double in power each year or so.
1966
Stewart Brand hosts Trips Festival with Ken Kesey.
26
Bob Taylor convinces ARPA chief Charles Herzfeld to fund ARPANET.
Donald Davies coins the term packet switching.
1967
ARPANET design discussions in Ann Arbor and Gatlinburg.
1968
Larry Roberts sends out request for bids to build the ARPANET’s IMPs.
27
Noyce and Moore form Intel, hire Andy Grove.
Brand publishes first Whole Earth Catalog.
28
Engelbart stages the Mother of All Demos with Brand’s help.
1969
First nodes of ARPANET installed.
1971
Don Hoefler begins column for Electronic News called “Silicon Valley USA.”
Demise party for Whole Earth Catalog.
Intel 4004 microprocessor unveiled.
29
Ray Tomlinson invents email.
1972
Nolan Bushnell creates Pong at Atari with Al Alcorn.
1973
1973
30
Alan Kay helps to create the Alto at Xerox PARC.
Ethernet developed by Bob Metcalfe at Xerox PARC.
Community Memory shared terminal set up at Leopold’s Records, Berkeley.
31
Vint Cerf and Bob Kahn complete TCP/IP protocols for the Internet.
1974
Intel 8080 comes out.
1975
Altair personal computer from MITS appears.
32
Paul Allen and Bill Gates write BASIC for Altair, form Microsoft.
First meeting of Homebrew Computer Club.
Steve Jobs and Steve Wozniak launch the Apple I.
1977
The Apple II is released.
33
1978
First Internet Bulletin Board System.
1979
Usenet newsgroups invented.
Jobs visits Xerox PARC.
1980
IBM commissions Microsoft to develop an operating system for PC.
1981
Hayes modem marketed to home users.
1983
34
Microsoft announces Windows.
Richard Stallman begins developing GNU, a free operating system.
2011
1984
35
Apple introduces Macintosh.
1985
Stewart Brand and Larry Brilliant launch The WELL.
CVC launches Q-Link, which becomes AOL.
1991
36
Linus Torvalds releases first version of Linux kernel.
Tim Berners-Lee announces World Wide Web.
1993
Marc Andreessen announces Mosaic browser.
37
Steve Case’s AOL offers direct access to the Internet.
1994
Justin Hall launches Web log and directory.
HotWired and Time Inc.’s Pathfinder become first major magazine publishers on Web.
1995
Ward Cunningham’s Wiki Wiki Web goes online.
1997
38
IBM’s Deep Blue beats Garry Kasparov in chess.
1998
Larry Page and Sergey Brin launch Google.
1999
39
Ev Williams launches Blogger.
2001
Jimmy Wales, with Larry Sanger, launches Wikipedia.
2011
40
IBM’s computer Watson wins Jeopardy!
41
INTRODUCTION
HOW THIS BOOK CAME TO BE
The computer and the Internet are among the most important inventions of our era, but few people know who created them. They were not conjured up in a garret or garage by solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell, and Morse. Instead, most of the innovations of the digital age were done collaboratively. There were a lot of fascinating people involved, some ingenious and a few even geniuses. This is the story of these pioneers, hackers, inventors, and entrepreneurs—who they were, how their minds worked, and what made them so creative. It’s also a narrative of how they collaborated and why their ability to work as teams made them even more creative.
The tale of their teamwork is important because we don’t often focus on how central that skill is to innovation. There are thousands of books celebrating people we biographers portray, or mythologize, as lone inventors. I’ve produced a few myself. Search the phrase “the man who invented” on Amazon and you get 1,860 book results. But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology revolution was fashioned. It can also be more interesting.
We talk so much about innovation these days that it has become a buzzword, drained of clear meaning. So in this book I set out to report on how innovation actually happens in the real world. How did the most imaginative innovators of our time turn disruptive ideas into realities? I focus on a dozen or so of the most significant breakthroughs of the digital age and the people who made them. What ingredients produced their creative leaps? What skills proved most useful? How did they lead and collaborate? Why did some succeed and others fail?
I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by government spending and managed by a military-industrial-academic collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it-yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.
Histories can be written with a different emphasis on any of these factors. An example is the invention of the Harvard/IBM Mark I, the first big electromechanical computer. One of its programmers, Grace Hopper, wrote a history that focused on its primary creator, Howard Aiken. IBM countered with a history that featured its teams of faceless engineers who contributed the incremental innovations, from counters to card feeders, that went into the machine.
Likewise, what emphasis should be put on great individuals versus on cultural currents has long been a matter of dispute; in the mid-nineteenth century, Thomas Carlyle declared that “the history of the world is but the biography of great men,” and Herbert Spencer responded with a theory that emphasized the role of societal forces. Academics and participants often view this balance differently. “As a professor, I tended to think of history as run by impersonal forces,” Henry Kissinger told reporters during one of his Middle East shuttle missions in the 1970s. “But when you see it in practice, you see the difference personalities make.”1 When it comes to digital-age innovation, as with Middle East peacemaking, a variety
42
of personal and cultural forces all come into play, and in this book I sought to weave them together.
The Internet was originally built to facilitate collaboration. By contrast, personal computers, especially those meant to be used at home, were devised as tools for individual creativity. For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.
Historians of science are sometimes wary about calling periods of great change revolutions, because they prefer to view progress as evolutionary. “There was no such thing as the Scientific Revolution, and this is a book about it,” is the wry opening sentence of the Harvard professor Steven Shapin’s book on that period. One method that Shapin used to escape his half-joking contradiction is to note how the key players of the period “vigorously expressed the view” that they were part of a revolution. “Our sense of radical change afoot comes substantially from them.”2
Likewise, most of us today share a sense that the digital advances of the past half century are transforming, perhaps even revolutionizing the way we live. I can recall the excitement that each new breakthrough engendered. My father and uncles were electrical engineers, and like many of the characters in this book I grew up with a basement workshop that had circuit boards to be soldered, radios to be opened, tubes to be tested, and boxes of transistors and resistors to be sorted and deployed. As an electronics geek who loved Heathkits and ham radios (WA5JTP), I can remember when vacuum tubes gave way to transistors. At college I learned programming using punch cards and recall when the agony of batch processing was replaced by the ecstasy of hands-on interaction. In the 1980s I thrilled to the static and screech that modems made when they opened for you the weirdly magical realm of online services and bulletin boards, and in the early 1990s I helped to run a digital division at Time and Time Warner that launched new Web and broadband Internet services. As Wordsworth said of the enthusiasts who were present at the beginning of the French Revolution, “Bliss was it in that dawn to be alive.”
I began work on this book more than a decade ago. It grew out of my fascination with the digital-age advances I had witnessed and also from my biography of Benjamin Franklin, who was an innovator, inventor, publisher, postal service pioneer, and all-around information networker and entrepreneur. I wanted to step away from doing biographies, which tend to emphasize the role of singular individuals, and once again do a book like The Wise Men, which I had coauthored with a colleague about the creative teamwork of six friends who shaped America’s cold war policies. My initial plan was to focus on the teams that invented the Internet. But when I interviewed Bill Gates, he convinced me that the simultaneous emergence of the Internet and the personal computer made for a richer tale. I put this book on hold early in 2009, when I began working on a biography of Steve Jobs. But his story reinforced my interest in how the development of the Internet and computers intertwined, so as soon as I finished that book, I went back to work on this tale of digital-age innovators.
The protocols of the Internet were devised by peer collaboration, and the resulting system seemed to have embedded in its genetic code a propensity to facilitate such collaboration. The power to create and transmit information was fully distributed to each of the nodes, and any attempt to impose controls or a hierarchy could be routed around. Without falling into the teleological fallacy of ascribing intentions or a personality to technology, it’s fair to say that a
43
system of open networks connected to individually controlled computers tended, as the printing press did, to wrest control over the distribution of information from gatekeepers, central authorities, and institutions that employed scriveners and scribes. It became easier for ordinary folks to create and share content.
The collaboration that created the digital age was not just among peers but also between generations. Ideas were handed off from one cohort of innovators to the next. Another theme that emerged from my research was that users repeatedly commandeered digital innovations to create communications and social networking tools. I also became interested in how the quest for artificial intelligence—machines that think on their own—has consistently proved less fruitful than creating ways to forge a partnership or symbiosis between people and machines. In other words, the collaborative creativity that marked the digital age included collaboration between humans and machines.
Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered. “I always thought of myself as a humanities person as a kid, but I liked electronics,” Jobs told me when I embarked on his biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story.
Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact. When Einstein was stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres.
When it comes to computers, there is one other historical figure, not as well known, who embodied the combination of the arts and sciences. Like her famous father, she understood the romance of poetry. Unlike him, she also saw the romance of math and machinery. And that is where our story begins.
44
Ada, Countess of Lovelace (1815–52), painted by Margaret Sarah Carpenter in 1836.
45
Lord Byron (1788–1824), Ada’s father, in Albanian dress, painted by Thomas Phillips in 1835.
46
Charles Babbage (1791–1871), photograph taken circa 1837.
47
CHAPTER ONE
ADA, COUNTESS OF LOVELACE
POETICAL SCIENCE In May 1833, when she was seventeen, Ada Byron was among the young women presented at the British royal court. Family members had worried about how she would acquit herself, given her high-strung and independent nature, but she ended up behaving, her mother reported, “tolerably well.” Among those Ada met that evening were the Duke of Wellington, whose straightforward manner she admired, and the seventy-nine-year-old French ambassador Talleyrand, who struck her as “an old monkey.”1
The only legitimate child of the poet Lord Byron, Ada had inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in mathematics. The combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to her enchantment with numbers. For many, including her father, the rarefied sensibilities of the Romantic era clashed with the techno-excitement of the Industrial Revolution. But Ada was comfortable at the intersection of both eras.
So it was not surprising that her debut at court, despite the glamour of the occasion, made less impression on her than her attendance a few weeks later at another majestic event of the London season, at which she met Charles Babbage, a forty-one-year-old widowed science and math eminence who had established himself as a luminary on London’s social circuit. “Ada was more pleased with a party she was at on Wednesday than with any of the assemblages in the grand monde,” her mother reported to a friend. “She met there a few scientific people—amongst them Babbage, with whom she was delighted.”2
Babbage’s galvanizing weekly salons, which included up to three hundred guests, brought together lords in swallow-tail coats and ladies in brocade gowns with writers, industrialists, poets, actors, statesmen, explorers, botanists, and other “scientists,” a word that Babbage’s friends had recently coined.3 By bringing scientific scholars into this exalted realm, said one noted geologist, Babbage “successfully asserted the rank in society due to science.”4
The evenings featured dancing, readings, games, and lectures accompanied by an assortment of seafood, meat, fowl, exotic drinks, and iced desserts. The ladies staged tableaux vivants, in which they dressed in costume to re-create famous paintings. Astronomers set up telescopes, researchers displayed their electrical and magnetic contrivances, and Babbage allowed guests to play with his mechanical dolls. The centerpiece of the evenings—and one of Babbage’s many motives for hosting them—was his demonstration of a model portion of his Difference Engine, a mammoth mechanical calculating contraption that he was building in a fireproof structure adjacent to his home. Babbage would display the model with great drama, cranking its arm as it calculated a sequence of numbers and, just as the audience began to get bored, showed how the pattern could suddenly change based on instructions that had been coded into the machine.5 Those who were especially intrigued would be invited through the yard to the former stables, where the complete machine was being constructed.
Babbage’s Difference Engine, which could solve polynomial equations, impressed people in different ways. The Duke of Wellington commented that it could be useful in analyzing the variables a general might face before going into battle.6 Ada’s mother, Lady Byron, marveled
48
that it was a “thinking machine.” As for Ada, who would later famously note that machines could never truly think, a friend who went with them to the demonstration reported, “Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.”7
Ada’s love of both poetry and math primed her to see beauty in a computing machine. She was an exemplar of the era of Romantic science, which was characterized by a lyrical enthusiasm for invention and discovery. It was a period that brought “imaginative intensity and excitement to scientific work,” Richard Holmes wrote in The Age of Wonder. “It was driven by a common ideal of intense, even reckless, personal commitment to discovery.”8
In short, it was a time not unlike our own. The advances of the Industrial Revolution, including the steam engine, mechanical loom, and telegraph, transformed the nineteenth century in much the same way that the advances of the Digital Revolution—the computer, microchip, and Internet—have transformed our own. At the heart of both eras were innovators who combined imagination and passion with wondrous technology, a mix that produced Ada’s poetical science and what the twentieth-century poet Richard Brautigan would call “machines of loving grace.”
LORD BYRON Ada inherited her poetic and insubordinate temperament from her father, but he was not the source of her love for machinery. He was, in fact, a Luddite. In his maiden speech in the House of Lords, given in February 1812 when he was twenty-four, Byron defended the followers of Ned Ludd, who were rampaging against mechanical weaving machines. With sarcastic scorn Byron mocked the mill owners of Nottingham, who were pushing a bill that would make destroying automated looms a crime punishable by death. “These machines were to them an advantage, inasmuch as they superseded the necessity of employing a number of workmen, who were left in consequence to starve,” Byron declared. “The rejected workmen, in the blindness of their ignorance, instead of rejoicing at these improvements in arts so beneficial to mankind, conceived themselves to be sacrificed to improvements in mechanism.”
Two weeks later, Byron published the first two cantos of his epic poem Childe Harold’s Pilgrimage, a romanticized account of his wanderings through Portugal, Malta, and Greece, and, as he later remarked, “awoke one morning and found myself famous.” Beautiful, seductive, troubled, brooding, and sexually adventurous, he was living the life of a Byronic hero while creating the archetype in his poetry. He became the toast of literary London and was feted at three parties each day, most memorably a lavish morning dance hosted by Lady Caroline Lamb.
Lady Caroline, though married to a politically powerful aristocrat who was later prime minister, fell madly in love with Byron. He thought she was “too thin,” yet she had an unconventional sexual ambiguity (she liked to dress as a page boy) that he found enticing. They had a turbulent affair, and after it ended she stalked him obsessively. She famously declared him to be “mad, bad, and dangerous to know,” which he was. So was she.
At Lady Caroline’s party, Lord Byron had also noticed a reserved young woman who was, he recalled, “more simply dressed.” Annabella Milbanke, nineteen, was from a wealthy and multi-titled family. The night before the party, she had read Childe Harold and had mixed feelings. “He is rather too much of a mannerist,” she wrote. “He excels most in the delineation of deep feeling.” Upon seeing him across the room at the party, her feelings were conflicted, dangerously so. “I did not seek an introduction to him, for all the women were absurdly courting him, and trying to deserve the lash of his Satire,” she wrote her mother. “I am not desirous of a place in his lays. I made no offering at the shrine of Childe Harold,