Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

Cook magic talking microwave model 87108

04/12/2021 Client: muhammad11 Deadline: 2 Day

The paper should have a title, and consists of at least two sections: 1) A brief narrative of how an IS/IT is realized, initiated, designed, and implemented in terms of what/when/where/how this happened, and key character players involved in the series of events. And, 2) Aha! Moment: Key observations/lessons/implications you have learned from the reading from the perspective of business information systems.

THE

2

INNOVATORS

3

ALSO BY WALTER ISAACSON

Steve Jobs American Sketches

Einstein: His Life and Universe A Benjamin Franklin Reader

Benjamin Franklin: An American Life Kissinger: A Biography

The Wise Men: Six Friends and the World They Made (with Evan Thomas) Pro and Con

4

HOW A GROUP OF HACKERS, GENIUSES, AND GEEKS CREATED THE DIGITAL

REVOLUTION

5

6

First published in Great Britain by Simon & Schuster UK Ltd, 2014 A CBS COMPANY

Copyright © 2014 by Walter Isaacson

This book is copyright under the Berne Convention. No reproduction without permission.

All rights reserved.

The right of Walter Isaacson to be identified as the author of this work has been asserted by him in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act, 1988.

Simon & Schuster UK Ltd 1st Floor

222 Gray’s Inn Road London WC1X 8HB

www.simonandschuster.co.uk

Simon & Schuster Australia, Sydney Simon & Schuster India, New Delhi

A CIP catalogue record for this book is available from the British Library

Excerpts from “All Watched Over by Machines of Loving Grace” from The Pill Versus the Springhill Mine Disaster by Richard Brautigan. Copyright © 1968 by Richard Brautigan. Reproduced by permission of Houghton Mifflin Harcourt Publishing Company. All rights reserved.

Photo research and editing by Laura Wyss, Wyssphoto, Inc., with the assistance of Elizabeth Seramur, Amy Hikida, and Emily Vinson, and by Jonathan Cox.

Interior design by Ruth Lee-Mui

ISBN: 978-1-47113-879-9 Ebook: 978-1-47113-881-2

The author and publishers have made all reasonable efforts to contact copyright-holders for permission, and apologise for any omissions or errors in the form of credits given. Corrections may be made to future printings.

Printed and bound by CPI Group (UK) Ltd, Croydon, CR0 4YY

7

http://www.simonandschuster.co.uk
CONTENTS

Illustrated Timeline Introduction

CHAPTER 1

Ada, Countess of Lovelace CHAPTER 2

The Computer CHAPTER 3

Programming CHAPTER 4

The Transistor CHAPTER 5

The Microchip CHAPTER 6

Video Games CHAPTER 7

The Internet CHAPTER 8

The Personal Computer CHAPTER 9

Software CHAPTER 10

Online CHAPTER 11

The Web CHAPTER 12

Ada Forever

Acknowledgments Notes

Photo Credits

8

Index

9

THE

10

INNOVATORS

11

1800

1843

Ada, Countess of Lovelace, publishes “Notes” on Babbage’s Analytical Engine.

1847

George Boole creates a system using algebra for logical reasoning.

1890

The census is tabulated with Herman Hollerith’s punch-card machines.

12

1931

Vannevar Bush devises the Differential Analyzer, an analog electromechanical computer.

1935

Tommy Flowers pioneers use of vacuum tubes as on-off switches in circuits.

1937

13

Alan Turing publishes “On Computable Numbers,” describing a universal computer.

Claude Shannon describes how circuits of switches can perform tasks of Boolean algebra.

Bell Labs’ George Stibitz proposes a calculator using an electric circuit.

14

Howard Aiken proposes construction of large digital computer and discovers parts of Babbage’s Difference Engine at Harvard.

John Vincent Atanasoff puts together concepts for an electronic computer during a long December night’s drive.

1938

William Hewlett and David Packard form company in Palo Alto garage.

1939

Atanasoff finishes model of electronic computer with mechanical storage drums.

15

Turing arrives at Bletchley Park to work on breaking German codes.

1941

Konrad Zuse completes Z3, a fully functional electromechanical programmable digital computer.

16

John Mauchly visits Atanasoff in Iowa, sees computer demonstrated.

1952

1942

Atanasoff completes partly working computer with three hundred vacuum tubes, leaves for Navy.

1943

Colossus, a vacuum-tube computer to break German codes, is completed at Bletchley Park.

1944

17

Harvard Mark I goes into operation.

John von Neumann goes to Penn to work on ENIAC.

1945

Von Neumann writes “First Draft of a Report on the EDVAC” describing a stored-program computer.

18

Six women programmers of ENIAC are sent to Aberdeen for training.

Vannevar Bush publishes “As We May Think,” describing personal computer.

Bush publishes “Science, the Endless Frontier,” proposing government funding of academic and industrial research.

ENIAC is fully operational.

1947

Transistor invented at Bell Labs.

1950

Turing publishes article describing a test for artificial intelligence.

19

1952

Grace Hopper develops first computer compiler.

Von Neumann completes modern computer at the Institute for Advanced Study.

UNIVAC predicts Eisenhower election victory.

1954

1954

Turing commits suicide.

20

Texas Instruments introduces silicon transistor and helps launch Regency radio.

1956

Shockley Semiconductor founded.

First artificial intelligence conference.

1957

21

Robert Noyce, Gordon Moore, and others form Fairchild Semiconductor.

Russia launches Sputnik.

1958

Advanced Research Projects Agency (ARPA) announced.

22

Jack Kilby demonstrates integrated circuit, or microchip.

1959

Noyce and Fairchild colleagues independently invent microchip.

1960

J. C. R. Licklider publishes “Man-Computer Symbiosis.”

23

Paul Baran at RAND devises packet switching.

1961

President Kennedy proposes sending man to the moon.

1962

MIT hackers create Spacewar game.

Licklider becomes founding director of ARPA’s Information Processing Techniques Office.

Doug Engelbart publishes “Augmenting Human Intellect.”

1963

24

Licklider proposes an “Intergalactic Computer Network.”

Engelbart and Bill English invent the mouse.

1972

1964

Ken Kesey and the Merry Pranksters take bus trip across America.

1965

Ted Nelson publishes first article about “hypertext.”

25

Moore’s Law predicts microchips will double in power each year or so.

1966

Stewart Brand hosts Trips Festival with Ken Kesey.

26

Bob Taylor convinces ARPA chief Charles Herzfeld to fund ARPANET.

Donald Davies coins the term packet switching.

1967

ARPANET design discussions in Ann Arbor and Gatlinburg.

1968

Larry Roberts sends out request for bids to build the ARPANET’s IMPs.

27

Noyce and Moore form Intel, hire Andy Grove.

Brand publishes first Whole Earth Catalog.

28

Engelbart stages the Mother of All Demos with Brand’s help.

1969

First nodes of ARPANET installed.

1971

Don Hoefler begins column for Electronic News called “Silicon Valley USA.”

Demise party for Whole Earth Catalog.

Intel 4004 microprocessor unveiled.

29

Ray Tomlinson invents email.

1972

Nolan Bushnell creates Pong at Atari with Al Alcorn.

1973

1973

30

Alan Kay helps to create the Alto at Xerox PARC.

Ethernet developed by Bob Metcalfe at Xerox PARC.

Community Memory shared terminal set up at Leopold’s Records, Berkeley.

31

Vint Cerf and Bob Kahn complete TCP/IP protocols for the Internet.

1974

Intel 8080 comes out.

1975

Altair personal computer from MITS appears.

32

Paul Allen and Bill Gates write BASIC for Altair, form Microsoft.

First meeting of Homebrew Computer Club.

Steve Jobs and Steve Wozniak launch the Apple I.

1977

The Apple II is released.

33

1978

First Internet Bulletin Board System.

1979

Usenet newsgroups invented.

Jobs visits Xerox PARC.

1980

IBM commissions Microsoft to develop an operating system for PC.

1981

Hayes modem marketed to home users.

1983

34

Microsoft announces Windows.

Richard Stallman begins developing GNU, a free operating system.

2011

1984

35

Apple introduces Macintosh.

1985

Stewart Brand and Larry Brilliant launch The WELL.

CVC launches Q-Link, which becomes AOL.

1991

36

Linus Torvalds releases first version of Linux kernel.

Tim Berners-Lee announces World Wide Web.

1993

Marc Andreessen announces Mosaic browser.

37

Steve Case’s AOL offers direct access to the Internet.

1994

Justin Hall launches Web log and directory.

HotWired and Time Inc.’s Pathfinder become first major magazine publishers on Web.

1995

Ward Cunningham’s Wiki Wiki Web goes online.

1997

38

IBM’s Deep Blue beats Garry Kasparov in chess.

1998

Larry Page and Sergey Brin launch Google.

1999

39

Ev Williams launches Blogger.

2001

Jimmy Wales, with Larry Sanger, launches Wikipedia.

2011

40

IBM’s computer Watson wins Jeopardy!

41

INTRODUCTION

HOW THIS BOOK CAME TO BE

The computer and the Internet are among the most important inventions of our era, but few people know who created them. They were not conjured up in a garret or garage by solo inventors suitable to be singled out on magazine covers or put into a pantheon with Edison, Bell, and Morse. Instead, most of the innovations of the digital age were done collaboratively. There were a lot of fascinating people involved, some ingenious and a few even geniuses. This is the story of these pioneers, hackers, inventors, and entrepreneurs—who they were, how their minds worked, and what made them so creative. It’s also a narrative of how they collaborated and why their ability to work as teams made them even more creative.

The tale of their teamwork is important because we don’t often focus on how central that skill is to innovation. There are thousands of books celebrating people we biographers portray, or mythologize, as lone inventors. I’ve produced a few myself. Search the phrase “the man who invented” on Amazon and you get 1,860 book results. But we have far fewer tales of collaborative creativity, which is actually more important in understanding how today’s technology revolution was fashioned. It can also be more interesting.

We talk so much about innovation these days that it has become a buzzword, drained of clear meaning. So in this book I set out to report on how innovation actually happens in the real world. How did the most imaginative innovators of our time turn disruptive ideas into realities? I focus on a dozen or so of the most significant breakthroughs of the digital age and the people who made them. What ingredients produced their creative leaps? What skills proved most useful? How did they lead and collaborate? Why did some succeed and others fail?

I also explore the social and cultural forces that provide the atmosphere for innovation. For the birth of the digital age, this included a research ecosystem that was nurtured by government spending and managed by a military-industrial-academic collaboration. Intersecting with that was a loose alliance of community organizers, communal-minded hippies, do-it-yourself hobbyists, and homebrew hackers, most of whom were suspicious of centralized authority.

Histories can be written with a different emphasis on any of these factors. An example is the invention of the Harvard/IBM Mark I, the first big electromechanical computer. One of its programmers, Grace Hopper, wrote a history that focused on its primary creator, Howard Aiken. IBM countered with a history that featured its teams of faceless engineers who contributed the incremental innovations, from counters to card feeders, that went into the machine.

Likewise, what emphasis should be put on great individuals versus on cultural currents has long been a matter of dispute; in the mid-nineteenth century, Thomas Carlyle declared that “the history of the world is but the biography of great men,” and Herbert Spencer responded with a theory that emphasized the role of societal forces. Academics and participants often view this balance differently. “As a professor, I tended to think of history as run by impersonal forces,” Henry Kissinger told reporters during one of his Middle East shuttle missions in the 1970s. “But when you see it in practice, you see the difference personalities make.”1 When it comes to digital-age innovation, as with Middle East peacemaking, a variety

42

of personal and cultural forces all come into play, and in this book I sought to weave them together.

The Internet was originally built to facilitate collaboration. By contrast, personal computers, especially those meant to be used at home, were devised as tools for individual creativity. For more than a decade, beginning in the early 1970s, the development of networks and that of home computers proceeded separately from one another. They finally began coming together in the late 1980s with the advent of modems, online services, and the Web. Just as combining the steam engine with ingenious machinery drove the Industrial Revolution, the combination of the computer and distributed networks led to a digital revolution that allowed anyone to create, disseminate, and access any information anywhere.

Historians of science are sometimes wary about calling periods of great change revolutions, because they prefer to view progress as evolutionary. “There was no such thing as the Scientific Revolution, and this is a book about it,” is the wry opening sentence of the Harvard professor Steven Shapin’s book on that period. One method that Shapin used to escape his half-joking contradiction is to note how the key players of the period “vigorously expressed the view” that they were part of a revolution. “Our sense of radical change afoot comes substantially from them.”2

Likewise, most of us today share a sense that the digital advances of the past half century are transforming, perhaps even revolutionizing the way we live. I can recall the excitement that each new breakthrough engendered. My father and uncles were electrical engineers, and like many of the characters in this book I grew up with a basement workshop that had circuit boards to be soldered, radios to be opened, tubes to be tested, and boxes of transistors and resistors to be sorted and deployed. As an electronics geek who loved Heathkits and ham radios (WA5JTP), I can remember when vacuum tubes gave way to transistors. At college I learned programming using punch cards and recall when the agony of batch processing was replaced by the ecstasy of hands-on interaction. In the 1980s I thrilled to the static and screech that modems made when they opened for you the weirdly magical realm of online services and bulletin boards, and in the early 1990s I helped to run a digital division at Time and Time Warner that launched new Web and broadband Internet services. As Wordsworth said of the enthusiasts who were present at the beginning of the French Revolution, “Bliss was it in that dawn to be alive.”

I began work on this book more than a decade ago. It grew out of my fascination with the digital-age advances I had witnessed and also from my biography of Benjamin Franklin, who was an innovator, inventor, publisher, postal service pioneer, and all-around information networker and entrepreneur. I wanted to step away from doing biographies, which tend to emphasize the role of singular individuals, and once again do a book like The Wise Men, which I had coauthored with a colleague about the creative teamwork of six friends who shaped America’s cold war policies. My initial plan was to focus on the teams that invented the Internet. But when I interviewed Bill Gates, he convinced me that the simultaneous emergence of the Internet and the personal computer made for a richer tale. I put this book on hold early in 2009, when I began working on a biography of Steve Jobs. But his story reinforced my interest in how the development of the Internet and computers intertwined, so as soon as I finished that book, I went back to work on this tale of digital-age innovators.

The protocols of the Internet were devised by peer collaboration, and the resulting system seemed to have embedded in its genetic code a propensity to facilitate such collaboration. The power to create and transmit information was fully distributed to each of the nodes, and any attempt to impose controls or a hierarchy could be routed around. Without falling into the teleological fallacy of ascribing intentions or a personality to technology, it’s fair to say that a

43

system of open networks connected to individually controlled computers tended, as the printing press did, to wrest control over the distribution of information from gatekeepers, central authorities, and institutions that employed scriveners and scribes. It became easier for ordinary folks to create and share content.

The collaboration that created the digital age was not just among peers but also between generations. Ideas were handed off from one cohort of innovators to the next. Another theme that emerged from my research was that users repeatedly commandeered digital innovations to create communications and social networking tools. I also became interested in how the quest for artificial intelligence—machines that think on their own—has consistently proved less fruitful than creating ways to forge a partnership or symbiosis between people and machines. In other words, the collaborative creativity that marked the digital age included collaboration between humans and machines.

Finally, I was struck by how the truest creativity of the digital age came from those who were able to connect the arts and sciences. They believed that beauty mattered. “I always thought of myself as a humanities person as a kid, but I liked electronics,” Jobs told me when I embarked on his biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” The people who were comfortable at this humanities-technology intersection helped to create the human-machine symbiosis that is at the core of this story.

Like many aspects of the digital age, this idea that innovation resides where art and science connect is not new. Leonardo da Vinci was the exemplar of the creativity that flourishes when the humanities and sciences interact. When Einstein was stymied while working out General Relativity, he would pull out his violin and play Mozart until he could reconnect to what he called the harmony of the spheres.

When it comes to computers, there is one other historical figure, not as well known, who embodied the combination of the arts and sciences. Like her famous father, she understood the romance of poetry. Unlike him, she also saw the romance of math and machinery. And that is where our story begins.

44

Ada, Countess of Lovelace (1815–52), painted by Margaret Sarah Carpenter in 1836.

45

Lord Byron (1788–1824), Ada’s father, in Albanian dress, painted by Thomas Phillips in 1835.

46

Charles Babbage (1791–1871), photograph taken circa 1837.

47

CHAPTER ONE

ADA, COUNTESS OF LOVELACE

POETICAL SCIENCE In May 1833, when she was seventeen, Ada Byron was among the young women presented at the British royal court. Family members had worried about how she would acquit herself, given her high-strung and independent nature, but she ended up behaving, her mother reported, “tolerably well.” Among those Ada met that evening were the Duke of Wellington, whose straightforward manner she admired, and the seventy-nine-year-old French ambassador Talleyrand, who struck her as “an old monkey.”1

The only legitimate child of the poet Lord Byron, Ada had inherited her father’s romantic spirit, a trait that her mother tried to temper by having her tutored in mathematics. The combination produced in Ada a love for what she took to calling “poetical science,” which linked her rebellious imagination to her enchantment with numbers. For many, including her father, the rarefied sensibilities of the Romantic era clashed with the techno-excitement of the Industrial Revolution. But Ada was comfortable at the intersection of both eras.

So it was not surprising that her debut at court, despite the glamour of the occasion, made less impression on her than her attendance a few weeks later at another majestic event of the London season, at which she met Charles Babbage, a forty-one-year-old widowed science and math eminence who had established himself as a luminary on London’s social circuit. “Ada was more pleased with a party she was at on Wednesday than with any of the assemblages in the grand monde,” her mother reported to a friend. “She met there a few scientific people—amongst them Babbage, with whom she was delighted.”2

Babbage’s galvanizing weekly salons, which included up to three hundred guests, brought together lords in swallow-tail coats and ladies in brocade gowns with writers, industrialists, poets, actors, statesmen, explorers, botanists, and other “scientists,” a word that Babbage’s friends had recently coined.3 By bringing scientific scholars into this exalted realm, said one noted geologist, Babbage “successfully asserted the rank in society due to science.”4

The evenings featured dancing, readings, games, and lectures accompanied by an assortment of seafood, meat, fowl, exotic drinks, and iced desserts. The ladies staged tableaux vivants, in which they dressed in costume to re-create famous paintings. Astronomers set up telescopes, researchers displayed their electrical and magnetic contrivances, and Babbage allowed guests to play with his mechanical dolls. The centerpiece of the evenings—and one of Babbage’s many motives for hosting them—was his demonstration of a model portion of his Difference Engine, a mammoth mechanical calculating contraption that he was building in a fireproof structure adjacent to his home. Babbage would display the model with great drama, cranking its arm as it calculated a sequence of numbers and, just as the audience began to get bored, showed how the pattern could suddenly change based on instructions that had been coded into the machine.5 Those who were especially intrigued would be invited through the yard to the former stables, where the complete machine was being constructed.

Babbage’s Difference Engine, which could solve polynomial equations, impressed people in different ways. The Duke of Wellington commented that it could be useful in analyzing the variables a general might face before going into battle.6 Ada’s mother, Lady Byron, marveled

48

that it was a “thinking machine.” As for Ada, who would later famously note that machines could never truly think, a friend who went with them to the demonstration reported, “Miss Byron, young as she was, understood its working, and saw the great beauty of the invention.”7

Ada’s love of both poetry and math primed her to see beauty in a computing machine. She was an exemplar of the era of Romantic science, which was characterized by a lyrical enthusiasm for invention and discovery. It was a period that brought “imaginative intensity and excitement to scientific work,” Richard Holmes wrote in The Age of Wonder. “It was driven by a common ideal of intense, even reckless, personal commitment to discovery.”8

In short, it was a time not unlike our own. The advances of the Industrial Revolution, including the steam engine, mechanical loom, and telegraph, transformed the nineteenth century in much the same way that the advances of the Digital Revolution—the computer, microchip, and Internet—have transformed our own. At the heart of both eras were innovators who combined imagination and passion with wondrous technology, a mix that produced Ada’s poetical science and what the twentieth-century poet Richard Brautigan would call “machines of loving grace.”

LORD BYRON Ada inherited her poetic and insubordinate temperament from her father, but he was not the source of her love for machinery. He was, in fact, a Luddite. In his maiden speech in the House of Lords, given in February 1812 when he was twenty-four, Byron defended the followers of Ned Ludd, who were rampaging against mechanical weaving machines. With sarcastic scorn Byron mocked the mill owners of Nottingham, who were pushing a bill that would make destroying automated looms a crime punishable by death. “These machines were to them an advantage, inasmuch as they superseded the necessity of employing a number of workmen, who were left in consequence to starve,” Byron declared. “The rejected workmen, in the blindness of their ignorance, instead of rejoicing at these improvements in arts so beneficial to mankind, conceived themselves to be sacrificed to improvements in mechanism.”

Two weeks later, Byron published the first two cantos of his epic poem Childe Harold’s Pilgrimage, a romanticized account of his wanderings through Portugal, Malta, and Greece, and, as he later remarked, “awoke one morning and found myself famous.” Beautiful, seductive, troubled, brooding, and sexually adventurous, he was living the life of a Byronic hero while creating the archetype in his poetry. He became the toast of literary London and was feted at three parties each day, most memorably a lavish morning dance hosted by Lady Caroline Lamb.

Lady Caroline, though married to a politically powerful aristocrat who was later prime minister, fell madly in love with Byron. He thought she was “too thin,” yet she had an unconventional sexual ambiguity (she liked to dress as a page boy) that he found enticing. They had a turbulent affair, and after it ended she stalked him obsessively. She famously declared him to be “mad, bad, and dangerous to know,” which he was. So was she.

At Lady Caroline’s party, Lord Byron had also noticed a reserved young woman who was, he recalled, “more simply dressed.” Annabella Milbanke, nineteen, was from a wealthy and multi-titled family. The night before the party, she had read Childe Harold and had mixed feelings. “He is rather too much of a mannerist,” she wrote. “He excels most in the delineation of deep feeling.” Upon seeing him across the room at the party, her feelings were conflicted, dangerously so. “I did not seek an introduction to him, for all the women were absurdly courting him, and trying to deserve the lash of his Satire,” she wrote her mother. “I am not desirous of a place in his lays. I made no offering at the shrine of Childe Harold,

49

though I shall not refuse the acquaintance if it comes my way.”9 That acquaintance, as it turned out, did come her way. After he was introduced to her

formally, Byron decided that she might make a suitable wife. It was, for him, a rare display of reason over romanticism. Rather than arousing his passions, she seemed to be the sort of woman who might tame those passions and protect him from his excesses—as well as help pay off his burdensome debts. He proposed to her halfheartedly by letter. She sensibly declined. He wandered off to far less appropriate liaisons, including one with his half sister, Augusta Leigh. But after a year, Annabella rekindled the courtship. Byron, falling more deeply in debt while grasping for a way to curb his enthusiasms, saw the rationale if not the romance in the possible relationship. “Nothing but marriage and a speedy one can save me,” he admitted to Annabella’s aunt. “If your niece is obtainable, I should prefer her; if not, the very first woman who does not look as if she would spit in my face.”10 There were times when Lord Byron was not a romantic. He and Annabella were married in January 1815.

Byron initiated the marriage in his Byronic fashion. “Had Lady Byron on the sofa before dinner,” he wrote about his wedding day.11 Their relationship was still active when they visited his half sister two months later, because around then Annabella got pregnant. However, during the visit she began to suspect that her husband’s friendship with Augusta went beyond the fraternal, especially after he lay on a sofa and asked them both to take turns kissing him.12 The marriage started to unravel.

Annabella had been tutored in mathematics, which amused Lord Byron, and during their courtship he had joked about his own disdain for the exactitude of numbers. “I know that two and two make four—and should be glad to prove it too if I could,” he wrote, “though I must say if by any sort of process I could convert two and two into five it would give me much greater pleasure.” Early on, he affectionately dubbed her the “Princess of Parallelograms.” But when the marriage began to sour, he refined that mathematical image: “We are two parallel lines prolonged to infinity side by side but never to meet.” Later, in the first canto of his epic poem Don Juan, he would mock her: “Her favourite science was the mathematical. . . . She was a walking calculation.”

The marriage was not saved by the birth of their daughter on December 10, 1815. She was named Augusta Ada Byron, her first name that of Byron’s too-beloved half sister. When Lady Byron became convinced of her husband’s perfidy, she thereafter called her daughter by her middle name. Five weeks later she packed her belongings into a carriage and fled to her parents’ country home with the infant Ada.

Ada never saw her father again. Lord Byron left the country that April after Lady Byron, in letters so calculating that she earned his sobriquet of “Mathematical Medea,” threatened to expose his alleged incestuous and homosexual affairs as a way to secure a separation agreement that gave her custody of their child.13

The opening of canto 3 of Childe Harold, written a few weeks later, invokes Ada as his muse: Is thy face like thy mother’s, my fair child! Ada! sole daughter of my house and of my heart? When last I saw thy young blue eyes they smiled, And then we parted.

Byron wrote these lines in a villa by Lake Geneva, where he was staying with the poet Percy Bysshe Shelley and Shelley’s future wife, Mary. It rained relentlessly. Trapped inside for days, Byron suggested they write horror stories. He produced a fragment of a tale about a vampire, one of the first literary efforts on that subject, but Mary’s story was the one that became a classic: Frankenstein, or The Modern Prometheus. Playing on the ancient Greek myth of the hero who crafted a living man out of clay and snatched fire from the gods for human use, Frankenstein was the story of a scientist who galvanized a man-made assemblage

50

into a thinking human. It was a cautionary tale about technology and science. It also raised the question that would become associated with Ada: Can man-made machines ever truly think?

The third canto of Childe Harold ends with Byron’s prediction that Annabella would try to keep Ada from knowing about her father, and so it happened. There was a portrait of Lord Byron at their house, but Lady Byron kept it securely veiled, and Ada never saw it until she was twenty.14

Lord Byron, by contrast, kept a picture of Ada on his desk wherever he wandered, and his letters often requested news or portraits of her. When she was seven, he wrote to Augusta, “I wish you would obtain from Lady B some accounts of Ada’s disposition. . . . Is the girl imaginative? . . . Is she passionate? I hope that the Gods have made her anything save poetical—it is enough to have one such fool in the family.” Lady Byron reported that Ada had an imagination that was “chiefly exercised in connection with her mechanical ingenuity.”15

Around that time, Byron, who had been wandering through Italy, writing and having an assortment of affairs, grew bored and decided to enlist in the Greek struggle for independence from the Ottoman Empire. He sailed for Missolonghi, where he took command of part of the rebel army and prepared to attack a Turkish fortress. But before he could engage in battle, he caught a violent cold that was made worse by his doctor’s decision to treat him by bloodletting. On April 19, 1824, he died. According to his valet, among his final words were “Oh, my poor dear child!—my dear Ada! My God, could I have seen her! Give her my blessing.”16

ADA Lady Byron wanted to make sure that Ada did not turn out like her father, and part of her strategy was to have the girl rigorously study math, as if it were an antidote to poetic imagination. When Ada, at age five, showed a preference for geography, Lady Byron ordered that the subject be replaced by additional arithmetic lessons, and her governess soon proudly reported, “She adds up sums of five or six rows of figures with accuracy.” Despite these efforts, Ada developed some of her father’s propensities. She had an affair as a young teenager with one of her tutors, and when they were caught and the tutor banished, she tried to run away from home to be with him. In addition, she had mood swings that took her from feelings of grandiosity to despair, and she suffered various maladies both physical and psychological.

Ada accepted her mother’s conviction that an immersion in math could help tame her Byronic tendencies. After her dangerous liaison with her tutor, and inspired by Babbage’s Difference Engine, she decided on her own, at eighteen, to begin a new series of lessons. “I must cease to think of living for pleasure or self-gratification,” she wrote her new tutor. “I find that nothing but very close and intense application to subjects of a scientific nature now seems to keep my imagination from running wild. . . . It appears to me that the first thing is to go through a course of Mathematics.” He agreed with the prescription: “You are right in supposing that your chief resource and safeguard at the present is in a course of severe intellectual study. For this purpose there is no subject to be compared to Mathematics.”17 He prescribed Euclidean geometry, followed by a dose of trigonometry and algebra. That should cure anyone, they both thought, from having too many artistic or romantic passions.

Her interest in technology was stoked when her mother took her on a trip through the British industrial midlands to see the new factories and machinery. Ada was particularly impressed with an automated weaving loom that used punch cards to direct the creation of the desired fabric patterns, and she drew a sketch of how it worked. Her father’s famous speech

51

in the House of Lords had defended the Luddites who had smashed such looms because of their fear of what technology might inflict on humanity. But Ada waxed poetical about them and saw the connection with what would someday be called computers. “This Machinery reminds me of Babbage and his gem of all mechanism,” she wrote.18

Ada’s interest in applied science was further stimulated when she met one of Britain’s few noted female mathematicians and scientists, Mary Somerville. Somerville had just finished writing one of her great works, On the Connexion of the Physical Sciences, in which she tied together developments in astronomy, optics, electricity, chemistry, physics, botany, and geology.1 Emblematic of the time, it provided a unified sense of the extraordinary endeavors of discovery that were under way. She proclaimed in her opening sentence, “The progress of modern science, especially within the last five years, has been remarkable for a tendency to simplify the laws of nature and to unite detached branches by general principles.”

Somerville became a friend, teacher, inspiration, and mentor to Ada. She met with Ada regularly, sent her math books, devised problems for her to solve, and patiently explained the correct answers. She was also a good friend of Babbage’s, and during the fall of 1834 she and Ada would often visit his Saturday-evening salons. Somerville’s son, Woronzow Greig, aided Ada’s efforts to settle down by suggesting to one of his former classmates at Cambridge that she would make a suitable—or at least interesting—wife.

William King was socially prominent, financially secure, quietly intelligent, and as taciturn as Ada was excitable. Like her, he was a student of science, but his focus was more practical and less poetic: his primary interests were crop rotation theories and advances in livestock breeding techniques. He proposed marriage within a few weeks of meeting Ada, and she accepted. Her mother, with motives that only a psychiatrist could fathom, decided it was imperative to tell William about Ada’s attempted elopement with her tutor. Despite this news, William was willing to proceed with the wedding, which was held in July 1835. “Gracious God, who has so mercifully given you an opportunity of turning aside from the dangerous paths, has given you a friend and guardian,” Lady Byron wrote her daughter, adding that she should use this opportunity to “bid adieu” to all of her “peculiarities, caprices, and self- seeking.”19

The marriage was a match made in rational calculus. For Ada, it offered the chance to adopt a more steady and grounded life. More important, it allowed her to escape dependence on her domineering mother. For William, it meant having a fascinating, eccentric wife from a wealthy and famous family.

Lady Byron’s first cousin Viscount Melbourne (who had the misfortune of having been married to Lady Caroline Lamb, by then deceased) was the prime minister, and he arranged that, in Queen Victoria’s coronation list of honors, William would become the Earl of Lovelace. His wife thus became Ada, Countess of Lovelace. She is therefore properly referred to as Ada or Lady Lovelace, though she is now commonly known as Ada Lovelace.

That Christmas of 1835, Ada received from her mother the family’s life-size portrait of her father. Painted by Thomas Phillips, it showed Lord Byron in romantic profile, gazing at the horizon, dressed in traditional Albanian costume featuring a red velvet jacket, ceremonial sword, and headdress. For years it had hung over Ada’s grandparents’ mantelpiece, but it had been veiled by a green cloth from the day her parents had separated. Now she was trusted not only to see it but to possess it, along with his inkstand and pen.

Her mother did something even more surprising when the Lovelaces’ first child, a son, was born a few months later. Despite her disdain for her late husband’s memory, she agreed that Ada should name the boy Byron, which she did. The following year Ada had a daughter, whom she dutifully named Annabella, after her mother. Ada then came down with yet another mysterious malady, which kept her bedridden for months. She recovered well enough

52

to have a third child, a son named Ralph, but her health remained fragile. She had digestive and respiratory problems that were compounded by being treated with laudanum, morphine, and other forms of opium, which led to mood swings and occasional delusions.

Ada was further unsettled by the eruption of a personal drama that was bizarre even by the standards of the Byron family. It involved Medora Leigh, the daughter of Byron’s half sister and occasional lover. According to widely accepted rumors, Medora was Byron’s daughter. She seemed determined to show that darkness ran in the family. She had an affair with a sister’s husband, then ran off with him to France and had two illegitimate children. In a fit of self-righteousness, Lady Byron went to France to rescue Medora, then revealed to Ada the story of her father’s incest.

This “most strange and dreadful history” did not seem to surprise Ada. “I am not in the least astonished,” she wrote her mother. “You merely confirm what I have for years and years felt scarcely a doubt about.”20 Rather than being outraged, she seemed oddly energized by the news. She declared that she could relate to her father’s defiance of authority. Referring to his “misused genius,” she wrote to her mother, “If he has transmitted to me any portion of that genius, I would use it to bring out great truths and principles. I think he has bequeathed this task to me. I have this feeling strongly, and there is a pleasure attending it.”21

Once again Ada took up the study of math in order to settle herself, and she tried to convince Babbage to become her tutor. “I have a peculiar way of learning, and I think it must be a peculiar man to teach me successfully,” she wrote him. Whether due to her opiates or her breeding or both, she developed a somewhat outsize opinion of her own talents and began to describe herself as a genius. In her letter to Babbage, she wrote, “Do not reckon me conceited, . . . but I believe I have the power of going just as far as I like in such pursuits, and where there is so decided a taste, I should almost say a passion, as I have for them, I question if there is not always some portion of natural genius even.”22

Babbage deflected Ada’s request, which was probably wise. It preserved their friendship for an even more important collaboration, and she was able to secure a first-rate math tutor instead: Augustus De Morgan, a patient gentleman who was a pioneer in the field of symbolic logic. He had propounded a concept that Ada would one day employ with great significance, which was that an algebraic equation could apply to things other than numbers. The relations among symbols (for example, that a + b = b + a) could be part of a logic that applied to things that were not numerical.

Ada was never the great mathematician that her canonizers claim, but she was an eager pupil, able to grasp most of the basic concepts of calculus, and with her artistic sensibility she liked to visualize the changing curves and trajectories that the equations were describing. De Morgan encouraged her to focus on the rules for working through equations, but she was more eager to discuss the underlying concepts. Likewise with geometry, she often asked for visual ways to picture problems, such as how the intersections of circles in a sphere divide it into various shapes.

Ada’s ability to appreciate the beauty of mathematics is a gift that eludes many people, including some who think of themselves as intellectual. She realized that math was a lovely language, one that describes the harmonies of the universe and can be poetic at times. Despite her mother’s efforts, she remained her father’s daughter, with a poetic sensibility that allowed her to view an equation as a brushstroke that painted an aspect of nature’s physical splendor, just as she could visualize the “wine-dark sea” or a woman who “walks in beauty, like the night.” But math’s appeal went even deeper; it was spiritual. Math “constitutes the language through which alone we can adequately express the great facts of the natural world,” she said, and it allows us to portray the “changes of mutual relationship” that unfold in creation. It is “the instrument through which the weak mind of man can most effectually read his Creator’s works.”

53

This ability to apply imagination to science characterized the Industrial Revolution as well as the computer revolution, for which Ada was to become a patron saint. She was able, as she told Babbage, to understand the connection between poetry and analysis in ways that transcended her father’s talents. “I do not believe that my father was (or ever could have been) such a Poet as I shall be an Analyst; for with me the two go together indissolubly,” she wrote.23

Her reengagement with math, she told her mother, spurred her creativity and led to an “immense development of imagination, so much so that I feel no doubt if I continue my studies I shall in due time be a Poet.”24 The whole concept of imagination, especially as it was applied to technology, intrigued her. “What is imagination?” she asked in an 1841 essay. “It is the Combining faculty. It brings together things, facts, ideas, conceptions in new, original, endless, ever-varying combinations. . . . It is that which penetrates into the unseen worlds around us, the worlds of Science.”25

By then Ada believed she possessed special, even supernatural abilities, what she called “an intuitive perception of hidden things.” Her exalted view of her talents led her to pursue aspirations that were unusual for an aristocratic woman and mother in the early Victorian age. “I believe myself to possess a most singular combination of qualities exactly fitted to make me pre-eminently a discoverer of the hidden realities of nature,” she explained in a letter to her mother in 1841. “I can throw rays from every quarter of the universe into one vast focus.”26

It was while in this frame of mind that she decided to engage again with Charles Babbage, whose salons she had first attended eight years earlier.

CHARLES BABBAGE AND HIS ENGINES From an early age, Charles Babbage had been interested in machines that could perform human tasks. When he was a child, his mother took him to many of the exhibition halls and museums of wonder that were springing up in London in the early 1800s. At one in Hanover Square, a proprietor aptly named Merlin invited him up to the attic workshop where there was a variety of mechanical dolls, known as “automata.” One was a silver female dancer, about a foot tall, whose arms moved with grace and who held in her hand a bird that could wag its tail, flap its wings, and open its beak. The Silver Lady’s ability to display feelings and personality captured the boy’s fancy. “Her eyes were full of imagination,” he recalled. Years later he discovered the Silver Lady at a bankruptcy auction and bought it. It served as an amusement at his evening salons where he celebrated the wonders of technology.

At Cambridge Babbage became friends with a group, including John Herschel and George Peacock, who were disappointed by the way math was taught there. They formed a club, called the Analytical Society, which campaigned to get the university to abandon the calculus notation devised by its alumnus Newton, which relied on dots, and replace it with the one devised by Leibniz, which used dx and dy to represent infinitesimal increments and was thus known as “d” notation. Babbage titled their manifesto “The Principles of pure D-ism in opposition to the Dot-age of the University.”27 He was prickly, but he had a good sense of humor.

One day Babbage was in the Analytical Society’s room working on a table of logarithms that was littered with discrepancies. Herschel asked him what he was thinking. “I wish to God these calculations had been executed by steam,” Babbage answered. To this idea of a mechanical method for tabulating logarithms Herschel replied, “It is quite possible.”28 In 1821 Babbage turned his attention to building such a machine.

Over the years, many had fiddled with making calculating contraptions. In the 1640s, Blaise Pascal, the French mathematician and philosopher, created a mechanical calculator to

54

reduce the drudgery of his father’s work as a tax supervisor. It had spoked metal wheels with the digits 0 through 9 on their circumference. To add or subtract numbers, the operator used a stylus to dial a number, as if using a rotary phone, then dialed in the next number; an armature carried or borrowed a 1 when necessary. It became the first calculator to be patented and sold commercially.

Thirty years later, Gottfried Leibniz, the German mathematician and philosopher, tried to improve upon Pascal’s contraption with a “stepped reckoner” that had the capacity to multiply and divide. It had a hand-cranked cylinder with a set of teeth that meshed with counting wheels. But Leibniz ran into a problem that would be a recurring theme of the digital age. Unlike Pascal, an adroit engineer who could combine scientific theories with mechanical genius, Leibniz had little engineering skill and did not surround himself with those who did. So, like many great theorists who lacked practical collaborators, he was unable to produce reliably working versions of his device. Nevertheless, his core concept, known as the Leibniz wheel, would influence calculator design through the time of Babbage.

Babbage knew of the devices of Pascal and Leibniz, but he was trying to do something more complex. He wanted to construct a mechanical method for tabulating logarithms, sines, cosines, and tangents.2 To do so, he adapted an idea that the French mathematician Gaspard de Prony came up with in the 1790s. In order to create logarithm and trigonometry tables, de Prony broke down the operations into very simple steps that involved only addition and subtraction. Then he provided easy instructions so that scores of human laborers, who knew little math, could perform these simple tasks and pass along their answers to the next set of laborers. In other words, he created an assembly line, the great industrial-age innovation that was memorably analyzed by Adam Smith in his description of the division of labor in a pin- making factory. After a trip to Paris in which he heard of de Prony’s method, Babbage wrote, “I conceived all of a sudden the idea of applying the same method to the immense work with which I had been burdened, and to manufacture logarithms as one manufactures pins.”29

Even complex mathematical tasks, Babbage realized, could be broken into steps that came down to calculating “finite differences” through simple adding and subtracting. For example, in order to make a table of squares—12, 22, 32, 42, and so on—you could list the initial numbers in such a sequence: 1, 4, 9, 16. . . . This would be column A. Beside it, in column B, you could figure out the differences between each of these numbers, in this case 3, 5, 7, 9. . . . Column C would list the difference between each of column B’s numbers, which is 2, 2, 2, 2. . . . Once the process was thus simplified, it could be reversed and the tasks parceled out to untutored laborers. One would be in charge of adding 2 to the last number in column B, and then would hand that result to another person, who would add that result to the last number in column A, thus generating the next number in the sequence of squares.

55

Replica of the Difference Engine.

56

Replica of the Analytical Engine.

57

The Jacquard loom.

58

Silk portrait of Joseph-Marie Jacquard (1752–1834) woven by a Jacquard loom.

Babbage devised a way to mechanize this process, and he named it the Difference Engine. It could tabulate any polynomial function and provide a digital method for approximating the solution to differential equations.

How did it work? The Difference Engine used vertical shafts with disks that could be turned to any numeral. These were attached to cogs that could be cranked in order to add that numeral to (or subtract it from) a disk on an adjacent shaft. The contraption could even “store” the interim results on another shaft. The main complexity was how to “carry” or “borrow” when necessary, as we do with pencils when we calculate 36 + 19 or 42 – 17. Drawing on Pascal’s devices, Babbage came up with a few ingenious contrivances that allowed the cogs and shafts to handle the calculation.

The machine was, in concept, a true marvel. Babbage even figured out a way to get it to create a table of prime numbers up to 10 million. The British government was impressed, at least initially. In 1823 it gave him seed money of £1,700 and would eventually sink more than £17,000, twice the cost of a warship, into the device during the decade Babbage spent trying to build it. But the project ran into two problems. First, Babbage and his hired engineer did not quite have the skills to get the device working. Second, he began dreaming up

59

something better.

Babbage’s new idea, which he conceived in 1834, was a general-purpose computer that could carry out a variety of different operations based on programming instructions given to it. It could perform one task, then be made to switch and perform another. It could even tell itself to switch tasks—or alter its “pattern of action,” as Babbage explained—based on its own interim calculations. Babbage named this proposed machine the Analytical Engine. He was one hundred years ahead of his time.

The Analytical Engine was the product of what Ada Lovelace, in her essay on imagination, had called “the Combining Faculty.” Babbage had combined innovations that had cropped up in other fields, a trick of many great inventors. He had originally used a metal drum that was studded with spikes to control how the shafts would turn. But then he studied, as Ada had, the automated loom invented in 1801 by a Frenchman named Joseph-Marie Jacquard, which transformed the silk-weaving industry. Looms create a pattern by using hooks to lift selected warp threads, and then a rod pushes a woof thread underneath. Jacquard invented a method of using cards with holes punched in them to control this process. The holes determined which hooks and rods would be activated for each pass of the weave, thus automating the creation of intricate patterns. Each time the shuttle was thrown to create a new pass of the thread, a new punch card would come into play.

On June 30, 1836, Babbage made an entry into what he called his “Scribbling Books” that would represent a milestone in the prehistory of computers: “Suggested Jacquard’s loom as a substitute for the drums.”30 Using punch cards rather than steel drums meant that an unlimited number of instructions could be input. In addition, the sequence of tasks could be modified, thus making it easier to devise a general-purpose machine that was versatile and reprogrammable.

Babbage bought a portrait of Jacquard and began to display it at his salons. It showed the inventor sitting in an armchair, a loom in the background, holding a pair of calipers over rectangular punch cards. Babbage amused his guests by asking them to guess what it was. Most thought it a superb engraving. He would then reveal that it was actually a finely woven silk tapestry, with twenty-four thousand rows of threads, each controlled by a different punch card. When Prince Albert, the husband of Queen Victoria, came to one of Babbage’s salons, he asked Babbage why he found the tapestry so interesting. Babbage replied, “It will greatly assist in explaining the nature of my calculating machine, the Analytical Engine.”31

Few people, however, saw the beauty of Babbage’s proposed new machine, and the British government had no inclination to fund it. Try as he might, Babbage could generate little notice in either the popular press or scientific journals.

But he did find one believer. Ada Lovelace fully appreciated the concept of a general- purpose machine. More important, she envisioned an attribute that might make it truly amazing: it could potentially process not only numbers but any symbolic notations, including musical and artistic ones. She saw the poetry in such an idea, and she set out to encourage others to see it as well.

She barraged Babbage with letters, some of which verged on cheeky, even though he was twenty-four years her senior. In one, she described the solitaire game using twenty-six marbles, where the goal is to execute jumps so that only one marble remains. She had mastered it but was trying to derive a “mathematical formula . . . on which the solution depends, and which can be put into symbolic language.” Then she asked, “Am I too imaginative for you? I think not.”32

Her goal was to work with Babbage as his publicist and partner in trying to get support to build the Analytical Engine. “I am very anxious to talk to you,” she wrote in early 1841. “I will give you a hint on what. It strikes me that at some future time . . . my head may be made

60

by you subservient to some of your purposes and plans. If so, if ever I could be worthy or capable of being used by you, my head will be yours.”33

A year later, a tailor-made opportunity presented itself.

LADY LOVELACE’S NOTES In his quest to find support for his Analytical Engine, Babbage had accepted an invitation to address the Congress of Italian Scientists in Turin. Taking notes was a young military engineer, Captain Luigi Menabrea, who would later serve as prime minister of Italy. With Babbage’s help, Menabrea published a detailed description of the machine, in French, in October 1842.

One of Ada’s friends suggested that she produce a translation of Menabrea’s piece for Scientific Memoirs, a periodical devoted to scientific papers. This was her opportunity to serve Babbage and show her talents. When she finished, she informed Babbage, who was pleased but also somewhat surprised. “I asked why she had not herself written an original paper on a subject with which she was so intimately acquainted,” Babbage said.34 She replied that the thought had not occurred to her. Back then, women generally did not publish scientific papers.

Babbage suggested that she add some notes to Menabrea’s memoir, a project that she embraced with enthusiasm. She began working on a section she called “Notes by the Translator” that ended up totaling 19,136 words, more than twice the length of Menabrea’s original article. Signed “A.A.L.,” for Augusta Ada Lovelace, her “Notes” became more famous than the article and were destined to make her an iconic figure in the history of computing.35

As she worked on the notes at her country estate in Surrey in the summer of 1843, she and Babbage exchanged scores of letters, and in the fall they had numerous meetings after she moved back to her London home. A minor academic specialty and gender-charged debate has grown up around the issue of how much of the thinking was hers rather than his. In his memoirs, Babbage gives her much of the credit: “We discussed together the various illustrations that might be introduced: I suggested several but the selection was entirely her own. So also was the algebraic working out of the different problems, except, indeed, that relating to the numbers of Bernoulli, which I had offered to do to save Lady Lovelace the trouble. This she sent back to me for an amendment, having detected a grave mistake which I had made in the process.”36

In her “Notes,” Ada explored four concepts that would have historical resonance a century later when the computer was finally born. The first was that of a general-purpose machine, one that could not only perform a preset task but could be programmed and reprogrammed to do a limitless and changeable array of tasks. In other words, she envisioned the modern computer. This concept was at the core of her “Note A,” which emphasized the distinction between Babbage’s original Difference Engine and his proposed new Analytical Engine. “The particular function whose integral the Difference Engine was constructed to tabulate is Δ7ux = 0,” she began, explaining that its purpose was the computation of nautical tables. “The Analytical Engine, on the contrary, is not merely adapted for tabulating the results of one particular function and of no other, but for developing and tabulating any function whatever.”

This was done, she wrote, by “the introduction into it of the principle which Jacquard devised for regulating, by means of punched cards, the most complicated patterns in the fabrication of brocaded stuffs.” Even more than Babbage, Ada realized the significance of this. It meant that the machine could be like the type of computer we now take for granted: one that does not merely do a specific arithmetic task but can be a general-purpose machine. She explained:

61

The bounds of arithmetic were outstepped the moment the idea of applying cards had occurred. The Analytical Engine does not occupy common ground with mere “calculating machines.” It holds a position wholly its own. In enabling a mechanism to combine together general symbols, in successions of unlimited variety and extent, a uniting link is established between the operations of matter and the abstract mental processes.37

Those sentences are somewhat clotted, but they are worth reading carefully. They describe the essence of modern computers. And Ada enlivened the concept with poetic flourishes. “The Analytical Engine weaves algebraical patterns just as the Jacquard loom weaves flowers and leaves,” she wrote. When Babbage read “Note A,” he was thrilled and made no changes. “Pray do not alter it,” he said.38

Ada’s second noteworthy concept sprang from this description of a general-purpose machine. Its operations, she realized, did not need to be limited to math and numbers. Drawing on De Morgan’s extension of algebra into a formal logic, she noted that a machine such as the Analytical Engine could store, manipulate, process, and act upon anything that could be expressed in symbols: words and logic and music and anything else we might use symbols to convey.

To explain this idea, she carefully defined what a computer operation was: “It may be desirable to explain that by the word ‘operation,’ we mean any process which alters the mutual relation of two or more things, be this relation of what kind it may.” A computer operation, she noted, could alter the relationship not just between numbers but between any symbols that were logically related. “It might act upon other things besides number, were objects found whose mutual fundamental relations could be expressed by those of the abstract science of operations.” The Analytical Engine could, in theory, even perform operations on musical notations: “Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and of musical composition were susceptible of such expression and adaptations, the engine might compose elaborate and scientific pieces of music of any degree of complexity.” It was the ultimate Ada-like “poetical science” concept: an elaborate and scientific piece of music composed by a machine! Her father would have shuddered.

This insight would become the core concept of the digital age: any piece of content, data, or information—music, text, pictures, numbers, symbols, sounds, video—could be expressed in digital form and manipulated by machines. Even Babbage failed to see this fully; he focused on numbers. But Ada realized that the digits on the cogs could represent things other than mathematical quantities. Thus did she make the conceptual leap from machines that were mere calculators to ones that we now call computers. Doron Swade, a computer historian who specializes in studying Babbage’s engines, has declared this one of Ada’s historic legacies. “If we are looking and sifting history for that transition, then that transition was made explicitly by Ada in that 1843 paper,” he said.39

Ada’s third contribution, in her final “Note G,” was to figure out in step-by-step detail the workings of what we now call a computer program or algorithm. The example she used was a program to compute Bernoulli numbers,3 an exceedingly complex infinite series that in various guises plays a role in number theory.

To show how the Analytical Engine could generate Bernoulli numbers, Ada described a sequence of operations and then made a chart showing how each would be coded into the machine. Along the way, she helped to devise the concepts of subroutines (a sequence of instructions that performs a specific task, such as computing a cosine or calculating compound interest, and can be dropped into larger programs as needed) and a recursive loop (a sequence of instructions that repeats itself).4 These were made possible by the punch-card mechanism. Seventy-five cards were needed to generate each number, she explained, and then the process became iterative as that number was fed back into the process to generate the next one. “It will be obvious that the very same seventy-five variable cards may be repeated for the computation of every succeeding number,” she wrote. She envisioned a library of commonly used subroutines, something that her intellectual heirs, including women such as

62

Grace Hopper at Harvard and Kay McNulty and Jean Jennings at the University of Pennsylvania, would create a century later. In addition, because Babbage’s engine made it possible to jump back and forth within the sequence of instruction cards based on the interim results it had calculated, it laid the foundation for what we now call conditional branching, changing to a different path of instructions if certain conditions are met.

Babbage helped Ada with the Bernoulli calculations, but the letters show her deeply immersed in the details. “I am doggedly attacking and sifting to the very bottom all the ways of deducing the Bernoulli numbers,” she wrote in July, just weeks before her translation and notes were due at the printers. “I am in much dismay at having gotten so amazing a quagmire and botheration with these Numbers that I cannot possibly get the thing done today. . . . I am in a charming state of confusion.”40

When it got worked out, she added a contribution that was primarily her own: a table and diagram showing exactly how the algorithm would be fed into the computer, step by step, including two recursive loops. It was a numbered list of coding instructions that included destination registers, operations, and commentary—something that would be familiar to any C++ coder today. “I have worked incessantly and most successfully all day,” she wrote Babbage. “You will admire the Table and Diagram extremely. They have been made out with extreme care.” From all of the letters it is clear that she did the table herself; the only help came from her husband, who did not understand the math but was willing to methodically trace in ink what she had done in pencil. “Lord L is at this moment kindly inking it all over for me,” she wrote Babbage. “I had to do it in pencil.”41

It was mainly on the basis of this diagram, which accompanied the complex processes for generating Bernoulli numbers, that Ada has been accorded by her fans the accolade of “the world’s first computer programmer.” That is a bit hard to defend. Babbage had already devised, at least in theory, more than twenty explanations of processes that the machine might eventually perform. But none of these was published, and there was no clear description of the way to sequence the operations. Therefore, it is fair to say that the algorithm and detailed programming description for the generation of Bernoulli numbers was the first computer program ever to be published. And the initials at the end were those of Ada Lovelace.

There was one other significant concept that she introduced in her “Notes,” which harked back to the Frankenstein story produced by Mary Shelley after that weekend with Lord Byron. It raised what is still the most fascinating metaphysical topic involving computers, that of artificial intelligence: Can machines think?

Ada believed not. A machine such as Babbage’s could perform operations as instructed, she asserted, but it could not come up with ideas or intentions of its own. “The Analytical Engine has no pretensions whatever to originate anything,” she wrote in her “Notes.” “It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths.” A century later this assertion would be dubbed “Lady Lovelace’s Objection” by the computer pioneer Alan Turing (see chapter 3).

Ada wanted her work to be regarded as a serious scientific paper and not merely a public advocacy piece, so at the outset of her “Notes” she stated that she would “offer no opinion” on the government’s reluctance to continue funding Babbage’s endeavors. This did not please Babbage, who proceeded to write a screed attacking the government. He wanted Ada to include it in her “Notes,” without his name on it, as if it were her opinion. She refused. She did not want her work compromised.

Without informing her, Babbage sent his proposed appendage directly to Scientific Memoirs. The editors decided that it should appear separately and suggested that he “manfully” sign his name. Babbage was charming when he wished, but he could also be

63

cranky, stubborn, and defiant, like most innovators. The proposed solution infuriated him, and he wrote Ada asking that she withdraw her work. Now it was her turn to become irate. Using a form of address typically used by male friends, “My Dear Babbage,” she wrote that “withdrawing the translation and Notes” would “be dishonorable and unjustifiable.” She concluded the letter, “Be assured that I am your best friend; but that I never can or will support you in acting on principles which I conceive to be not only wrong in themselves, but suicidal.”42

Babbage backed down and agreed to have his piece published separately in another periodical. That day Ada complained to her mother:

I have been harassed and pressed in a most perplexing manner by the conduct of Mr. Babbage. . . . I am sorry to come to the conclusion that he is one of the most impracticable, selfish, and intemperate persons one can have to do with. . . . I declared at once to Babbage that no power should induce me to lend myself to any of his quarrels or to become in any way his organ. . . . He was furious. I imperturbable and unmoved.43

Ada’s response to the dispute was a bizarre sixteen-page letter to Babbage, poured forth in

a frenzy, that vividly displayed her moodiness, exultations, delusions, and passions. She cajoled and berated him, praised and denigrated him. At one point she contrasted their motives. “My own uncompromising principle is to endeavour to love truth and God before fame and glory,” she claimed. “Yours is to love truth and God; but to love fame, glory, honours yet more.” She proclaimed that she saw her own inevitable fame as being of an exalted nature: “I wish to add my might toward expounding and interpreting the Almighty and his laws. . . . I should feel it no small glory if I were able to be one of his most noted prophets.”44

Having laid that groundwork, she offered him a deal: they should forge a business and political partnership. She would apply her connections and persuasive pen to his endeavor to build his Analytical Engine if—and only if—he would let her have control over his business decisions. “I give you the first choice and offer of my services and my intellect,” she wrote. “Do not lightly reject them.” The letter read in parts like a venture capital term sheet or a prenuptial agreement, complete with the possibility of arbitrators. “You will undertake to abide wholly by the judgment of myself (or of any persons whom you may now please to name as referees, whenever we may differ) on all practical matters,” she declared. In return, she promised, she would “lay before you in the course of a year or two explicit and honorable propositions for executing your engine.”45

The letter would seem surprising were it not like so many others that she wrote. It was an example of how her grandiose ambitions sometimes got the best of her. Nevertheless, she deserves respect as a person who, rising above the expectations of her background and gender and defying plagues of family demons, dedicated herself diligently to complex mathematical feats that most of us never would or could attempt. (Bernoulli numbers alone would defeat many of us.) Her impressive mathematical labors and imaginative insights came in the midst of the drama of Medora Leigh and bouts of illness that would cause her to become dependent on opiates that amplified her mood swings. She explained at the end of her letter to Babbage, “My dear friend, if you knew what sad and direful experiences I have had, in ways of which you cannot be aware, you would feel that some weight is due to my feelings.” Then, after a quick detour to raise a small point about using the calculus of finite differences to compute Bernoulli numbers, she apologized that “this letter is sadly blotted” and plaintively asked, “I wonder if you will choose to retain the lady-fairy in your service or not.”46

Ada was convinced that Babbage would accept her offer to become entrepreneurial partners. “He has so strong an idea of the advantage of having my pen as his servant that he will probably yield; though I demand very strong concessions,” she wrote her mother. “If he does consent to what I propose, I shall probably be enabled to keep him out of much hot water and to bring his engine to consummation.”47 Babbage, however, thought it wiser to

64

decline. He went to see Ada and “refused all the conditions.”48 Although they never again collaborated on science, their relationship survived. “Babbage and I are I think more friends than ever,” she wrote her mother the next week.49 And Babbage agreed the next month to pay a visit to her country home, sending her a fond letter referring to her as “the Enchantress of Numbers” and “my dear and much admired Interpreter.”

That month, September 1843, her translation and “Notes” finally appeared in Scientific Memoirs. For a while she was able to bask in acclaim from friends and to hope that, like her mentor Mary Somerville, she would be taken seriously in scientific and literary circles. Publication made her finally feel like “a completely professional person,” she wrote to a lawyer. “I really have become as much tied to a profession as you are.”50

It was not to be. Babbage got no more funding for his machines; they were never built, and he died in poverty. As for Lady Lovelace, she never published another scientific paper. Instead her life spiraled downward, and she became addicted to gambling and opiates. She had an affair with a gambling partner who then blackmailed her, forcing her to pawn her family jewels. During the final year of her life, she fought an exceedingly painful battle with uterine cancer accompanied by constant hemorrhaging. When she died in 1852, at age thirty- six, she was buried, in accordance with one of her last requests, in a country grave next to the poet father she never knew, who had died at the same age.

The Industrial Revolution was based on two grand concepts that were profound in their simplicity. Innovators came up with ways to simplify endeavors by breaking them into easy, small tasks that could be accomplished on assembly lines. Then, beginning in the textile industry, inventors found ways to mechanize steps so that they could be performed by machines, many of them powered by steam engines. Babbage, building on ideas from Pascal and Leibniz, tried to apply these two processes to the production of computations, creating a mechanical precursor to the modern computer. His most significant conceptual leap was that such machines did not have to be set to do only one process, but instead could be programmed and reprogrammed through the use of punch cards. Ada saw the beauty and significance of that enchanting notion, and she also described an even more exciting idea that derived from it: such machines could process not only numbers but anything that could be notated in symbols.

Over the years, Ada Lovelace has been celebrated as a feminist icon and a computer pioneer. For example, the U.S. Defense Department named its high-level object-oriented programming language Ada. However, she has also been ridiculed as delusional, flighty, and only a minor contributor to the “Notes” that bear her initials. As she herself wrote in those “Notes,” referring to the Analytical Engine but in words that also describe her fluctuating reputation, “In considering any new subject, there is frequently a tendency, first, to overrate what we find to be already interesting or remarkable; and, secondly, by a sort of natural reaction, to undervalue the true state of the case.”

The reality is that Ada’s contribution was both profound and inspirational. More than Babbage or any other person of her era, she was able to glimpse a future in which machines would become partners of the human imagination, together weaving tapestries as beautiful as those from Jacquard’s loom. Her appreciation for poetical science led her to celebrate a proposed calculating machine that was dismissed by the scientific establishment of her day, and she perceived how the processing power of such a device could be used on any form of information. Thus did Ada, Countess of Lovelace, help sow the seeds for a digital age that would blossom a hundred years later.

65

Vannevar Bush (1890–1974), with his Differential Analyzer at MIT.

Alan Turing (1912–54), at the Sherborne School in 1928.

66

Claude Shannon (1916–2001) in 1951.

67

CHAPTER TWO

THE COMPUTER

Sometimes innovation is a matter of timing. A big idea comes along at just the moment when the technology exists to implement it. For example, the idea of sending a man to the moon was proposed right when the progress of microchips made it possible to put computer guidance systems into the nose cone of a rocket. There are other cases, however, when the timing is out of kilter. Charles Babbage published his paper about a sophisticated computer in 1837, but it took a hundred years to achieve the scores of technological advances needed to build one.

Some of those advances seem almost trivial, but progress comes not only in great leaps but also from hundreds of small steps. Take for example punch cards, like those Babbage saw on Jacquard’s looms and proposed incorporating into his Analytical Engine. Perfecting the use of punch cards for computers came about because Herman Hollerith, an employee of the U.S. Census Bureau, was appalled that it took close to eight years to manually tabulate the 1880 census. He resolved to automate the 1890 count.

Drawing on the way that railway conductors punched holes in various places on a ticket in order to indicate the traits of each passenger (gender, approximate height, age, hair color), Hollerith devised punch cards with twelve rows and twenty-four columns that recorded the salient facts about each person in the census. The cards were then slipped between a grid of mercury cups and a set of spring-loaded pins, which created an electric circuit wherever there was a hole. The machine could tabulate not only the raw totals but also combinations of traits, such as the number of married males or foreign-born females. Using Hollerith’s tabulators, the 1890 census was completed in one year rather than eight. It was the first major use of electrical circuits to process information, and the company that Hollerith founded became in 1924, after a series of mergers and acquisitions, the International Business Machines Corporation, or IBM.

One way to look at innovation is as the accumulation of hundreds of small advances, such as counters and punch-card readers. At places like IBM, which specialize in daily improvements made by teams of engineers, this is the preferred way to understand how innovation really happens. Some of the most important technologies of our era, such as the fracking techniques developed over the past six decades for extracting natural gas, came about because of countless small innovations as well as a few breakthrough leaps.

In the case of computers, there were many such incremental advances made by faceless engineers at places like IBM. But that was not enough. Although the machines that IBM produced in the early twentieth century could compile data, they were not what we would call computers. They weren’t even particularly adroit calculators. They were lame. In addition to those hundreds of minor advances, the birth of the computer age required some larger imaginative leaps from creative visionaries.

DIGITAL BEATS ANALOG The machines devised by Hollerith and Babbage were digital, meaning they calculated using digits: discrete and distinct integers such as 0, 1, 2, 3. In their machines, the integers were

68

added and subtracted using cogs and wheels that clicked one digit at a time, like counters. Another approach to computing was to build devices that could mimic or model a physical phenomenon and then make measurements on the analogous model to calculate the relevant results. These were known as analog computers because they worked by analogy. Analog computers do not rely on discrete integers to make their calculations; instead, they use continuous functions. In analog computers, a variable quantity such as electrical voltage, the position of a rope on a pulley, hydraulic pressure, or a measurement of distance is employed as an analog for the corresponding quantities of the problem to be solved. A slide rule is analog; an abacus is digital. Clocks with sweeping hands are analog, and those with displayed numerals are digital.

Around the time that Hollerith was building his digital tabulator, Lord Kelvin and his brother James Thomson, two of England’s most distinguished scientists, were creating an analog machine. It was designed to handle the tedious task of solving differential equations, which would help in the creation of tide charts and of tables showing the firing angles that would generate different trajectories of artillery shells. Beginning in the 1870s, the brothers devised a system that was based on a planimeter, an instrument that can measure the area of a two-dimensional shape, such as the space under a curved line on a piece of paper. The user would trace the outline of the curve with the device, which would calculate the area by using a small sphere that was slowly pushed across the surface of a large rotating disk. By calculating the area under the curve, it could thus solve equations by integration—in other words, it could perform a basic task of calculus. Kelvin and his brother were able to use this method to create a “harmonic synthesizer” that could churn out an annual tide chart in four hours. But they were never able to conquer the mechanical difficulties of linking together many of these devices in order to solve equations with a lot of variables.

That challenge of linking together multiple integrators was not mastered until 1931, when an MIT engineering professor, Vannevar (rhymes with beaver) Bush—remember his name, for he is a key character in this book—was able to build the world’s first analog electrical- mechanical computer. He dubbed his machine a Differential Analyzer. It consisted of six wheel-and-disk integrators, not all that different from Lord Kelvin’s, that were connected by an array of gears, pulleys, and shafts rotated by electric motors. It helped that Bush was at MIT; there were a lot of people around who could assemble and calibrate complex contraptions. The final machine, which was the size of a small bedroom, could solve equations with as many as eighteen independent variables. Over the next decade, versions of Bush’s Differential Analyzer were replicated at the U.S. Army’s Aberdeen Proving Ground in Maryland, the Moore School of Electrical Engineering at the University of Pennsylvania, and Manchester and Cambridge universities in England. They proved particularly useful in churning out artillery firing tables—and in training and inspiring the next generation of computer pioneers.

Bush’s machine, however, was not destined to be a major advance in computing history because it was an analog device. In fact, it turned out to be the last gasp for analog computing, at least for many decades.

New approaches, technologies, and theories began to emerge in 1937, exactly a hundred years after Babbage first published his paper on the Analytical Engine. It would become an annus mirabilis of the computer age, and the result would be the triumph of four properties, somewhat interrelated, that would define modern computing:

DIGITAL. A fundamental trait of the computer revolution was that it was based on digital, not analog, computers. This occurred for many reasons, as we shall soon see, including simultaneous advances in logic theory, circuits, and electronic on-off switches that made a

69

digital rather than an analog approach more fruitful. It would not be until the 2010s that computer scientists, seeking to mimic the human brain, would seriously begin working on ways to revive analog computing.

BINARY. Not only would modern computers be digital, but the digital system they would adopt would be binary, or base-2, meaning that it employs just 0s and 1s rather than all ten digits of our everyday decimal system. Like many mathematical concepts, binary theory was pioneered by Leibniz in the late seventeenth century. During the 1940s, it became increasingly clear that the binary system worked better than other digital forms, including the decimal system, for performing logical operations using circuits composed of on-off switches.

ELECTRONIC. In the mid-1930s, the British engineer Tommy Flowers pioneered the use of vacuum tubes as on-off switches in electronic circuits. Until then, circuits had relied on mechanical and electromechanical switches, such as the clacking electromagnetic relays that were used by phone companies. Vacuum tubes had mainly been employed to amplify signals rather than as on-off switches. By using electronic components such as vacuum tubes, and later transistors and microchips, computers could operate thousands of times faster than machines that had moving electromechanical switches.

GENERAL PURPOSE. Finally, the machines would eventually have the ability to be programmed and reprogrammed—and even reprogram themselves—for a variety of purposes. They would be able to solve not just one form of mathematical calculation, such as differential equations, but could handle a multiplicity of tasks and symbol manipulations, involving words and music and pictures as well as numbers, thus fulfilling the potential that Lady Lovelace had celebrated when describing Babbage’s Analytical Engine.

Innovation occurs when ripe seeds fall on fertile ground. Instead of having a single cause, the great advances of 1937 came from a combination of capabilities, ideas, and needs that coincided in multiple places. As often happens in the annals of invention, especially information technology invention, the time was right and the atmosphere was charged. The development of vacuum tubes for the radio industry paved the way for the creation of electronic digital circuits. That was accompanied by theoretical advances in logic that made circuits more useful. And the march was quickened by the drums of war. As nations began arming for the looming conflict, it became clear that computational power was as important as firepower. Advances fed on one another, occurring almost simultaneously and spontaneously, at Harvard and MIT and Princeton and Bell Labs and an apartment in Berlin and even, most improbably but interestingly, in a basement in Ames, Iowa.

Underpinning all of these advances were some beautiful—Ada might call them poetic— leaps of mathematics. One of these leaps led to the formal concept of a “universal computer,” a general-purpose machine that could be programmed to perform any logical task and simulate the behavior of any other logical machine. It was conjured up as a thought experiment by a brilliant English mathematician with a life story that was both inspiring and tragic.

ALAN TURING Alan Turing had the cold upbringing of a child born on the fraying fringe of the British gentry.1 His family had been graced since 1638 with a baronetcy, which had meandered down the lineage to one of his nephews. But for the younger sons on the family tree, which Turing

70

and his father and grandfather were, there was no land and little wealth. Most went into fields such as the clergy, like Alan’s grandfather, and the colonial civil service, like his father, who served as a minor administrator in remote regions of India. Alan was conceived in Chhatrapur, India, and born on June 23, 1912, in London, while his parents were on home leave. When he was only one, his parents went back to India for a few years, and handed him and his older brother off to a retired army colonel and his wife to be raised in a seaside town on the south coast of England. “I am no child psychologist,” his brother, John, later noted, “but I am assured that it is a bad thing for an infant in arms to be uprooted and put into a strange environment.”2

When his mother returned, Alan lived with her for a few years and then, at age thirteen, was sent to boarding school. He rode there on his bicycle, taking two days to cover more than sixty miles, alone. There was a lonely intensity to him, reflected in his love of long-distance running and biking. He also had a trait, so common among innovators, that was charmingly described by his biographer Andrew Hodges: “Alan was slow to learn that indistinct line that separated initiative from disobedience.”3

In a poignant memoir, his mother described the son whom she doted upon:

Alan was broad, strongly built and tall, with a square, determined jaw and unruly brown hair. His deep-set, clear blue eyes were his most remarkable feature. The short, slightly retroussé nose and humorous lines of his mouth gave him a youthful—sometimes a childlike— appearance. So much so that in his late thirties he was still at times mistaken for an undergraduate. In dress and habits he tended to be slovenly. His hair was usually too long, with an overhanging lock which he would toss back with a jerk of his head. . . . He could be abstracted and dreamy, absorbed in his own thoughts which on occasion made him seem unsociable. . . . There were times when his shyness led him into extreme gaucherie. . . . Indeed he surmised that the seclusion of a mediaeval monastery would have suited him very well.4

At the boarding school, Sherborne, he realized that he was homosexual. He became

infatuated with a fair-haired, slender schoolmate, Christopher Morcom, with whom he studied math and discussed philosophy. But in the winter before he was to graduate, Morcom suddenly died of tuberculosis. Turing would later write Morcom’s mother, “I simply worshipped the ground he trod on—a thing which I did not make much attempt to disguise, I am sorry to say.”5 In a letter to his own mother, Turing seemed to take refuge in his faith: “I feel that I shall meet Morcom again somewhere and that there will be work for us to do together there as I believed there was for us to do here. Now that I am left to do it alone, I must not let him down. If I succeed I shall be more fit to join his company than I am now.” But the tragedy ended up eroding Turing’s religious faith. It also turned him even more inward, and he never again found it easy to forge intimate relationships. His housemaster reported to his parents at Easter 1927, “Undeniably he’s not a ‘normal’ boy; not the worse for that, but probably less happy.”6

In his final year at Sherborne, Turing won a scholarship to attend King’s College, Cambridge, where he went in 1931 to read mathematics. One of three books he bought with some prize money was The Mathematical Foundations of Quantum Mechanics, by John von Neumann, a fascinating Hungarian-born mathematician who, as a pioneer of computer design, would have a continuing influence on his life. Turing was particularly interested in the math at the core of quantum physics, which describes how events at the subatomic level are governed by statistical probabilities rather than laws that determine things with certainty. He believed (at least while he was young) that this uncertainty and indeterminacy at the subatomic level permitted humans to exercise free will—a trait that, if true, would seem to distinguish them from machines. In other words, because events at the subatomic level are not predetermined, that opens the way for our thoughts and actions not to be predetermined. As he explained in a letter to Morcom’s mother:

It used to be supposed in science that if everything was known about the Universe at any particular moment then we can predict what it will be through all the future. This idea was really due to the great success of astronomical prediction. More modern science however has come to the conclusion that when we are dealing with atoms and electrons we are quite unable to know the exact state of them; our instruments being made of atoms and electrons themselves. The conception then of being able to know the exact state of the universe then really must break down on the small scale. This means then that the theory which held that as eclipses etc. are predestined so were all our actions breaks down too. We have a

71

will which is able to determine the action of the atoms probably in a small portion of the brain, or possibly all over it.7

For the rest of his life, Turing would wrestle with the issue of whether the human mind was fundamentally different from a deterministic machine, and he would gradually come to the conclusion that the distinction was less clear than he had thought.

He also had an instinct that, just as uncertainty pervaded the subatomic realm, there were also mathematical problems that could not be solved mechanically and were destined to be cloaked in indeterminacy. At the time, mathematicians were intensely focused on questions about the completeness and consistency of logical systems, partly due to the influence of David Hilbert, the Göttingen-based genius who, among many other achievements, had come up with the mathematical formulation of the theory of general relativity concurrently with Einstein.

At a 1928 conference, Hilbert posed three fundamental questions about any formal system of mathematics: (1) Was its set of rules complete, so that any statement could be proved (or disproved) using only the rules of the system? (2) Was it consistent, so that no statement could be proved true and also proved false? (3) Was there some procedure that could determine whether a particular statement was provable, rather than allowing the possibility that some statements (such as enduring math riddles like Fermat’s last theorem,5 Goldbach’s conjecture,6 or the Collatz conjecture7) were destined to remain in undecidable limbo? Hilbert thought that the answer to the first two questions was yes, making the third one moot. He put it simply, “There is no such thing as an unsolvable problem.”

Within three years, the Austrian-born logician Kurt Gödel, then twenty-five and living with his mother in Vienna, polished off the first two of these questions with unexpected answers: no and no. In his “incompleteness theorem,” he showed that there existed statements that could be neither proved nor disproved. Among them, to oversimplify a bit, were those that were akin to self-referential statements such as “This statement is unprovable.” If the statement is true, then it decrees that we can’t prove it to be true; if it’s false, that also leads to a logical contradiction. It is somewhat like the ancient Greek “liar’s paradox,” in which the truth of the statement “This statement is false” cannot be determined. (If the statement is true, then it’s also false, and vice versa.)

By coming up with statements that could not be proved or disproved, Gödel showed that any formal system powerful enough to express the usual mathematics was incomplete. He was also able to produce a companion theorem that effectively answered no to Hilbert’s second question.

That left the third of Hilbert’s questions, that of decidability or, as Hilbert called it, the Entscheidungsproblem or “decision problem.” Even though Gödel had come up with statements that could be neither proved nor disproved, perhaps that odd class of statements could somehow be identified and cordoned off, leaving the rest of the system complete and consistent. That would require that we find some method for deciding whether a statement was provable. When the great Cambridge math professor Max Newman taught Turing about Hilbert’s questions, the way he expressed the Entscheidungsproblem was this: Is there a “mechanical process” that can be used to determine whether a particular logical statement is provable?

Turing liked the concept of a “mechanical process.” One day in the summer of 1935, he was out for his usual solitary run along the Ely River, and after a couple of miles he stopped to lie down among the apple trees in Grantchester Meadows to ponder an idea. He would take the notion of a “mechanical process” literally, conjuring up a mechanical process—an imaginary machine—and applying it to the problem.8

The “Logical Computing Machine” that he envisioned (as a thought experiment, not as a real machine to be built) was quite simple at first glance, but it could handle, in theory, any

72

mathematical computation. It consisted of an unlimited length of paper tape containing symbols within squares; in the simplest binary example, these symbols could be merely a 1 and a blank. The machine would be able to read the symbols on the tape and perform certain actions based on a “table of instructions” it had been given.9

The table of instructions would tell the machine what to do based on whatever configuration it happened to be in and what symbol, if any, it found in the square. For example, the table of instructions for a particular task might decree that if the machine was in configuration 1 and saw a 1 in the square, then it should move one square to the right and shift into configuration 2. Somewhat surprisingly, to us if not to Turing, such a machine, given the proper table of instructions, could complete any mathematical task, no matter how complex.

How might this imaginary machine answer Hilbert’s third question, the decision problem? Turing approached the problem by refining the concept of “computable numbers.” Any real number that was defined by a mathematical rule could be calculated by the Logical Computing Machine. Even an irrational number such as π could be calculated indefinitely using a finite table of instructions. So could the logarithm of 7, or the square root of 2, or the sequence of Bernoulli numbers that Ada Lovelace had helped produce an algorithm for, or any other number or series, no matter how challenging to compute, as long as its calculation was defined by a finite set of rules. All of these were, in Turing’s parlance, “computable numbers.”

Turing went on to show that noncomputable numbers also existed. This was related to what he called “the halting problem.” There can be no method, he showed, to determine in advance whether any given instruction table combined with any given set of inputs will lead the machine to arrive at an answer or go into some loop and continue chugging away indefinitely, getting nowhere. The insolvability of the halting problem, he showed, meant that Hilbert’s decision problem, the Entscheidungsproblem, was unsolvable. Despite what Hilbert seemed to hope, no mechanical procedure can determine the provability of every mathematical statement. Gödel’s incompleteness theory, the indeterminacy of quantum mechanics, and Turing’s answer to Hilbert’s third challenge all dealt blows to a mechanical, deterministic, predictable universe.

Turing’s paper was published in 1937 with the not so snappy title “On Computable Numbers, with an Application to the Entscheidungsproblem.” His answer to Hilbert’s third question was useful for the development of mathematical theory. But far more important was the by-product of Turing’s proof: his concept of a Logical Computing Machine, which soon came to be known as a Turing machine. “It is possible to invent a single machine which can be used to compute any computable sequence,” he declared.10 Such a machine would be able to read the instructions of any other machine and carry out whatever task that machine could do. In essence, it embodied the dream of Charles Babbage and Ada Lovelace for a completely general-purpose universal machine.

A different and less beautiful solution to the Entscheidungsproblem, with the clunkier name “untyped lambda calculus,” had been published earlier that year by Alonzo Church, a mathematician at Princeton. Turing’s professor Max Newman decided that it would be useful for Turing to go there to study under Church. In his letter of recommendation, Newman described Turing’s enormous potential. He also added a more personal appeal based on Turing’s personality. “He has been working without any supervision or criticism from anyone,” Newman wrote. “This makes it all the more important that he should come into contact as soon as possible with the leading workers on this line, so that he should not develop into a confirmed solitary.”11

Turing did have a tendency toward being a loner. His homosexuality made him feel like an outsider at times; he lived alone and avoided deep personal commitments. At one point he

73

proposed marriage to a female colleague, but then felt compelled to tell her that he was gay; she was unfazed and still willing to get married, but he believed it would be a sham and decided not to proceed. Yet he did not become “a confirmed solitary.” He learned to work as part of a team, with collaborators, which was key to allowing his abstract theories to be reflected in real and tangible inventions.

In September 1936, while waiting for his paper to be published, the twenty-four-year-old doctoral candidate sailed to America in steerage class aboard the aging ocean liner RMS Berengaria, lugging with him a prized brass sextant. His office at Princeton was in the Mathematics Department building, which also then housed the Institute for Advanced Study, where Einstein, Gödel, and von Neumann held court. The cultivated and highly sociable von Neumann became particularly interested in Turing’s work, despite their very different personalities.

The seismic shifts and simultaneous advances of 1937 were not directly caused by the publication of Turing’s paper. In fact, it got little notice at first. Turing asked his mother to send out reprints of it to the mathematical philosopher Bertrand Russell and a half dozen other famous scholars, but the only major review was by Alonzo Church, who could afford to be flattering because he had been ahead of Turing in solving Hilbert’s decision problem. Church was not only generous; he introduced the term Turing machine for what Turing had called a Logical Computing Machine. Thus at twenty-four, Turing’s name became indelibly stamped on one of the most important concepts of the digital age.12

CLAUDE SHANNON AND GEORGE STIBITZ AT BELL LABS There was another seminal theoretical breakthrough in 1937, similar to Turing’s in that it was purely a thought experiment. This one was the work of an MIT graduate student named Claude Shannon, who that year turned in the most influential master’s thesis of all time, a paper that Scientific American later dubbed “the Magna Carta of the Information Age.”13

Shannon grew up in a small Michigan town where he built model planes and amateur radios, then went on to major in electrical engineering and math at the University of Michigan. In his senior year he answered a help-wanted listing tacked to a bulletin board, which offered a job at MIT working under Vannevar Bush helping to run the Differential Analyzer. Shannon got the job and was mesmerized by the machine—not so much the rods and pulleys and wheels that formed the analog components as the electromagnetic relay switches that were part of its control circuit. As electrical signals caused them to click open and clack closed, the switches created different circuit patterns.

During the summer of 1937, Shannon took a break from MIT and went to work at Bell Labs, a research facility run by AT&T. Located then in Manhattan on the Hudson River edge of Greenwich Village, it was a haven for turning ideas into inventions. Abstract theories intersected with practical problems there, and in the corridors and cafeterias eccentric theorists mingled with hands-on engineers, gnarly mechanics, and businesslike problem- solvers, encouraging the cross-fertilization of theory with engineering. This made Bell Labs an archetype of one of the most important underpinnings of digital-age innovation, what the Harvard science historian Peter Galison has called a “trading zone.” When these disparate practitioners and theoreticians came together, they learned how to find a common parlance to trade ideas and exchange information.14

At Bell Labs, Shannon saw up close the wonderful power of the phone system’s circuits, which used electrical switches to route calls and balance loads. In his mind, he began connecting the workings of these circuits to another subject he found fascinating, the system of logic formulated ninety years earlier by the British mathematician George Boole. Boole revolutionized logic by finding ways to express logical statements using symbols and

74

equations. He gave true propositions the value 1 and false propositions a 0. A set of basic logical operations—such as and, or, not, either/or, and if/then—could then be performed using these propositions, just as if they were math equations.

Shannon figured out that electrical circuits could execute these logical operations using an arrangement of on-off switches. To perform an and function, for example, two switches could be put in sequence, so that both had to be on for electricity to flow. To perform an or function, the switches could be in parallel so that electricity would flow if either of them was on. Slightly more versatile switches called logic gates could streamline the process. In other words, you could design a circuit containing a lot of relays and logic gates that could perform, step by step, a sequence of logical tasks.

(A “relay” is simply a switch that can be opened and shut electrically, such as by using an electromagnet. The ones that clack open and closed are sometimes called electromechanical because they have moving parts. Vacuum tubes and transistors can also be used as switches in an electrical circuit; they are called electronic because they manipulate the flow of electrons but do not require the movement of any physical parts. A “logic gate” is a switch that can handle one or more inputs. For example, in the case of two inputs, an and logic gate switches on if both of the inputs are on, and an or logic gate switches on if either of the inputs is on. Shannon’s insight was that these could be wired together in circuits that could execute the tasks of Boole’s logical algebra.)

When Shannon returned to MIT in the fall, Bush was fascinated by his ideas and urged him to include them in his master’s thesis. Entitled “A Symbolic Analysis of Relay and Switching Circuits,” it showed how each of the many functions of Boolean algebra could be executed. “It is possible to perform complex mathematical operations by means of relay circuits,” he summed up at the end.15 This became the basic concept underlying all digital computers.

Shannon’s ideas intrigued Turing because they neatly related to his own just-published concept of a universal machine that could use simple instructions, expressed in binary coding, to tackle problems not only of math but of logic. Also, since logic was related to the way human minds reason, a machine that performed logical tasks could, in theory, mimic the way humans think.

Working at Bell Labs at the same time was a mathematician named George Stibitz, whose job was to figure out ways to handle the increasingly complicated calculations needed by the telephone engineers. The only tools he had were mechanical desktop adding machines, so he set out to invent something better based on Shannon’s insight that electronic circuits could perform mathematical and logical tasks. Late one evening in November, he went to the stockroom and took home some old electromagnetic relays and bulbs. At his kitchen table, he put the parts together with a tobacco tin and a few switches to form a simple logical circuit that could add binary numbers. A lit bulb represented a 1, and an unlit bulb represented a 0. His wife dubbed it the “K-Model,” after the kitchen table. He took it into the office the next day and tried to convince his colleagues that, with enough relays, he could make a calculating machine.

One important mission of Bell Labs was to figure out ways to amplify a phone signal over long distances while filtering out static. The engineers had formulas that dealt with the amplitude and phase of the signal, and the solutions to their equations sometimes involved complex numbers (ones that include an imaginary unit that represents the square root of a negative number). Stibitz was asked by his supervisor if his proposed machine could handle complex numbers. When he said that it could, a team was assigned to help him build it. The Complex Number Calculator, as it was called, was completed in 1939. It had more than four hundred relays, each of which could open and shut twenty times per second. That made it both blindingly fast compared to mechanical calculators and painfully clunky compared to

75

the all-electronic vacuum-tube circuits just being invented. Stibitz’s computer was not programmable, but it showed the potential of a circuit of relays to do binary math, process information, and handle logical procedures.16

HOWARD AIKEN Also in 1937 a Harvard doctoral student named Howard Aiken was struggling to do tedious calculations for his physics thesis using an adding machine. When he lobbied the university to build a more sophisticated computer to do the work, his department head mentioned that in the attic of Harvard’s science center were some brass wheels from a century-old device that seemed to be similar to what he wanted. When Aiken explored the attic, he found one of six demonstration models of Charles Babbage’s Difference Engine, which Babbage’s son Henry had made and distributed. Aiken became fascinated by Babbage and moved the set of brass wheels into his office. “Sure enough, we had two of Babbage’s wheels,” he recalled. “Those were the wheels that I had later mounted and put in the body of the computer.”17

That fall, just when Stibitz was cooking up his kitchen-table demonstration, Aiken wrote a twenty-two-page memo to his Harvard superiors and executives at IBM making the case that they should fund a modern version of Babbage’s digital machine. “The desire to economize time and mental effort in arithmetical computations, and to eliminate human liability to error is probably as old as the science of arithmetic itself,” his memo began.18

Aiken had grown up in Indiana under rough circumstances. When he was twelve, he used a fireplace poker to defend his mother against his drunk and abusive father, who then abandoned the family with no money. So young Howard dropped out of ninth grade to support the family by working as a telephone installer, then got a night job with the local power company so that he could attend a tech school during the day. He drove himself to be a success, but in the process he developed into a taskmaster with an explosive temper, someone who was described as resembling an approaching thunderstorm.19

Harvard had mixed feelings about building Aiken’s proposed calculating machine or holding out the possibility that he might be granted tenure for a project that seemed to be more practical than academic. (In parts of the Harvard faculty club, calling someone practical rather than academic was considered an insult.) Supporting Aiken was President James Bryant Conant, who, as chairman of the National Defense Research Committee, was comfortable positioning Harvard as part of a triangle involving academia, industry, and the military. His Physics Department, however, was more purist. Its chairman wrote to Conant in December 1939, saying that the machine was “desirable if money can be found, but not necessarily more desirable than anything else,” and a faculty committee said of Aiken, “It should be made quite clear to him that such activity did not increase his chances of promotion to a professorship.” Eventually Conant prevailed and authorized Aiken to build his machine.20

In April 1941, as IBM was constructing the Mark I to Aiken’s specifications at its lab in Endicott, New York, he left Harvard to serve in the U.S. Navy. For two years he was a teacher, with the rank of lieutenant commander, at the Naval Mine Warfare School in Virginia. One colleague described him as “armed to the teeth with room-length formulas and ivy-covered Harvard theories” and running “smack into a collection of Dixie dumbbells [none of whom] knew calculus from corn pone.”21 Much of his time was spent thinking about the Mark I, and he made occasional visits to Endicott wearing his full dress uniform.22

His tour of duty had one major payoff: at the beginning of 1944, as IBM was getting ready to ship the completed Mark I to Harvard, Aiken was able to convince the Navy to take over authority for the machine and assign him to be the officer in charge. That helped him circumnavigate the academic bureaucracy of Harvard, which was still balky about granting him tenure. The Harvard Computation Laboratory became, for the time being, a naval

76

facility, and all of Aiken’s staffers were Navy personnel who wore uniforms to work. He called them his “crew,” they called him “commander,” and the Mark I was referred to as “she,” as if she were a ship.23

The Harvard Mark I borrowed a lot of Babbage’s ideas. It was digital, although not binary; its wheels had ten positions. Along its fifty-foot shaft were seventy-two counters that could store numbers of up to twenty-three digits, and the finished five-ton product was eighty feet long and fifty feet wide. The shaft and other moving parts were turned electrically. But it was slow. Instead of electromagnetic relays, it used mechanical ones that were opened and shut by electric motors. That meant it took about six seconds to do a multiplication problem, compared to one second for Stibitz’s machine. It did, however, have one impressive feature that would become a staple of modern computers: it was fully automatic. Programs and data were entered by paper tape, and it could run for days with no human intervention. That allowed Aiken to refer to it as “Babbage’s dream come true.”24

KONRAD ZUSE Although they didn’t know it, all of these pioneers were being beaten in 1937 by a German engineer working in his parents’ apartment. Konrad Zuse was finishing the prototype for a calculator that was binary and could read instructions from a punched tape. However, at least in its first version, called the Z1, it was a mechanical, not an electrical or electronic, machine.

Like many pioneers in the digital age, Zuse grew up fascinated by both art and engineering. After graduating from a technical college, he got a job as a stress analyst for an aircraft company in Berlin, solving linear equations that incorporated all sorts of load and strength and elasticity factors. Even using mechanical calculators, it was almost impossible for a person to solve in less than a day more than six simultaneous linear equations with six unknowns. If there were twenty-five variables, it could take a year. So Zuse, like so many others, was driven by the desire to mechanize the tedious process of solving mathematical equations. He converted his parents’ living room, in an apartment near Berlin’s Tempelhof Airport, into a workshop.25

In Zuse’s first version, binary digits were stored by using thin metal plates with slots and pins, which he and his friends made using a jigsaw. At first he used punched paper tape to input data and programs, but he soon switched to discarded 35 mm movie film, which not only was sturdier but happened to be cheaper. His Z1 was completed in 1938, and it was able to clank through a few problems, though not very reliably. All the components had been made by hand, and they tended to jam. He was handicapped by not being at a place like Bell Labs or part of a collaboration like Harvard had with IBM, which would have allowed him to team up with engineers who could have supplemented his talents.

The Z1 did, however, show that the logical concept Zuse had designed would work in theory. A college friend who was helping him, Helmut Schreyer, urged that they make a version using electronic vacuum tubes rather than mechanical switches. Had they done so right away, they would have gone down in history as the first inventors of a working modern computer: binary, electronic, and programmable. But Zuse, as well as the experts he consulted at the technical school, balked at the expense of building a device with close to two thousand vacuum tubes.26

So for the Z2 they decided instead to use electromechanical relay switches, acquired secondhand from the phone company, which were tougher and cheaper, although a lot slower. The result was a computer that used relays for the arithmetic unit. However, the memory unit was mechanical, using movable pins in a metal sheet.

In 1939 Zuse began work on a third model, the Z3, that used electromechanical relays both for the arithmetic unit and for the memory and control units. When it was completed in 1941,

77

it became the first fully working all-purpose, programmable digital computer. Even though it did not have a way to directly handle conditional jumps and branching in the programs, it could theoretically perform as a universal Turing machine. Its major difference from later computers was that it used clunky electromagnetic relays rather than electronic components such as vacuum tubes or transistors.

Zuse’s friend Schreyer went on to write a doctoral thesis, “The Tube Relay and the Techniques of Its Switching,” that advocated using vacuum tubes for a powerful and fast computer. But when he and Zuse proposed it to the German Army in 1942, the commanders said they were confident that they would win the war before the two years it would take to build such a machine.27 They were more interested in making weapons than computers. As a result, Zuse was pulled away from his computer work and sent back to engineering airplanes. In 1943 his computers and designs were destroyed in the Allied bombing of Berlin.

Zuse and Stibitz, working independently, had both come up with employing relay switches to make circuits that could handle binary computations. How did they develop this idea at the same time when war kept their two teams isolated? The answer is partly that advances in technology and theory made the moment ripe. Along with many other innovators, Zuse and Stibitz were familiar with the use of relays in phone circuits, and it made sense to tie that to binary operations of math and logic. Likewise, Shannon, who was also very familiar with phone circuits, made the related theoretical leap that electronic circuits would be able to perform the logical tasks of Boolean algebra. The idea that digital circuits would be the key to computing was quickly becoming clear to researchers almost everywhere, even in isolated places like central Iowa.

JOHN VINCENT ATANASOFF Far from both Zuse and Stibitz, another inventor was also experimenting with digital circuits in 1937. Toiling in a basement in Iowa, he would make the next historic innovation: building a calculating device that, at least in part, used vacuum tubes. In some ways his machine was less advanced than the others. It wasn’t programmable and multipurpose; instead of being totally electronic, he included some slow mechanical moving elements; and even though he built a model that was able to work in theory, he couldn’t actually get the thing reliably operational. Nevertheless, John Vincent Atanasoff, known to his wife and friends as Vincent, deserves the distinction of being the pioneer who conceived the first partly electronic digital computer, and he did so after he was struck by inspiration during a long impetuous drive one night in December 1937.28

Atanasoff was born in 1903, the eldest of seven children of a Bulgarian immigrant and a woman descended from one of New England’s oldest families. His father worked as an engineer in a New Jersey electric plant run by Thomas Edison, then moved the family to a town in rural Florida south of Tampa. At nine, Vincent helped his father wire their Florida house for electricity, and his father gave him a Dietzgen slide rule. “That slide rule was my meat,” he recalled.29 At an early age, he dove into the study of logarithms with an enthusiasm that seems a bit wacky even as he recounted it in earnest tones: “Can you imagine how a boy of nine, with baseball on his mind, could be transformed by this knowledge? Baseball was reduced to near zero as a stern study was made of logarithms.” Over the summer, he calculated the logarithm of 5 to the base e, then, with his mother’s help (she had once been a math teacher), he learned calculus while still in middle school. His father took him to the phosphate plant where he was an electrical engineer, showing him how the generators worked. Diffident, creative, and brilliant, young Vincent finished high school in two years, getting all A’s in his double load of classes.

At the University of Florida he studied electrical engineering and displayed a practical

78

inclination, spending time in the university’s machine shop and foundry. He also remained fascinated by math and as a freshman studied a proof involving binary arithmetic. Creative and self-confident, he graduated with the highest grade point average of his time. He accepted a fellowship to pursue master’s work in math and physics at Iowa State and, even though he later was admitted to Harvard, stuck with his decision to head up to the corn belt town of Ames.

Atanasoff went on to pursue a doctorate in physics at the University of Wisconsin, where he had the same experience as the other computer pioneers, beginning with Babbage. His work, which was on how helium can be polarized by an electric field, involved tedious calculations. As he struggled to solve the math using a desktop adding machine, he dreamed of ways to invent a calculator that could do more of the work. After returning to Iowa State in 1930 as an assistant professor, he decided that his degrees in electrical engineering, math, and physics had equipped him for the task.

There was a consequence to his decision not to stay at Wisconsin or to go to Harvard or a similar large research university. At Iowa State, where no one else was working on ways to build new calculators, Atanasoff was on his own. He could come up with fresh ideas, but he did not have around him people to serve as sounding boards or to help him overcome theoretical or engineering challenges. Unlike most innovators of the digital age, he was a lone inventor, drawing his inspiration during solo car trips and in discussions with one graduate student assistant. In the end, that would prove to be a drawback.

Atanasoff initially considered building an analog device; his love of slide rules led him to try to devise a supersize version using long strips of film. But he realized that the film would have to be hundreds of yards long in order to solve linear algebraic equations accurately enough to suit his needs. He also built a contraption that could shape a mound of paraffin so that it could calculate a partial differential equation. The limitations of these analog devices caused him to focus instead on creating a digital version.

The first problem he tackled was how to store numbers in a machine. He used the term memory to describe this feature: “At the time, I had only a cursory knowledge of the work of Babbage and so did not know he called the same concept ‘store.’ . . . I like his word, and perhaps if I had known, I would have adopted it; I like ‘memory,’ too, with its analogy to the brain.”30

Atanasoff went through a list of possible memory devices: mechanical pins, electromagnetic relays, a small piece of magnetic material that could be polarized by an electric charge, vacuum tubes, and a small electrical condenser. The fastest would be vacuum tubes, but they were expensive. So he opted instead to use what he called condensers—what we now call capacitors—which are small and inexpensive components that can store, at least briefly, an electrical charge. It was an understandable decision, but it meant that the machine would be sluggish and clunky. Even if the adding and subtracting could be done at electronic speeds, the process of taking numbers in and out of the memory unit would slow things down to the speed of the rotating drum.

79

George Stibitz (1904–95) circa 1945.

80

Konrad Zuse (1910–95) with the Z4 computer in 1944.

81

John Atanasoff (1903–95) at Iowa State, circa 1940.

82

Reconstruction of Atanasoff’s computer.

Once he had settled on the memory unit, Atanasoff turned his attention to how to construct the arithmetic and logic unit, which he called the “computing mechanism.” He decided it should be fully electronic; that meant using vacuum tubes, even though they were expensive. The tubes would act as on-off switches to perform the function of logic gates in a circuit that could add, subtract, and perform any Boolean function.

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Financial Analyst
Top Writing Guru
Professional Accountant
Smart Tutor
University Coursework Help
WRITING LAND
Writer Writer Name Offer Chat
Financial Analyst

ONLINE

Financial Analyst

As per my knowledge I can assist you in writing a perfect Planning, Marketing Research, Business Pitches, Business Proposals, Business Feasibility Reports and Content within your given deadline and budget.

$45 Chat With Writer
Top Writing Guru

ONLINE

Top Writing Guru

I find your project quite stimulating and related to my profession. I can surely contribute you with your project.

$36 Chat With Writer
Professional Accountant

ONLINE

Professional Accountant

I am an elite class writer with more than 6 years of experience as an academic writer. I will provide you the 100 percent original and plagiarism-free content.

$44 Chat With Writer
Smart Tutor

ONLINE

Smart Tutor

I find your project quite stimulating and related to my profession. I can surely contribute you with your project.

$44 Chat With Writer
University Coursework Help

ONLINE

University Coursework Help

I am an academic and research writer with having an MBA degree in business and finance. I have written many business reports on several topics and am well aware of all academic referencing styles.

$26 Chat With Writer
WRITING LAND

ONLINE

WRITING LAND

I am a PhD writer with 10 years of experience. I will be delivering high-quality, plagiarism-free work to you in the minimum amount of time. Waiting for your message.

$42 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

Tafe sa library login - Federal case which influences commerce on the internet - Pienso comprar aquellas camisas verdes - Burke litwin model strengths and weaknesses - Leadership Styles and Nursing - Types of informative texts - Ifrs 500 multiple choice questions - Dsm 5 cross cutting measures - John hopkins ebp model article - The english school cairo - United open mri limited - Residential valuation report sample - The lower limit of the intertidal zone is the - Ca oh 2 ch3cooh balanced equation - Hardening Techniques Responses - Respond post to other classmate only need 2 paragraph 2 classmates - Global ethic vs national interest - What is the primary objective of financial reporting - Shafts keys and couplings - Wooden ping pong ball launcher - Colleges that cost 75 000 dollars - Below are departmental income statements for a guitar manufacturer - Factors affecting evaporation ppt - THEORIES OF INQUIRY - Wireless Network - Periodontal treatment consent form - Animal farm movie questions - Job order cost accounting system journal entries - Order 2236817: Creating and Communicating a Security Strategy - Nao uk accordion championships - Igcse biology reproduction past papers - Wilsons prom tidal river camping - Recreation road infant school - Bowller roofing kings langley - Does technology make us more alone persuasive essay - Isaiah 65 21 23 meaning - Hybrid classes pros and cons - Information System and Technology - Orwell vets grange farm - History of biomedical engineering - Paper company accountant spilling chili amazon - Balancing demand and productive capacity - Discussion Business Law - Discussion 200 word - The all new don't think of an elephant - Comp 2 paper - How to find the nullity of a matrix - Unit step function laplace transform examples - Essay 2--Lysistrata - Brigg to scunthorpe bus - As 4120 code of tendering pdf - The guest by albert camus questions and answers - Netshelter sx 42u datasheet - Tensile strength vs breaking strength - Body fortress shred abolic igniter reviews - Discussion Board - My last duchess poetry foundation - Court Case #96303809Law - Ciccarelli white psychology 4th edition apa citation - Celine dion company issued 600 000 - Caboolture youth step up step down - 5 dysfunctions of a team avoidance of accountability - Factors leading up to the great depression - Target gift card sequence number - Week 5 Discussion BIO2070 Microbiology - Uber Strategies and Challenges globally and in China - Doctrine of part performance - States of matter presentation - Organizational Risk Management Interview - Week 4 project capstone - Under armour target market age - Activity 30 passive voice ratatouille - Payday 2 burn offshore money achievement - How to remember complementary and supplementary angles - Stats exam - Biology - The medical and dental defence union of scotland - Interpersonal communication movie analysis paper - Film techniques in blade runner - Marketing management strategies ferrell hartline pdf - Characteristics of a legitimate and qualified nutrition expert include - Fractional factorial design minitab - Keating on construction contracts pdf - F89 light support weapon - What is strategic compensation - Management accounting slides - Deep massage pioneer ida crossword clue - Rnib college loughborough staff - Amreeka youtube - Report on experience poem summary - Workplan - Manage meetings assessment sample - Duke of ferrara my last duchess - Information Technology - In most spanish speaking countries married women legally - Classification of space maintainers - What industry is yelp in - BUSINESS LAW, ETHICS AND SOCIAL RESPOSIBILITY - Reflection Paper- 2 - Poem for remembrance day