Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

Tropicana field team renamed in 2008 crossword

18/11/2021 Client: muhammad11 Deadline: 2 Day

Book Chapter Review

RANDOM HOUSE and the HOUSE colophon are registered trademarks of Random House LLC. LIBRARY OF CONGRESS CATALOGING-IN-PUBLICATION DATA

Catmull, Edwin E. Creativity, Inc. : overcoming the unseen forces that stand in the way of true inspiration / Ed Catmull with

Amy Wallace. pages cm

ISBN 978-0-8129-9301-1 eBook ISBN 978-0-67964450-7

1. Creative ability in business. 2. Corporate culture. 3. Organizational effectiveness. 4. Pixar (Firm) I. Wallace, Amy. II. Title.

HD53.C394 2014 658.4′0714—dc23 2013036026

www.atrandom.com

Jacket design: Andy Dreyfus Jacket illustration: © Disney • Pixar

v3.1

http://www.atrandom.com
CONTENTS

Cover Title Page Copyright

Introduction: Lost and Found

PART I: GETTING STARTED Chapter 1: Animated Chapter 2: Pixar Is Born Chapter 3: A Defining Goal Chapter 4: Establishing Pixar’s Identity

PART II: PROTECTING THE NEW Chapter 5: Honesty and Candor Chapter 6: Fear and Failure Chapter 7: The Hungry Beast and the Ugly Baby Chapter 8: Change and Randomness Chapter 9: The Hidden

PART III: BUILDING AND SUSTAINING Chapter 10: Broadening Our View Chapter 11: The Unmade Future

PART IV: TESTING WHAT WE KNOW Chapter 12: A New Challenge Chapter 13: Notes Day

Afterword: The Steve We Knew Starting Points: Thoughts for Managing a Creative Culture

Photo Insert Dedication

Acknowledgments About the Authors

INTRODUCTION: LOST AND FOUND

Every morning, as I walk into Pixar Animation Studios—past the twenty-foot- high sculpture of Luxo Jr., our friendly desk lamp mascot, through the double doors and into a spectacular glass-ceilinged atrium where a man-sized Buzz Lightyear and Woody, made entirely of Lego bricks, stand at attention, up the stairs past sketches and paintings of the characters that have populated our fourteen films—I am struck by the unique culture that defines this place. Although I’ve made this walk thousands of times, it never gets old. Built on the site of a former cannery, Pixar’s fifteen-acre campus, just over the

Bay Bridge from San Francisco, was designed, inside and out, by Steve Jobs. (Its name, in fact, is The Steve Jobs Building.) It has well-thought-out patterns of entry and egress that encourage people to mingle, meet, and communicate. Outside, there is a soccer field, a volleyball court, a swimming pool, and a six- hundred-seat amphitheater. Sometimes visitors misunderstand the place, thinking it’s fancy for fancy’s sake. What they miss is that the unifying idea for this building isn’t luxury but community. Steve wanted the building to support our work by enhancing our ability to collaborate. The animators who work here are free to—no, encouraged to—decorate their

work spaces in whatever style they wish. They spend their days inside pink dollhouses whose ceilings are hung with miniature chandeliers, tiki huts made of real bamboo, and castles whose meticulously painted, fifteen-foot-high styrofoam turrets appear to be carved from stone. Annual company traditions include “Pixarpalooza,” where our in-house rock bands battle for dominance, shredding their hearts out on stages we erect on our front lawn. The point is, we value self-expression here. This tends to make a big

impression on visitors, who often tell me that the experience of walking into Pixar leaves them feeling a little wistful, like something is missing in their work lives—a palpable energy, a feeling of collaboration and unfettered creativity, a sense, not to be corny, of possibility. I respond by telling them that the feeling they are picking up on—call it exuberance or irreverence, even whimsy—is

integral to our success. But it’s not what makes Pixar special. What makes Pixar special is that we acknowledge we will always have

problems, many of them hidden from our view; that we work hard to uncover these problems, even if doing so means making ourselves uncomfortable; and that, when we come across a problem, we marshal all of our energies to solve it. This, more than any elaborate party or turreted workstation, is why I love coming to work in the morning. It is what motivates me and gives me a definite sense of mission. There was a time, however, when my purpose here felt a lot less clear to me.

And it might surprise you when I tell you when.

On November 22, 1995, Toy Story debuted in America’s theaters and became the largest Thanksgiving opening in history. Critics heralded it as “inventive” (Time), “brilliant” and “exultantly witty” (The New York Times), and “visionary” (Chicago Sun-Times). To find a movie worthy of comparison, wrote The Washington Post, one had to go back to 1939, to The Wizard of Oz. The making of Toy Story—the first feature film to be animated entirely on a

computer—had required every ounce of our tenacity, artistry, technical wizardry, and endurance. The hundred or so men and women who produced it had weathered countless ups and downs as well as the ever-present, hair-raising knowledge that our survival depended on this 80-minute experiment. For five straight years, we’d fought to do Toy Story our way. We’d resisted the advice of Disney executives who believed that since they’d had such success with musicals, we too should fill our movie with songs. We’d rebooted the story completely, more than once, to make sure it rang true. We’d worked nights, weekends, and holidays—mostly without complaint. Despite being novice filmmakers at a fledgling studio in dire financial straits, we had put our faith in a simple idea: If we made something that we wanted to see, others would want to see it, too. For so long, it felt like we had been pushing that rock up the hill, trying to do the impossible. There were plenty of moments when the future of Pixar was in doubt. Now, we were suddenly being held up as an example of what could happen when artists trusted their guts. Toy Story went on to become the top-grossing film of the year and would earn

$358 million worldwide. But it wasn’t just the numbers that made us proud; money, after all, is just one measure of a thriving company and usually not the most meaningful one. No, what I found gratifying was what we’d created. Review after review focused on the film’s moving plotline and its rich, three-

dimensional characters—only briefly mentioning, almost as an aside, that it had been made on a computer. While there was much innovation that enabled our work, we had not let the technology overwhelm our real purpose: making a great film. On a personal level, Toy Story represented the fulfillment of a goal I had

pursued for more than two decades and had dreamed about since I was a boy. Growing up in the 1950s, I had yearned to be a Disney animator but had no idea how to go about it. Instinctively, I realize now, I embraced computer graphics— then a new field—as a means of pursuing that dream. If I couldn’t animate by hand, there had to be another way. In graduate school, I’d quietly set a goal of making the first computer-animated feature film, and I’d worked tirelessly for twenty years to accomplish it. Now, the goal that had been a driving force in my life had been reached, and

there was an enormous sense of relief and exhilaration—at least at first. In the wake of Toy Story’s release, we took the company public, raising the kind of money that would ensure our future as an independent production house, and began work on two new feature-length projects, A Bug’s Life and Toy Story 2. Everything was going our way, and yet I felt adrift. In fulfilling a goal, I had lost some essential framework. Is this really what I want to do? I began asking myself. The doubts surprised and confused me, and I kept them to myself. I had served as Pixar’s president for most of the company’s existence. I loved the place and everything that it stood for. Still, I couldn’t deny that achieving the goal that had defined my professional life had left me without one. Is this all there is? I wondered. Is it time for a new challenge? It wasn’t that I thought Pixar had “arrived” or that my work was done. I knew

there were major obstacles in front of us. The company was growing quickly, with lots of shareholders to please, and we were racing to put two new films into production. There was, in short, plenty to occupy my working hours. But my internal sense of purpose—the thing that had led me to sleep on the floor of the computer lab in graduate school just to get more hours on the mainframe, that kept me awake at night, as a kid, solving puzzles in my head, that fueled my every workday—had gone missing. I’d spent two decades building a train and laying its track. Now, the thought of merely driving it struck me as a far less interesting task. Was making one film after another enough to engage me? I wondered. What would be my organizing principle now? It would take a full year for the answer to emerge.

From the start, my professional life seemed destined to have one foot in Silicon

Valley and the other in Hollywood. I first got into the film business in 1979 when, flush from the success of Star Wars, George Lucas hired me to help him bring high technology into the film industry. But he wasn’t based in Los Angeles. Instead, he’d founded his company, Lucasfilm, at the north end of the San Francisco Bay. Our offices were located in San Rafael, about an hour’s drive from Palo Alto, the heart of Silicon Valley—a moniker that was just gaining traction then, as the semiconductor and computer industries took off. That proximity gave me a front-row seat from which to observe the many emerging hardware and software companies—not to mention the growing venture capital industry—that, in the course of a few years, would come to dominate Silicon Valley from its perch on Sand Hill Road. I couldn’t have arrived at a more dynamic and volatile time. I watched as

many startups burned bright with success—and then flamed out. My mandate at Lucasfilm—to merge moviemaking with technology—meant that I rubbed shoulders with the leaders of places like Sun Microsystems and Silicon Graphics and Cray Computer, several of whom I came to know well. I was first and foremost a scientist then, not a manager, so I watched these guys closely, hoping to learn from the trajectories their companies followed. Gradually, a pattern began to emerge: Someone had a creative idea, obtained funding, brought on a lot of smart people, and developed and sold a product that got a boatload of attention. That initial success begat more success, luring the best engineers and attracting customers who had interesting and high-profile problems to solve. As these companies grew, much was written about their paradigm-shifting approaches, and when their CEOs inevitably landed on the cover of Fortune magazine, they were heralded as “Titans of the New.” I especially remember the confidence. The leaders of these companies radiated supreme confidence. Surely, they could only have reached this apex by being very, very good. But then those companies did something stupid—not just stupid-in-retrospect,

but obvious-at-the-time stupid. I wanted to understand why. What was causing smart people to make decisions that sent their companies off the rails? I didn’t doubt that they believed they were doing the right thing, but something was blinding them—and keeping them from seeing the problems that threatened to upend them. As a result, their companies expanded like bubbles, then burst. What interested me was not that companies rose and fell or that the landscape continually shifted as technology changed but that the leaders of these companies seemed so focused on the competition that they never developed any deep introspection about other destructive forces that were at work. Over the years, as Pixar struggled to find its way—first selling hardware, then

software, then making animated short films and advertisements—I asked myself:

If Pixar is ever successful, will we do something stupid, too? Can paying careful attention to the missteps of others help us be more alert to our own? Or is there something about becoming a leader that makes you blind to the things that threaten the well-being of your enterprise? Clearly, something was causing a dangerous disconnect at many smart, creative companies. What, exactly, was a mystery—and one I was determined to figure out. In the difficult year after Toy Story’s debut, I came to realize that trying to

solve this mystery would be my next challenge. My desire to protect Pixar from the forces that ruin so many businesses gave me renewed focus. I began to see my role as a leader more clearly. I would devote myself to learning how to build not just a successful company but a sustainable creative culture. As I turned my attention from solving technical problems to engaging with the philosophy of sound management, I was excited once again—and sure that our second act could be as exhilarating as our first.

It has always been my goal to create a culture at Pixar that will outlast its founding leaders—Steve, John Lasseter, and me. But it is also my goal to share our underlying philosophies with other leaders and, frankly, with anyone who wrestles with the competing—but necessarily complementary—forces of art and commerce. What you’re holding in your hands, then, is an attempt to put down on paper my best ideas about how we built the culture that is the bedrock of this place. This book isn’t just for Pixar people, entertainment executives, or animators.

It is for anyone who wants to work in an environment that fosters creativity and problem solving. My belief is that good leadership can help creative people stay on the path to excellence no matter what business they’re in. My aim at Pixar— and at Disney Animation, which my longtime partner John Lasseter and I have also led since the Walt Disney Company acquired Pixar in 2006—has been to enable our people to do their best work. We start from the presumption that our people are talented and want to contribute. We accept that, without meaning to, our company is stifling that talent in myriad unseen ways. Finally, we try to identify those impediments and fix them. I’ve spent nearly forty years thinking about how to help smart, ambitious

people work effectively with one another. The way I see it, my job as a manager is to create a fertile environment, keep it healthy, and watch for the things that undermine it. I believe, to my core, that everybody has the potential to be creative—whatever form that creativity takes—and that to encourage such development is a noble thing. More interesting to me, though, are the blocks that

get in the way, often without us noticing, and hinder the creativity that resides within any thriving company. The thesis of this book is that there are many blocks to creativity, but there are

active steps we can take to protect the creative process. In the coming pages, I will discuss many of the steps we follow at Pixar, but the most compelling mechanisms to me are those that deal with uncertainty, instability, lack of candor, and the things we cannot see. I believe the best managers acknowledge and make room for what they do not know—not just because humility is a virtue but because until one adopts that mindset, the most striking breakthroughs cannot occur. I believe that managers must loosen the controls, not tighten them. They must accept risk; they must trust the people they work with and strive to clear the path for them; and always, they must pay attention to and engage with anything that creates fear. Moreover, successful leaders embrace the reality that their models may be wrong or incomplete. Only when we admit what we don’t know can we ever hope to learn it. This book is organized into four sections—Getting Started, Protecting the

New, Building and Sustaining, and Testing What We Know. It is no memoir, but in order to understand the mistakes we made, the lessons we learned, and the ways we learned from them, it necessarily delves at times into my own history and that of Pixar. I have much to say about enabling groups to create something meaningful together and then protecting them from the destructive forces that loom even in the strongest companies. My hope is that by relating my search for the sources of confusion and delusion within Pixar and Disney Animation, I can help others avoid the pitfalls that impede and sometimes ruin businesses of all kinds. The key for me—what has kept me motivated in the nineteen years since Toy Story debuted—has been the realization that identifying these destructive forces isn’t merely a philosophical exercise. It is a crucial, central mission. In the wake of our earliest success, Pixar needed its leaders to sit up and pay attention. And that need for vigilance never goes away. This book, then, is about the ongoing work of paying attention—of leading by being self-aware, as managers and as companies. It is an expression of the ideas that I believe make the best in us possible.

PART I

GETTING STARTED

CHAPTER 1

ANIMATED

For thirteen years we had a table in the large conference room at Pixar that we call West One. Though it was beautiful, I grew to hate this table. It was long and skinny, like one of those things you’d see in a comedy sketch about an old wealthy couple that sits down for dinner—one person at either end, a candelabra in the middle—and has to shout to make conversation. The table had been chosen by a designer Steve Jobs liked, and it was elegant, all right—but it impeded our work. We’d hold regular meetings about our movies around that table—thirty of us

facing off in two long lines, often with more people seated along the walls—and everyone was so spread out that it was difficult to communicate. For those unlucky enough to be seated at the far ends, ideas didn’t flow because it was nearly impossible to make eye contact without craning your neck. Moreover, because it was important that the director and producer of the film in question be able to hear what everyone was saying, they had to be placed at the center of the table. So did Pixar’s creative leaders: John Lasseter, Pixar’s creative officer, and me, and a handful of our most experienced directors, producers, and writers. To ensure that these people were always seated together, someone began making place cards. We might as well have been at a formal dinner party. When it comes to creative inspiration, job titles and hierarchy are

meaningless. That’s what I believe. But unwittingly, we were allowing this table —and the resulting place card ritual—to send a different message. The closer you were seated to the middle of the table, it implied, the more important—the more central—you must be. And the farther away, the less likely you were to speak up—your distance from the heart of the conversation made participating feel intrusive. If the table was crowded, as it often was, still more people would sit in chairs around the edges of the room, creating yet a third tier of participants (those at the center of the table, those at the ends, and those not at the table at

all). Without intending to, we’d created an obstacle that discouraged people from jumping in. Over the course of a decade, we held countless meetings around this table in

this way—completely unaware of how doing so undermined our own core principles. Why were we blind to this? Because the seating arrangements and place cards were designed for the convenience of the leaders, including me. Sincerely believing that we were in an inclusive meeting, we saw nothing amiss because we didn’t feel excluded. Those not sitting at the center of the table, meanwhile, saw quite clearly how it established a pecking order but presumed that we—the leaders—had intended that outcome. Who were they, then, to complain? It wasn’t until we happened to have a meeting in a smaller room with a square

table that John and I realized what was wrong. Sitting around that table, the interplay was better, the exchange of ideas more free-flowing, the eye contact automatic. Every person there, no matter their job title, felt free to speak up. This was not only what we wanted, it was a fundamental Pixar belief: Unhindered communication was key, no matter what your position. At our long, skinny table, comfortable in our middle seats, we had utterly failed to recognize that we were behaving contrary to that basic tenet. Over time, we’d fallen into a trap. Even though we were conscious that a room’s dynamics are critical to any good discussion, even though we believed that we were constantly on the lookout for problems, our vantage point blinded us to what was right before our eyes. Emboldened by this new insight, I went to our facilities department. “Please,”

I said, “I don’t care how you do it, but get that table out of there.” I wanted something that could be arranged into a more intimate square, so people could address each other directly and not feel like they didn’t matter. A few days later, as a critical meeting on an upcoming movie approached, our new table was installed, solving the problem. Still, interestingly, there were remnants of that problem that did not

immediately vanish just because we’d solved it. For example, the next time I walked into West One, I saw the brand-new table, arranged—as requested—in a more intimate square that made it possible for more people to interact at once. But the table was adorned with the same old place cards! While we’d fixed the key problem that had made place cards seem necessary, the cards themselves had become a tradition that would continue until we specifically dismantled it. This wasn’t as troubling an issue as the table itself, but it was something we had to address because cards implied hierarchy, and that was precisely what we were trying to avoid. When Andrew Stanton, one of our directors, entered the meeting room that morning, he grabbed several place cards and began randomly moving

them around, narrating as he went. “We don’t need these anymore!” he said in a way that everyone in the room grasped. Only then did we succeed in eliminating this ancillary problem. This is the nature of management. Decisions are made, usually for good

reasons, which in turn prompt other decisions. So when problems arise—and they always do—disentangling them is not as simple as correcting the original error. Often, finding a solution is a multi-step endeavor. There is the problem you know you are trying to solve—think of that as an oak tree—and then there are all the other problems—think of these as saplings—that sprouted from the acorns that fell around it. And these problems remain after you cut the oak tree down. Even after all these years, I’m often surprised to find problems that have

existed right in front of me, in plain sight. For me, the key to solving these problems is finding ways to see what’s working and what isn’t, which sounds a lot simpler than it is. Pixar today is managed according to this principle, but in a way I’ve been searching all my life for better ways of seeing. It began decades before Pixar even existed.

When I was a kid, I used to plunk myself down on the living room floor of my family’s modest Salt Lake City home a few minutes before 7 P.M. every Sunday and wait for Walt Disney. Specifically, I’d wait for him to appear on our black- and-white RCA with its tiny 12-inch screen. Even from a dozen feet away—the accepted wisdom at the time was that viewers should put one foot between them and the TV for every inch of screen—I was transfixed by what I saw. Each week, Walt Disney himself opened the broadcast of The Wonderful

World of Disney. Standing before me in suit and tie, like a kindly neighbor, he would demystify the Disney magic. He’d explain the use of synchronized sound in Steamboat Willie or talk about the importance of music in Fantasia. He always went out of his way to give credit to his forebears, the men—and, at this point, they were all men—who’d done the pioneering work upon which he was building his empire. He’d introduce the television audience to trailblazers such as Max Fleischer, of Koko the Clown and Betty Boop fame, and Winsor McCay, who made Gertie the Dinosaur—the first animated film to feature a character that expressed emotion—in 1914. He’d gather a group of his animators, colorists, and storyboard artists to explain how they made Mickey Mouse and Donald Duck come to life. Each week, Disney created a made-up world, used cutting-edge technology to enable it, and then told us how he’d done it. Walt Disney was one of my two boyhood idols. The other was Albert Einstein.

To me, even at a young age, they represented the two poles of creativity. Disney was all about inventing the new. He brought things into being—both artistically and technologically—that did not exist before. Einstein, by contrast, was a master of explaining that which already was. I read every Einstein biography I could get my hands on as well as a little book he wrote on his theory of relativity. I loved how the concepts he developed forced people to change their approach to physics and matter, to view the universe from a different perspective. Wild-haired and iconic, Einstein dared to bend the implications of what we thought we knew. He solved the biggest puzzles of all and, in doing so, changed our understanding of reality. Both Einstein and Disney inspired me, but Disney affected me more because

of his weekly visits to my family’s living room. “When you wish upon a star, makes no difference who you are,” his TV show’s theme song would announce as a baritone-voiced narrator promised: “Each week, as you enter this timeless land, one of these many worlds will open to you.… ” Then the narrator would tick them off: Frontierland (“tall tales and true from the legendary past”), Tomorrowland (“the promise of things to come”), Adventureland (“the wonder world of nature’s own realm”), and Fantasyland (“the happiest kingdom of them all”). I loved the idea that animation could take me places I’d never been. But the land I most wanted to learn about was the one occupied by the innovators at Disney who made these animated films. Between 1950 and 1955, Disney made three movies we consider classics

today: Cinderella, Peter Pan, and Lady and the Tramp. More than half a century later, we all remember the glass slipper, the Island of Lost Boys, and that scene where the cocker spaniel and the mutt slurp spaghetti. But few grasp how technically sophisticated these movies were. Disney’s animators were at the forefront of applied technology; instead of merely using existing methods, they were inventing ones of their own. They had to develop the tools to perfect sound and color, to use blue screen matting and multiplane cameras and xerography. Every time some technological breakthrough occurred, Walt Disney incorporated it and then talked about it on his show in a way that highlighted the relationship between technology and art. I was too young to realize such a synergy was groundbreaking. To me, it just made sense that they belonged together. Watching Disney one Sunday evening in April of 1956, I experienced

something that would define my professional life. What exactly it was is difficult to describe except to say that I felt something fall into place inside my head. That night’s episode was called “Where Do the Stories Come From?” and Disney kicked it off by praising his animators’ knack for turning everyday occurrences into cartoons. That night, though, it wasn’t Disney’s explanation that

pulled me in but what was happening on the screen as he spoke. An artist was drawing Donald Duck, giving him a jaunty costume and a bouquet of flowers and a box of candy with which to woo Daisy. Then, as the artist’s pencil moved around the page, Donald came to life, putting up his dukes to square off with the pencil lead, then raising his chin to allow the artist to give him a bow tie. The definition of superb animation is that each character on the screen makes

you believe it is a thinking being. Whether it’s a T-Rex or a slinky dog or a desk lamp, if viewers sense not just movement but intention—or, put another way, emotion—then the animator has done his or her job. It’s not just lines on paper anymore; it’s a living, feeling entity. This is what I experienced that night, for the first time, as I watched Donald leap off the page. The transformation from a static line drawing to a fully dimensional, animated image was sleight of hand, nothing more, but the mystery of how it was done—not just the technical process but the way the art was imbued with such emotion—was the most interesting problem I’d ever considered. I wanted to climb through the TV screen and be part of this world.

The mid-1950s and early 1960s were, of course, a time of great prosperity and industry in the United States. Growing up in Utah in a tight-knit Mormon community, my four younger brothers and sisters and I felt that anything was possible. Because the adults we knew had all lived through the Depression, World War II, and then the Korean War, this period felt to them like the calm after a thunderstorm. I remember the optimistic energy—an eagerness to move forward that was

enabled and supported by a wealth of emerging technologies. It was boom time in America, with manufacturing and home construction at an all-time high. Banks were offering loans and credit, which meant more and more people could own a new TV, house, or Cadillac. There were amazing new appliances like disposals that ate your garbage and machines that washed your dishes, although I certainly did my share of cleaning them by hand. The first organ transplants were performed in 1954; the first polio vaccine came a year later; in 1956, the term artificial intelligence entered the lexicon. The future, it seemed, was already here. Then, when I was twelve, the Soviets launched the first artificial satellite—

Sputnik 1—into earth’s orbit. This was huge news, not just in the scientific and political realms but in my sixth grade classroom at school, where the morning routine was interrupted by a visit from the principal, whose grim expression told us that our lives had changed forever. Since we’d been taught that the

Communists were the enemy and that nuclear war could be waged at the touch of a button, the fact that they’d beaten us into space seemed pretty scary—proof that they had the upper hand. The United States government’s response to being bested was to create

something called ARPA, or the Advanced Research Projects Agency. Though it was housed within the Defense Department, its mission was ostensibly peaceful: to support scientific researchers in America’s universities in the hopes of preventing what it termed “technological surprise.” By sponsoring our best minds, the architects of ARPA believed, we’d come up with better answers. Looking back, I still admire that enlightened reaction to a serious threat: We’ll just have to get smarter. ARPA would have a profound effect on America, leading directly to the computer revolution and the Internet, among countless other innovations. There was a sense that big things were happening in America, with much more to come. Life was full of possibility. Still, while my family was middle-class, our outlook was shaped by my

father’s upbringing. Not that he talked about it much. Earl Catmull, the son of an Idaho dirt farmer, was one of fourteen kids, five of whom had died as infants. His mother, raised by Mormon pioneers who made a meager living panning for gold in the Snake River in Idaho, didn’t attend school until she was eleven. My father was the first in his family ever to go to college, paying his own way by working several jobs. During my childhood, he taught math during the school year and built houses during the summers. He built our house from the ground up. While he never explicitly said that education was paramount, my siblings and I all knew we were expected to study hard and go to college. I was a quiet, focused student in high school. An art teacher once told my

parents I would often become so lost in my work that I wouldn’t hear the bell ring at the end of class; I’d be sitting there, at my desk, staring at an object—a vase, say, or a chair. Something about the act of committing that object to paper was completely engrossing—the way it necessitated seeing only what was there and shutting out the distraction of my ideas about chairs or vases and what they were supposed to look like. At home, I sent away for Jon Gnagy’s Learn to Draw art kits—which were advertised in the back of comic books—and the 1948 classic Animation, written and drawn by Preston Blair, the animator of the dancing hippos in Disney’s Fantasia. I bought a platen—the flat metal plate artists use to press paper against ink—and even built a plywood animation stand with a light under it. I made flipbooks—one was of a man whose legs turned into a unicycle—while nursing my first crush, Tinker Bell, who had won my heart in Peter Pan. Nevertheless, it soon became clear to me that I would never be talented

enough to join Disney Animation’s vaunted ranks. What’s more, I had no idea how one actually became an animator. There was no school for it that I knew of. As I finished high school, I realized I had a far better understanding of how one became a scientist. The route seemed easier to discern. Throughout my life, people have always smiled when I told them I switched from art to physics because it seems, to them, like such an incongruous leap. But my decision to pursue physics, and not art, would lead me, indirectly, to my true calling.

Four years later, in 1969, I graduated from the University of Utah with two degrees, one in physics and the other in the emerging field of computer science. Applying to graduate school, my intention was to learn how to design computer languages. But soon after I matriculated, also at the U of U, I met a man who would encourage me to change course: one of the pioneers of interactive computer graphics, Ivan Sutherland. The field of computer graphics—in essence, the making of digital pictures out

of numbers, or data, that can be manipulated by a machine—was in its infancy then, but Professor Sutherland was already a legend. Early in his career, he had devised something called Sketchpad, an ingenious computer program that allowed figures to be drawn, copied, moved, rotated, or resized, all while retaining their basic properties. In 1968, he’d co-created what is widely believed to be the first virtual reality head-mounted display system. (The device was named The Sword of Damocles, after the Greek myth, because it was so heavy that in order to be worn by the person using it, it had to be suspended from a mechanical arm bolted to the ceiling.) Sutherland and Dave Evans, who was chair of the university’s computer science department, were magnets for bright students with diverse interests, and they led us with a light touch. Basically, they welcomed us to the program, gave us workspace and access to computers, and then let us pursue whatever turned us on. The result was a collaborative, supportive community so inspiring that I would later seek to replicate it at Pixar. One of my classmates, Jim Clark, would go on to found Silicon Graphics and

Netscape. Another, John Warnock, would co-found Adobe, known for Photoshop and the PDF file format, among other things. Still another, Alan Kay, would lead on a number of fronts, from object-oriented programming to “windowing” graphical user interfaces. In many respects, my fellow students were the most inspirational part of my university experience; this collegial, collaborative atmosphere was vital not just to my enjoyment of the program but also to the quality of the work that I did. This tension between the individual’s personal creative contribution and the

leverage of the group is a dynamic that exists in all creative environments, but this would be my first taste of it. On one end of the spectrum, I noticed, we had the genius who seemed to do amazing work on his or her own; on the other end, we had the group that excelled precisely because of its multiplicity of views. How, then, should we balance these two extremes, I wondered. I didn’t yet have a good mental model that would help me answer that, but I was developing a fierce desire to find one. Much of the research being done at the U of U’s computer science department

was funded by ARPA. As I’ve said, ARPA had been created in response to Sputnik, and one of its key organizing principles was that collaboration could lead to excellence. In fact, one of ARPA’s proudest achievements was linking universities with something they called “ARPANET,” which would eventually evolve into the Internet. The first four nodes on the ARPANET were at the Stanford Research Institute, UCLA, UC Santa Barbara, and the U of U, so I had a ringside seat from which to observe this grand experiment, and what I saw influenced me profoundly. ARPA’s mandate—to support smart people in a variety of areas—was carried out based on the unwavering presumption that researchers would try to do the right thing and, in ARPA’s view, overmanaging them was counterproductive. ARPA’s administrators did not hover over the shoulders of those of us working on the projects they funded, nor did they demand that our work have direct military applications. They simply trusted us to innovate. This kind of trust gave me the freedom to tackle all sorts of complex

problems, and I did so with gusto. Not only did I often sleep on the floor of the computer rooms to maximize time on the computer, but so did many of my fellow graduate students. We were young, driven by the sense that we were inventing the field from scratch—and that was exciting beyond words. For the first time, I saw a way to simultaneously create art and develop a technical understanding of how to create a new kind of imagery. Making pictures with a computer spoke to both sides of my brain. To be sure, the pictures that could be rendered on a computer were very crude in 1969, but the act of inventing new algorithms and seeing better pictures as a result was thrilling to me. In its own way, my childhood dream was reasserting itself. At the age of twenty-six, I set a new goal: to develop a way to animate, not

with a pencil but with a computer, and to make the images compelling and beautiful enough to use in the movies. Perhaps, I thought, I could become an animator after all.

In the spring of 1972, I spent ten weeks making my first short animated film—a digitized model of my left hand. My process combined old and new; again, like everyone in this fast-changing field, I was helping to invent the language. First I plunged my hand into a tub of plaster of Paris (forgetting, unfortunately, to coat it in Vaseline first, which meant I had to yank out every tiny hair on the back of my hand to get it free); then, once I had the mold, I filled it with more plaster to make a model of my hand; then, I took that model and covered it with 350 tiny interlocking triangles and polygons to create what looked like a net of black lines on its “skin.” You may not think that a curved surface could be built out of such flat, angular elements, but when you make them small enough, you can get pretty close.

I’d chosen this project because I was interested in rendering complex objects and curved surfaces—and I was looking for a challenge. At that time, computers weren’t great at showing flat objects, let alone curved ones. The mathematics of curved surfaces was not well developed, and computers had limited memory capability. At the U of U’s computer graphics department, where every one of us yearned to make computer-generated images look as if they were photographs of real objects, we had three driving goals: speed, realism, and the ability to depict curved surfaces. My film sought to address the latter two. The human hand doesn’t have a single flat plane. And unlike a simpler curved

surface—a ball, for example—it has many parts that act in opposition to one another, with a seemingly infinite number of resulting movements. The hand is

an incredibly complex “object” to try to capture and translate into arrays of numbers. Given that most computer animation at the time consisted of rendering simple polygonal objects (cubes, pyramids), I had my work cut out for me. Once I had drawn the triangles and polygons on my model, I measured the

coordinates of each of their corners, then entered that data into a 3D animation program I’d written. That enabled me to display the many triangles and polygons that made up my virtual hand on a monitor. In its first incarnation, sharp edges could be seen at the seams where the polygons joined together. But later, thanks to “smooth shading”—a technique, developed by another graduate student, that diminished the appearance of those edges—the hand became more lifelike. The real challenge, though, was making it move.

Hand, which debuted at a computer science conference in 1973, caused a bit of a stir because no one had ever seen anything like it before. In it, my hand, which appears at first to be covered in a white net of polygons, begins to open

and close, as if trying to make a fist. Then my hand’s surface becomes smoother, more like the real thing. There is a moment when my hand points directly at the viewer as if to say, “Yes, I’m talking to you.” Then, the camera goes inside the hand and takes a look around, aiming its lens inside the palm and up into each finger, a tricky bit of perspective that I liked because it could be depicted only via computer. Those four minutes of film had taken me more than sixty thousand minutes to complete. Together with a digitized film that my friend Fred Parke made of his wife’s

face around the same time, Hand represented the state-of-the-art in computer animation for years after it was made. Snippets of both Fred’s and my films would be featured in the 1976 movie Futureworld, which—though mostly forgotten by moviegoers today—is still remembered by aficionados as the first full-length feature to use computer-generated animation.

Professor Sutherland used to say that he loved his graduate students at Utah because we didn’t know what was impossible. Neither, apparently, did he: He was among the first to believe that Hollywood movie execs would care a fig about what was happening in academia. To that end, he sought to create a formal exchange program with Disney, wherein the studio would send one of its animators to Utah to learn about new technologies in computer rendering, and the university would send a student to Disney Animation to learn more about how to tell stories. In the spring of 1973, he sent me to Burbank to try to sell this idea to the

Disney executives. It was a thrill for me to drive through the red brick gates and onto the Disney lot on my way to the original Animation Building, built in 1940 with a “Double H” floor plan personally supervised by Walt himself to ensure that as many rooms as possible had windows to let in natural light. While I’d studied this place—or what I could glimpse of it on our 12-inch RCA—walking into it was a little like stepping into the Parthenon for the first time. There, I met Frank Thomas and Ollie Johnston, two of Walt’s “Nine Old Men,” the group of legendary animators who had created so many of the characters in the Disney movies I loved, from Pinocchio to Peter Pan. At one point I was taken into the archives where all the original paper drawings from all the animated films were kept, with rack after rack after rack of the images that had fueled my imagination. I’d entered the Promised Land. One thing was immediately clear. The people I met at Disney—one of whom,

I swear, was named Donald Duckwall—had zero interest in Sutherland’s exchange program. The technically adventuresome Walt Disney was long gone.

My enthusiastic descriptions were met with blank stares. To them, computers and animation simply didn’t mix. How did they know this? Because the one time they had turned to computers for help—to render images of millions of bubbles in their 1971 live-action movie Bedknobs and Broomsticks—the computers had apparently let them down. The state of the technology at the time was so poor, particularly for curved images, that bubbles were beyond the computers’ reach. Unfortunately, this didn’t help my cause. “Well,” more than one Disney executive told me that day, “until computer animation can do bubbles, then it will not have arrived.” Instead, they tried to tempt me into taking a job with what is now called

Disney Imagineering, the division that designs the theme parks. It may sound odd, given how large Walt Disney had always loomed in my life, but I turned the offer down without hesitation. The theme park job felt like a diversion that would lead me down a path I didn’t want to be on. I didn’t want to design rides for a living. I wanted to animate with a computer.

Just as Walt Disney and the pioneers of hand-drawn animation had done decades before, those of us who sought to make pictures with computers were trying to create something new. When one of my colleagues at the U of U invented something, the rest of us would immediately piggyback on it, pushing that new idea forward. There were setbacks, too, of course. But the overriding feeling was one of progress, of moving steadily toward a distant goal. Long before I’d heard about Disney’s bubble problem, what kept me and

many of my fellow graduate students up at night was the need to continue to hone our methods for creating smoothly curved surfaces with the computer—as well as to figure out how to add richness and complexity to the images we were creating. My dissertation, “A Subdivision Algorithm for Computer Display of Curved Surfaces,” offered a solution to that problem. Much of what I spent every waking moment thinking about then was

extremely technical and difficult to explain, but I’ll give it a try. The idea behind what I called “subdivision surfaces” was that instead of setting out to depict the whole surface of a shiny, red bottle, for example, we could divide that surface into many smaller pieces. It was easier to figure out how to color and display each tiny piece—which we could then put together to create our shiny, red bottle. (As I’ve noted, computer memory capacity was quite small in those days, so we put a lot of energy into developing tricks to overcome that limitation. This was one of those tricks.) But what if you wanted that shiny, red bottle to be zebra- striped? In my dissertation, I figured out a way that I could take a zebra-print or

woodgrain pattern, say, and wrap it around any object. “Texture mapping,” as I called it, was like having stretchable wrapping paper

that you could apply to a curved surface so that it fit snugly. The first texture map I made involved projecting an image of Mickey Mouse onto an undulating surface. I also used Winnie the Pooh and Tigger to illustrate my points. I may not have

been ready to work at Disney, but their characters were still the touchstones I referenced. At the U of U, we were inventing a new language. One of us would contribute

a verb, another a noun, then a third person would figure out ways to string the elements together to actually say something. My invention of something called the “Z-buffer” was a good example of this, in that it built on others’ work. The Z-buffer was designed to address the problem of what happens when one computer-animated object is hidden, or partially hidden, behind another one. Even though the data that describes every aspect of the hidden object is in the computer’s memory (meaning that you could see it, if need be), the desired spatial relationships mean that it should not be fully seen. The challenge was to figure out a way to tell the computer to meet that goal. For example, if a sphere were in front of a cube, partially blocking it, the sphere’s surface should be visible on the screen, as should the parts of the cube that are not blocked by the sphere. The Z-buffer accomplished that by assigning a depth to every object in three-dimensional space, then telling the computer to match each of the screen’s pixels to whatever object was the closest. Computer memory was so limited—as I’ve said—that this wasn’t a practical solution, but I had found a new way of solving the problem. Although it sounds simple, it is anything but. Today, there is a Z-buffer in every game and PC chip manufactured on earth. After receiving my Ph.D. in 1974, I left Utah with a nice little list of

innovations under my belt, but I was keenly aware that I’d only done all this in the service of a larger mutual goal. Like my classmates, the work I’d championed had taken hold largely because of the protective, eclectic, intensely challenging environment I’d been in. The leaders of my department understood that to create a fertile laboratory, they had to assemble different kinds of thinkers and then encourage their autonomy. They had to offer feedback when needed but also had to be willing to stand back and give us room. I felt instinctively that this kind of environment was rare and worth reaching for. I knew that the most valuable thing I was taking away from the U of U was the model my teachers had provided for how to lead and inspire other creative thinkers. The question for me, then, was how to get myself into another environment like this—or how to build one of my own.

I walked away from Utah with a clearer sense of my goal, and I was prepared to devote my life to it: making the first computer-animated film. But getting to that point would not be easy. There were, I guessed, at least another ten years of development needed to figure out how to model and animate characters and render them in complex environments before we could even begin to conceive of making a short—let alone a feature—film. I also didn’t yet know that my self- assigned mission was about much more than technology. To pull it off, we’d have to be creative not only technically but also in the ways that we worked together. Back then, no other company or university shared my goal of making a

computer-generated film; in fact, each time I expressed that goal in job interviews at universities, it seemed to cast a pall over the room. “But we want you to teach computer science,” my interviewers would say. What I was proposing to do looked, to most academics, like a pipe dream, an expensive fantasy. Then, in November 1974, I received a mysterious call from a woman who said

she worked at something called the New York Institute of Technology. She said she was the secretary to the institute’s president, and she was calling to book my airplane ticket. I didn’t know what she was talking about, and I told her so. What was the name of the institute again? I asked. Why did she want me to fly to New York? There was an awkward silence. “I’m sorry,” she said. “Someone else was supposed to call you before I did.” And with that, she hung up. The next phone call I received would change my

life.

CHAPTER 2

PIXAR IS BORN

What does it mean to manage well? As a young man, I certainly had no idea, but I was about to begin figuring it

out by taking a series of jobs—working for three iconoclastic men with very different styles—that would provide me with a crash course in leadership. In the next decade, I would learn much about what managers should and shouldn’t do, about vision and delusion, about confidence and arrogance, about what encourages creativity and what snuffs it out. As I gained experience, I was asking questions that intrigued me even as they confused me. Even now, forty years later, I’ve never stopped questioning. I want to start with my first boss, Alex Schure—the man whose secretary

called me out of the blue that day in 1974 to book me an airplane ticket and then, realizing her mistake, slammed down the receiver. When the phone rang again, a few minutes later, an unfamiliar voice—this time, a man who said he worked for Alex—filled me in: Alex was starting a research lab on Long Island’s North Shore whose mission was to bring computers into the animation process. Money was not a problem, he assured me—Alex was a multimillionaire. What they needed was someone to run the place. Was I interested in talking? Within weeks I was moving into my new office at the New York Institute of

Technology. Alex, a former college chancellor, had zero expertise in the field of computer

science. At the time, that wasn’t unusual, but Alex himself certainly was. He naïvely thought that computers would soon replace people, and leading that charge was what excited him. (We knew this was a misconception, if a common one at that point, but we were grateful for his eagerness to fund our work.) He had a bizarre way of speaking that mixed bluster, non sequiturs, and even snippets of rhyming verse into a sort of Mad Hatter–ish patois—or “word salad,” as one of my colleagues called it. (“Our vision will speed up time,” he would

say, “eventually deleting it.”) Those of us who worked with him often had trouble understanding what he meant. Alex had a secret ambition—well, not so secret. He said almost every day that he didn’t want to be the next Walt Disney, which only made us all think that he did. When I arrived, he was in the process of directing a hand-drawn animated movie called Tubby the Tuba. Really, the thing never had a chance—no one at NYIT had the training or the story sensibility to make a film, and when it was finally released, it vanished without a trace. Deluded though he may have been about his own skills, Alex was a visionary.

He was incredibly prescient about the role computers would someday play in animation, and he was willing to spend a lot of his own money to push that vision forward. His unwavering commitment to what many labeled a pipe dream —the melding of technology and this hand-drawn art form—enabled much groundbreaking work to be done. Once Alex brought me in, he left it to me to assemble a team. I have to give

that to him: He had total confidence in the people he hired. This was something I admired and, later, sought to do myself. One of the first people I interviewed was Alvy Ray Smith, a charismatic Texan with a Ph.D. in computer science and a sparkling resume that included teaching stints at New York University and UC Berkeley and a gig at Xerox PARC, the distinguished R&D lab in Palo Alto. I had conflicting feelings when I met Alvy because, frankly, he seemed more qualified to lead the lab than I was. I can still remember the uneasiness in my gut, that instinctual twinge spurred by a potential threat: This, I thought, could be the guy who takes my job one day. I hired him anyway. Some might have seen hiring Alvy as a confident move. The truth is, as a

twenty-nine-year-old who’d been focused on research for four years and had never had an assistant, let alone hired and managed a staff, I was feeling anything but confident. I could see, however, that NYIT was a place where I could explore what I’d set out to do as a graduate student. To ensure that it succeeded, I needed to attract the sharpest minds; to attract the sharpest minds, I needed to put my own insecurities away. The lesson of ARPA had lodged in my brain: When faced with a challenge, get smarter. So we did. Alvy would become one of my closest friends and most trusted

collaborators. And ever since, I’ve made a policy of trying to hire people who are smarter than I am. The obvious payoffs of exceptional people are that they innovate, excel, and generally make your company—and, by extension, you— look good. But there is another, less obvious, payoff that only occurred to me in retrospect. The act of hiring Alvy changed me as a manager: By ignoring my fear, I learned that the fear was groundless. Over the years, I have met people

who took what seemed the safer path and were the lesser for it. By hiring Alvy, I had taken a risk, and that risk yielded the highest reward—a brilliant, committed teammate. I had wondered in graduate school how I could ever replicate the singular environment of the U of U. Now, suddenly, I saw the way. Always take a chance on better, even if it seems threatening. At NYIT, we focused on a single goal: pushing the boundaries of what

computers could do in animation and graphics. And as word of our mission spread, we began to attract the top people in the field. The bigger my staff became, the more urgent it was that I figure out how to manage them. I created a flat organizational structure, much like I’d experienced in academia, largely because I naïvely thought that if I put together a hierarchical structure— assigning a bunch of managers to report to me—I would have to spend too much time managing and not enough time on my own work. This structure—in which I entrusted everybody to drive their own projects forward, at their own pace— had its limits, but the fact is, giving a ton of freedom to highly self-motivated people enabled us to make some significant technological leaps in a short time. Together, we did groundbreaking work, much of which was aimed at figuring out how to integrate the computer with hand-drawn animation. In 1977, for example, I wrote a 2D animation program called Tween, which

performed what’s known as “automatic in-betweening”—filling in frames of motion between key shots, an otherwise expensive and labor-intensive process. Another technical challenge that occupied us was the need for something called “motion blur.” With animation in general and computer animation in particular, the images created are in perfect focus. That may sound like a good thing, but in fact, human beings react negatively to it. When moving objects are in perfect focus, theatergoers experience an unpleasant, strobe-like sensation, which they describe as “jerky.” When watching live-action movies, we don’t perceive this problem because traditional film cameras capture a slight blur in the direction an object is moving. The blur keeps our brains from noticing the sharp edges, and our brains regard this blur as natural. Without motion blur, our brains think something is wrong. So the question for us was how to simulate the blur for animation. If the human eye couldn’t accept computer animation, the field would have no future. Among the handful of companies that were trying to solve these problems,

most embraced a culture of strictly enforced, even CIA-like secrecy. We were in a race, after all, to be the first to make a computer-animated feature film, so many who were pursuing this technology held their discoveries close to their vests. After talking about it, however, Alvy and I decided to do the opposite—to share our work with the outside world. My view was that we were all so far from

achieving our goal that to hoard ideas only impeded our ability to get to the finish line. Instead, NYIT engaged with the computer graphics community, publishing everything we discovered, participating in committees to review papers written by all manner of researchers, and taking active roles at all the major academic conferences. The benefit of this transparency was not immediately felt (and, notably, when we decided upon it, we weren’t even counting on a payoff; it just seemed like the right thing to do). But the relationships and connections we formed, over time, proved far more valuable than we could have imagined, fueling our technical innovation and our understanding of creativity in general. For all the good work we were doing, however, I found myself in a quandary

at NYIT. Thanks to Alex, we were fortunate to have the funds to buy the equipment and hire the people necessary to innovate in the world of computer animation, but we didn’t have anyone who knew anything about filmmaking. As we developed the ability to tell a story with a computer, we still didn’t have storytellers among us, and we were the poorer for it. So aware were Alvy and I of this limitation that we began making quiet overtures to Disney and other studios, trying to gauge their interest in investing in our tools. If we found an interested suitor, Alvy and I were prepared to leave NYIT and move our team to Los Angeles to partner with proven filmmakers and storytellers. But it was not to be. One by one, they demurred. It’s hard to imagine now, but in 1976, the idea of incorporating high technology into Hollywood filmmaking wasn’t just a low priority; it wasn’t even on the radar. But one man was about to change that, with a movie called Star Wars.

On May 25, 1977, Star Wars opened in theaters across America. The film’s mastery of visual effects—and its record-shattering popularity at the box office —would change the industry forever. And thirty-two-year-old writer-director George Lucas was only getting started. His company, Lucasfilm, and its ascendant Industrial Light & Magic studio had already taken the lead developing new tools in visual effects and sound design. Now, while no one else in the movie industry evinced even the slightest desire to invest in such things, George resolved in July 1979 to launch a computer division. Thanks to Luke Skywalker, he had the resources to do it right. To run this division, he wanted someone who not only knew computers; he

wanted someone who loved film and believed that the two could not only coexist but enhance one another. Eventually, that led George to me. One of his key people, Richard Edlund, who was a pioneer of special effects, came to see me

one afternoon in my office at NYIT wearing a belt with an enormous buckle that read, in huge letters, “Star Wars.” This was worrisome, given that I was trying to keep his visit a secret from Alex Schure. Somehow, though, Alex didn’t catch on. George’s emissary was apparently pleased with what I showed him, because a few weeks after he left, I was on my way to Lucasfilm in California for a formal interview. My first meeting there was with a man named Bob Gindy, who ran George’s

personal construction projects—not exactly the qualifications you’d expect for a guy spearheading the search for a new computer executive. The first thing he asked me was, “Who else should Lucasfilm be considering for this job?” Meaning, the job I was there to interview for. Without hesitation, I rattled off the names of several people who were doing impressive work in a variety of technical areas. My willingness to do this reflected my world-view, forged in academia, that any hard problem should have many good minds simultaneously trying to solve it. Not to acknowledge that seemed silly. Only later would I learn that the guys at Lucasfilm had already interviewed all the people I listed and had asked them, in turn, to make similar recommendations—and not one of them had suggested any other names! To be sure, working for George Lucas was a plum job that you’d have to be crazy not to want. But to go mute, as my rivals did, when asked to evaluate the field signaled not just intense competitiveness but also lack of confidence. Soon I’d landed an interview with George himself. On my way to meet him, I remember feeling nervous in a way I rarely had

before. Even before Star Wars, George had proved himself as a successful writer-director-producer with American Graffiti. I was a computer guy with an expensive dream. Still, when I arrived at the shooting stage in Los Angeles where he was working, he and I seemed pretty similar: Skinny and bearded, in our early thirties, we both wore glasses, worked with a blinders-on intensity, and had a tendency to talk only when we had something to say. But what struck me immediately was George’s relentless practicality. He wasn’t some hobbyist trying to bring technology into filmmaking for the heck of it. His interest in computers began and ended with their potential to add value to the filmmaking process—be it through digital optical printing, digital audio, digital nonlinear editing, or computer graphics. I was certain that they could, and I told him so. In the intervening years, George has said that he hired me because of my

honesty, my “clarity of vision,” and my steadfast belief in what computers could do. Not long after we met, he offered me the job. By the time I moved into the two-story building in San Anselmo that would

serve as the temporary headquarters of Lucasfilm’s new computer division, I had given myself an assignment: to rethink how I managed people. What George

wanted to create was a far more ambitious enterprise than the one I oversaw at NYIT, with a higher profile, a bigger budget, and, given his ambitions in Hollywood, the promise of much greater impact. I wanted to make sure that I was enabling my team to make the most of that. At NYIT, I’d created a flat structure much like I’d seen at the U of U, giving my colleagues a lot of running room and little oversight, and I’d been relatively pleased with the results. But now I had to admit that our team there behaved a lot like a collection of grad students—independent thinkers with individual projects—rather than a team with a common goal. A research lab is not a university, and the structure didn’t scale well. At Lucasfilm, then, I decided to hire managers to run the graphics, video, and audio groups; they would then report to me. I knew I had to put some sort of hierarchy in place, but I also worried that hierarchy would lead to problems. So I edged in slowly, feeling suspicious of it at first, yet knowing that some part of it was necessary. The Bay Area in 1979 could not have provided a more fertile environment for

our work. In Silicon Valley, the number of computer companies was growing so fast that no one’s Rolodex (yes, we had Rolodexes back then) was ever up to date. Also growing exponentially were the number of tasks that computers were being assigned to tackle. Not long after I got to California, Microsoft’s Bill Gates agreed to create an operating system for the new IBM personal computer —which would, of course, go on to transform the way Americans worked. One year later, Atari released the first in-home game console, meaning that its popular arcade games like Space Invaders and Pac-Man could be played in living rooms across America, opening up a market that now accounts for more than $65 billion in global sales. To get a sense of how quickly things were changing, consider that when I was

a graduate student, in 1970, we’d used huge computers made by IBM and seven other mainframe companies (a group that was nicknamed “IBM and the Seven Dwarves”). Picture a room filled with racks and racks of equipment measuring six feet tall, two feet wide, and 30 inches deep. Five years later, when I arrived at NYIT, the minicomputer—which was about the size of an armoire—was on the rise, with Digital Equipment in Massachusetts being the most significant player. By the time I got to Lucasfilm in 1979, the momentum was swinging to workstation computers such as those made by Silicon Valley upstarts Sun Microsystems and Silicon Graphics, as well as IBM, but by that time, everyone could see that workstations were only another stop on the way to PCs and, eventually, personal desktop computers. The swiftness of this evolution created seemingly endless opportunities for those who were willing and able to innovate. The allure of getting rich was a magnet for bright, ambitious people, and the

resulting competition was intense—as were the risks. The old business models were undergoing continual disruptive change. Lucasfilm was based in Marin County, one hour north of Silicon Valley by car

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Engineering Solutions
Math Exam Success
Professor Smith
Premium Solutions
Helping Hand
Math Guru
Writer Writer Name Offer Chat
Engineering Solutions

ONLINE

Engineering Solutions

I will be delighted to work on your project. As an experienced writer, I can provide you top quality, well researched, concise and error-free work within your provided deadline at very reasonable prices.

$23 Chat With Writer
Math Exam Success

ONLINE

Math Exam Success

I have read your project details and I can provide you QUALITY WORK within your given timeline and budget.

$34 Chat With Writer
Professor Smith

ONLINE

Professor Smith

I am an elite class writer with more than 6 years of experience as an academic writer. I will provide you the 100 percent original and plagiarism-free content.

$27 Chat With Writer
Premium Solutions

ONLINE

Premium Solutions

As an experienced writer, I have extensive experience in business writing, report writing, business profile writing, writing business reports and business plans for my clients.

$41 Chat With Writer
Helping Hand

ONLINE

Helping Hand

I find your project quite stimulating and related to my profession. I can surely contribute you with your project.

$15 Chat With Writer
Math Guru

ONLINE

Math Guru

I am an academic and research writer with having an MBA degree in business and finance. I have written many business reports on several topics and am well aware of all academic referencing styles.

$38 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

The things they carried symbolism essay - Make a power point -accounting case - Corey corey and callanan decision making model - A luta norman mailer pdf - C3925 vsec cube k9 - Discussion / about 200 words / need in 24 hours / Answer and comment on others - Web 2.0 phenomenon - Segmentation targeting positioning of hyundai - Why Platinum? - Dulux duramax metallic finish - Paradox access control software - MGMT315 Week 2 Assignment - Lizard point usa quiz - Types of Organizational Change - +971561686603 Abortion pills in Dubai/Abu Dhabi-mifepristone & misoprostol in DUBAI - Journal - E30 brake pad replacement - What does giles mention to hale about proctor - Nixe mackerel fillets in tomato sauce syns - Jody victor net worth - Psychology - Compare and contrast countries essay examples - Table 2: balloon circumference vs. temperature - Qnt 275 week 4 practice set answers - MGMT - Vendor who offers an erp strategy - Two factor authentication can be defeated if ________ - Promega annealing temperature calculator - Asci 202 introduction to aeronautical science - A master electrician's license is most likely to be held by - Discussion - Timken bearing damage analysis poster - Woolworths home insurance contact - Post impressionism font - Maylee name meaning hawaiian - Reviewing Codes of Ethics - Brk customer service uk - Wendy Lewis 1 - Company karma hummel - Kevin tomorrow when the war began - Case 2 Problems - 2 2 dimethyl 1 butanol nmr - Electrical circuits lab report - 22.1 data gathering techniques homework answers - Why you should drink more water persuasive speech - Faye abdellah patient centered approach to nursing - Syntagmatic and paradigmatic relations in linguistics ppt - Trader joe's woven wheat wafers discontinued - Mkt - What is a broadsheet - Ausbil additional investment form - Chromatography of food dyes lab results - Create a conference room scheduler PHP web application. - Crime self-reporting cjt101 - Final marketing plan and presentation mkt 421 - Jpm london whale - Compare and merge exploring_e11_grader_a1_karin.xlsx into your workbook. - Icandy apple 2 pear seat unit age - Ms andrews hotel video - Biomagic water pure water purifier - Which theory assumes the media spotlight some issues, events, and people while downplaying others? - Variables and Relationships - How do technologies related to microorganisms suggest ethical conflicts - Job offer comparison template - Past participle of falloir - Surrey family support programme - Ninth circle of hell - The book thief bombing of himmel street quotes - What is personal beliefs - 2 stage hydraulic cylinder diagram - St george card activation - Fluke 435 series 1 - Farewell to manzanar pdf - World Perspective Week 7 - Disadvantages of automating a manual accounting system - Three states of matter song - Rel discussion - Von thunen model of agricultural land use - Intro to Cybersecurity and Technology - Standard form of second order linear differential equation - Embracing Diversity - The outsiders multiple choice questions - Ball and socket joint reaction forces - D6: disaster planning in PH - Introduction to statistics final exam - Which element has the highest melting point - Assassin's creed odyssey the father the mother neither - Homework - Paint tutorial - Mark 1 14 20 summary - Why not me book - Examples of critical analysis - A nurse manager is reviewing the stages of conflict resolution - Commonwealth bank perls vi - Road traffic injuries and disease - Asce 7 16 seismic design example - Mis535 final exam - Ib extended essay rubric - Insite leeds city council - Wgu c361 task 1