Also by Adam Grant
Give and Take: Why Helping Others Drives Our Success
VIKING An imprint of Penguin Random House LLC
375 Hudson Street New York, New York 10014
penguin.com
Copyright © 2016 by Adam Grant Foreword copyright © 2016 by Sheryl Sandberg
Penguin supports copyright. Copyright fuels creativity, encourages diverse voices, promotes free speech, and creates a vibrant culture. Thank you for buying an authorized edition of this book and for complying with copyright laws by not reproducing, scanning, or distributing any part of it in any form without
permission. You are supporting writers and allowing Penguin to continue to publish books for every reader.
ISBN 978-0-698-40577-6
Version_1
http://penguin.com
For Allison
Contents
Also by Adam Grant
Title Page
Copyright
Dedication
Foreword by Sheryl Sandberg
1
Creative Destruction The Risky Business of Going Against the Grain
2
Blind Inventors and One-Eyed Investors The Art and Science of Recognizing Original Ideas
3
Out on a Limb Speaking Truth to Power
4
Fools Rush In Timing, Strategic Procrastination, and the First-Mover Disadvantage
5
Goldilocks and the Trojan Horse Creating and Maintaining Coalitions
6
Rebel with a Cause How Siblings, Parents, and Mentors Nurture Originality
7
Rethinking Groupthink The Myths of Strong Cultures, Cults, and Devil’s Advocates
8
Rocking the Boat and Keeping It Steady Managing Anxiety, Apathy, Ambivalence, and Anger
Actions for Impact
Acknowledgments
References
Index
A
Foreword
BY SHERYL SANDBERG
Chief operating officer of Facebook and founder of LeanIn.Org
dam Grant is the perfect person to write Originals because he is one. He is a brilliant researcher who passionately pursues the science of
what motivates people, busting myths and revealing truths. He is an informed optimist who offers insights and advice about how anyone—at home, at work, in the community—can make the world a better place. He is a dedicated friend who inspires me to believe in myself and has helped me understand how I can advocate effectively for my ideas.
Adam is one of the most important influences in my life. Through the pages of this magnificent book, he will enlighten, inspire, and support you as well.
MYTH BUSTER
Conventional wisdom holds that some people are innately creative, while most have few original thoughts. Some people are born to be leaders, and the rest are followers. Some people can have real impact, but the majority can’t.
In Originals Adam shatters all of these assumptions. He demonstrates that any of us can enhance our creativity. He reveals how
we can identify ideas that are truly original and predict which ones will work. He tells us when to trust our gut and when to rely on others. He shows how we can become better parents by nurturing originality in our children and better managers by fostering diversity of thought instead of conformity.
In these pages, I learned that great creators don’t necessarily have the deepest expertise but rather seek out the broadest perspectives. I saw how success is not
usually attained by being ahead of everyone else but by waiting patiently for the right time to act. And to my utter shock, I learned that procrastinating can be good. Anyone who has ever worked with me knows how much I hate leaving things to the last minute, how I always think that anything that can be done should be done right away. Mark Zuckerberg, along with many others, will be pleased if I can let go of the relentless pressure I feel to finish everything early— and, as Adam points out, it might just help me and my teams achieve better results.
INFORMED OPTIMIST
Every day, we all encounter things we love and things that need to change. The former give us joy. The latter fuel our desire to make the world different— ideally better than the way we found it. But trying to change deep-seated beliefs and behaviors is daunting. We accept the status quo because effecting real change seems impossible. Still, we dare to ask: Can one individual make a difference? And, in our bravest moments: Could that one individual be me?
Adam’s answer is a resounding yes. This book proves that any one of us can champion ideas that improve the world around us.
FRIEND
I met Adam just as his first book, Give and Take, was generating buzz in Silicon Valley. I read it and immediately started quoting it to anyone who would listen. Adam was not only a talented researcher but also a gifted teacher and storyteller who was able to explain complicated ideas simply and clearly.
Then my husband invited Adam to speak to his team at work and brought him over for dinner. Adam was every bit as extraordinary in person as he was on paper. His knowledge was encyclopedic and his energy was contagious. He and I started talking about how his research could inform the debate on gender and began working together. We have done so ever since, conducting research and writing a series of op-eds about women and work. LeanIn.Org has benefited immensely from his rigorous analysis and commitment to equality.
Once a year, Facebook brings its global teams together, and in 2015 I invited Adam to give a keynote speech. Everyone was blown away by his wisdom and humor. Months later, the teams are still talking about his insights and putting his advice into action.
Along the way, Adam and I became friends. When tragedy hit and I lost my husband suddenly, Adam stepped up and stepped in as only a true friend would. He approached the worst time of my life as he approaches everything, combining his unique understanding of psychology with his unparalleled generosity. When I thought I would never feel better, he flew across the country to explain what I could do to build my resilience. When I could not figure out how to handle a particularly gut-wrenching situation, he helped me find answers where I thought there were none. When I needed a shoulder to cry on, his was always there.
In the deepest sense of the word, a friend is someone who sees more potential in you than you see in yourself, someone who helps you become the best version of yourself. The magic of this book is that Adam becomes that kind of friend to everyone who reads it. He offers a wealth of advice for overcoming doubt and fear, speaking up and pitching ideas, and finding allies in the least likely of places. He gives practical guidance on how to manage anxiety, channel anger, find the strength in our weaknesses, overcome obstacles, and give hope to others.
— Originals is one of the most important and captivating books I have ever read, full of surprising and powerful ideas. It will not only change the way you see the world; it might just change the way you live your life. And it could very well inspire you to change your world.
O
1
Creative Destruction
The Risky Business of Going Against the Grain
“The reasonable man adapts himself to the world; the unreasonable one persists in trying to adapt the world to himself. Therefore all progress depends on the
unreasonable man.”
George Bernard Shaw
n a cool fall evening in 2008, four students set out to revolutionize an industry. Buried in loans, they had lost and broken eyeglasses and were
outraged at how much it cost to replace them. One of them had been wearing the same damaged pair for five years: He was using a paper clip to bind the frames together. Even after his prescription changed twice, he refused to pay for pricey new lenses.
Luxottica, the 800-pound gorilla of the industry, controlled more than 80 percent of the eyewear market. To make glasses more affordable, the students would need to topple a giant. Having recently watched Zappos transform footwear by selling shoes online, they wondered if they could do the same with eyewear.
When they casually mentioned their idea to friends, time and again they were blasted with scorching criticism. No one would ever buy glasses over the internet, their friends insisted. People had to try them on first. Sure, Zappos had pulled the concept off with shoes, but there was a reason it hadn’t happened with eyewear. “If this were a good idea,” they heard repeatedly, “someone would have done it already.”
None of the students had a background in e-commerce and technology, let alone in retail, fashion, or apparel. Despite being told their idea was crazy, they walked away from lucrative job offers to start a company. They would sell
eyeglasses that normally cost $500 in a store for $95 online, donating a pair to someone in the developing world with every purchase.
The business depended on a functioning website. Without one, it would be impossible for customers to view or buy their products. After scrambling to pull a website together, they finally managed to get it online at 4 A.M. on the day before the launch in February 2010. They called the company Warby Parker, combining the names of two characters created by the novelist Jack Kerouac, who inspired them to break free from the shackles of social pressure and embark on their adventure. They admired his rebellious spirit, infusing it into their culture. And it paid off.
The students expected to sell a pair or two of glasses per day. But when GQ called them “the Netflix of eyewear,” they hit their target for the entire first year in less than a month, selling out so fast that they had to put twenty thousand customers on a waiting list. It took them nine months to stock enough inventory to meet the demand.
Fast forward to 2015, when Fast Company released a list of the world’s most innovative companies. Warby Parker didn’t just make the list—they came in first. The three previous winners were creative giants Google, Nike, and Apple, all with over fifty thousand employees. Warby Parker’s scrappy startup, a new kid on the block, had a staff of just five hundred. In the span of five years, the four friends built one of the most fashionable brands on the planet and donated over a million pairs of glasses to people in need. The company cleared $100 million in annual revenues and was valued at over $1 billion.
Back in 2009, one of the founders pitched the company to me, offering me the chance to invest in Warby Parker. I declined.
It was the worst financial decision I’ve ever made, and I needed to understand where I went wrong.
—
orig•i•nal, adj The origin or source of something; from which something springs, proceeds, or is derived.
orig•i•nal, n A thing of singular or unique character; a person who is different from other people in an appealing or interesting way; a person of fresh initiative or inventive capacity.
Years ago, psychologists discovered that there are two routes to achievement: conformity and originality. Conformity means following the crowd down conventional paths and maintaining the status quo. Originality is taking the road less traveled, championing a set of novel ideas that go against the grain but ultimately make things better.
Of course, nothing is completely original, in the sense that all of our ideas are influenced by what we learn from the world around us. We are constantly borrowing thoughts, whether intentionally or inadvertently. We’re all vulnerable to “kleptomnesia”—accidentally remembering the ideas of others as our own. By my definition, originality involves introducing and advancing an idea that’s relatively unusual within a particular domain, and that has the potential to improve it.
Originality itself starts with creativity: generating a concept that is both novel and useful. But it doesn’t stop there. Originals are people who take the initiative to make their visions a reality. The Warby Parker founders had the originality to dream up an unconventional way to sell glasses online, but became originals by taking action to make them easily accessible and affordable.
This book is about how we can all become more original. There’s a surprising clue in the web browser that you use to surf the internet.
Finding the Faults in Defaults
Not long ago, economist Michael Housman was leading a project to figure out why some customer service agents stayed in their jobs longer than others. Armed with data from over thirty thousand employees who handled calls for banks, airlines, and cell-phone companies, he suspected that their employment histories would contain telltale signs about their commitment. He thought that people with a history of job-hopping would quit sooner, but they didn’t: Employees who had held five jobs in the past five years weren’t any more likely to leave their positions than those who had stayed in the same job for five years.
Hunting for other hints, he noticed that his team had captured information about which internet browser employees had used when they logged in to apply for their jobs. On a whim, he tested whether that choice might be related to quitting. He didn’t expect to find any correlation, assuming that browser preference was purely a matter of taste. But when he looked at the results, he was stunned: Employees who used Firefox or Chrome to browse the Web remained in their jobs 15 percent longer than those who used Internet Explorer or Safari.
Thinking it was a coincidence, Housman ran the same analysis for absences from work. The pattern was the same: Firefox and Chrome users were 19 percent less likely to miss work than Internet Explorer and Safari fans.
Then he looked at performance. His team had assembled nearly three million data points on sales, customer satisfaction, and average call length. The Firefox and Chrome users had significantly higher sales, and their call times were shorter. Their customers were happier, too: After 90 days on the job, the Firefox and Chrome users had customer satisfaction levels that Internet Explorer and Safari users reached only after 120 days at work.
It’s not the browser itself that’s causing them to stick around, show up dependably, and succeed. Rather, it’s what their browser preference signals about their habits. Why are the Firefox and Chrome users more committed and better performers on every metric?
The obvious answer was that they’re more tech savvy, so I asked Housman if he could explore that. The employees had all taken a computer proficiency test, which assessed their knowledge of keyboard shortcuts, software programs, and
hardware, as well as a timed test of their typing speed. But the Firefox and Chrome group didn’t prove to have significantly more computer expertise, and they weren’t faster or more accurate typists. Even after accounting for those scores, the browser effect persisted. Technical knowledge and skill weren’t the source of their advantage.
What made the difference was how they obtained the browser. If you own a PC, Internet Explorer is built into Windows. If you’re a Mac user, your computer came preinstalled with Safari. Almost two thirds of the customer service agents used the default browser, never questioning whether a better one was available.
To get Firefox or Chrome, you have to demonstrate some resourcefulness and download a different browser. Instead of accepting the default, you take a bit of initiative to seek out an option that might be better. And that act of initiative, however tiny, is a window into what you do at work.
The customer service agents who accepted the defaults of Internet Explorer and Safari approached their job the same way. They stayed on script in sales calls and followed standard operating procedures for handling customer complaints. They saw their job descriptions as fixed, so when they were unhappy with their work, they started missing days, and eventually just quit.
The employees who took the initiative to change their browsers to Firefox or Chrome approached their jobs differently. They looked for novel ways of selling to customers and addressing their concerns. When they encountered a situation they didn’t like, they fixed it. Having taken the initiative to improve their circumstances, they had little reason to leave. They created the jobs they wanted. But they were the exception, not the rule.
We live in an Internet Explorer world. Just as almost two thirds of the customer service reps used the default browser on their computers, many of us accept the defaults in our own lives. In a series of provocative studies, a team led by political psychologist John Jost explored how people responded to undesirable default conditions. Compared to European Americans, African Americans were less satisfied with their economic circumstances but perceived economic inequality as more legitimate and just. Compared to people in the highest income bracket, people in the lowest income bracket were 17 percent more likely to view economic inequality as necessary. And when asked whether they would support laws that limit the rights of citizens and the press to criticize the government if enacting such legislation was necessary to solve our nation’s problems, twice as many people in the lowest income bracket were willing to give up the right to free speech as those in the highest income bracket. After
finding that disadvantaged groups consistently support the status quo more than advantaged groups, Jost and his colleagues concluded: “People who suffer the most from a given state of affairs are paradoxically the least likely to question, challenge, reject, or change it.”
To explain this peculiar phenomenon, Jost’s team developed a theory of system justification. Its core idea is that people are motivated to rationalize the status quo as legitimate—even if it goes directly against their interests. In one study, they tracked Democratic and Republican voters before the 2000 U.S. presidential election. When George W. Bush gained in the polls, Republicans rated him as more desirable, but so did Democrats, who were already preparing justifications for the anticipated status quo. The same happened when Al Gore’s likelihood of success increased: Both Republicans and Democrats judged him more favorably. Regardless of political ideologies, when a candidate seemed destined to win, people liked him more. When his odds dropped, they liked him less.
Justifying the default system serves a soothing function. It’s an emotional painkiller: If the world is supposed to be this way, we don’t need to be dissatisfied with it. But acquiescence also robs us of the moral outrage to stand against injustice and the creative will to consider alternative ways that the world could work.
— The hallmark of originality is rejecting the default and exploring whether a better option exists. I’ve spent more than a decade studying this, and it turns out to be far less difficult than I expected.
The starting point is curiosity: pondering why the default exists in the first place. We’re driven to question defaults when we experience vuja de, the opposite of déjà vu. Déjà vu occurs when we encounter something new, but it feels as if we’ve seen it before. Vuja de is the reverse—we face something familiar, but we see it with a fresh perspective that enables us to gain new insights into old problems.
Without a vuja de event, Warby Parker wouldn’t have existed. When the founders were sitting in the computer lab on the night they conjured up the company, they had spent a combined sixty years wearing glasses. The product had always been unreasonably expensive. But until that moment, they had taken the status quo for granted, never questioning the default price. “The thought had
never crossed my mind,” cofounder Dave Gilboa says. “I had always considered them a medical purchase. I naturally assumed that if a doctor was selling it to me, there was some justification for the price.”
Having recently waited in line at the Apple Store to buy an iPhone, he found himself comparing the two products. Glasses had been a staple of human life for nearly a thousand years, and they’d hardly changed since his grandfather wore them. For the first time, Dave wondered why glasses had such a hefty price tag. Why did such a fundamentally simple product cost more than a complex smartphone?
Anyone could have asked those questions and arrived at the same answer that the Warby Parker squad did. Once they became curious about why the price was so steep, they began doing some research on the eyewear industry. That’s when they learned that it was dominated by Luxottica, a European company that had raked in over $7 billion the previous year. “Understanding that the same company owned LensCrafters and Pearle Vision, Ray-Ban and Oakley, and the licenses for Chanel and Prada prescription frames and sunglasses—all of a sudden, it made sense to me why glasses were so expensive,” Dave says. “Nothing in the cost of goods justified the price.” Taking advantage of its monopoly status, Luxottica was charging twenty times the cost. The default wasn’t inherently legitimate; it was a choice made by a group of people at a given company. And this meant that another group of people could make an alternative choice. “We could do things differently,” Dave suddenly understood. “It was a realization that we could control our own destiny, that we could control our own prices.”
When we become curious about the dissatisfying defaults in our world, we begin to recognize that most of them have social origins: Rules and systems were created by people. And that awareness gives us the courage to contemplate how we can change them. Before women gained the right to vote in America, many “had never before considered their degraded status as anything but natural,” historian Jean Baker observes. As the suffrage movement gained momentum, “a growing number of women were beginning to see that custom, religious precept, and law were in fact man-made and therefore reversible.”
The Two Faces of Ambition
The pressures to accept defaults start much earlier than we realize. If you consider the individuals who will grow up and make a dent in the universe, the first group that probably comes to mind is child prodigies. These geniuses learn to read at age two, play Bach at four, breeze through calculus at six, and speak seven languages fluently by eight. Their classmates shudder with jealousy; their parents rejoice at having won the lottery. But to paraphrase T. S. Eliot, their careers tend to end not with a bang, but a whimper.
Child prodigies, it turns out, rarely go on to change the world. When psychologists study history’s most eminent and influential people, they discover that many of them weren’t unusually gifted as children. And if you assemble a large group of child prodigies and follow them for their entire lives, you’ll find that they don’t outshine their less precocious peers from families of similar means.
Intuitively, this makes sense. We assume that what gifted kids have in book smarts, they lack in street smarts. While they have the intellectual chops, they must lack the social, emotional, and practical skills to function in society. When you look at the evidence, though, this explanation falls short: Less than a quarter of gifted children suffer from social and emotional problems. The vast majority are well-adjusted—as delightful at a cocktail party as in a spelling bee.
Although child prodigies are often rich in both talent and ambition, what holds them back from moving the world forward is that they don’t learn to be original. As they perform in Carnegie Hall, win the science Olympics, and become chess champions, something tragic happens: Practice makes perfect, but it doesn’t make new. The gifted learn to play magnificent Mozart melodies and beautiful Beethoven symphonies, but never compose their own original scores. They focus their energy on consuming existing scientific knowledge, not producing new insights. They conform to the codified rules of established games, rather than inventing their own rules or their own games. All along the way, they strive to earn the approval of their parents and the admiration of their teachers.
Research demonstrates that it is the most creative children who are the least likely to become the teacher’s pet. In one study, elementary school teachers
listed their favorite and least favorite students, and then rated both groups on a list of characteristics. The least favorite students were the non-conformists who made up their own rules. Teachers tend to discriminate against highly creative students, labeling them as troublemakers. In response, many children quickly learn to get with the program, keeping their original ideas to themselves. In the language of author William Deresiewicz, they become the world’s most excellent sheep.
In adulthood, many child prodigies become experts in their fields and leaders in their organizations. Yet “only a fraction of gifted children eventually become revolutionary adult creators,” laments psychologist Ellen Winner. “Those who do must make a painful transition” from a child who “learns rapidly and effortlessly in an established domain” to an adult who “ultimately remakes a domain.”
Most prodigies never make that leap. They apply their extraordinary abilities in ordinary ways, mastering their jobs without questioning defaults and without making waves. In every domain they enter, they play it safe by following the conventional paths to success. They become doctors who heal their patients without fighting to fix the broken systems that prevent many patients from affording health care in the first place. They become lawyers who defend clients for violating outdated laws without trying to transform the laws themselves. They become teachers who plan engaging algebra lessons without questioning whether algebra is what their students need to learn. Although we rely on them to keep the world running smoothly, they keep us running on a treadmill.
Child prodigies are hindered by achievement motivation. The drive to succeed is responsible for many of the world’s greatest accomplishments. When we’re determined to excel, we have the fuel to work harder, longer, and smarter. But as cultures rack up a significant number of achievements, originality is increasingly left to a specialized few.
When achievement motivation goes sky-high, it can crowd out originality: The more you value achievement, the more you come to dread failure. Instead of aiming for unique accomplishments, the intense desire to succeed leads us to strive for guaranteed success. As psychologists Todd Lubart and Robert Sternberg put it, “Once people pass an intermediate level in the need to achieve, there is evidence that they actually become less creative.”
The drive to succeed and the accompanying fear of failure have held back some of the greatest creators and change agents in history. Concerned with maintaining stability and attaining conventional achievements, they have been
reluctant to pursue originality. Instead of charging full steam ahead with assurance, they have been coaxed, convinced, or coerced to take a stand. While they may seem to have possessed the qualities of natural leaders, they were figuratively—and sometimes literally—lifted up by followers and peers. If a handful of people hadn’t been cajoled into taking original action, America might not exist, the civil rights movement could still be a dream, the Sistine Chapel might be bare, we might still believe the sun revolves around the earth, and the personal computer might never have been popularized.
From our perspective today, the Declaration of Independence seems inevitable, but it nearly didn’t happen due to the reluctance of key revolutionaries. “The men who took commanding roles in the American Revolution were as unlikely a group of revolutionaries as one can imagine,” Pulitzer Prize–winning historian Jack Rakove recounts. “They became revolutionaries despite themselves.” In the years leading up to the war, John Adams feared British retaliation and hesitated to give up his budding law career; he only got involved after being elected as a delegate to the First Continental Congress. George Washington had been focused on managing his wheat, flour, fishing, and horse-breeding businesses, joining the cause only after Adams nominated him as commander in chief of the army. “I have used every endeavor in my power to avoid it,” Washington wrote.
Nearly two centuries later, Martin Luther King, Jr., was apprehensive about leading the civil rights movement; his dream was to be a pastor and a college president. In 1955, after Rosa Parks was tried for refusing to give up her seat at the front of a bus, a group of civil rights activists gathered to discuss their response. They agreed to form the Montgomery Improvement Association and launch a bus boycott, and one of the attendees nominated King for the presidency. “It had happened so quickly that I did not even have time to think it through. It is probable that if I had, I would have declined the nomination,” King reflected. Just three weeks earlier, King and his wife had “agreed that I should not then take on any heavy community responsibilities, since I had so recently finished my thesis, and needed to give more attention to my church work.” He was unanimously elected to lead the boycott. Faced with giving a speech to the community that evening, “I became possessed by fear.” King would overcome that trepidation soon enough that in 1963 his thundering voice united a country around an electrifying vision of freedom. But that only happened because a colleague proposed that King should be the closing speaker at the March on Washington and gathered a coalition of leaders to advocate for him.
When the pope commissioned him to paint a fresco on the ceiling of the Sistine Chapel, Michelangelo wasn’t interested. He viewed himself as a sculptor, not a painter, and found the task so overwhelming that he fled to Florence. Two years would pass before he began work on the project, at the pope’s insistence. And astronomy stagnated for decades because Nicolaus Copernicus refused to publish his original discovery that the earth revolves around the sun. Fearing rejection and ridicule, he stayed silent for twenty-two years, circulating his findings only to his friends. Eventually, a major cardinal learned of his work and wrote a letter encouraging Copernicus to publish it. Even then, Copernicus stalled for four more years. His magnum opus only saw the light of day after a young mathematics professor took matters into his own hands and submitted it for publication.
Almost half a millennium later, when an angel investor offered $250,000 to Steve Jobs and Steve Wozniak to bankroll Apple in 1977, it came with an ultimatum: Wozniak would have to leave Hewlett-Packard. He refused. “I still intended to be at that company forever,” Wozniak reflects. “My psychological block was really that I didn’t want to start a company. Because I was just afraid,” he admits. Wozniak changed his mind only after being encouraged by Jobs, multiple friends, and his own parents.
We can only imagine how many Wozniaks, Michelangelos, and Kings never pursued, publicized, or promoted their original ideas because they were not dragged or catapulted into the spotlight. Although we may not all aspire to start our own companies, create a masterpiece, transform Western thought, or lead a civil rights movement, we do have ideas for improving our workplaces, schools, and communities. Sadly, many of us hesitate to take action to promote those ideas. As economist Joseph Schumpeter famously observed, originality is an act of creative destruction. Advocating for new systems often requires demolishing the old way of doing things, and we hold back for fear of rocking the boat. Among nearly a thousand scientists at the Food and Drug Administration, more than 40 percent were afraid that they would face retaliation if they spoke up publicly about safety concerns. Of more than forty thousand employees at a technology company, half felt it was not safe to voice dissenting opinions at work. When employees in consulting, financial services, media, pharmaceuticals, and advertising companies were interviewed, 85 percent admitted to keeping quiet about an important concern rather than voicing it to their bosses.
The last time you had an original idea, what did you do with it? Although America is a land of individuality and unique self-expression, in search of excellence and in fear of failure, most of us opt to fit in rather than stand out. “On matters of style, swim with the current,” Thomas Jefferson allegedly advised, but “on matters of principle, stand like a rock.” The pressure to achieve leads us to do the opposite. We find surface ways of appearing original— donning a bow tie, wearing bright red shoes—without taking the risk of actually being original. When it comes to the powerful ideas in our heads and the core values in our hearts, we censor ourselves. “There are so few originals in life,” says renowned executive Mellody Hobson, because people are afraid to “speak up and stand out.” What are the habits of the people whose originality extends beyond appearance to effective action?
The Right Stuff
To be an original, you need to take radical risks. This belief is embedded so deeply in our cultural psyche that we rarely even stop to think about it. We admire astronauts like Neil Armstrong and Sally Ride for having “the right stuff”—the courage to leave the only planet humans have ever inhabited and venture boldly into space. We celebrate heroes like Mahatma Gandhi and Martin Luther King, Jr., who possessed enough conviction to risk their lives for the moral principles they held dear. We idolize icons like Steve Jobs and Bill Gates for having the audacity to drop out of school and go for broke, holing up in garages to will their technological visions into existence.
When we marvel at the original individuals who fuel creativity and drive change in the world, we tend to assume they’re cut from a different cloth. In the same way that some lucky people are born with genetic mutations that make them resistant to diseases like cancer, obesity, and HIV, we believe that great creators are born with a biological immunity to risk. They’re wired to embrace uncertainty and ignore social approval; they simply don’t worry about the costs of non-conformity the way the rest of us do. They’re programmed to be iconoclasts, rebels, revolutionaries, troublemakers, mavericks, and contrarians who are impervious to fear, rejection, and ridicule.
The word entrepreneur, as it was coined by economist Richard Cantillon, literally means “bearer of risk.” When we read the story of Warby Parker’s stratospheric rise, this theme comes through loud and clear. Like all great creators, innovators, and change agents, the quartet transformed the world because they were willing to take a leap of faith. After all, if you don’t swing for the fences, it’s impossible to hit a home run.
Isn’t it?
— Six months before Warby Parker launched, one of the founders was sitting in my classroom at Wharton. Tall and affable, with curly black hair and a calm energy, Neil Blumenthal hailed from a nonprofit background and genuinely aspired to make the world a better place. When he pitched the company to me, like many
other doubters, I told him it sounded like an interesting idea, but it was hard to imagine people ordering glasses online.
With a skeptical consumer base, I knew, it would require a herculean effort to get the company off the ground. And when I learned how Neil and his friends were spending their time preparing for the launch, I had the sinking feeling that they were doomed.
The first strike against them, I told Neil, was that they were all still in school. If they truly believed in Warby Parker, they should drop out to focus every waking hour on making it happen.
“We want to hedge our bets,” he responded. “We’re not sure if it’s a good idea and we have no clue whether it will succeed, so we’ve been working on it in our spare time during the school year. We were four friends before we started, and we made a commitment that dealing with each other fairly was more important than success. But for the summer, Jeff got a grant to focus on the business full time.”
What about the other three of you? “We all got internships,” Neil admitted. “I was in consulting, Andy was in venture capital, and Dave was in health care.”
With their time scarce and their attention divided, they still hadn’t built a website, and it had taken them six months just to agree on a name for the company. Strike two.
Before I gave up on them entirely, though, I remembered that they were all graduating at the end of the year, which meant they’d finally have the time to go all in and dedicate themselves completely to the business. “Well, not necessarily,” Neil backpedaled. “We’ve hedged our bets. Just in case things don’t work out, I’ve accepted a full-time job for after graduation. So has Jeff. And to make sure he would have options, Dave did two different internships over the summer, and he’s talking with his former employer about rejoining.”
Strike three. They were out—and so was I. I declined to invest in Warby Parker because Neil and his friends were too
much like me. I became a professor because I was passionate about discovering new insights, sharing knowledge, and teaching the next generations of students. But in my most honest moments, I know that I was also drawn to the security of tenure. I would never have had the confidence to start a business in my twenties. If I had, I certainly would have stayed in school and lined up a job to cover my bases.
When I compared the choices of the Warby Parker team to my mental model of the choices of successful entrepreneurs, they didn’t match. Neil and his
colleagues lacked the guts to go in with their guns blazing, which led me to question their conviction and commitment. They weren’t serious about becoming successful entrepreneurs: They didn’t have enough skin in the game. In my mind, they were destined to fail because they played it safe instead of betting the farm. But in fact, this is exactly why they succeeded.
I want to debunk the myth that originality requires extreme risk taking and persuade you that originals are actually far more ordinary than we realize. In every domain, from business and politics to science and art, the people who move the world forward with original ideas are rarely paragons of conviction and commitment. As they question traditions and challenge the status quo, they may appear bold and self-assured on the surface. But when you peel back the layers, the truth is that they, too, grapple with fear, ambivalence, and self-doubt. We view them as self-starters, but their efforts are often fueled and sometimes forced by others. And as much as they seem to crave risk, they really prefer to avoid it.
— In a fascinating study, management researchers Joseph Raffiee and Jie Feng asked a simple question: When people start a business, are they better off keeping or quitting their day jobs? From 1994 until 2008, they tracked a nationally representative group of over five thousand Americans in their twenties, thirties, forties, and fifties who became entrepreneurs. Whether these founders kept or left their day jobs wasn’t influenced by financial need; individuals with high family income or high salaries weren’t any more or less likely to quit and become full-time entrepreneurs. A survey showed that the ones who took the full plunge were risk takers with spades of confidence. The entrepreneurs who hedged their bets by starting their companies while still working were far more risk averse and unsure of themselves.
If you think like most people, you’ll predict a clear advantage for the risk takers. Yet the study showed the exact opposite: Entrepreneurs who kept their day jobs had 33 percent lower odds of failure than those who quit.
If you’re risk averse and have some doubts about the feasibility of your ideas, it’s likely that your business will be built to last. If you’re a freewheeling gambler, your startup is far more fragile.
Like the Warby Parker crew, the entrepreneurs whose companies topped Fast Company’s recent most innovative lists typically stayed in their day jobs even
after they launched. Former track star Phil Knight started selling running shoes out of the trunk of his car in 1964, yet kept working as an accountant until 1969. After inventing the original Apple I computer, Steve Wozniak started the company with Steve Jobs in 1976 but continued working full time in his engineering job at Hewlett-Packard until 1977. And although Google founders Larry Page and Sergey Brin figured out how to dramatically improve internet searches in 1996, they didn’t go on leave from their graduate studies at Stanford until 1998. “We almost didn’t start Google,” Page says, because we “were too worried about dropping out of our Ph.D. program.” In 1997, concerned that their fledgling search engine was distracting them from their research, they tried to sell Google for less than $2 million in cash and stock. Luckily for them, the potential buyer rejected the offer.
This habit of keeping one’s day job isn’t limited to successful entrepreneurs. Many influential creative minds have stayed in full-time employment or education even after earning income from major projects. Selma director Ava DuVernay made her first three films while working in her day job as a publicist, only pursuing filmmaking full time after working at it for four years and winning multiple awards. Brian May was in the middle of doctoral studies in astrophysics when he started playing guitar in a new band, but he didn’t drop out until several years later to go all in with Queen. Soon thereafter he wrote “We Will Rock You.” Grammy winner John Legend released his first album in 2000 but kept working as a management consultant until 2002, preparing PowerPoint presentations by day while performing at night. Thriller master Stephen King worked as a teacher, janitor, and gas station attendant for seven years after writing his first story, only quitting a year after his first novel, Carrie, was published. Dilbert author Scott Adams worked at Pacific Bell for seven years after his first comic strip hit newspapers.
Why did all these originals play it safe instead of risking it all?
Why Risks Are Like Stock Portfolios
Half a century ago, University of Michigan psychologist Clyde Coombs developed an innovative theory of risk. In the stock market, if you’re going to make a risky investment, you protect yourself by playing it safe in other investments. Coombs suggested that in their daily lives, successful people do the same thing with risks, balancing them out in a portfolio. When we embrace danger in one domain, we offset our overall level of risk by exercising caution in another domain. If you’re about to bet aggressively in blackjack, you might drive below the speed limit on your way to the casino.
Risk portfolios explain why people often become original in one part of their lives while remaining quite conventional in others. Baseball owner Branch Rickey opened the door for Jackie Robinson to break the color barrier, but refused to go to the ballpark on Sundays, use profanity, or touch a drop of alcohol. T. S. Eliot’s landmark work, The Waste Land, has been hailed as one of the twentieth century’s most significant poems. But after publishing it in 1922, Eliot kept his London bank job until 1925, rejecting the idea of embracing professional risk. As the novelist Aldous Huxley noted after paying him an office visit, Eliot was “the most bank-clerky of all bank clerks.” When he finally did leave the position, Eliot still didn’t strike out on his own. He spent the next forty years working for a publishing house to provide stability in his life, writing poetry on the side. As Polaroid founder Edwin Land remarked, “No person could possibly be original in one area unless he were possessed of the emotional and social stability that comes from fixed attitudes in all areas other than the one in which he is being original.”
But don’t day jobs distract us from doing our best work? Common sense suggests that creative accomplishments can’t flourish without big windows of time and energy, and companies can’t thrive without intensive effort. Those assumptions overlook the central benefit of a balanced risk portfolio: Having a sense of security in one realm gives us the freedom to be original in another. By covering our bases financially, we escape the pressure to publish half-baked books, sell shoddy art, or launch untested businesses. When Pierre Omidyar built eBay, it was just a hobby; he kept working as a programmer for the next nine months, only leaving after his online marketplace was netting him more money
than his job. “The best entrepreneurs are not risk maximizers,” Endeavor cofounder and CEO Linda Rottenberg observes based on decades of experience training many of the world’s great entrepreneurs. “They take the risk out of risk- taking.”
Managing a balanced risk portfolio doesn’t mean constantly hovering in the middle of the spectrum by taking moderate risks. Instead, successful originals take extreme risks in one arena and offset them with extreme caution in another. At age twenty-seven, Sara Blakely generated the novel idea of creating footless pantyhose, taking a big risk by investing her entire savings of $5,000. To balance out her risk portfolio, she stayed in her full-time position selling fax machines for two years, spending nights and weekends building the prototype—and saving money by writing her own patent application instead of hiring lawyers to do so. After she finally launched Spanx, she became the world’s youngest self-made billionaire. A century earlier, Henry Ford started his automotive empire while employed as a chief engineer for Thomas Edison, which gave him the security necessary to try out his novel inventions for a car. He continued working under Edison for two years after building a carburetor and a year after earning a patent for it.
And what about Bill Gates, famous for dropping out of Harvard to start Microsoft? When Gates sold a new software program as a sophomore, he waited an entire year before leaving school. Even then he didn’t drop out, but balanced his risk portfolio by applying for a leave of absence that was formally approved by the university—and by having his parents bankroll him. “Far from being one of the world’s great risk takers,” entrepreneur Rick Smith notes, “Bill Gates might more accurately be thought of as one of the world’s great risk mitigators.”
It was this kind of risk mitigation that was responsible for Warby Parker’s breakthrough. Two of the cofounders, Neil Blumenthal and Dave Gilboa, became the company’s co-CEOs. They rejected advice to conform to the norm of selecting a single leader, believing it was safer to have a pair at the helm— indeed, evidence shows that having co-CEOs elicits positive market reactions and increases firm valuation. From the start, their number-one priority was reducing risk. “Warby Parker wasn’t the basket that I wanted to put all my eggs into,” Dave says. After starting the company he continued exploring other business opportunities by scouting scientific discoveries on campus to see if they had any commercial potential. Having backup plans gave the founders the courage to base their business on the unproven assumption that people would be willing to buy glasses online. Instead of just acknowledging that uncertainty,
they actively worked to minimize it. “We talked constantly about de-risking the business,” Neil says. “The whole journey was a series of go/no-go decisions. At every step of the way, we had checks and balances.”
As part of their protection against risk, the four friends took an entrepreneurship class together and spent months honing their business plan. To make customers more comfortable with the unfamiliar concept of ordering eyewear over the internet, they decided to offer free returns. But in surveys and focus groups, people were still hesitant to buy glasses online. “There were a lot of people who just wouldn’t do it. That really made us question the whole premise of the business,” Neil recalls. “It was a moment of severe self-doubt. That took us back to the drawing board.”