IN DEFENSE of FOOD
ALSO BY MICHAEL POLLAN
Sec ond Nature
A P lac e of My Ow n
The Botany of Des ire
The Omniv ore’s Di lemma
IN DEFENSE of FOOD
AN EATER’S MANIFESTO
MICHAEL POLLAN
T HE PENGUIN PRESS
New York 2008
T HE PENGUIN PRESS Publ is hed by the Penguin Group
Penguin Group (USA) Inc ., 375 Huds on S treet, New York, New York 10014, U.S .A . • Penguin Group (Canada), 90 Egl inton Avenue Eas t, Sui te 700, T oronto, Ontario, Canada M4P 2Y3 (a divis ion of Pears on Penguin Canada Inc .) Penguin Books Ltd, 80 S trand, London W C2R ORL, England • Penguin Ireland, 25 S t. S tephen’s Green, Dubl in 2, Ireland (a divis ion of Penguin Books Ltd) • Penguin Books Aus tral ia Ltd, 250 Camberwel l Road, Camberwel l , V ic toria 3124, Aus tral ia (a divis ion of Pears on Aus tral ia Group P ty Ltd) • Penguin Books India Pvt Ltd, 11 Community Centre, Panc hs heel Park, New Delhi-110 017, India • Penguin Group (NZ), 67 Apol lo Drive, Ros edale, North Shore 0632, New Zealand (a divis ion of Pears on New Zealand Ltd) • Penguin Books (South A fric a) (P ty) Ltd, 24 S turdee Avenue, Ros ebank, J ohannes burg 2196, South A fric a
Penguin Books Ltd, Regis tered Offic es : 80 S trand, London W C2R ORL, England
Firs t publ is hed in 2008 by T he Penguin P res s , a member of Penguin Group (USA) Inc .
Copyright © Mic hael Pol lan, 2008 A l l rights res erved
A portion of this book fi rs t appeared in The New York Times Magaz ine under the ti tle “Unhappy Meals .”
LIBRARY OF CONGRESS CAT ALOGING IN PUBLICAT ION DAT A
Pol lan, Mic hael . In defens e of food : an eater’s manifes to / Mic hael Pol lan.
p. c m. Inc ludes bibl iographic al referenc es and index.
ISBN: 1-4295-8125-5 1. Nutri tion. 2. Food habi ts . I. T i tle.
RA784.P643 2008 613—dc 22 2007037552
W ithout l im i ting the rights under c opyright res erved above, no part of this publ ic ation may be reproduc ed, s tored in or introduc ed into a retrieval s ys tem, or trans mitted, in any form or by any means (elec tronic , mec hanic al , photoc opying, rec ording or otherwis e), wi thout the prior wri tten perm is s ion of both the c opyright owner and the above publ is her of this book.
T he s c anning, uploading, and dis tribution of this book via the Internet or via any other means without the perm is s ion of the publ is her is i l legal and punis hable by law. P leas e purc has e only authorized elec tronic edi tions and do not partic ipate in or enc ourage elec tronic pirac y of c opyrightable materials . Your s upport of the author’s rights is apprec iated.
FOR ANN AND GERRY,
W ith grati tude for y our loy al friends hip and ins pired edi ting
CONTENTS
INTRODUCTION An Eater’s Manifesto
I THE AGE OF NUTRITIONISM
ONE From Foods to Nutrients
TW O Nutritionism Defined
THREE Nutritionism Comes to Market
FOUR Food Science’s Golden Age
FIVE The Melting of the Lipid Hypothesis
SIX Eat Right, Get Fatter
SEVEN Beyond the Pleasure Principle
EIGHT The Proof in the Low-Fat Pudding
NINE Bad Science
TEN Nutritionism’s Children
II THE WESTERN DIET AND THE DISEASES OF CIVILIZATION
ONE The Aborigine in All of Us
TW O The Elephant in the Room
THREE The Industrialization of Eating: What We Do Know
1) From Whole Foods to Refined
2) From Complexity to Simplicity
3) From Quality to Quantity
4) From Leaves to Seeds
5) From Food Culture to Food Science
5) From Food Culture to Food Science
III GETTING OVER NUTRITIONISM
ONE Escape from the Western Diet
TW O Eat Food: Food Defined
THREE Mostly Plants: What to Eat
FOUR Not Too Much: How to Eat
ACKNOW LEDGMENTS
SOURCES
RESOURCES
INDEX
IN DEFENSE of FOOD
INTRODUCTION AN EATER’S MANIFESTO
Eat food. Not too much. Mostly plants. That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy.
I hate to give the game away right here at the beginning of a whole book devoted to the subject, and I’m tempted to complicate matters in the interest of keeping things going for a couple hundred more pages or so. I’ll try to resist, but will go ahead and add a few more details to flesh out the recommendations. Like, eating a little meat isn’t going to kill you, though it might be better approached as a side dish than as a main. And you’re better off eating whole fresh foods rather than processed food products. That’s what I mean by the recommendation to “eat food,” which is not quite as simple as it sounds. For while it used to be that food was all you could eat, today there are thousands of other edible foodlike substances in the supermarket. These novel products of food science often come in packages elaborately festooned with health claims, which brings me to another, somewhat counterintuitive, piece of advice: If you’re concerned about your health, you should probably avoid products that make health claims. Why? Because a health claim on a food product is a strong indication it’s not really food, and food is what you want to eat.
You can see how quickly things can get complicated.
I started on this quest to identify a few simple rules about eating after publishing The Omnivore’s Dilemma in 2006. Questions of personal health did not take center stage in that book, which was more concerned with the ecological and ethical dimensions of our eating choices. (Though I’ve found that, in most but not all cases, the best ethical and environmental choices also happen to be the best choices for our health—very good news indeed.) But many readers wanted to know, after they’d spent a few hundred pages following me following the food chains that feed us, “Okay, but what should I eat? And now that you’ve been to the feedlots, the food-processing plants, the organic factory farms, and the local farms and ranches, what do you eat?”
Fair questions, though it does seem to me a symptom of our present confusion about food that people would feel the need to consult a journalist, or for that matter a nutritionist or doctor or government food pyramid, on so basic a question about the conduct of our everyday lives as humans. I mean, what other animal needs professional help in deciding what it should eat? True, as omnivores—creatures that can eat just about anything nature has to offer and that in fact need to eat a wide variety of different things in order to be healthy—the “What to eat” question is somewhat more complicated for us than it is for, say, cows. Yet for most of human history, humans have navigated the question without expert advice. To guide us we had, instead, Culture, which, at least when it comes to food, is really just a fancy word for your mother. What to eat, how much of it to eat, what order in which to eat it, with what and when and with whom have for most of human history been a set of questions long settled and passed down from parents to children without a lot of controversy or fuss.
But over the last several decades, mom lost much of her authority over the dinner menu, ceding it to scientists and food marketers (often an unhealthy alliance of the two) and, to a lesser extent, to the government, with its ever-shifting dietary guidelines, food-labeling rules, and perplexing pyramids. Think about it: Most of us no longer eat what our mothers ate as children or, for that matter, what our mothers fed us as children. This is, historically speaking, an unusual state of affairs.
My own mother grew up in the 1930s and 1940s eating a lot of traditional Jewish-American fare, typical of families who recently emigrated from Russia or Eastern Europe: stuffed cabbage, organ meats, cheese blintzes, kreplach, knishes stuffed with potato or chicken liver, and vegetables that often were cooked in rendered chicken or duck fat. I never ate any of that stuff as a kid, except when I visited my grandparents. My mother, an excellent and adventurous cook whose own menus were shaped by the cosmopolitan food trends of New York in the 1960s (her influences would have included the 1964 World’s Fair; Julia Child and Craig Claiborne; Manhattan restaurant menus of the time; and of course the rising drumbeat of food marketing) served us a rotating menu that each week completed a culinary world tour: beouf bourguignon or beef Stroganoff on Monday; coq au vin or oven-fried chicken (in a Kellogg’s Cornflakes crust) on Tuesday; meat loaf or Chinese pepper steak on Wednesday (yes, there was a lot of beef); spaghetti pomodoro with Italian sausages on Thursday; and on her weekend nights off, a Swanson’s TV dinner or Chinese takeout. She cooked with Crisco or Wesson oil rather than chicken or duck fat and used margarine rather than butter because she’d absorbed the nutritional orthodoxy of the time, which held that these more up-to-date fats were better for our health.
(Oops.)
Nowadays I don’t eat any of that stuff—and neither does my mother, who has moved on too. Her parents wouldn’t recognize the foods we put on the table, except maybe the butter, which is back. Today in America the culture of food is changing more than once a generation, which is historically unprecedented—and dizzying.
What is driving such relentless change in the American diet? One force is a thirty-two-billion-dollar food-marketing machine that thrives on change for its own sake. Another is the constantly shifting ground of nutrition science that, depending on your point of view, is steadily advancing the frontiers of our knowledge about diet and health or is just changing its mind a lot because it is a flawed science that knows much less than it cares to admit. Part of what drove my grandparents’ food culture from the American table was official scientific opinion, which, beginning in the 1960s, decided that animal fat was a deadly substance. And then there were the food manufacturers, which stood to make very little money from my grandmother’s cooking, because she was doing so much of it from scratch—up to and including rendering her own cooking fats. Amplifying the “latest science,” they managed to sell her daughter on the virtues of hydrogenated vegetable oils, the ones that we’re now learning may be, well, deadly substances.
Sooner or later, everything solid we’ve been told about the links between our diet and our health seems to get blown away in the gust of the most recent study. Consider the latest findings. In 2006 came news that a low-fat diet, long believed to protect against cancer, may do no such thing—this from the massive, federally funded Women’s Health Initiative, which has also failed to find a link between a low-fat diet and the risk of coronary heart disease. Indeed, the whole nutritional orthodoxy around dietary fat appears to be crumbling, as we will see. In 2005 we learned that dietary fiber might not, as we’d been confidently told for years, help prevent colorectal cancers and heart disease. And then, in the fall of 2006, two prestigious studies on omega-3 fats published at the same time came to strikingly different conclusions. While the Institute of Medicine at the National Academy of Sciences found little conclusive evidence that eating fish would do your heart much good (and might hurt your brain, because so much fish is contaminated with mercury), a Harvard study brought the hopeful piece of news that simply by eating a couple of servings of fish each week (or by downing enough fish oil tablets) you could cut your risk of dying from a heart attack by more than a third. It’s no wonder that omega-3 fatty acids are poised to become the oat bran of our time as food scientists rush to microencapsulate fish and algae oil and blast it into such formerly all-terrestrial foods as bread and pasta, milk and yogurt and cheese, all of which will soon, you can be sure, spout fishy new health claims. (I hope you remember the relevant rule.)
By now you’re probably feeling the cognitive dissonance of the supermarket shopper or science-section reader as well as some nostalgia for the simplicity and solidity of the first few words of this book. Words I’m still prepared to defend against the shifting winds of nutritional science and food-industry marketing, and will. But before I do, it’s important to understand how we arrived at our present state of nutritional confusion and anxiety. That is the subject of the first portion of this book, “The Age of Nutritionism.”
The story of how the most basic questions about what to eat ever got so complicated reveals a great deal about the institutional imperatives of the food industry, nutrition science, and—ahem— journalism, three parties that stand to gain much from widespread confusion surrounding the most elemental question an omnivore confronts. But humans deciding what to eat without professional guidance—something they have been doing with notable success since coming down out of the trees—is seriously unprofitable if you’re a food company, a definite career loser if you’re a nutritionist, and just plain boring if you’re a newspaper editor or reporter. (Or, for that matter, an eater. Who wants to hear, yet again, that you should “eat more fruits and vegetables”?) And so like a large gray cloud, a great Conspiracy of Scientific Complexity has gathered around the simplest questions of nutrition—much to the advantage of everyone involved. Except perhaps the supposed beneficiary of all this nutritional advice: us, and our health and happiness as eaters. For the most important thing to know about the campaign to professionalize dietary advice is that it has not made us any healthier. To the contrary: As I argue in part one, most of the nutritional advice we’ve received over the last half century (and in particular the advice to replace the fats in our diets with carbohydrates) has actually made us less healthy and considerably fatter.
My aim in this book is to help us reclaim our health and happiness as eaters. To do this requires an exercise that might at first blush seem unnecessary, if not absurd: to offer a defense of food and the eating thereof. That food and eating stand in need of a defense might seem counterintuitive at a time when “overnutrition” is emerging as a more serious threat to public health than undernutrition. But I contend that most of what we’re consuming today is no longer, strictly speaking, food at all, and how we’re consuming it—in the car, in front of the TV, and, increasingly, alone—is not really eating, at least not in the sense that civilization has long understood the term. Jean-Anthelme Brillat-Savarin, the eighteenth-century gastronomist, drew a useful distinction between the alimentary activity of animals, which “feed,” and humans, who eat, or dine, a practice, he suggested, that owes as much to culture as it does to biology.
But if food and eating stand in need of a defense, from whom, or what, do they need defending? From nutrition science on one side and from the food industry on the other—and from the needless complications around eating that together they have fostered. As eaters we find ourselves increasingly in the grip of a Nutritional Industrial Complex—comprised of well-meaning, if error-prone, scientists and food marketers only too eager to exploit every shift in the nutritional consensus. Together, and with some crucial help from the government, they have constructed an ideology of nutritionism that, among other things, has convinced us of three pernicious myths: that what matters most is not the food but the “nutrient”; that because nutrients are invisible and incomprehensible to everyone but scientists, we need expert help in deciding what to eat; and that the purpose of eating is to promote a narrow concept of physical health. Because food in this view is foremost a matter of biology, it follows that we must try to eat “scientifically”—by the nutrient and the number and under the guidance of experts.
If such an approach to food doesn’t strike you as the least bit strange, that is probably because nutritionist thinking has become so pervasive as to be invisible. We forget that, historically, people have eaten for a great many reasons other than biological necessity. Food is also about pleasure, about community, about family and spirituality, about our relationship to the natural world, and about expressing our identity. As long as humans have been taking meals together, eating has been as much about culture as it has been about biology.
That eating should be foremost about bodily health is a relatively new and, I think, destructive idea—destructive not just of the pleasure of eating, which would be bad enough, but paradoxically of our
health as well. Indeed, no people on earth worry more about the health consequences of their food choices than we Americans do—and no people suffer from as many diet-related health problems. We are becoming a nation of orthorexics: people with an unhealthy obsession with healthy eating.*
The scientists haven’t tested the hypothesis yet, but I’m willing to bet that when they do they’ll find an inverse correlation between the amount of time people spend worrying about nutrition and their overall health and happiness. This is, after all, the implicit lesson of the French paradox, so-called not by the French (Quel paradoxe?) but by American nutritionists, who can’t fathom how a people who enjoy their food as much as the French do, and blithely eat so many nutrients deemed toxic by nutritionists, could have substantially lower rates of heart disease than we do on our elaborately engineered low-fat diets. Maybe it’s time we confronted the American paradox: a notably unhealthy population preoccupied with nutrition and diet and the idea of eating healthily.
I don’t mean to suggest that all would be well if we could just stop worrying about food or the state of our dietary health: Let them eat Twinkies! There are in fact some very good reasons to worry. The rise of nutritionism reflects legitimate concerns that the American diet, which is well on its way to becoming the world’s diet, has changed in ways that are making us increasingly sick and fat. Four of the top ten causes of death today are chronic diseases with well-established links to diet: coronary heart disease, diabetes, stroke, and cancer. Yes, the rise to prominence of these chronic diseases is partly due to the fact that we’re not dying earlier in life of infectious diseases, but only partly: Even after adjusting for age, many of the so-called diseases of civilization were far less common a century ago—and they remain rare in places where people don’t eat the way we do.
I’m speaking, of course, of the elephant in the room whenever we discuss diet and health: “the Western diet.” This is the subject of the second part of the book, in which I follow the story of the most radical change to the way humans eat since the discovery of agriculture. All of our uncertainties about nutrition should not obscure the plain fact that the chronic diseases that now kill most of us can be traced directly to the industrialization of our food: the rise of highly processed foods and refined grains; the use of chemicals to raise plants and animals in huge monocultures; the superabundance of cheap calories of sugar and fat produced by modern agriculture; and the narrowing of the biological diversity of the human diet to a tiny handful of staple crops, notably wheat, corn, and soy. These changes have given us the Western diet that we take for granted: lots of processed foods and meat, lots of added fat and sugar, lots of everything—except vegetables, fruits, and whole grains.
That such a diet makes people sick and fat we have known for a long time. Early in the twentieth century, an intrepid group of doctors and medical workers stationed overseas observed that wherever in the world people gave up their traditional way of eating and adopted the Western diet, there soon followed a predictable series of Western diseases, including obesity, diabetes, cardiovascular diseases, and cancer. They called these the Western diseases and, though the precise causal mechanisms were (and remain) uncertain, these observers had little doubt these chronic diseases shared a common etiology: the Western diet.
What’s more, the traditional diets that the new Western foods displaced were strikingly diverse: Various populations thrived on diets that were what we’d call high fat, low fat, or high carb; all meat or all plant; indeed, there have been traditional diets based on just about any kind of whole food you can imagine. What this suggests is that the human animal is well adapted to a great many different diets. The Western diet, however, is not one of them.
Here, then, is a simple but crucial fact about diet and health, yet, curiously, it is a fact that nutritionism cannot see, probably because it developed in tandem with the industrialization of our food and so takes it for granted. Nutritionism prefers to tinker with the Western diet, adjusting the various nutrients (lowering the fat, boosting the protein) and fortifying processed foods rather than questioning their value in the first place. Nutritionism is, in a sense, the official ideology of the Western diet and so cannot be expected to raise radical or searching questions about it.
But we can. By gaining a firmer grasp on the nature of the Western diet—trying to understand it not only physiologically but also historically and ecologically—we can begin to develop a different way of thinking about food that might point a path out of our predicament. In doing so we have two sturdy—and strikingly hopeful—facts to guide us: first, that humans historically have been healthy eating a great many different diets; and second, that, as we’ll see, most of the damage to our food and health caused by the industrialization of our eating can be reversed. Put simply, we can escape the Western diet and its consequences.
This is the burden of the third and last section of In Defense of Food: to propose a couple dozen personal rules of eating that are conducive not only to better health but also to greater pleasure in eating, two goals that turn out to be mutually reinforcing.
These recommendations are a little different from the dietary guidelines you’re probably accustomed to. They are not, for example, narrowly prescriptive. I’m not interested in telling you what to have for dinner. No, these suggestions are more like eating algorithms, mental devices for thinking through our food choices. Because there is no single answer to the question of what to eat, these guidelines will produce as many different menus as there are people using them.
These rules of thumb are also not framed in the vocabulary of nutrition science. This is not because nutrition science has nothing important to teach us—it does, at least when it avoids the pitfalls of reductionism and overconfidence—but because I believe we have as much, if not more, to learn about eating from history and culture and tradition. We are accustomed in all matters having to do with health to assuming science should have the last word, but in the case of eating, other sources of knowledge and ways of knowing can be just as powerful, sometimes more so. And while I inevitably rely on science (even reductionist science) in attempting to understand many questions about food and health, one of my aims in this book is to show the limitations of a strictly scientific understanding of something as richly complex and multifaceted as food. Science has much of value to teach us about food, and perhaps someday scientists will “solve” the problem of diet, creating the nutritionally optimal meal in a pill, but for now and the foreseeable future, letting the scientists decide the menu would be a mistake. They simply do not know enough.
You may well, and rightly, wonder who am I to tell you how to eat? Here I am advising you to reject the advice of science and industry—and then blithely go on to offer my own advice. So on whose authority do I purport to speak? I speak mainly on the authority of tradition and common sense. Most of what we need to know about how to eat we already know, or once did until we allowed the nutrition experts and the advertisers to shake our confidence in common sense, tradition, the testimony of our senses, and the wisdom of our mothers and grandmothers.
Not that we had much choice in the matter. By the 1960s or so it had become all but impossible to sustain traditional ways of eating in the face of the industrialization of our food. If you wanted to eat produce grown without synthetic chemicals or meat raised on pasture without pharmaceuticals, you were out of luck. The supermarket had become the only place to buy food, and real food was rapidly disappearing from its shelves, to be replaced by the modern cornucopia of highly processed foodlike products. And because so many of these novelties deliberately lied to our senses with fake sweeteners and flavorings, we could no longer rely on taste or smell to know what we were eating.
Most of my suggestions come down to strategies for escaping the Western diet, but before the resurgence of farmers’ markets, the rise of the organic movement, and the renaissance of local agriculture now under way across the country, stepping outside the conventional food system simply was not a realistic option for most people. Now it is. We are entering a postindustrial era of food; for the first time in a generation it is possible to leave behind the Western diet without having also to leave behind civilization. And the more eaters who vote with their forks for a different kind of food, the more commonplace and accessible such food will become. Among other things, this book is an eater’s manifesto, an invitation to join the movement that is renovating our food system in the name of health—health in the very broadest sense of that word.
I doubt the last third of this book could have been written forty years ago, if only because there would have been no way to eat the way I propose without going back to the land and growing all your own food. It would have been the manifesto of a crackpot. There was really only one kind of food on the national menu, and that was whatever industry and nutritionism happened to be serving. Not anymore. Eaters have real choices now, and those choices have real consequences, for our health and the health of the land and the health of our food culture—all of which, as we will see, are inextricably linked. That anyone should need to write a book advising people to “eat food” could be taken as a measure of our alienation and confusion. Or we can choose to see it in a more positive light and count ourselves fortunate indeed that there is once again real food for us to eat.
I
THE AGE OF NUTRITIONISM
ONE FROM FOODS TO NUTRIENTS
If you spent any time at all in a supermarket in the 1980s, you might have noticed something peculiar going on. The food was gradually disappearing from the shelves. Not literally vanishing—I’m not talking about Soviet-style shortages. No, the shelves and refrigerated cases still groaned with packages and boxes and bags of various edibles, more of them landing every year in fact, but a great many of the traditional supermarket foods were steadily being replaced by “nutrients,” which are not the same thing. Where once the familiar names of recognizable comestibles—things like eggs or breakfast cereals or snack foods—claimed pride of place on the brightly colored packages crowding the aisles, now new, scientific-sounding terms like “cholesterol” and “fiber” and “saturated fat” began rising to large-type prominence. More important than mere foods, the presence or absence of these invisible substances was now generally believed to confer health benefits on their eaters. The implicit message was that foods, by comparison, were coarse, old-fashioned, and decidedly unscientific things—who could say what was in them really? But nutrients—those chemical compounds and minerals in foods that scientists have identified as important to our health—gleamed with the promise of scientific certainty. Eat more of the right ones, fewer of the wrong, and you would live longer, avoid chronic diseases, and lose weight.
Nutrients themselves had been around as a concept and a set of words since early in the nineteenth century. That was when William Prout, an English doctor and chemist, identified the three principal constituents of food—protein, fat, and carbohydrates—that would come to be known as macronutrients. Building on Prout’s discovery, Justus von Liebig, the great German scientist credited as one of the founders of organic chemistry, added a couple of minerals to the big three and declared that the mystery of animal nutrition—how food turns into flesh and energy—had been solved. This is the very same Liebig who identified the macronutrients in soil—nitrogen, phosphorus, and potassium (known to farmers and gardeners by their periodic table initials, N, P, and K). Liebig claimed that all that plants need to live and grow are these three chemicals, period. As with the plant, so with the person: In 1842, Liebig proposed a theory of metabolism that explained life strictly in terms of a small handful of chemical nutrients, without recourse to metaphysical forces such as “vitalism.”
Having cracked the mystery of human nutrition, Liebig went on to develop a meat extract—Liebig’s Extractum Carnis—that has come down to us as bouillon and concocted the first baby formula, consisting of cow’s milk, wheat flour, malted flour, and potassium bicarbonate.
Liebig, the father of modern nutritional science, had driven food into a corner and forced it to yield its chemical secrets. But the post–Liebig consensus that science now pretty much knew what was going on in food didn’t last long. Doctors began to notice that many of the babies fed exclusively on Liebig’s formula failed to thrive. (Not surprising, given that his preparation lacked any vitamins or several essential fats and amino acids.) That Liebig might have overlooked a few little things in food also began to occur to doctors who observed that sailors on long ocean voyages often got sick, even when they had adequate supplies of protein, carbohydrates, and fat. Clearly the chemists were missing something—some essential ingredients present in the fresh plant foods (like oranges and potatoes) that miraculously cured the sailors. This observation led to the discovery early in the twentieth century of the first set of micronutrients, which the Polish biochemist Casimir Funk, harkening back to older vitalist ideas of food, christened “vitamines” in 1912 (“vita-” for life and “-amines” for organic compounds organized around nitrogen).
Vitamins did a lot for the prestige of nutritional science. These special molecules, which at first were isolated from foods and then later synthesized in a laboratory, could cure people of nutritional deficiencies such as scurvy or beriberi almost overnight in a convincing demonstration of reductive chemistry’s power. Beginning in the 1920s, vitamins enjoyed a vogue among the middle class, a group not notably afflicted by beriberi or scurvy. But the belief took hold that these magic molecules also promoted growth in children, long life in adults, and, in a phrase of the time, “positive health” in everyone. (And what would “negative health” be exactly?) Vitamins had brought a kind of glamour to the science of nutrition, and though certain elite segments of the population now began to eat by its expert lights, it really wasn’t until late in the twentieth century that nutrients began to push food aside in the popular imagination of what it means to eat.
No single event marked the shift from eating food to eating nutrients, although in retrospect a little-noticed political dustup in Washington in 1977 seems to have helped propel American culture down this unfortunate and dimly lighted path. Responding to reports of an alarming increase in chronic diseases linked to diet—including heart disease, cancer, obesity, and diabetes—the Senate Select Committee on Nutrition and Human Needs chaired by South Dakota Senator George McGovern held hearings on the problem. The committee had been formed in 1968 with a mandate to eliminate malnutrition, and its work had led to the establishment of several important food-assistance programs. Endeavoring now to resolve the question of diet and chronic disease in the general population
represented a certain amount of mission creep, but all in a good cause to which no one could possibly object.
After taking two days of testimony on diet and killer diseases, the committee’s staff—comprised not of scientists or doctors but of lawyers and (ahem) journalists—set to work preparing what it had every reason to assume would be an uncontroversial document called Dietary Goals for the United States. The committee learned that while rates of coronary heart disease had soared in America since World War II, certain other cultures that consumed traditional diets based mostly on plants had strikingly low rates of chronic diseases. Epidemiologists had also observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease had temporarily plummeted, only to leap upward once the war was over.
Beginning in the 1950s, a growing body of scientific opinion held that the consumption of fat and dietary cholesterol, much of which came from meat and dairy products, was responsible for rising rates of heart disease during the twentieth century. The “lipid hypothesis,” as it was called, had already been embraced by the American Heart Association, which in 1961 had begun recommending a “prudent diet” low in saturated fat and cholesterol from animal products. True, actual proof for the lipid hypothesis was remarkably thin in 1977—it was still very much a hypothesis, but one well on its way to general acceptance.
In January 1977, the committee issued a fairly straightforward set of dietary guidelines, calling on Americans to cut down on their consumption of red meat and dairy products. Within weeks a firestorm of criticism, emanating chiefly from the red meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about actual foodstuffs—the committee had advised Americans to “reduce consumption of meat”—was replaced by artful compromise: “choose meats, poultry, and fish that will reduce saturated fat intake.”
Leave aside for now the virtues, if any, of a low-meat and/or low-fat diet, questions to which I will return, and focus for a moment on language. For with these subtle changes in wording a whole way of thinking about food and health underwent a momentous shift. First, notice that the stark message to “eat less” of a particular food—in this case meat—had been deep-sixed; don’t look for it ever again in any official U.S. government dietary pronouncement. Say what you will about this or that food, you are not allowed officially to tell people to eat less of it or the industry in question will have you for lunch. But there is a path around this immovable obstacle, and it was McGovern’s staffers who blazed it: Speak no more of foods, only nutrients. Notice how in the revised guidelines, distinctions between entities as different as beef and chicken and fish have collapsed. These three venerable foods, each representing not just a different species but an entirely different taxonomic class, are now lumped together as mere delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves. Now the culprit is an obscure, invisible, tasteless—and politically unconnected—substance that may or may not lurk in them called saturated fat.
The linguistic capitulation did nothing to rescue McGovern from his blunder. In the very next election, in 1980, the beef lobby succeeded in rusticating the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein squatting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, but would instead arrive dressed in scientific euphemism and speaking of nutrients, entities that few Americans (including, as we would find out, American nutrition scientists) really understood but that, with the notable exception of sucrose, lack powerful lobbies in Washington.*
The lesson of the McGovern fiasco was quickly absorbed by all who would pronounce on the American diet. When a few years later the National Academy of Sciences looked into the question of diet and cancer, it was careful to frame its recommendations nutrient by nutrient rather than food by food, to avoid offending any powerful interests. We now know the academy’s panel of thirteen scientists adopted this approach over the objections of at least two of its members who argued that most of the available science pointed toward conclusions about foods, not nutrients. According to T. Colin Campbell, a Cornell nutritional biochemist who served on the panel, all of the human population studies linking dietary fat to cancer actually showed that the groups with higher cancer rates consumed not just more fats, but also more animal foods and fewer plant foods as well. “This meant that these cancers could just as easily be caused by animal protein, dietary cholesterol, something else exclusively found in animal-based foods, or a lack of plant-based foods,” Campbell wrote years later. The argument fell on deaf ears.
In the case of the “good foods” too, nutrients also carried the day: The language of the final report highlighted the benefits of the antioxidants in vegetables rather than the vegetables themselves. Joan Gussow, a Columbia University nutritionist who served on the panel, argued against the focus on nutrients rather than whole foods. “The really important message in the epidemiology, which is all we had to go on, was that some vegetables and citrus fruits seemed to be protective against cancer. But those sections of the report were written as though it was the vitamin C in the citrus or the beta-carotene in the vegetables that was responsible for the effect. I kept changing the language to talk about ‘foods that contain vitamin C’ and ‘foods that contain carotenes.’ Because how do you know it’s not one of the other things in the carrots or the broccoli? There are hundreds of carotenes. But the biochemists had their answer: ‘You can’t do a trial on broccoli.’”
So the nutrients won out over the foods. The panel’s resort to scientific reductionism had the considerable virtue of being both politically expedient (in the case of meat and dairy) and, to these scientific heirs of Justus von Liebig, intellectually sympathetic. With each of its chapters focused on a single nutrient, the final draft of the National Academy of Sciences report, Diet, Nutrition and Cancer, framed its recommendations in terms of saturated fats and antioxidants rather than beef and broccoli.
In doing so, the 1982 National Academy of Sciences report helped codify the official new dietary language, the one we all still speak. Industry and media soon followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids, flavonols, carotenoids, antioxidants, probiotics, and phytochemicals soon colonized much of the cultural space previously occupied by the tangible material formerly known as food.
The Age of Nutritionism had arrived.
TWO NUTRITIONISM DEFINED
The term isn’t mine. It was coined by an Australian sociologist of science by the name of Gyorgy Scrinis, and as near as I can determine first appeared in a 2002 essay titled “Sorry Marge” published in an Australian quarterly called Meanjin. “Sorry Marge” looked at margarine as the ultimate nutritionist product, able to shift its identity (no cholesterol! one year, no trans fats! the next) depending on the prevailing winds of dietary opinion. But Scrinis had bigger game in his sights than spreadable vegetable oil. He suggested that we look past the various nutritional claims swirling around margarine and butter and consider the underlying message of the debate itself: “namely, that we should understand and engage with food and our bodies in terms of their nutritional and chemical constituents and requirements—the assumption being that this is all we need to understand.” This reductionist way of thinking about food had been pointed out and criticized before (notably by the Canadian historian Harvey Levenstein, the British nutritionist Geoffrey Cannon, and the American nutritionists Joan Gussow and Marion Nestle), but it had never before been given a proper name: “nutritionism.” Proper names have a way of making visible things we don’t easily see or simply take for granted.
The first thing to understand about nutritionism is that it is not the same thing as nutrition. As the “-ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s still exerting its hold on your culture. A reigning ideology is a little like the weather—all pervasive and so virtually impossible to escape. Still, we can try.
In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. Put another way: Foods are essentially the sum of their nutrient parts. From this basic premise flow several others.
Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists reach the public) to explain the hidden reality of foods to us. In form this is a quasireligious idea, suggesting the visible world is not the one that really matters, which implies the need for a priesthood. For to enter a world where your dietary salvation depends on unseen nutrients, you need plenty of expert help.
But expert help to do what exactly? This brings us to another unexamined assumption of nutritionism: that the whole point of eating is to maintain and promote bodily health. Hippocrates’ famous injunction to “let food be thy medicine” is ritually invoked to support this notion. I’ll leave the premise alone for now, except to point out that it is not shared by all cultures and, further, that the experience of these other cultures suggests that, paradoxically, regarding food as being about things other than bodily health—like pleasure, say, or sociality or identity—makes people no less healthy; indeed, there’s some reason to believe it may make them more healthy. This is what we usually have in mind when we speak of the French paradox. So there is at least a question as to whether the ideology of nutritionism is actually any good for you.
It follows from the premise that food is foremost about promoting physical health that the nutrients in food should be divided into the healthy ones and the unhealthy ones—good nutrients and bad. This has been a hallmark of nutritionist thinking from the days of Liebig, for whom it wasn’t enough to identify the nutrients; he also had to pick favorites, and nutritionists have been doing so ever since. Liebig claimed that protein was the “master nutrient” in animal nutrition, because he believed it drove growth. Indeed, he likened the role of protein in animals to that of nitrogen in plants: Protein (which contains nitrogen) comprised the essential human fertilizer. Liebig’s elevation of protein dominated nutritionist thinking for decades as public health authorities worked to expand access to and production of the master nutrient (especially in the form of animal protein), with the goal of growing bigger, and therefore (it was assumed) healthier, people. (A high priority for Western governments fighting imperial wars.) To a considerable extent we still have a food system organized around the promotion of protein as the master nutrient. It has given us, among other things, vast amounts of cheap meat and milk, which have in turn given us much, much bigger people. Whether they are healthier too is another question.
It seems to be a rule of nutritionism that for every good nutrient, there must be a bad nutrient to serve as its foil, the latter a focus for our food fears and the former for our enthusiasms. A backlash against protein arose in America at the turn of the last century as diet gurus like John Harvey Kellogg and Horace Fletcher (about whom more later) railed against the deleterious effects of protein on digestion (it supposedly led to the proliferation of toxic bacteria in the gut) and promoted the cleaner, more wholesome carbohydrate in its place. The legacy of that revaluation is the breakfast cereal, the strategic objective of which was to dethrone animal protein at the morning meal.
Ever since, the history of modern nutritionism has been a history of macronutrients at war: protein against carbs; carbs against proteins, and then fats; fats against carbs. Beginning with Liebig, in each age nutritionism has organized most of its energies around an imperial nutrient: protein in the nineteenth century, fat in the twentieth, and, it stands to reason, carbohydrates will occupy our attention in the twenty-first. Meanwhile, in the shadow of these titanic struggles, smaller civil wars have raged within the sprawling empires of the big three: refined carbohydrates versus fiber; animal protein versus plant protein; saturated fats versus polyunsaturated fats; and then, deep down within the province of the polyunsaturates, omega-3 fatty acids versus omega-6s. Like so many ideologies, nutritionism at bottom hinges on a form of dualism, so that at all times there must be an evil nutrient for adherents to excoriate and a savior nutrient for them to sanctify. At the moment, trans fats are performing admirably in the former role, omega-3 fatty acids in the latter. It goes without saying that such a Manichaean view of nutrition is bound to promote food fads and phobias and large abrupt swings of the nutritional
pendulum.
Another potentially serious weakness of nutritionist ideology is that, focused so relentlessly as it is on the nutrients it can measure, it has trouble discerning qualitative distinctions among foods. So fish, beef, and chicken through the nutritionist’s lens become mere delivery systems for varying quantities of different fats and proteins and whatever other nutrients happen to be on their scope. Milk through this lens is reduced to a suspension of protein, lactose, fats, and calcium in water, when it is entirely possible that the benefits, or for that matter the hazards, of drinking milk owe to entirely other factors (growth hormones?) or relationships between factors ( fat-soluble vitamins and saturated fat?) that have been overlooked. Milk remains a food of humbling complexity, to judge by the long, sorry saga of efforts to simulate it. The entire history of baby formula has been the history of one overlooked nutrient after another: Liebig missed the vitamins and amino acids, and his successors missed the omega-3s, and still to this day babies fed on the most “nutritionally complete” formula fail to do as well as babies fed human milk. Even more than margarine, infant formula stands as the ultimate test product of nutritionism and a fair index of its hubris.
This brings us to one of the most troubling features of nutritionism, though it is a feature certainly not troubling to all. When the emphasis is on quantifying the nutrients contained in foods (or, to be precise, the recognized nutrients in foods), any qualitative distinction between whole foods and processed foods is apt to disappear. “[If] foods are understood only in terms of the various quantities of nutrients they contain,” Gyorgy Scrinis wrote, then “even processed foods may be considered to be ‘healthier’ for you than whole foods if they contain the appropriate quantities of some nutrients.”
How convenient.
THREE NUTRITIONISM COMES TO MARKET
No idea could be more sympathetic to manufacturers of processed foods, which surely explains why they have been so happy to jump on the nutritionism bandwagon. Indeed, nutritionism supplies the ultimate justification for processing food by implying that with a judicious application of food science, fake foods can be made even more nutritious than the real thing. This of course is the story of margarine, the first important synthetic food to slip into our diet. Margarine started out in the nineteenth century as a cheap and inferior substitute for butter, but with the emergence of the lipid hypothesis in the 1950s, manufacturers quickly figured out that their product, with some tinkering, could be marketed as better—smarter!—than butter: butter with the bad nutrients removed (cholesterol and saturated fats) and replaced with good nutrients (polyunsaturated fats and then vitamins). Every time margarine was found wanting, the wanted nutrient could simply be added (Vitamin D? Got it now. Vitamin A? Sure, no problem). But of course margarine, being the product not of nature but of human ingenuity, could never be any smarter than the nutritionists dictating its recipe, and the nutritionists turned out to be not nearly as smart as they thought. The food scientists’ ingenious method for making healthy vegetable oil solid at room temperature—by blasting it with hydrogen—turned out to produce unhealthy trans fats, fats that we now know are more dangerous than the saturated fats they were designed to replace. Yet the beauty of a processed food like margarine is that it can be endlessly reengineered to overcome even the most embarrassing about-face in nutritional thinking—including the real wincer that its main ingredient might cause heart attacks and cancer. So now the trans fats are gone, and margarine marches on, unfazed and apparently unkillable. Too bad the same cannot be said of an unknown number of margarine eaters.
By now we have become so inured to fake foods that we forget what a difficult trail margarine had to blaze before it and other synthetic food products could win government and consumer acceptance. At least since the 1906 publication of Upton Sinclair’s The Jungle, the “adulteration” of common foods has been a serious concern of the eating public and the target of numerous federal laws and Food and Drug Administration regulations. Many consumers regarded “oleomargarine” as just such an adulteration, and in the late 1800s five states passed laws requiring that all butter imitations be dyed pink so no one would be fooled. The Supreme Court struck down the laws in 1898. In retrospect, had the practice survived, it might have saved some lives.
The 1938 Food, Drug and Cosmetic Act imposed strict rules requiring that the word “imitation” appear on any food product that was, well, an imitation. Read today, the official rationale behind the imitation rule seems at once commonsensical and quaint: “…there are certain traditional foods that everyone knows, such as bread, milk and cheese, and that when consumers buy these foods, they should get the foods they are expecting…[and] if a food resembles a standardized food but does not comply with the standard, that food must be labeled as an ‘imitation.’”
Hard to argue with that…but the food industry did, strenuously for decades, and in 1973 it finally succeeded in getting the imitation rule tossed out, a little-noticed but momentous step that helped speed America down the path to nutritionism.
Industry hated the imitation rule. There had been such a tawdry history of adulterated foods and related forms of snake oil in American commerce that slapping the word “imitation” on a food product was the kiss of death—an admission of adulteration and inferiority. By the 1960s and 1970s, the requirement that such a pejorative term appear on fake food packages stood in the way of innovation, indeed of the wholesale reformulation of the American food supply—a project that, in the wake of rising concerns about dietary fat and cholesterol, was coming to be seen as a good thing. What had been regarded as hucksterism and fraud in 1906 had begun to look like sound public health policy by 1973. The American Heart Association, eager to get Americans off saturated fats and onto vegetable oils (including hydrogenated vegetable oils), was actively encouraging the food industry to “modify” various foods to get the saturated fats and cholesterol out of them, and in the early seventies the association urged that “any existing and regulatory barriers to the marketing of such foods be removed.”
And so they were when, in 1973, the FDA (not, note, the Congress that wrote the law) simply repealed the 1938 rule concerning imitation foods. It buried the change in a set of new, seemingly consumer-friendly rules about nutrient labeling so that news of the imitation rule’s repeal did not appear until the twenty-seventh paragraph of The New York Times’ account, published under the headline F.D.A. PROPOSES SW EEPING CHANGE IN FOOD LABELING: NEW RULES DESIGNED T O GIVE CONSUMERS A BET T ER IDEA OF NUT RIT IONAL VALUE . (The second deck of the headline gave away the game: PROCESSORS BACK MOVE.) The revised imitation rule held that as long as an imitation product was not “nutritionally inferior” to the natural food it sought to impersonate—as long as it had the same quantities of recognized nutrients—the imitation could be marketed without using the dreaded “i” word.
With that, the regulatory door was thrown open to all manner of faked low-fat products: Fats in things like sour cream and yogurt could now be replaced with hydrogenated oils or guar gum or carrageenan, bacon bits could be replaced with soy protein, the cream in “whipped cream” and “coffee creamer” could be replaced with corn starch, and the yolks of liquefied eggs could be replaced with, well, whatever the food scientists could dream up, because the sky was now the limit. As long as the new fake foods were engineered to be nutritionally equivalent to the real article, they could no longer be considered fake. Of course the operative nutritionist assumption here is that we know enough to determine nutritional equivalence—something that the checkered history of baby formula suggests has never been the case.
Nutritionism had become the official ideology of the Food and Drug Administration; for all practical purposes the government had redefined foods as nothing more than the sum of their recognized nutrients. Adulteration had been repositioned as food science. All it would take now was a push from McGovern’s Dietary Goals for hundreds of “traditional foods that everyone knows” to begin their long
retreat from the supermarket shelves and for our eating to become more “scientific.”
FOUR FOOD SCIENCE’S GOLDEN AGE
In the years following the 1977 Dietary Goals and the 1982 National Academy of Sciences report on diet and cancer, the food industry, armed with its regulatory absolution, set about reengineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and fewer of the bad. A golden age for food science dawned. Hyphens sprouted like dandelions in the supermarket aisles: low-fat, no-cholesterol, high-fiber. Ingredients labels on formerly two-or three-ingredient foods such as mayonnaise and bread and yogurt ballooned with lengthy lists of new additives—what in a more benighted age would have been called adulterants. The Year of Eating Oat Bran—also known as 1988—served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern now was set, and every few years since then, a new oat bran has taken its star turn under the marketing lights. (Here come omega-3s!)
You would not think that common food animals could themselves be rejiggered to fit nutritionist fashion, but in fact some of them could be, and were, in response to the 1977 and 1982 dietary guidelines as animal scientists figured out how to breed leaner pigs and select for leaner beef. With widespread lipophobia taking hold of the human population, countless cattle lost their marbling and lean pork was repositioned as “the new white meat”—tasteless and tough as running shoes, perhaps, but now even a pork chop could compete with chicken as a way for eaters to “reduce saturated fat intake.” In the years since then, egg producers figured out a clever way to redeem even the disreputable egg: By feeding flaxseed to hens, they could elevate levels of omega-3 fatty acids in the yolks. Aiming to do the same thing for pork and beef fat, the animal scientists are now at work genetically engineering omega-3 fatty acids into pigs and persuading cattle to lunch on flaxseed in the hope of introducing the blessed fish fat where it had never gone before: into hot dogs and hamburgers.
But these whole foods are the exceptions. The typical whole food has much more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t quite as readily change its nutritional stripes. (Though rest assured the genetic engineers are hard at work on the problem.) To date, at least, they can’t put oat bran in a banana or omega-3s in a peach. So depending on the reigning nutritional orthodoxy, the avocado might either be a high-fat food to be assiduously avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate and supermarket sales of each whole food rises and falls with every change in the nutritional weather while the processed foods simply get reformulated and differently supplemented. That’s why when the Atkins diet storm hit the food industry in 2003, bread and pasta got a quick redesign (dialing back the carbs; boosting the proteins) while poor unreconstructed potatoes and carrots were left out in the carbohydrate cold. (The low-carb indignities visited on bread and pasta, two formerly “traditional foods that everyone knows,” would never have been possible had the imitation rule not been tossed out in 1973. Who would ever buy imitation spaghetti? But of course that is precisely what low-carb pasta is.)
A handful of lucky whole foods have recently gotten the “good nutrient” marketing treatment: The antioxidants in the pomegranate (a fruit formerly more trouble to eat than it was worth) now protect against cancer and erectile dysfunction, apparently, and the omega-3 fatty acids in the (formerly just fattening) walnut ward off heart disease. A whole subcategory of nutritional science—funded by industry and, according to one recent analysis,* remarkably reliable in its ability to find a health benefit in whatever food it has been commissioned to study—has sprung up to give a nutritionist sheen—(and FDA- approved health claim) to all sorts of foods, including some not ordinarily thought of as healthy. The Mars Corporation recently endowed a chair in chocolate science at the University of California at Davis, where research on the antioxidant properties of cacao is making breakthroughs, so it shouldn’t be long before we see chocolate bars bearing FDA-approved health claims. (When we do, nutritionism will surely have entered its baroque phase.) Fortunately for everyone playing this game, scientists can find an antioxidant in just about any plant-based food they choose to study.
Yet as a general rule it’s a whole lot easier to slap a health claim on a box of sugary cereal than on a raw potato or a carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over in Cereal the Cocoa Puffs and Lucky Charms are screaming their newfound “whole-grain goodness” to the rafters.
Watch out for those health claims.
FIVE THE MELTING OF THE LIPID HYPOTHESIS
Nutritionism is good for the food business. But is it good for us? You might think that a national fixation on nutrients would lead to measurable improvements in public health. For that to happen, however, the underlying nutritional science and the policy recommendations (not to mention the journalism) based on that science would both have to be sound. This has seldom been the case.
The most important such nutrition campaign has been the thirty-year effort to reform the food supply and our eating habits in light of the lipid hypothesis—the idea that dietary fat is responsible for chronic disease. At the behest of government panels, nutrition scientists, and public health officials, we have dramatically changed the way we eat and the way we think about food, in what stands as the biggest experiment in applied nutritionism in history. Thirty years later, we have good reason to believe that putting the nutritionists in charge of the menu and the kitchen has not only ruined an untold number of meals, but also has done little for our health, except very possibly to make it worse.
These are strong words, I know. Here are a couple more: What the Soviet Union was to the ideology of Marxism, the Low-Fat Campaign is to the ideology of nutritionism—its supreme test and, as now is coming clear, its most abject failure. You can argue, as some diehards will do, that the problem was one of faulty execution or you can accept that the underlying tenets of the ideology contained the seeds of the eventual disaster.
At this point you’re probably saying to yourself, Hold on just a minute. Are you really saying the whole low-fat deal was bogus? But my supermarket is still packed with low-fat this and no- cholesterol that! My doctor is still on me about my cholesterol and telling me to switch to low-fat everything. I was flabbergasted at the news too, because no one in charge—not in the government, not in the public health community—has dared to come out and announce: Um, you know everything we’ve been telling you for the last thirty years about the links between dietary fat and heart disease? And fat and cancer? And fat and fat? Well, this just in: It now appears that none of it was true. We sincerely regret the error.
No, the admissions of error have been muffled, and the mea culpas impossible to find. But read around in the recent scientific literature and you will find a great many scientists beating a quiet retreat from the main tenets of the lipid hypothesis. Let me offer just one example, an article from a group of prominent nutrition scientists at the Harvard School of Public Health. In a recent review of the relevant research called “Types of Dietary Fat and Risk of Coronary Heart Disease: A Critical Review,” * the authors proceed to calmly remove, one by one, just about every strut supporting the theory that dietary fat causes heart disease.
Hu and his colleagues begin with a brief, uninflected summary of the lipophobic era that is noteworthy mostly for casting the episode in the historical past:
During the past several decades, reduction in fat intake has been the main focus of national dietary recommendations. In the public’s mind, the words “dietary fat” have become synonymous with obesity and heart disease, whereas the words “low-fat” and “fat-free” have been synonymous with heart health.
We can only wonder how in the world such crazy ideas ever found their way into the “public’s mind.” Surely not from anyone associated with the Harvard School of Public Health, I would hope. Well, as it turns out, the selfsame group, formerly in thrall to the lipid hypothesis, was recommending until the early 1990s, when the evidence about the dangers of trans fats could no longer be ignored, that people reduce their saturated fat intake by switching from butter to margarine. (Though red flags about trans fats can be spotted as far back as 1956, when Ancel Keyes, the father of the lipid hypothesis, suggested that rising consumption of hydrogenated vegetable oils might be responsible for the twentieth-century rise in coronary heart disease.)
But back to the critical review, which in its second paragraph drops this bombshell:
It is now increasingly recognized that the low-fat campaign has been based on little scientific evidence and may have caused unintended health consequences.
Say what?
The article then goes on blandly to survey the crumbling foundations of the lipid hypothesis, circa 2001: Only two studies have ever found “a significant positive association between saturated fat intake and risk of CHD [coronary heart disease]”; many more have failed to find an association. Only one study has ever found “a significant inverse association between polyunsaturated fat intake and CHD.” Let me translate: The amount of saturated fat in the diet probably may have little if any bearing on the risk of heart disease, and evidence that increasing polyunsaturated fats in the diet will reduce risk is slim to nil. As for the dangers of dietary cholesterol, the review found “a weak and nonsignificant positive association between dietary cholesterol and risk of CHD.” (Someone should tell the food
processors, who continue to treat dietary cholesterol as a matter of life and death.) “Surprisingly,” the authors wrote, “there is little direct evidence linking higher egg consumption and increased risk of CHD”—surprising, because eggs are particularly high in cholesterol.
By the end of the review, there is one strong association between a type of dietary fat and heart disease left standing, and it happens to be precisely the type of fat that the low-fat campaigners have spent most of the last thirty years encouraging us to consume more of: trans fats. It turns out that “a higher intake of trans fat can contribute to increased risk of CHD through multiple mechanisms”; to wit, it raises bad cholesterol and lowers good cholesterol (something not even the evil saturated fats can do); it increases triglycerides, a risk factor for CHD; it promotes inflammation and possibly thrombogenesis (clotting), and it may promote insulin resistance. Trans fat is really bad stuff, apparently, fully twice as bad as saturated fat in its impact on cholesterol ratios. If any of the authors of the critical review are conscious of the cosmic irony here—that the principal contribution of thirty years of official nutritional advice has been to replace a possibly mildly unhealthy fat in our diets with a demonstrably lethal one—they are not saying.
The paper is not quite prepared to throw out the entire lipid hypothesis, but by the end precious little of it is left standing. The authors conclude that while total levels of fat in the diet apparently have little bearing on the risk of heart disease (!), the ratio between types of fats does. Adding omega-3 fatty acids to the diet (that is, eating more of a certain kind of fat) “substantially reduces coronary and total mortality” in heart patients, and replacing saturated fats with polyunsaturated fats lowers blood cholesterol, which they deem an important risk factor for CHD. (Some researchers no longer do, pointing out that half the people who get heart attacks don’t have elevated cholesterol levels, and about half the people with elevated cholesterol do not suffer from CHD.) One other little grenade is dropped in the paper’s conclusion: Although “a major purported benefit of a low-fat diet is weight loss,” a review of the literature failed to turn up any convincing evidence of this proposition. To the contrary, it found “some evidence” that replacing fats in the diet with carbohydrates (as official dietary advice has urged us to do since the 1970s) will lead to weight gain.
I have dwelled on this paper because it fairly reflects the current thinking on the increasingly tenuous links between dietary fat and health. The lipid hypothesis is quietly melting away, but no one in the public health community, or the government, seems quite ready to publicly acknowledge it. For fear of what exactly? That we’ll binge on bacon double cheeseburgers? More likely that we’ll come to the unavoidable conclusion that the emperors of nutrition have no clothes and never listen to them again.
In fact, there have been dissenters to the lipid hypothesis all along, lipid biochemists like Mary Enig (who has been sounding the alarm on trans fats since the 1970s) and nutritionists like Fred Kummerow and John Yudkin (who have been sounding the alarm on refined carbohydrates, also since the 1970s), but these critics have always had trouble getting a hearing, especially after 1977, when the McGovern guidelines effectively closed off debate on the lipid hypothesis.