THREE   THE INDUSTRIALIZATION OF EATING: WHAT WE DO KNOW

I’ve dwelled on the all-but-forgotten ideas of people like Weston Price and Sir Albert Howard—ecological thinkers about the human food chain—because they point us down a path that might lead the way out of the narrow, and ultimately unhelpful, confines of nutritionism: of thinking about food strictly in terms of its chemical constituents. What we need now, it seems to me, is to create a broader, more ecological—and more cultural—view of food. So let us try.

What would happen if we were to start thinking about food as less of a thing and more of a relationship? In nature, that is of course precisely what eating has always been: relationships among species in systems we call food chains, or food webs, that reach all the way down to the soil. Species coevolve with the other species that they eat, and very often there develops a relationship of interdependence: I’ll feed you if you spread around my genes. A gradual process of mutual adaptation transforms something like an apple or a squash into a nutritious and tasty food for an animal. Over time and through trial and error, the plant becomes tastier (and often more conspicuous) in order to gratify the animal’s needs and desires, while the animal gradually acquires whatever digestive tools (enzymes, for example) it needs to make optimal use of the plant.

Similarly, the milk of cows did not start out as a nutritious food for humans; in fact, it made them sick until people who lived around cows evolved the ability to digest milk as adults. The gene for the production of a milk-digesting enzyme called lactase used to switch off in humans shortly after weaning until about five thousand years ago, when a mutation that kept the gene switched on appeared and quickly spread through a population of animal herders in north-central Europe. Why? Because the people possessing the new mutation then had access to a terrifically nutritious new food source and as a consequence were able to produce more offspring than the people who lacked it. This development proved much to the advantage of both the milk drinkers and the cows, whose numbers and habitat (and health) greatly improved as a result of this new symbiotic relationship.

Health is, among other things, the product of being in these sorts of relationships in a food chain—a great many such relationships in the case of an omnivorous creature like man. It follows that when the health of one part of the food chain is disturbed, it can affect all the other creatures in it. If the soil is sick or in some way deficient, so will be the grasses that grow in that soil and the cattle that eat the grasses and the people who drink the milk from them. This is precisely what Weston Price and Sir Howard had in mind when they sought to connect the seemingly distant realms of soil and human health. Our personal health cannot be divorced from the health of the entire food web.

In many cases, long familiarity between foods and their eaters leads to elaborate systems of communication up and down the food chain so that a creature’s senses come to recognize foods as suitable by their taste and smell and color. Very often these signals are “sent” by the foods themselves, which may have their own reasons for wanting to be eaten. Ripeness in fruit is often signaled by a distinctive smell (an appealing scent that can travel over distances), or color (one that stands out from the general green), or taste (typically sweet). Ripeness, which is the moment when the seeds of the plant are ready to go off and germinate, typically coincides with the greatest concentration of nutrients in a fruit, so the interests of the plant (for transportation) align with those of the plant eater (for nutriment). Our bodies, having received these signals and determined this fruit is good to eat, now produce in anticipation precisely the enzymes and acids needed to break it down. Health depends heavily on knowing how to read these biological signals: This looks ripe; this smells spoiled; that’s one slick-looking cow. This is much easier to do when you have long experience of a food and much harder when a food has been expressly designed to deceive your senses with, say, artificial flavors or synthetic sweeteners. Foods that lie to our senses are one of the most challenging features of the Western diet.

Note that these ecological relationships are, at least in the first instance, between eaters and whole foods, not nutrients or chemicals. Even though the foods in question eventually get broken down in our bodies into simple chemical compounds, as corn is reduced mostly to simple sugars, the qualities of the whole foods are not unimportant. The amount and structure of the fiber in that corn, for example, will determine such things as the speed at which the sugars in it will be released and absorbed, something we’ve learned is critical to insulin metabolism. The chemist will tell you the starch in corn is on its way to becoming glucose in the blood, but that reductive understanding overlooks the complex and variable process by which that happens. Contrary to the nutrition label, not all carbohydrates are created equal.

Put another way, our bodies have a long-standing and sustainable relationship to corn that they do not have to high-fructose corn syrup. Such a relationship with corn syrup might develop someday (as people evolve superhuman insulin systems to cope with regular floods of pure fructose and glucose*), but for now the relationship leads to ill health because our bodies don’t know how to handle these biological novelties. In much the same way, human bodies that can cope with chewing coca leaves—a longstanding relationship between native people and the coca plant in parts of South America—cannot cope with cocaine or crack, even though the same active ingredients are present in all three. Reductionism as a way of understanding food or drugs may be harmless, even necessary, but reductionism in practice—reducing food or drug plants to their most salient chemical compounds—can lead to problems.

Looking at eating, and food, through this ecological lens opens a whole new perspective on exactly what the Western diet is: a radical and, at least in evolutionary terms, abrupt set of changes over the course of the last 150 years, not just to our foodstuffs but also to our food relationships, all the way from the soil to the meal. The rise of the ideology of nutritionism is itself part of that change. When we think of a species’ “environment,” we usually think in terms of things like geography, predators and prey, and the weather. But of course one of the most critical components of any creature’s environment is the nature of the food available to it and its relationships to the species it eats. Much is at stake when a creature’s food environment changes. For us, the first big change came ten thousand years ago with the advent of agriculture. (And it devastated our health, leading to a panoply of deficiencies and infectious diseases that we’ve only managed to get under control in the last century or so.) The biggest change in our food environment since then? The advent of the modern diet.

To get a better grip on the nature of these changes is to begin to understand how we might alter our relationship to food—for the better, for our health. These changes have been numerous and far reaching, but consider as a start these five fundamental transformations to our foods and ways of eating. All of them can be reversed, if not perhaps so easily in the food system as a whole, certainly in the life and diet of any individual eater, and without, I hasten to add, returning to the bush or taking up hunting and gathering.

1) From Whole Foods to Refined

The case of corn points to one of the key features of the modern diet: a shift toward increasingly refined foods, especially carbohydrates. People have been refining cereal grains since at least the Industrial Revolution, favoring white flour and white rice over brown, even at the price of lost nutrients. Part of the reason was prestige: Because for many years only the wealthy could afford refined grains, they acquired a certain glamour. Refining grains extends their shelf life (precisely because they are less nutritious to the pests that compete with us for their calories) and makes them easier to digest by removing the fiber that ordinarily slows the release of their sugars. Also, the finer that flour is ground, the more surface area is exposed to digestive enzymes, so the quicker the starches turn to glucose. A great deal of modern industrial food can be seen as an extension and intensification of this practice as food processors find ways to deliver glucose—the brain’s preferred fuel—ever more swiftly and efficiently. Sometimes this is precisely the point, as when corn is refined into corn syrup; other times, though, it is an unfortunate by-product of processing food for other reasons.

Viewed from this perspective, the history of refining whole foods has been a history of figuring out ways not just to make them more durable and portable, but also how to concentrate their energy and, in a sense, speed them up. This acceleration took a great leap forward with the introduction in Europe around 1870 of rollers (made from iron, steel, or porcelain) for grinding grain. Perhaps more than any other single development, this new technology, which by 1880 had replaced grinding by stone throughout Europe and America, marked the beginning of the industrialization of our food—reducing it to its chemical essence and speeding up its absorption. Refined flour is the first fast food.

Before the roller-milling revolution, wheat was ground between big stone wheels, which could get white flour only so white. That’s because while stone grinding removed the bran from the wheat kernel (and therefore the largest portion of the fiber), it couldn’t remove the germ, or embryo, which contains volatile oils that are rich in nutrients. The stone wheels merely crushed the germ and released the oil. This had the effect of tinting the flour yellowish gray (the yellow is carotene) and shortening its shelf life, because the oil, once exposed to the air, soon oxidized—turned rancid. That’s what people could see and smell, and they didn’t like it. What their senses couldn’t tell them, however, is that the germ contributed some of the most valuable nutrients to the flour, including much of its protein, folic acid, and other B vitamins; carotenes and other antioxidants; and omega-3 fatty acids, which are especially prone to rancidity.

The advent of rollers that made it possible to remove the germ and then grind the remaining endosperm (the big packet of starch and protein in a seed) exceptionally fine solved the problem of stability and color. Now just about everyone could afford snowy-white flour that could keep on a shelf for many months. No longer did every town need its own mill, because flour could now travel great distances. (Plus it could be ground year-round by large companies in big cities: Heavy stone mills, which typically relied on water power, operated mostly when and where rivers flowed; steam engines could drive the new rollers whenever and wherever.) Thus was one of the main staples of the Western diet cut loose from its moorings in place and time and marketed on the basis of image rather than nutritional value. In this, white flour was a modern industrial food, one of the first.

The problem was that this gorgeous white powder was nutritionally worthless, or nearly so. Much the same was now true for corn flour and white rice, the polishing of which (i.e., the removing of its most nutritious parts) was perfected around the same time. Wherever these refining technologies came into widespread use, devastating epidemics of pellagra and beriberi soon followed. Both are diseases caused by deficiencies in the B vitamins that the germ had contributed to the diet. But the sudden absence from bread of several other micronutrients, as well as omega-3 fatty acids, probably also took its toll on public health, particularly among the urban poor of Europe, many of whom ate little but bread.

In the 1930s, with the discovery of vitamins, scientists figured out what had happened, and millers began fortifying refined grain with B vitamins. This took care of the most obvious deficiency diseases. More recently, scientists recognized that many of us also had a deficiency of folic acid in our diet, and in 1996 public health authorities ordered millers to start adding folic acid to flour as well. But it would take longer still for science to realize that this “Wonder Bread” strategy of supplementation, as one nutritionist has called it, might not solve all the problems caused by the refining of grain. Deficiency diseases are much easier to trace and treat (indeed, medicine’s success in curing deficiency diseases is an important source of nutritionism’s prestige) than chronic diseases, and it turns out that the practice of refining carbohydrates is implicated in several of these chronic diseases as well—diabetes, heart disease, and certain cancers.

The story of refined grain stands as a parable about the limits of reductionist science when applied to something as complex as food. For years now nutritionists have known that a diet high in whole grains reduces one’s risk for diabetes, heart disease, and cancer. (This seems to be true even after you correct for the fact that the kind of people who eat lots of whole grains today probably have lifestyles healthier in other ways as well.) Different nutritionists have given the credit for the benefits of whole grain to different nutrients: the fiber in the bran, the folic acid and other B vitamins in the germ, or the antioxidants or the various minerals. In 2003 the American Journal of Clinical Nutrition* published an unusually nonreductionist study demonstrating that no one of those nutrients alone can explain the benefits of whole-grain foods: The typical reductive analysis of isolated nutrients could not explain the improved health of the whole-grain eaters.

For the study, University of Minnesota epidemiologists David R. Jacobs and Lyn M. Steffen reviewed the relevant research and found a large body of evidence that a diet rich in whole grains did in fact reduce mortality from all causes. But what was surprising was that even after adjusting for levels of dietary fiber, vitamin E, folic acid, phytic acid, iron, zinc, magnesium, and manganese in the diet (all the good things we know are in whole grains), they found an additional health benefit to eating whole grains that none of the nutrients alone or even together could explain. That is, subjects getting the same amounts of these nutrients from other sources were not as healthy as the whole-grain eaters. “This analysis suggests that something else in the whole grain protects against death.” The authors concluded, somewhat vaguely but suggestively, that “the various grains and their parts act synergistically” and suggested that their colleagues begin paying attention to the concept of “food synergy.” Here, then, is support for an idea revolutionary by the standards of nutritionism: A whole food might be more than the sum of its nutrient parts.

Suffice it to say, this proposition has not been enthusiastically embraced by the food industry, and probably won’t be any time soon. As I write, Coca-Cola is introducing vitamin-fortified sodas, extending the Wonder Bread strategy of supplementation to junk food in its purest form. (Wonder Soda?) The big money has always been in processing foods, not selling them whole, and the industry’s investment in the reductionist approach to food is probably safe. The fact is, there is something in us that loves a refined carbohydrate, and that something is the human brain. The human brain craves carbohydrates reduced to their energy essence, which is to say pure glucose. Once industry figured out how to transform the seeds of grasses into the chemical equivalent of sugar, there was probably no turning back.

And then of course there is sugar itself, the ultimate refined carbohydrate, which began flooding the marketplace and the human metabolism around the same time as refined flour. In 1874, England lifted its tariffs on imported sugar, the price dropped by half, and by the end of the nineteenth century fully a sixth of the calories in the English diet were coming from sugar, with much of the rest coming from refined flour.

With the general availability of cheap pure sugar, the human metabolism now had to contend not only with a constant flood of glucose, but also with more fructose than it had ever before encountered, because sugar—sucrose—is half fructose.* (Per capita fructose consumption has increased 25 percent in the past thirty years.) In the natural world, fructose is a rare and precious thing, typically encountered seasonally in ripe fruit, when it comes packaged in a whole food full of fiber (which slows its absorption) and valuable micronutrients. It’s no wonder we’ve been hardwired by natural selection to prize sweet foods: Sugar as it is ordinarily found in nature—in fruits and some vegetables—gives us a slow-release form of energy accompanied by minerals and all sorts of crucial micronutrients we can get nowhere else. (Even in honey, the purest form of sugar found in nature, you find some valuable micronutrients.)

One of the most momentous changes in the American diet since 1909 (when the USDA first began keeping track) has been the increase in the percentage of calories coming from sugars, from 13 percent to 20 percent. Add to that the percentage of calories coming from carbohydrates (roughly 40 percent, or ten servings, nine of which are refined) and Americans are consuming a diet that is at least half sugars in one form or another—calories providing virtually nothing but energy. The energy density of these refined carbohydrates contributes to obesity in two ways. First, we consume many more calories per unit of food; the fiber that’s been removed from these foods is precisely what would have made us feel full and stop eating. Also, the flash flood of glucose causes insulin levels to spike and then, once the cells have taken all that glucose out of circulation, drop precipitously, making us think we need to eat again.

While the widespread acceleration of the Western diet has given us the instant gratification of sugar, in many people—especially those newly exposed to it—the speediness of this food overwhelms the ability of insulin to process it, leading to type 2 diabetes and all the other chronic diseases associated with metabolic syndrome. As one nutrition expert put it to me, “We’re in the middle of a national experiment in the mainlining of glucose.” And don’t forget the flood of fructose, which may represent an even greater evolutionary novelty, and therefore challenge to the human metabolism, than all that glucose.

It is probably no accident that rates of type 2 diabetes are lower among ethnic Europeans, who have had longer than other groups to accustom their metabolisms to fast-release refined carbohydrates: Their food environment changed first.* To encounter such a diet for the first time, as when people accustomed to a more traditional diet come to America or when fast food comes to them, delivers a shock to the system. This shock is what public health experts mean by the nutrition transition, and it can be deadly.

So here, then, is the first momentous change in the Western diet that may help to explain why it makes some people so sick: Supplanting tested relationships to the whole foods with which we coevolved over many thousands of years, it asks our bodies now to relate to, and deal with, a very small handful of efficiently delivered nutrients that have been torn from their food context. Our ancient evolutionary relationship with the seeds of grasses and fruit of plants has given way, abruptly, to a rocky marriage with glucose and fructose.

2) From Complexity to Simplicity

At every level, from the soil to the plate, the industrialization of the food chain has involved a process of chemical and biological simplification. It starts with industrial fertilizers, which grossly simplify the biochemistry of the soil. In the wake of Liebig’s identification of the big three macronutrients that plants need to grow—nitrogen, phosphorus, and potassium (NPK)—and Fritz Haber’s invention of a method for synthesizing nitrogen fertilizer from fossil fuels, agricultural soils began receiving large doses of the big three but little else. Just like Liebig, whose focus on the macronutrients in the human diet failed to take account of the important role played by micronutrients such as vitamins, Haber completely overlooked the importance of biological activity in the soil: the contribution to plant health of the complex underground ecosystem of soil microbes, earthworms, and mycorrhizal fungi. Harsh chemical fertilizers (and pesticides) depress or destroy this biological activity, forcing crops to subsist largely on a simple ration of NPK. Plants can live on this fast-food diet of chemicals, but it leaves them more vulnerable to pests and diseases and appears to diminish their nutritional quality.

It stands to reason that a chemically simplified soil would produce chemically simplified plants. Since the widespread adoption of chemical fertilizers in the 1950s, the nutritional quality of produce in America has declined substantially, according to figures gathered by the USDA, which has tracked the nutrient content of various crops since then. Some researchers blame this decline on the condition of the soil; others cite the tendency of modern plant breeding, which has consistently selected for industrial characteristics such as yield rather than nutritional quality. (The next section will take up the trade-off between quality and quantity in industrial food.)

The trend toward simplification of our food continues up the chain. As we’ve seen, processing whole foods—refining, chemically preserving, and canning them—depletes them of many nutrients, a few of which are then added back: B vitamins in refined flour, vitamins and minerals in breakfast cereal and bread. Fortifying processed foods with missing nutrients is surely better than leaving them out, but food science can add back only the small handful of nutrients that food science recognizes as important today. What is it overlooking? As the whole-grain food synergy study suggests, science doesn’t know nearly enough to compensate for everything that processing does to whole foods. We know how to break down a kernel of corn or grain of wheat into its chemical parts, but we have no idea how to put it back together again. Destroying complexity is a lot easier than creating it.

Simplification of the food chain occurs at the level of species diversity too. The astounding variety of foods on offer in today’s supermarket obscures the fact that the actual number of species in the modern diet is shrinking. Thousands of plant and animal varieties have fallen out of commerce in the last century as industrial agriculture has focused its attentions on a small handful of high-yielding (and usually patented) varieties, with qualities that suited them to things like mechanical harvesting and processing. Half of all the broccoli grown commercially in America today is a single variety—Marathon—notable for its high yield. The overwhelming majority of the chickens raised for meat in America are the same hybrid, the Cornish cross; more than 99 percent of the turkeys are Broad-Breasted Whites.

With the rise of industrial agriculture, vast monocultures of a tiny group of plants, most of them cereal grains, have replaced the diversified farms that used to feed us. A century ago, the typical Iowa farm raised more than a dozen different plant and animal species: cattle, chickens, corn, hogs, apples, hay, oats, potatoes, cherries, wheat, plums, grapes, and pears. Now it raises only two: corn and soybeans. This simplification of the agricultural landscape leads directly to the simplification of the diet, which is now to a remarkable extent dominated by—big surprise—corn and soybeans. You may not think you eat a lot of corn and soybeans, but you do: 75 percent of the vegetable oils in your diet come from soy (representing 20 percent of your daily calories) and more than half of the sweeteners you consume come from corn (representing around 10 percent of daily calories).

Why corn and soy? Because these two plants are among nature’s most efficient transformers of sunlight and chemical fertilizer into carbohydrate energy (in the case of corn) and fat and protein (in the case of soy)—if you want to extract the maximum amount of macronutrients from the American farm belt, corn and soy are the crops to plant. (It helps that the government pays farmers to grow corn and soy, subsidizing every bushel they produce.) Most of the corn and soy crop winds up in the feed of our food animals (simplifying their diets in unhealthy ways, as we’ll see), but much of the rest goes into processed foods. The business model of the food industry is organized around “adding value” to cheap raw materials; its genius has been to figure out how to break these two big seeds down into their chemical building blocks and then reassemble them in myriad packaged food products. With the result that today corn contributes 554 calories a day to America’s per capita food supply and soy another 257. Add wheat (768 calories) and rice (91) and you can see there isn’t a whole lot of room left in the American stomach for any other foods.

Today these four crops account for two thirds of the calories we eat. When you consider that humankind has historically consumed some eighty thousand edible species, and that three thousand of these have been in widespread use, this represents a radical simplification of the human diet. Why should this concern us? Because humans are omnivores, requiring somewhere between fifty and a hundred different chemical compounds and elements in order to be healthy. It’s hard to believe we’re getting everything we need from a diet consisting largely of processed corn, soybeans, rice, and wheat.

3) From Quality to Quantity

While industrial agriculture has made tremendous strides in coaxing macronutrients—calories—from the land, it is becoming increasingly clear that these gains in food quantity have come at a cost to its quality. This probably shouldn’t surprise us: Our food system has long devoted its energies to increasing yields and selling food as cheaply as possible. It would be too much to hope those goals could be achieved without sacrificing at least some of the nutritional quality of our food.

As mentioned earlier, USDA figures show a decline in the nutrient content of the forty-three crops it has tracked since the 1950s. In one recent analysis, vitamin C declined by 20 percent, iron by 15 percent, riboflavin by 38 percent, calcium by 16 percent. Government figures from England tell a similar story: declines since the fifties of 10 percent or more in levels of iron, zinc, calcium, and selenium across a range of food crops. To put this in more concrete terms, you now have to eat three apples to get the same amount of iron as you would have gotten from a single 1940 apple, and you’d have to eat several more slices of bread to get your recommended daily allowance of zinc than you would have a century ago.

These examples come from a 2007 report entitled “Still No Free Lunch” written by Brian Halweil, a researcher for Worldwatch, and published by the Organic Center, a research institute established by the organic food industry. “American agriculture’s single-minded focus on increasing yields created a blind spot,” Halweil writes, “where incremental erosion in the nutritional quality of our food…has largely escaped the notice of scientists, government, and consumers.” The result is the nutritional equivalent of inflation, such that we have to eat more to get the same amount of various essential nutrients. The fact that at least 30 percent of Americans have a diet deficient in vitamin C, vitamin E, vitamin A, and magnesium surely owes more to eating processed foods full of empty calories than it does to lower levels of nutrients in the whole foods we aren’t eating. Still, it doesn’t help that the raw materials used in the manufacture of processed foods have declined in nutritional quality or that when we are eating whole foods, we’re getting substantially less nutrition per calorie than we used to.*

Nutritional inflation seems to have two principal causes: changes in the way we grow food and changes in the kinds of foods we grow. Halweil cites a considerable body of research demonstrating that plants grown with industrial fertilizers are often nutritionally inferior to the same varieties grown in organic soils. Why this should be so is uncertain, but there are a couple of hypotheses. Crops grown with chemical fertilizers grow more quickly, giving them less time and opportunity to accumulate nutrients other than the big three (nutrients in which industrial soils are apt to be deficient anyway). Also, easy access to the major nutrients means that industrial crops develop smaller and shallower root systems than organically grown plants; deeply rooted plants have access to more soil minerals. Biological activity in the soil almost certainly plays a role as well; the slow decomposition of organic matter releases a wide range of plant nutrients, possibly including compounds science hasn’t yet identified as important. Also, a biologically active soil will have more mycorrhizae, the soil fungi that live in symbiosis with plant roots, supplying the plants with minerals in exchange for a ration of sugar.

In addition to these higher levels of minerals, organically grown crops have also been found to contain more phytochemicals—the various secondary compounds (including carotenoids and polyphenols) that plants produce in order to defend themselves from pests and diseases, many of which turn out to have important antioxidant, antiinflammatory, and other beneficial effects in humans. Because plants living on organic farms aren’t sprayed with synthetic pesticides, they’re forced to defend themselves, with the result that they tend to produce between 10 percent and 50 percent more of these valuable secondary compounds than conventionally grown plants.

Some combination of these environmental factors probably accounts for at least part of the decline in the nutritional quality of conventional crops, but genetics likely plays just as important a role. Very simply, we have been breeding crops for yield, not nutritional quality, and when you breed for one thing, you invariably sacrifice another. Halweil cites several studies demonstrating that when older crop varieties are grown side by side with modern cultivars, the older ones typically have lower yields but substantially higher nutrient levels. USDA researchers recently found that breeding to “improve” wheat varieties over the past 130 years (a period during which yields of grain per acre tripled) had reduced levels of iron by 28 percent and zinc and selenium by roughly a third. Similarly, milk from modern Holstein cows (in which breeders have managed to more than triple daily yield since 1950) has considerably less butterfat and other nutrients than that from older, less “improved” varieties like Jersey, Guernsey, and Brown Swiss.

Clearly the achievements of industrial agriculture have come at a cost: It can produce a great many more calories per acre, but each of those calories may supply less nutrition than it formerly did. And what has happened on the farm has happened in the food system as a whole as industry has pursued the same general strategy of promoting quantity at the expense of quality. You don’t need to spend much time in an American supermarket to figure out that this is a food system organized around the objective of selling large quantities of calories as cheaply as possible.

Indeed, doing so has been official U.S. government policy since the mid-seventies, when a sharp spike in food prices brought protesting housewives into the street and prompted the Nixon administration to adopt an ambitious cheap food policy. Agricultural policies were rewritten to encourage farmers to plant crops like corn, soy, and wheat fencerow to fencerow, and it worked: Since 1980, American farmers have produced an average of 600 more calories per person per day, the price of food has fallen, portion sizes have ballooned, and, predictably, we’re eating a whole lot more, at least 300 more calories a day than we consumed in 1985. What kind of calories? Nearly a quarter of these additional calories come from added sugars (and most of that in the form of high-fructose corn syrup); roughly another quarter from added fat (most of it in the form of soybean oil); 46 percent of them from grains (mostly refined); and the few calories left (8 percent) from fruits and vegetables.* The overwhelming majority of the calories Americans have added to their diets since 1985—the 93 percent of them in the form of sugars, fats, and mostly refined grains—supply lots of energy but very little of anything else.

A diet based on quantity rather than quality has ushered a new creature onto the world stage: the human being who manages to be both overfed and undernourished, two characteristics seldom found in the same body in the long natural history of our species. In most traditional diets, when calories are adequate, nutrient intake will usually be adequate as well. Indeed, many traditional diets are nutrient rich and, at least compared to ours, calorie poor. The Western diet has turned that relationship upside down. At a health clinic in Oakland, California, doctors report seeing overweight children suffering from old-time deficiency diseases such as rickets, long thought to have been consigned to history’s dustheap in the developed world. But when children subsist on fast food rather than fresh fruits and vegetables and drink more soda than milk, the old deficiency diseases return—now even in the obese.

Bruce Ames, the renowned Berkeley biochemist, works with kids like this at Children’s Hospital and Research Center in Oakland. He’s convinced that our high-calorie, low-nutrient diet is responsible for many chronic diseases, including cancer. Ames has found that even subtle micronutrient deficiencies—far below the levels needed to produce acute deficiency diseases—can cause damage to DNA that may lead to cancer. Studying cultured human cells, he’s found that “deficiency of vitamins C, E, B12, B6, niacin, folic acid, iron or zinc appears to mimic radiation by causing single-and double-strand DNA breaks, oxidative lesions, or both”—precursors to cancer. “This has serious implications, as half of the U.S. population may be deficient in at least one of these micronutrients.” Most of the missing micronutrients are supplied by fruits and vegetables, of which only 20 percent of American children and 32 percent of adults eat the recommended five daily servings. The cellular mechanisms Ames has identified could explain why diets rich in vegetables and fruits seem to offer some protection against certain cancers.

Ames also believes, though he hasn’t yet proven it, that micronutrient deficiencies may contribute to obesity. His hypothesis is that a body starved of critical nutrients will keep eating in the hope of obtaining them. The absence of these nutrients from the diet may “counteract the normal feeling of satiety after sufficient calories are eaten” and that such an unrelenting hunger “may be a biological strategy for obtaining missing nutrients.” If Ames is right, then a food system organized around quantity rather than quality has a destructive feedback loop built into it, such that the more low-quality food one eats, the more one wants to eats, in a futile—but highly profitable—quest for the absent nutrient.

4) From Leaves to Seeds

It’s no accident that the small handful of plants we’ve come to rely on are grains (soy is a legume); these crops are exceptionally efficient at transforming sunlight, fertilizer, air, and water into macronutrients—carbohydrates, fats, and proteins. These macronutrients in turn can be profitably converted into meat, dairy, and processed foods of every description. Also, the fact that they come in the form of durable seeds which can be stored for long periods of time means they can function as commodities as well as foods, making these crops particularly well adapted to the needs of industrial capitalism.

The needs of the human eater are a very different matter, however. An oversupply of macronutrients, such as we now face, itself represents a serious threat to our health, as soaring rates of obesity and diabetes indicate. But, as the research of Bruce Ames and others suggests, the undersupply of micronutrients may constitute a threat just as grave. Put in the most basic terms, we’re eating a lot more seeds and a lot fewer leaves (as do the animals we depend on), a tectonic dietary shift the full implications of which we are just now beginning to recognize. To borrow, again, the nutritionist’s reductive vocabulary: Leaves provide a host of critical nutrients a body can’t get from a diet of refined seeds. There are the antioxidants and phytochemicals; there is the fiber; and then there are the essential omega-3 fatty acids found in leaves, which some researchers believe will turn out to be the most crucial missing nutrient of all.

Most people associate omega-3 fatty acids with fish, but fish get them originally from green plants (specifically algae), which is where they all originate.* Plant leaves produce these essential fatty acids (we say they’re essential because our bodies can’t produce them on their own) as part of photosynthesis; they occupy the cell membranes of chloroplasts, helping them collect light. Seeds contain more of another kind of essential fatty acid, omega-6, which serves as a store of energy for the developing seedling. These two types of polyunsaturated fats perform very different functions in the plant as well as the plant eater. In describing their respective roles, I’m going to simplify the chemistry somewhat. For a more complete (and fascinating) account of the biochemistry of these fats and the story of their discovery read Susan Allport’s The Queen of Fats.

Omega-3s appear to play an important role in neurological development and processing (the highest concentrations of omega-3s in humans are found in the tissues of the brain and the eyes), visual acuity (befitting their role in photosynthesis), the permeability of cell walls, the metabolism of glucose, and the calming of inflammation. Omega-6s are involved in fat storage (which is what they do for the plant), the rigidity of cell walls, clotting, and the inflammation response. It helps to think of omega-3s as fleet and flexible, omega-6s as sturdy and slow. Because the two fatty acids compete with each other for space in cell membranes and for the attention of various enzymes, the ratio between omega-3s and omega-6s, in the diet and in turn in our tissues, may matter more than the absolute quantity of either fat. So, too much omega-6 may be just as much a problem as too little omega-3.

And that might well be a problem for people eating a Western diet. As the basis of our diet has shifted from leaves to seeds, the ratio of omega-6s to omega-3s in our bodies has changed too. The same is true for most of our food animals, which industrial agriculture has taken off their accustomed diet of green plants and put on a richer diet of seeds. The result has been a marked decline in the amount of omega-3s in modern meat, dairy products, and eggs, and an increase in the amount of omega-6s. At the same time, modern food production practices have further diminished the omega-3s in our diet. Omega-3s, being less stable than omega-6s, spoil more readily, so the food industry, focused on store food, has been strongly disposed against omega-3s long before we even knew what they were. ( Omega-3s weren’t recognized as essential to the human diet until the 1980s—some time after nutritionism’s blanket hostility to fat had already taken hold.) For years plant breeders have been unwittingly selecting for plants that produce fewer omega-3s, because such crops don’t spoil as quickly. (Wild greens like purslane have substantially higher levels of omega-3s than most domesticated plants.) Also, when food makers partially hydrogenate oils to render them more stable, it is the omega-3s that are eliminated. An executive from Frito-Lay told Susan Allport in no uncertain terms that because of their tendency to oxidize, omega-3s “cannot be used in processed foods.”

Most of the official nutritional advice we’ve been getting since the 1970s has, again unwittingly, helped to push omega-3s out of the diet and to elevate levels of omega-6. Besides demonizing fats in general, that advice has encouraged us to move from saturated fats of animal origin (some of which, like butter, actually contain respectable amounts of omega-3s) to seed oils, most of which are much higher in omega-6s (corn oil especially), and even more so after partial hydrogenation. The move from butter (and especially butter from pastured cows) to margarine, besides introducing trans fats to the diet, markedly increased omega-6s at the cost of omega-3s.

Thus without even realizing what we were doing, we dramatically altered the ratio of these two essential fats in our diet and our bodies, with the result that the ratio of omega-6 to omega-3 in the typical American today stands at more than 10 to 1. Before the widespread introduction of seed oils at the turn of the last century, the ratio was closer to 3 to 1.

The precise role of these lipids in human health is still not completely understood, but some researchers are convinced that these historically low levels of omega-3 (or, conversely, historically high levels of omega-6) bear responsibility for many of the chronic diseases associated with the Western diet, including heart disease and diabetes. Population studies suggest that omega-3 levels in the diet are strongly correlated with rates of heart disease, stroke, and mortality from all causes.* For example, the Japanese, who consume large amounts of omega-3s (most of it in fish), have markedly low rates of cardiovascular disease in spite of their high rates of smoking and high blood pressure. Americans consume only a third as much omega-3s as the Japanese and have nearly four times the rate of death from heart disease. But there is more than epidemiology to link omega-3 levels and heart disease: Clinical studies have found that increasing the omega-3s in one’s diet may reduce the chances of heart attack by a third.

What biological mechanism could explain these findings? A couple of theories have emerged. Omega-3s are present in high concentrations in heart tissue where they seem to play a role in regulating heart rhythm and preventing fatal arrhythmias. Omega-3s also dampen the inflammation response, which omega-6s tend to excite. Inflammation is now believed to play an important role in cardiovascular disease as well as in a range of other disorders, including rheumatoid arthritis and Alzheimer’s. Omega-6s supply the building blocks for a class of pro-inflammatory messenger chemicals involved in the body’s rapid-response reaction to a range of problems. One of these compounds is thromboxane, which encourages blood platelets to aggregate into clots. In contrast, omega-3s slow the clotting response, which is probably why populations with particularly high levels of omega-3s, such as the Inuit, are prone to bleeding. (If there is a danger to consuming too much omega-3, bleeding is probably it.)

The hypothesis that omega-3 might protect against heart disease was inspired by studies of Greenland Eskimos, in whom omega-3 consumption is high and heart disease rare. Eskimos eating their traditional marine-based diet also don’t seem to get diabetes, and some researchers believe it is the omega-3s that protect them. Adding omega-3s to the diet of rats has been shown to protect them against insulin resistance. (The same effect has not been duplicated in humans, however.) The theory is that omega-3s increase the permeability of the cell’s membranes and its rate of metabolism. (Hummingbirds have tons of omega-3s in their cell membranes; big mammals much less.) A cell with a rapid metabolism and permeable membrane should respond particularly well to insulin, absorbing more glucose from the blood to meet its higher energy requirements. That same mechanism suggests that diets high in omega-3s might protect against obesity as well.

So why is it, as Susan Allport writes, that “populations, when given the choice, will naturally drift toward foods with lesser amounts of omega-3s”? Because a faster metabolism increases the need for food and therefore the possibility of hunger, she suggests, which is a much less agreeable condition than being overweight. This might help explain why so many groups have adopted Western diets as soon as they get the chance.

It should be said that researchers working on omega-3s can sound a bit like Dr. Casaubon in Middlemarch, hard at work on his “Key to all Mythologies.” Likewise, omega-3 researchers seem to be in possession of a Theory of Everything, including happiness. The same population studies that have correlated omega-3 deficiency to cardiovascular disease have also found strong correlations between falling levels of omega-3 in the diet and rising rates of depression, suicide, and even homicide. Some researchers implicate omega-3 deficiency in learning disabilities such as attention deficit disorder as well. That omega-3s play an important role in mental function has been recognized since the 1980s, when it was found that babies fed on infant formula supplemented with omega-3s scored significantly higher on tests of both mental development and visual acuity than babies receiving formula supplemented only with omega-6.

Could it be that the problem with the Western diet is a gross deficiency in this essential nutrient? A growing number of researchers have concluded that it is, and they voice frustration that official nutritional advice has been slow to recognize the problem. To do so, of course, would mean conceding the error of past nutritional advice demonizing fats in general and promoting the switch to seed oils high in omega-6. But it seems likely that sooner or later the government will establish minimum daily requirements for omega-3 (several other governments already have) and, perhaps in time, doctors will routinely test us for omega-3 levels the way they already do for cholesterol.

Though maybe they should be testing for omega-6 levels as well, because it’s possible that is the greater problem. Omega-6s exist in a kind of zero-sum relationship with omega-3s, counteracting most of the positive effects of omega-3 throughout the body. Merely adding omega-3s to the diet—by taking supplements, say—may not do much good unless we also reduce the high levels of omega-6s that have entered the Western diet with the advent of processed foods, seed oils, and foods from animals raised on grain. Nine percent of the calories in the American diet today come from a single omega-6 fatty acid: linoleic acid, most of it from soybean oil. Some nutrition experts think that this is fine: Omega-6s, after all, are essential fatty acids too, and their rise to dietary prominence has pushed out saturated fats, usually thought to be a positive development. But others strongly disagree, contending that the unprecedented proportion of omega-6s in the Western diet is contributing to the full range of disorders involving inflammation. Joseph Hibbeln, the researcher at the National Institutes of Health who conducted population studies correlating omega-3 consumption with everything from stroke to suicide, says that the billions we spend on antiinflammatory drugs such as aspirin, ibuprofen, and acetaminophen is money spent to undo the effects of too much omega-6 in the diet. He writes, “The increases in world [ omega-6] consumption over the past century may be considered a very large uncontrolled experiment that may have contributed to increased societal burdens of aggression, depression, and cardiovascular mortality.”*

 

Of all the changes to our food system that go under the heading “The Western Diet,” the shift from a food chain with green plants at its base to one based on seeds may be the most far reaching of all. Nutritional scientists focus on different nutrients—whether the problem with modern diets is too many refined carbohydrates, not enough good fats, too many bad fats, or a deficiency of any number of micronutrients or too many total calories. But at the root of all these biochemical changes is a single ecological change. For the shift from leaves to seeds affects much more than the levels of omega-3 and omega-6 in the body. It also helps account for the flood of refined carbohydrates in the modern diet and the drought of so many micronutrients and the surfeit of total calories. From leaves to seeds: It’s almost, if not quite, a Theory of Everything.

5) From Food Culture to Food Science

The last important change wrought by the Western diet is not, strictly speaking, ecological, at least not in any narrow sense of the word. But the industrialization of our food that we call the Western diet is systematically and deliberately undermining traditional food cultures everywhere. This may be as destructive of our health as any nutritional deficiency.

Before the modern food era—and before the rise of nutritionism—people relied for guidance about what to eat on their national or ethnic or regional cultures. We think of culture as a set of beliefs and practices to help mediate our relationship to other people, but of course culture—at least before the rise of modern science—has also played a critical role in helping to mediate people’s relationship to nature. Eating being one of the most important manifestations of that relationship, cultures have had a great deal to say about what and how and why and when and how much we should eat. Of course when it comes to food, culture is another word for mom, the figure who typically passes on the food ways of the group—food ways that endured, by the way, only because they tended to keep people healthy.

The sheer novelty and glamour of the Western diet, with its seventeen thousand new food products every year and the marketing power—thirty-two billion dollars a year—used to sell us those products, has overwhelmed the force of tradition and left us where we now find ourselves: relying on science and journalism and government and marketing to help us decide what to eat. Nutritionism, which arose to help us better deal with the problems of the Western diet, has largely been co-opted by it: used by the industry to sell more nutritionally “enhanced” processed food and to undermine further the authority of traditional food cultures that stand in the way of fast food. Industry greatly amplifies the claims of nutritional science through its advertising and, through its sponsorship of self-serving nutritional research, corrupts it.* The predictable result is the general cacophony of nutritional information ringing in our ears and the widespread confusion that has come to surround this most fundamental of creaturely activities: finding something good to eat.

You would not have bought this book and read this far into it if your food culture was intact and healthy. And while it is true that most of us unthinkingly place the authority of science above culture in all matters having to do with our health, that prejudice should at least be examined. The question we need to ask is, Are we better off with these new authorities telling us how to eat than we were with the traditional authorities they supplanted? The answer by now should be clear.

It might be argued that at this point we should simply accept that fast food is our food culture and get on with it. Over time, people will get used to eating this way, and our health will improve as we gradually adjust to the new food environment. Also, as nutritional science improves, we should be able to ameliorate the worst effects of this diet. Already food scientists are figuring out ways to microencapsulate omega-3s and bake them into our vitamin-fortified bread. But I’m not sure we should put our faith in food science, which so far has not served us very well, or in evolution, either.

There are a couple of problems with trying simply to get used to the Western diet. You could argue that, compared to the Aborigines, say, or Inuit, we are getting used to it—most of us don’t get quite as fat or diabetic as they do. But our “adjustment” looks much less plausible when you consider that, as mentioned, fully a quarter of all Americans suffer from metabolic syndrome, two thirds of us are overweight or obese, and diet-related diseases are already killing the majority of us. The concept of a changing food environment is not just a metaphor; nor is the idea of adapting to it. In order for natural selection to help us adapt to the Western diet, we’d have to be prepared to let those whom it sickens die. Also, many of the chronic diseases caused by the Western diet come late in life, after the childbearing years, a period of our lives in which natural selection takes no interest. Thus genes predisposing people to these conditions get passed on rather than weeded out.

So we turn for salvation to the health care industry. Medicine is learning how to keep alive the people whom the Western diet is making sick. Doctors have gotten really good at keeping people with heart disease alive, and now they’re hard at work on obesity and diabetes. Much more so than the human body, capitalism is marvelously adaptive, able to turn the problems it creates into new business opportunities: diet pills, heart bypass operations, insulin pumps, bariatric surgery. But though fast food may be good business for the health care industry, the cost to society—an estimated $250 billion a year in diet-related health care costs and rising rapidly—cannot be sustained indefinitely. An American born in 2000 has a 1 in 3 chance of developing diabetes in his lifetime; the risk is even greater for a Hispanic American or African American. A diagnosis of diabetes subtracts roughly twelve years from one’s life and living with the condition incurs medical costs of $13,000 a year (compared with $2,500 for someone without diabetes).

This is a global pandemic in the making, but a most unusual one, because it involves no virus or bacteria, no microbe of any kind—just a way of eating. It remains to be seen whether we’ll respond by changing our diet or our culture and economy. Although an estimated 80 percent of cases of type 2 diabetes could be prevented by a change of diet and exercise, it looks like the smart money is instead on the creation of a vast new diabetes industry. The mainstream media is full of advertisements for new gadgets and drugs for diabetics, and the health care industry is gearing up to meet the surging demand for heart bypass operations (80 percent of diabetics will suffer from heart disease), dialysis, and kidney transplantation. At the supermarket checkout you can thumb copies of a new lifestyle magazine, Diabetic Living. Diabetes is well on its way to becoming normalized in the West—recognized as a whole new demographic and so a major marketing opportunity. Apparently it is easier, or at least a lot more profitable, to change a disease of civilization into a lifestyle than it is to change the way that civilization eats.

In Defense of Food: An Eater's Manifesto
titlepage.xhtml
indefenseoffood_cov.html
indefenseoffood_htp01.html
indefenseoffood_adc01.html
indefenseoffood_tp01.html
indefenseoffood_cop01.html
indefenseoffood_ded01.html
indefenseoffood_con01.html
indefenseoffood_htp02.html
indefenseoffood_fm01.html
indefenseoffood_pt01.html
indefenseoffood_ch01.html
indefenseoffood_ch02.html
indefenseoffood_ch03.html
indefenseoffood_ch04.html
indefenseoffood_ch05.html
indefenseoffood_ch06.html
indefenseoffood_ch07.html
indefenseoffood_ch08.html
indefenseoffood_ch09.html
indefenseoffood_ch10.html
indefenseoffood_pt02.html
indefenseoffood_ch11.html
indefenseoffood_ch12.html
indefenseoffood_ch13.html
indefenseoffood_pt03.html
indefenseoffood_ch14.html
indefenseoffood_ch15.html
indefenseoffood_ch16.html
indefenseoffood_ch17.html
indefenseoffood_ack01.html
indefenseoffood_bm01.html
indefenseoffood_bm02.html
indefenseoffood_bm03.html
footnote1.html
footnote10.html
footnote11.html
footnote12.html
footnote13.html
footnote14.html
footnote15.html
footnote16.html
footnote17.html
footnote18.html
footnote19.html
footnote2.html
footnote20.html
footnote21.html
footnote22.html
footnote23.html
footnote24.html
footnote25.html
footnote26.html
footnote27.html
footnote28.html
footnote29.html
footnote3.html
footnote30.html
footnote31.html
footnote32.html
footnote33.html
footnote34.html
footnote35.html
footnote36.html
footnote37.html
footnote38.html
footnote4.html
footnote5.html
footnote6.html
footnote7.html
footnote8.html
footnote9.html