by Nell Irvin Painter
Illustrated. 496 pp. W. W. Norton & Company. $27.95
Review by Linda Gordon
Daily news, analysis, and link directories on American studies, global-regional-local problems, minority groups, and internet resources.
Everything I Want to Do Is Illegal: War Stories from the Local Food Front
by Joel Salatin
Polyface, 338 pp., $23.95 (paper)
All You Can Eat: How Hungry Is America?
by Joel Berg
Seven Stories, 351 pp., $22.95 (paper)
Eating Animals
by Jonathan Safran Foer
Little, Brown, 341 pp., $25.99
Terra Madre: Forging a New Global Network of Sustainable Food Communities
by Carlo Petrini, with a foreword by Alice Waters
Chelsea Green, 155 pp., $20.00 (paper)
The Taste for Civilization: Food, Politics, and Civil Society
by Janet A. Flammang
University of Illinois Press, 325 pp., $70.00; $25.00 (paper)
It might sound odd to say this about something people deal with at least three times a day, but food in America has been more or less invisible, politically speaking, until very recently. At least until the early 1970s, when a bout of food price inflation and the appearance of books critical of industrial agriculture (by Wendell Berry, Francis Moore Lappé, and Barry Commoner, among others) threatened to propel the subject to the top of the national agenda, Americans have not had to think very hard about where their food comes from, or what it is doing to the planet, their bodies, and their society.
Most people count this a blessing. Americans spend a smaller percentage of their income on food than any people in history—slightly less than 10 percent—and a smaller amount of their time preparing it: a mere thirty-one minutes a day on average, including clean-up. The supermarkets brim with produce summoned from every corner of the globe, a steady stream of novel food products (17,000 new ones each year) crowds the middle aisles, and in the freezer case you can find “home meal replacements” in every conceivable ethnic stripe, demanding nothing more of the eater than opening the package and waiting for the microwave to chirp. Considered in the long sweep of human history, in which getting food dominated not just daily life but economic and political life as well, having to worry about food as little as we do, or did, seems almost a kind of dream.
The dream that the age-old “food problem” had been largely solved for most Americans was sustained by the tremendous postwar increases in the productivity of American farmers, made possible by cheap fossil fuel (the key ingredient in both chemical fertilizers and pesticides) and changes in agricultural policies. Asked by President Nixon to try to drive down the cost of food after it had spiked in the early 1970s, Agriculture Secretary Earl Butz shifted the historical focus of federal farm policy from supporting prices for farmers to boosting yields of a small handful of commodity crops (corn and soy especially) at any cost.
The administration’s cheap food policy worked almost too well: crop prices fell, forcing farmers to produce still more simply to break even. This led to a deep depression in the farm belt in the 1980s followed by a brutal wave of consolidation. Most importantly, the price of food came down, or at least the price of the kinds of foods that could be made from corn and soy: processed foods and sweetened beverages and feedlot meat. (Prices for fresh produce have increased since the 1980s.) Washington had succeeded in eliminating food as a political issue—an objective dear to most governments at least since the time of the French Revolution.
But although cheap food is good politics, it turns out there are significant costs—to the environment, to public health, to the public purse, even to the culture—and as these became impossible to ignore in recent years, food has come back into view. Beginning in the late 1980s, a series of food safety scandals opened people’s eyes to the way their food was being produced, each one drawing the curtain back a little further on a food system that had changed beyond recognition. When BSE, or mad cow disease, surfaced in England in 1986, Americans learned that cattle, which are herbivores, were routinely being fed the flesh of other cattle; the practice helped keep meat cheap but at the risk of a hideous brain-wasting disease.
The 1993 deaths of four children in Washington State who had eaten hamburgers from Jack in the Box were traced to meat contaminated with E.coli 0157:H7, a mutant strain of the common intestinal bacteria first identified in feedlot cattle in 1982. Since then, repeated outbreaks of food-borne illness linked to new antibiotic-resistant strains of bacteria (campylobacter, salmonella, MRSA) have turned a bright light on the shortsighted practice of routinely administering antibiotics to food animals, not to treat disease but simply to speed their growth and allow them to withstand the filthy and stressful conditions in which they live.
In the wake of these food safety scandals, the conversation about food politics that briefly flourished in the 1970s was picked up again in a series of books, articles, and movies about the consequences of industrial food production.Beginning in 2001 with the publication of Eric Schlosser’s Fast Food Nation, a surprise best-seller, and, the following year, Marion Nestle’s Food Politics, the food journalism of the last decade has succeeded in making clear and telling connections between the methods of industrial food production, agricultural policy, food-borne illness, childhood obesity, the decline of the family meal as an institution, and, notably, the decline of family income beginning in the 1970s.
Besides drawing women into the work force, falling wages made fast food both cheap to produce and a welcome, if not indispensible, option for pinched and harried families. The picture of the food economy Schlosser painted resembles an upside-down version of the social compact sometimes referred to as “Fordism”: instead of paying workers well enough to allow them to buy things like cars, as Henry Ford proposed to do, companies like Wal-Mart and McDonald’s pay their workers so poorly that they can afford only the cheap, low-quality food these companies sell, creating a kind of nonvirtuous circle driving down both wages and the quality of food. The advent of fast food (and cheap food in general) has, in effect, subsidized the decline of family incomes in America.
Cheap food has become an indispensable pillar of the modern economy. But it is no longer an invisible or uncontested one. One of the most interesting social movements to emerge in the last few years is the “food movement,” or perhaps I should say “movements,” since it is unified as yet by little more than the recognition that industrial food production is in need of reform because its social/environmental/public health/animal welfare/gastronomic costs are too high.
As that list suggests, the critics are coming at the issue from a great many different directions. Where many social movements tend to splinter as time goes on, breaking into various factions representing divergent concerns or tactics, the food movement starts out splintered. Among the many threads of advocacy that can be lumped together under that rubric we can include school lunch reform; the campaign for animal rights and welfare; the campaign against genetically modified crops; the rise of organic and locally produced food; efforts to combat obesity and type 2 diabetes; “food sovereignty” (the principle that nations should be allowed to decide their agricultural policies rather than submit to free trade regimes); farm bill reform; food safety regulation; farmland preservation; student organizing around food issues on campus; efforts to promote urban agriculture and ensure that communities have access to healthy food; initiatives to create gardens and cooking classes in schools; farm worker rights; nutrition labeling; feedlot pollution; and the various efforts to regulate food ingredients and marketing, especially to kids.
It’s a big, lumpy tent, and sometimes the various factions beneath it work at cross-purposes. For example, activists working to strengthen federal food safety regulations have recently run afoul of local food advocates, who fear that the burden of new regulation will cripple the current revival of small-farm agriculture. Joel Salatin, the Virginia meat producer and writer who has become a hero to the food movement, fulminates against food safety regulation on libertarian grounds in his Everything I Want to Do Is Illegal: War Stories From the Local Food Front. Hunger activists like Joel Berg, in All You Can Eat: How Hungry Is America?, criticize supporters of “sustainable” agriculture—i.e., producing food in ways that do not harm the environment—for advocating reforms that threaten to raise the cost of food to the poor. Animal rights advocates occasionally pick fights with sustainable meat producers (such as Joel Salatin), as Jonathan Safran Foer does in his recent vegetarian polemic, Eating Animals.
But there are indications that these various voices may be coming together in something that looks more and more like a coherent movement. Many in the animal welfare movement, from PETA to Peter Singer, have come to see that a smaller-scale, more humane animal agriculture is a goal worth fighting for, and surely more attainable than the abolition of meat eating. Stung by charges of elitism, activists for sustainable farming are starting to take seriously the problem of hunger and poverty. They’re promoting schemes and policies to make fresh local food more accessible to the poor, through programs that give vouchers redeemable at farmers’ markets to participants in the Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) and food stamp recipients. Yet a few underlying tensions remain: the “hunger lobby” has traditionally supported farm subsidies in exchange for the farm lobby’s support of nutrition programs, a marriage of convenience dating to the 1960s that vastly complicates reform of the farm bill—a top priority for the food movement.
The sociologist Troy Duster reminds us of an all-important axiom about social movements: “No movement is as coherent and integrated as it seems from afar,” he says, “and no movement is as incoherent and fractured as it seems from up close.” Viewed from a middle distance, then, the food movement coalesces around the recognition that today’s food and farming economy is “unsustainable”—that it can’t go on in its current form much longer without courting a breakdown of some kind, whether environmental, economic, or both.
For some in the movement, the more urgent problem is environmental: the food system consumes more fossil fuel energy than we can count on in the future (about a fifth of the total American use of such energy) and emits more greenhouse gas than we can afford to emit, particularly since agriculture is the one human system that should be able to substantially rely on photosynthesis: solar energy. It will be difficult if not impossible to address the issue of climate change without reforming the food system. This is a conclusion that has only recently been embraced by the environmental movement, which historically has disdained all agriculture as a lapse from wilderness and a source of pollution.1 But in the last few years, several of the major environmental groups have come to appreciate that a diversified, sustainable agriculture—which can sequester large amounts of carbon in the soil—holds the potential not just to mitigate but actually to help solve environmental problems, including climate change. Today, environmental organizations like the Natural Resources Defense Council and the Environmental Working Group are taking up the cause of food system reform, lending their expertise and clout to the movement.
But perhaps the food movement’s strongest claim on public attention today is the fact that the American diet of highly processed food laced with added fats and sugars is responsible for the epidemic of chronic diseases that threatens to bankrupt the health care system. The Centers for Disease Control estimates that fully three quarters of US health care spending goes to treat chronic diseases, most of which are preventable and linked to diet: heart disease, stroke, type 2 diabetes, and at least a third of all cancers. The health care crisis probably cannot be addressed without addressing the catastrophe of the American diet, and that diet is the direct (even if unintended) result of the way that our agriculture and food industries have been organized.
Michelle Obama’s recent foray into food politics, beginning with the organic garden she planted on the White House lawn last spring, suggests that the administration has made these connections. Her new “Let’s Move” campaign to combat childhood obesity might at first blush seem fairly anodyne, but in announcing the initiative in February, and in a surprisingly tough speech to the Grocery Manufacturers Association in March,2 the First Lady has effectively shifted the conversation about diet from the industry’s preferred ground of “personal responsibility” and exercise to a frank discussion of the way food is produced and marketed. “We need you not just to tweak around the edges,” she told the assembled food makers, “but to entirely rethink the products that you’re offering, the information that you provide about these products, and how you market those products to our children.”
Mrs. Obama explicitly rejected the conventional argument that the food industry is merely giving people the sugary, fatty, and salty foods they want, contending that the industry “doesn’t just respond to people’s natural inclinations—it also actually helps to shape them,” through the ways it creates products and markets them.
So far at least, Michelle Obama is the food movement’s most important ally in the administration, but there are signs of interest elsewhere. Under Commissioner Margaret Hamburg, the FDA has cracked down on deceptive food marketing and is said to be weighing a ban on the nontherapeutic use of antibiotics in factory farming. Attorney General Eric Holder recently avowed the Justice Department’s intention to pursue antitrust enforcement in agribusiness, one of the most highly concentrated sectors in the economy.3 At his side was Agriculture Secretary Tom Vilsack, the former governor of Iowa, who has planted his own organic vegetable garden at the department and launched a new “Know Your Farmer, Know Your Food” initiative aimed at promoting local food systems as a way to both rebuild rural economies and improve access to healthy food.
Though Vilsack has so far left mostly undisturbed his department’s traditional deference to industrial agriculture, the new tone in Washington and the appointment of a handful of respected reformers (such as Tufts professor Kathleen Merrigan as deputy secretary of agriculture) has elicited a somewhat defensive, if not panicky, reaction from agribusiness. The Farm Bureau recently urged its members to go on the offensive against “food activists,” and a trade association representing pesticide makers called CropLife America wrote to Michelle Obama suggesting that her organic garden had unfairly maligned chemical agriculture and encouraging her to use “crop protection technologies”—i.e., pesticides.
The First Lady’s response is not known; however, the President subsequently rewarded CropLife by appointing one of its executives to a high-level trade post. This and other industry-friendly appointments suggest that while the administration may be sympathetic to elements of the food movement’s agenda, it isn’t about to take on agribusiness, at least not directly, at least until it senses at its back a much larger constituency for reform.
One way to interpret Michelle Obama’s deepening involvement in food issues is as an effort to build such a constituency, and in this she may well succeed. It’s a mistake to underestimate what a determined First Lady can accomplish. Lady Bird Johnson’s “highway beautification” campaign also seemed benign, but in the end it helped raise public consciousness about “the environment” (as it would soon come to be known) and put an end to the public’s tolerance for littering. And while Michelle Obama has explicitly limited her efforts to exhortation (“we can’t solve this problem by passing a bunch of laws in Washington,” she told the Grocery Manufacturers, no doubt much to their relief), her work is already creating a climate in which just such a “bunch of laws” might flourish: a handful of state legislatures, including California’s, are seriously considering levying new taxes on sugar in soft drinks, proposals considered hopelessly extreme less than a year ago.
The political ground is shifting, and the passage of health care reform may accelerate that movement. The bill itself contains a few provisions long promoted by the food movement (like calorie labeling on fast food menus), but more important could be the new political tendencies it sets in motion. If health insurers can no longer keep people with chronic diseases out of their patient pools, it stands to reason that the companies will develop a keener interest in preventing those diseases. They will then discover that they have a large stake in things like soda taxes and in precisely which kinds of calories the farm bill is subsidizing. As the insurance industry and the government take on more responsibility for the cost of treating expensive and largely preventable problems like obesity and type 2 diabetes, pressure for reform of the food system, and the American diet, can be expected to increase.
It would be a mistake to conclude that the food movement’s agenda can be reduced to a set of laws, policies, and regulations, important as these may be. What is attracting so many people to the movement today (and young people in particular) is a much less conventional kind of politics, one that is about something more than food. The food movement is also about community, identity, pleasure, and, most notably, about carving out a new social and economic space removed from the influence of big corporations on the one side and government on the other. As the Diggers used to say during their San Francisco be-ins during the 1960s, food can serve as “an edible dynamic”—a means to a political end that is only nominally about food itself.
One can get a taste of this social space simply by hanging around a farmers’ market, an activity that a great many people enjoy today regardless of whether they’re in the market for a bunch of carrots or a head of lettuce. Farmers’ markets are thriving, more than five thousand strong, and there is a lot more going on in them than the exchange of money for food. Someone is collecting signatures on a petition. Someone else is playing music. Children are everywhere, sampling fresh produce, talking to farmers. Friends and acquaintances stop to chat. One sociologist calculated that people have ten times as many conversations at the farmers’ market than they do in the supermarket. Socially as well as sensually, the farmers’ market offers a remarkably rich and appealing environment. Someone buying food here may be acting not just as a consumer but also as a neighbor, a citizen, a parent, a cook. In many cities and towns, farmers’ markets have taken on (and not for the first time) the function of a lively new public square.
Though seldom articulated as such, the attempt to redefine, or escape, the traditional role of consumer has become an important aspiration of the food movement. In various ways it seeks to put the relationship between consumers and producers on a new, more neighborly footing, enriching the kinds of information exchanged in the transaction, and encouraging us to regard our food dollars as “votes” for a different kind of agriculture and, by implication, economy. The modern marketplace would have us decide what to buy strictly on the basis of price and self-interest; the food movement implicitly proposes that we enlarge our understanding of both those terms, suggesting that not just “good value” but ethical and political values should inform our buying decisions, and that we’ll get more satisfaction from our eating when they do.
That satisfaction helps to explain why many in the movement don’t greet the spectacle of large corporations adopting its goals, as some of them have begun to do, with unalloyed enthusiasm. Already Wal-Mart sells organic and local food, but this doesn’t greatly warm the hearts of food movement activists. One important impetus for the movement, or at least its locavore wing—those who are committed to eating as much locally produced food as possible—is the desire to get “beyond the barcode”—to create new economic and social structures outside of the mainstream consumer economy. Though not always articulated in these terms, the local food movement wants to decentralize the global economy, if not secede from it altogether, which is why in some communities, such as Great Barrington, Massachusetts, local currencies (the “BerkShare”) have popped up.
In fact it’s hard to say which comes first: the desire to promote local agriculture or the desire to promote local economies more generally by cutting ties, to whatever degree possible, to the national economic grid.4 This is at bottom a communitarian impulse, and it is one that is drawing support from the right as well as the left. Though the food movement has deep roots in the counterculture of the 1960s, its critique of corporate food and federal farm subsidies, as well as its emphasis on building community around food, has won it friends on the right. In his 2006 book Crunchy Cons, Rod Dreher identifies a strain of libertarian conservatism, often evangelical, that regards fast food as anathema to family values, and has seized on local food as a kind of culinary counterpart to home schooling.
It makes sense that food and farming should become a locus of attention for Americans disenchanted with consumer capitalism. Food is the place in daily life where corporatization can be most vividly felt: think about the homogenization of taste and experience represented by fast food. By the same token, food offers us one of the shortest, most appealing paths out of the corporate labyrinth, and into the sheer diversity of local flavors, varieties, and characters on offer at the farmers’ market.
Put another way, the food movement has set out to foster new forms of civil society. But instead of proposing that space as a counterweight to an overbearing state, as is usually the case, the food movement poses it against the dominance of corporations and their tendency to insinuate themselves into any aspect of our lives from which they can profit. As Wendell Berry writes, the corporations
will grow, deliver, and cook your food for you and (just like your mother) beg you to eat it. That they do not yet offer to insert it, prechewed, into your mouth is only because they have found no profitable way to do so.
The corporatization of something as basic and intimate as eating is, for many of us today, a good place to draw the line.
The Italian-born organization Slow Food, founded in 1986 as a protest against the arrival of McDonald’s in Rome, represents perhaps the purest expression of these politics. The organization, which now has 100,000 members in 132 countries, began by dedicating itself to “a firm defense of quiet material pleasure” but has lately waded into deeper political and economic waters. Slow Food’s founder and president, Carlo Petrini, a former leftist journalist, has much to say about how people’s daily food choices can rehabilitate the act of consumption, making it something more creative and progressive. In his new book Terra Madre: Forging a New Global Network of Sustainable Food Communities, Petrini urges eaters and food producers to join together in “food communities” outside of the usual distribution channels, which typically communicate little information beyond price and often exploit food producers. A farmers’ market is one manifestation of such a community, but Petrini is no mere locavore. Rather, he would have us practice on a global scale something like “local” economics, with its stress on neighborliness, as when, to cite one of his examples, eaters in the affluent West support nomad fisher folk in Mauritania by creating a market for their bottarga, or dried mullet roe. In helping to keep alive such a food tradition and way of life, the eater becomes something more than a consumer; she becomes what Petrini likes to call a “coproducer.”
Ever the Italian, Petrini puts pleasure at the center of his politics, which might explain why Slow Food is not always taken as seriously as it deserves to be. For why shouldn’t pleasure figure in the politics of the food movement? Good food is potentially one of the most democratic pleasures a society can offer, and is one of those subjects, like sports, that people can talk about across lines of class, ethnicity, and race.
The fact that the most humane and most environmentally sustainable choices frequently turn out to be the most delicious choices (as chefs such as Alice Waters and Dan Barber have pointed out) is fortuitous to say the least; it is also a welcome challenge to the more dismal choices typically posed by environmentalism, which most of the time is asking us to give up things we like. As Alice Waters has often said, it was not politics or ecology that brought her to organic agriculture, but rather the desire to recover a certain taste—one she had experienced as an exchange student in France. Of course democratizing such tastes, which under current policies tend to be more expensive, is the hard part, and must eventually lead the movement back to more conventional politics lest it be tagged as elitist.
But the movement’s interest in such seemingly mundane matters as taste and the other textures of everyday life is also one of its great strengths. Part of the movement’s critique of industrial food is that, with the rise of fast food and the collapse of everyday cooking, it has damaged family life and community by undermining the institution of the shared meal. Sad as it may be to bowl alone, eating alone can be sadder still, not least because it is eroding the civility on which our political culture depends.
That is the argument made by Janet Flammang, a political scientist, in a provocative new book called The Taste for Civilization: Food, Politics, and Civil Society. “Significant social and political costs have resulted from fast food and convenience foods,” she writes, “grazing and snacking instead of sitting down for leisurely meals, watching television during mealtimes instead of conversing”—40 percent of Americans watch television during meals—”viewing food as fuel rather than sustenance, discarding family recipes and foodways, and denying that eating has social and political dimensions.” The cultural contradictions of capitalism—its tendency to undermine the stabilizing social forms it depends on—are on vivid display at the modern American dinner table.
In a challenge to second-wave feminists who urged women to get out of the kitchen, Flammang suggests that by denigrating “foodwork”—everything involved in putting meals on the family table—we have unthinkingly wrecked one of the nurseries of democracy: the family meal. It is at “the temporary democracy of the table” that children learn the art of conversation and acquire the habits of civility—sharing, listening, taking turns, navigating differences, arguing without offending—and it is these habits that are lost when we eat alone and on the run. “Civility is not needed when one is by oneself.”5
These arguments resonated during the Senate debate over health care reform, when The New York Times reported that the private Senate dining room, where senators of both parties used to break bread together, stood empty. Flammang attributes some of the loss of civility in Washington to the aftermatch of the 1994 Republican Revolution, when Newt Gingrich, the new Speaker of the House, urged his freshman legislators not to move their families to Washington. Members now returned to their districts every weekend, sacrificing opportunities for socializing across party lines and, in the process, the “reservoirs of good will replenished at dinner parties.” It is much harder to vilify someone with whom you have shared a meal.
Flammang makes a convincing case for the centrality of food work and shared meals, much along the lines laid down by Carlo Petrini and Alice Waters, but with more historical perspective and theoretical rigor. A scholar of the women’s movement, she suggests that “American women are having second thoughts” about having left the kitchen.6 However, the answer is not for them simply to return to it, at least not alone, but rather “for everyone—men, women, and children—to go back to the kitchen, as in preindustrial days, and for the workplace to lessen its time demands on people.” Flammang points out that the historical priority of the American labor movement has been to fight for money, while the European labor movement has fought for time, which she suggests may have been the wiser choice.
At the very least this is a debate worth having, and it begins by taking food issues much more seriously than we have taken them. Flammang suggests that the invisibility of these issues until recently owes to the identification of food work with women and the (related) fact that eating, by its very nature, falls on the wrong side of the mind–body dualism. “Food is apprehended through the senses of touch, smell and taste,” she points out,
which rank lower on the hierarchy of senses than sight and hearing, which are typically thought to give rise to knowledge. In most of philosophy, religion, and literature, food is associated with body, animal, female, and appetite—things civilized men have sought to overcome with reason and knowledge.
Much to our loss. But food is invisible no longer and, in light of the mounting costs we’ve incurred by ignoring it, it is likely to demand much more of our attention in the future, as eaters, parents, and citizens. It is only a matter of time before politicians seize on the power of the food issue, which besides being increasingly urgent is also almost primal, indeed is in some deep sense proto- political. For where do all politics begin if not in the high chair?—at that fateful moment when mother, or father, raises a spoonful of food to the lips of the baby who clamps shut her mouth, shakes her head no, and for the very first time in life awakens to and asserts her sovereign power.
Al Gore's An Inconvenient Truth made scant mention of food or agriculture, but in his recent follow-up book, Our Choice: A Plan to Solve the Climate Crisis (2009), he devotes a long chapter to the subject of our food choices and their bearing on climate. ↩
Ms. Obama's speech can be read at www.whitehouse.gov/the-press-office/remarks-first-lady-a-grocery-manufacturers-association-conference. ↩
Speaking in March at an Iowa "listening session" about agribusiness concentration, Holder said, "long periods of reckless deregulation have restricted competition" in agriculture. Indeed: four companies (JBS/Swift, Tyson, Cargill, and National Beef Packers) slaughter 85 percent of US beef cattle; two companies (Monsanto and DuPont) sell more than 50 percent of US corn seed; one company (Dean Foods) controls 40 percent of the US milk supply. ↩
For an interesting case study about a depressed Vermont mining town that turned to local food and agriculture to revitalize itself, see Ben Hewitt, The Town That Food Saved: How One Community Found Vitality in Local Food (Rodale, 2009). ↩
See David M. Herszenhorn, "In Senate Health Care Vote, New Partisan Vitriol," The New York Times, December 23, 2009: "Senator Max Baucus, Democrat of Montana and chairman of the Finance Committee, said the political—and often personal—divisions that now characterize the Senate were epitomized by the empty tables in the senators' private dining room, a place where members of both parties used to break bread. 'Nobody goes there anymore,' Mr. Baucus said. 'When I was here 10, 15, 30 years ago, that the place you would go to talk to senators, let your hair down, just kind of compare notes, no spouses allowed, no staff, nobody. It is now empty.'"↩
The stirrings of a new "radical homemakers" movement lends some support to the assertion. See Shannon Hayes's Radical Homemakers: Reclaiming Domesticity from a Consumer Culture (Left to Write Press, 2010).↩
Image by Searocket via Flickr
by Nicholas LemannA few days after the September 11th attacks—which killed seven times as many people as any previous act of terrorism—President George W. Bush declared that the United States was engaged in a global war on terror. September 11th seemed to confirm that we were in a clash of civilizations between modernity and radical Islam. We had a worldwide enemy with a cause that was general, not specific (“They hate our freedoms”), and we now had to take on the vast, long-running mission—equal in scope to the Cold War—of defeating all ambitious terrorist groups everywhere, along with the states that harbored them. The war on terror wasn’t a hollow rhetorical trope. It led to the American conquest and occupation first of Afghanistan, which had sheltered the leaders of Al Qaeda, and then of Iraq, which had no direct connection to September 11th.
Today, few consider the global war on terror to have been a success, either as a conceptual framing device or as an operation. President Obama has pointedly avoided stringing those fateful words together in public. His foreign-policy speech in Cairo, last June, makes an apt bookend with Bush’s war-on-terror speech in Washington, on September 20, 2001. Obama not only didn’t talk about a war; he carefully avoided using the word “terrorism,” preferring “violent extremism.”
But if “global war” isn’t the right approach to terror what is? Experts on terrorism have produced shelves’ worth of new works on this question. For outsiders, reading this material can be a jarring experience. In the world of terrorism studies, the rhetoric of righteousness gives way to equilibrium equations. Nobody is good and nobody is evil. Terrorists, even suicide bombers, are not psychotics or fanatics; they’re rational actors—that is, what they do is explicable in terms of their beliefs and desires—who respond to the set of incentives that they find before them. The tools of analysis are realism, rational choice, game theory, decision theory: clinical and bloodless modes of thinking.
That approach, along with these scholars’ long immersion in the subject, can produce some surprising observations. In “A Question of Command: Counterinsurgency from the Civil War to Iraq” (Yale; $30), Mark Moyar, who holds the Kim T. Adamson Chair of Insurgency and Terrorism at the Marine Corps University, tells us that, in Afghanistan, the Taliban’s pay scale (financed by the protection payments demanded from opium farmers) is calibrated to be a generous multiple of the pay received by military and police personnel (financed by U.S. aid); no wonder official Afghan forces are no match for the insurgents. Audrey Kurth Cronin, a professor of strategy at the National War College, reminds us, in “How Terrorism Ends: Understanding the Decline and Demise of Terrorist Campaigns” (Princeton; $29.95), that one can find out about Al Qaeda’s policy for coördinating attacks by reading a book called “The Management of Barbarism,” by Abu Bakr Naji, which has been available via Al Qaeda’s online library. (Naji advises that, if jihadis are arrested in one country after an attack, a cell elsewhere should launch an attack as a display of resilience.) In “Radical, Religious, and Violent: The New Economics of Terrorism” (M.I.T.; $24.95), Eli Berman traces the origins of the Taliban to a phenomenon that long preceded the birth of modern radical Islam: they are a direct descendant of the Deobandi movement, which began in nineteenth-century India in opposition to British colonial rule and, among other things, established a system of religious schools.
What is terrorism, anyway? The expert consensus converges on a few key traits. Terrorists have political or ideological objectives (the purpose can’t be mere profiteering). They are “non-state actors,” not part of conventional governments. Their intention is to intimidate an audience larger than their immediate victims, in the hope of generating widespread panic and, often, a response from the enemy so brutal that it ends up backfiring by creating sympathy for the terrorists’ cause. Their targets are often ordinary civilians, and, even when terrorists are trying to kill soldiers, their attacks often don’t take place on the field of battle. The modern age of suicide terrorism can be said to have begun with Hezbollah’s attack, in October of 1983, on U.S. marines who were sleeping in their barracks in Beirut.
Image by doodledubz collective via Flickr
Once you take terrorists to be rational actors, you need a theory about their rationale. Robert Pape, a political scientist at the University of Chicago, built a database of three hundred and fifteen suicide attacks between 1980 and 2003, and drew a resoundingly clear conclusion: “What nearly all suicide terrorist attacks have in common is a specific secular and strategic goal: to compel modern democracies to withdraw military forces from territory that the terrorists consider to be their homeland.” As he wrote in “Dying to Win: The Strategic Logic of Suicide Terrorism” (2005), what terrorists want is “to change policy,” often the policy of a faraway major power. Pape asserts that “offensive military action rarely works” against terrorism, so, in his view, the solution to the problem of terrorism couldn’t be simpler: withdraw. Pape’s “nationalist theory of suicide terrorism” applies not just to Hamas and Hezbollah but also to Al Qaeda; its real goal, he says, is the removal of the U.S. military from the Arabian Peninsula and other Muslim countries. Pape says that “American military policy in the Persian Gulf was most likely the pivotal factor leading to September 11”; the only effective way to prevent future Al Qaeda attacks would be for the United States to take all its forces out of the Middle East.By contrast, Mark Moyar dismisses the idea that “people’s social, political, and economic grievances” are the main cause of popular insurgencies. He regards anti-insurgent campaigns as “a contest between elites.” Of the many historical examples he offers, the best known is L. Paul Bremer’s de-Baathification of Iraq, in the spring of 2003, in which the entire authority structure of Iraq was disbanded at a stroke, creating a leadership cadre for a terrorist campaign against the American occupiers. One of Moyar’s chapters is about the uncontrollably violent American South during Reconstruction—a subject that a number of authors have turned to during the war on terror—and it demonstrates better than his chapter on Iraq the power of his theory to offend contemporary civilian sensibilities. Rather than disempowering the former Confederates and empowering the freed slaves, Moyar says, the victorious Union should have maintained order by leaving the more coöperative elements of the slaveholding, seceding class in control. Effective counterinsurgency, he says, entails selecting the Ă©lites you can work with and co-opting them.
In “Talking to Terrorists: Why America Must Engage with Its Enemies” (Basic; $26.95), Mark Perry describes a little-known attempt to apply Moyar’s model in Iraq. The book jacket identifies Perry as “a military, intelligence, and foreign affairs analyst and writer,” but his writing conveys a strong impression that he has not spent his career merely watching the action from a safe seat in the bleachers. Much of the book is devoted to a detailed description, complete with many on-the-record quotes, of a series of meetings in Amman, Jordan, in 2004, between a group of Marine officers based in Anbar province, in western Iraq, and an Iraqi businessman named Talal al-Gaood. Gaood, a Sunni and a former member of Saddam Hussein’s Baath Party, suggested he could broker a deal that would make the horrific, almost daily terrorist attacks in western Iraq go away.
Perry’s tone calls to mind a Tom Clancy novel. Tough, brave, tight-lipped officers do endless battle not just with the enemy in the field but also with cowardly, dissembling political bureaucrats in the Pentagon, the State Department, and the White House. The crux of his story is that a promising negotiation was tragically cut short, just as it was about to bear fruit, when the key negotiator, a Marine colonel, was “PNG’d”—declared persona non grata—by Washington and denied entry to Jordan. Not long after that, Gaood died suddenly, of a heart ailment, at the age of forty-four (according to Perry, he was so beloved that his wake had to be held in a soccer stadium), putting an end to any possibility of further talks. It’s startling to read about American military commanders in the field taking on a freelance diplomatic mission of this magnitude, and to imagine that there was a businessman in Amman who, on the right terms, could have snapped his fingers and ended what we back home thought of as pervasive, wild-eyed jihad.
What dominates the writing of experts about terrorism, however, is a more fine-grained idea of terrorists’ motives—at the level of ethnic group, tribe, village, and even individual calculation. Pape thinks of terrorists as being motivated by policy and strategic concerns; Cronin, of the National War College, shares Pape’s view that most terrorists are, essentially, terroirists—people who want control of land—but she is also attuned to their narrower, more local considerations. The odds are against them, because of the natural forces of entropy and their lack of access to ordinary military power and other resources, but, if they do succeed, they can be counted upon to try to ascend the ladder of legitimacy, first to insurgency, then to some kind of governing status. (Examples of that ultimate kind of success would be the Irgun and the Stern Gang, in Israel, Sinn Fein and the Provisional I.R.A., in Northern Ireland, and the Palestine Liberation Organization, in the West Bank and Gaza.)
Cronin goes through an elaborate menu of techniques for hastening the end of a terrorist campaign. None of them rise to the level of major policy, let alone a war on terror; in general, the smaller their scope the more effective Cronin finds them to be. She believes, for instance, that jailing the celebrated head of a terrorist organization is a more effective countermeasure than killing him. (Abimael Guzmán, the head of the Shining Path, in Peru, was, after his capture in 1992, “displayed in a cage, in a striped uniform, recanting and asking his followers to lay down their arms.” That took the wind out of the Shining Path’s sails. A surprise ambush that martyred him might not have.) Negotiating with terrorists—a practice usually forsworn, often done—can work in the long term, Cronin says, not because it is likely to produce a peace treaty but because it enables a state to gain intelligence about its opponents, exploit differences and hive off factions, and stall while time works its erosive wonders.
Cronin offers a confident prescription, based on her small-bore approach to terrorism, for defeating the apparently intractable Al Qaeda. The idea is to take advantage of the group’s highly decentralized structure by working to alienate its far-flung component parts, getting them to see their local interests as being at odds with Al Qaeda’s global ones. “Bin Laden and Zawahiri have focused on exploiting and displacing the local concerns of the Chechens, the Uighurs, the Islamic Movement of Uzbekistan, the Salafist Group for Call and Combat in Algeria, and many others, and sought to replace them with an international agenda,” Cronin writes. The United States should now try to “sever the connection between Islamism and individualized local contexts for political violence, and then address them separately.” It should work with these local groups, not in an effort to convert them to democracy and love of America but in order to pry them away, one by one, from Al Qaeda. (“Calling the al-Qaeda movement ‘jihadi international,’ as the Israeli intelligence services do,” she writes, “encourages a grouping together of disparate threats that undermines our best counterterrorism. It is exactly the mistake we made when we lumped the Chinese and the Soviets together in the 1950s and early 1960s, calling them ‘international Communists.’ ”)
Eli Berman, an economist who has done field work among ultra-orthodox religious groups in Israel, is even more granular in his view of what terrorists want: he stresses the social services that terror and insurgent groups provide to their members. Berman’s book is an extended application to terrorism of an influential 1994 article by the economist Laurence Iannaccone, called “Why Strict Churches Are Strong.” Trying to answer the question of why religious denominations that impose onerous rules and demand large sacrifices of their members seem to thrive better than those which do not, Iannaccone surmised that strict religions function as economic clubs. They appeal to recruits in part because they are able to offer very high levels of benefits—not just spiritual ones but real services—and this involves high “defection constraints.” In denominations where it’s easy for individual members to opt out of an obligation, it is impossible to maintain such benefits. Among the religious groups Iannaccone has written about, impediments to defection can be emotionally painful, such as expulsion or the promise of eternal damnation; in many terrorist groups, the defection constraints reflect less abstract considerations: this-worldly torture, maiming, and murder.
Berman’s main examples are Hamas, Hezbollah, Moqtada al-Sadr’s Mahdi Army, in Iraq, and the Taliban, whom Berman calls “some of the most accomplished rebels of modern times.” All these organizations, he points out, are effective providers of services in places where there is dire need of them. Their members are also subject to high defection constraints, because their education and their location don’t put them in the way of a lot of opportunity and because they know they will be treated brutally if they do defect.
Like most other terrorism experts, Berman sees no crevasse between insurgents and terrorists. Instead, he considers them to be members of a single category he calls “rebels,” who use a variety of techniques, depending on the circumstances. Suicide bombing represents merely one end of the spectrum; its use is an indication not of the fanaticism or desperation of the individual bomber (most suicide bombers—recall Muhammad Atta’s professional-class background—are not miserably poor and alienated adolescent males) but of the supremely high cohesion of the group. Suicide bombing, Berman notes, increases when the terrorist group begins to encounter hard targets, like American military bases, that are impervious to everything else. The Taliban used traditional guerrilla-warfare techniques when they fought the Northern Alliance in the mountains. When their enemies became Americans and other Westerners operating from protected positions and with advanced equipment, the Taliban were more likely to resort to suicide bombing. How else could a small group make a big impact?
The idea of approaching terrorists as rational actors and defeating them by a cool recalibration of their incentives extends beyond the academic realm. Its most influential published expression is General David Petraeus’s 2006 manual “Counterinsurgency.” Written in dry management-ese, punctuated by charts and tables, the manual stands as a rebuke of the excesses of Bush’s global war on terror.
“Soldiers and Marines are expected to be nation builders as well as warriors,” the introduction to the manual declares. “They must be prepared to help reestablish institutions and local security forces and assist in rebuilding infrastructure and basic services. They must be able to facilitate establishing local governance and the rule of law.” The manual’s most famous formulation is “clear-hold-build,” and its heaviest emphasis is on the third of those projects; the counterinsurgent comes across a bit like a tough but kindhearted nineteen-fifties cop, walking a beat, except that he does more multitasking. He collects garbage, digs wells, starts schools and youth clubs, does media relations, improves the business climate. What he doesn’t do is torture, kill in revenge, or overreact. He’s Gandhi in I.E.D.-proof armor.
Petraeus has clearly absorbed the theory that terrorist and insurgent groups are sustained by their provision of social services. Great swaths of the manual are devoted to elaborating ways in which counterinsurgents must compete for people’s loyalty by providing better services in the villages and tribal encampments of the deep-rural Middle East. It’s hard to think of a service that the manual doesn’t suggest, except maybe yoga classes. And, like Berman, the manual is skeptical about the utility, in fighting terrorism, of big ideas about morality, policy, or even military operations. Here’s a representative passage:
One problem with such programs is that they can be too small, and too nice, to win the hearts and minds of the populace away from their traditional leaders. The former civil-affairs officer A. Heather Coyne tells the story, recounted in Berman’s book, of a program that offered people in Sadr City ten dollars a day to clean the streets—something right out of the counterinsurgency manual. The American colonel who was running the program went out to talk to people and find out how effective the program was at meeting its larger goal. This is what he heard: “We are so grateful for the program. And we’re so grateful to Muqtada al-Sadr for doing this program.” Evidently, Sadr had simply let it be known that he was behind this instance of social provision, and people believed him. For Berman, the lesson is “a general principle: economic development and governance can be at odds when the territory is not fully controlled by the government.” That’s a pretty discouraging admission—it implies that helping people peacefully in an area where insurgents are well entrenched may only help the insurgents.
One could criticize the manual from a military perspective, as Mark Moyar does, for being too nonviolent and social-worky. Moyar admires General Petraeus personally (Petraeus being the kind of guy who, while recuperating from major surgery at a hospital after taking a bullet during a live-ammunition exercise, had his doctors pull all the tubes out of his arm and did fifty pushups to prove that he should be released early). But Moyar is appalled by the manual’s tendency to downplay the use of force: “The manual repeatedly warned of the danger of alienating the populace through the use of lethal force and insisted that counterinsurgents minimize the use of force, even if in some instances it meant letting enemy combatants escape. . . . As operations in Iraq and elsewhere have shown, aggressive and well-led offensive operations to chase down insurgents have frequently aided the counterinsurgent cause by robbing the insurgents of the initiative, disrupting their activities, and putting them in prison or in the grave.”
Because terrorism is such an enormous problem—it takes place constantly, all over the world, in conflict zones and in big cities, in more and less developed countries—one can find an example of just about every anti-terrorist tactic working (or failing to). One of the most prolific contemporary terrorist groups, the Tamil Tigers, of Sri Lanka, appears to have been defeated by the Sinhalese Buddhist-dominated government, through a conventional, if unusually violent, military campaign, which ended last spring. In that instance, brutal repression seems to have been the key. But the Russians have tried that intermittently in Chechnya, without the same effect; the recent suicide bombing in the Moscow subway by Chechen terrorists prompted an Op-Ed piece in the Times by Robert Pape and two associates, arguing that the answer is for Russia to dial back its “indirect military occupation” of Chechnya.
The point of social science is to be careful, dispassionate, and analytical, to get beyond the lure of anecdote and see what the patterns really are. But in the case of counterterrorism the laboratory approach can’t be made to scan neatly, because there isn’t a logic that can be counted upon to apply in all cases. One could say that the way to reduce a group’s terrorist activity is by reaching a political compromise with it; Northern Ireland seems to be an example. But doing that can make terrorism more attractive to other groups—a particular risk for the United States, which operates in so many places around the world. After the Hezbollah attack on the Marine barracks, in 1983, President Ronald Reagan pulled out of Lebanon, a decision that may have set off more terrorism in the Middle East over the long term. Immediate, savage responses—George W. Bush, rather than Reagan—can work in one contained area and fail more broadly. If the September 11th attacks were meant in part to provoke a response that would make the United States unpopular in the Muslim world, they certainly succeeded.
Even if one could prove that a set of measured responses to specific terrorist acts was effective, or that it’s always a good idea to alter terrorists’ cost-benefit calculations, there’s the problem implied by the tactic’s name: people on the receiving end of terrorism, and not just the immediate victims, do, in fact, enter a state of terror. The emotion—and its companion, thirst for revenge—inevitably figure large in the political life of the targeted country. As Cronin dryly notes, “In the wake of major attacks, officials tend to respond (very humanly) to popular passions and anxiety, resulting in policy made primarily on tactical grounds and undermining their long-term interests. Yet this is not an effective way to gain the upper hand against nonstate actors.” The implication is that somewhere in the world there might be a politician with the skill to get people to calm down about terrorists in their midst, so that a rational policy could be pursued. That’s hard to imagine.
Another fundamental problem in counterterrorism emerges from a point many of the experts agree on: that terrorism, uniquely horrifying as it is, doesn’t belong to an entirely separate and containable realm of human experience, like the one occupied by serial killers. Instead, it’s a tactic whose aims bleed into the larger, endless struggle of people to control land, set up governments, and exercise power. History is about managing that struggle, sometimes successfully, sometimes not, rather than eliminating the impulses that underlie it.
For Americans, the gravest terrorist threat right now is halfway across the world, in Iraq, Afghanistan, and Pakistan. On paper, in all three countries, the experts’ conceptual model works. Lesser terrorist groups remain violent but seem gradually to lose force, and greater ones rise to the level of political participation. At least some elements of the Taliban have been talking with the Afghan government, with the United States looking on approvingly. In Iraq, during the recent elections, some Sunni groups set off bombs near polling places, but others won parliamentary seats. Yet this proof of concept does not solve the United States’ terrorism problem. Iraq, Afghanistan, and Pakistan all have pro-American governments that are weak. They don’t have firm control over the area within their borders, and they lack the sort of legitimacy that would make terrorism untempting. Now that General Petraeus is the head of the Central Command and has authority over American troops in the region, our forces could practice all that he has preached, achieve positive results, and still be unable to leave, because there is no national authority that can be effective against terrorism.
Long ago, great powers that had vital interests far away simply set up colonies. That wound up being one of the leading causes of terrorism. Then, as an alternative to colonialism, great powers supported dictatorial client states. That, too, often led to terrorism. During the Bush Administration, creating democracies (by force if necessary) in the Middle East was supposed to serve American interests, but, once again, the result was to increase terrorism. Even if all terrorism turns out to be local, effective, long-running counterterrorism has to be national. States still matter most. And finding trustworthy partner states in the region of the world where suicide bombers are killing Americans is so hard that it makes fighting terrorism look easy.
Image via Wikipedia
THE HISTORY OF WHITE PEOPLE
by Nell Irvin Painter
Illustrated. 496 pp. W. W. Norton & Company. $27.95
Review by Linda Gordon
Nell Irvin Painter’s title, “The History of White People,” is a provocation in several ways: it’s monumental in sweep, and its absurd grandiosity should call to mind the fact that writing a “History of Black People” might seem perfectly reasonable to white people. But the title is literally accurate, because the book traces characterizations of the lighter-skinned people we call white today, starting with the ancient Scythians. For those who have not yet registered how much these characterizations have changed, let me assure you that sensory observation was not the basis of racial nomenclature.
Some ancient descriptions did note color, as when the ancient Greeks recognized that their “barbaric” northern neighbors, Scythians and Celts, had lighter skin than Greeks considered normal. Most ancient peoples defined population differences culturally, not physically, and often regarded lighter people as less civilized. Centuries later, European travel writers regarded the light-skinned Circassians, a k a Caucasians, as people best fit only for slavery, yet at the same time labeled Circassian slave women the epitome of beauty. Exoticizing and sexualizing women of allegedly inferior “races” has a long and continuous history in racial thought; it’s just that today they are usually darker-skinned women.
“Whiteness studies” have so proliferated in the last two decades that historians might be forgiven a yawn in response to being told that racial divisions are fundamentally arbitrary, and that deciding who is white has been not only fluid but also heavily influenced by class and culture. In some Latin American countries, for example, the term blanquearse, to bleach oneself, is used to mean moving upward in class status. But this concept — the social and cultural construction of race over time — remains harder for many people to understand than, say, the notion that gender is a social and cultural construction, unlike sex. As recently as 10 years ago, some of my undergraduate students at the University of Wisconsin heard my explanations of critical race theory as a denial of observable physical differences.
I wish I had had this book to offer them. Painter, a renowned historian recently retired from Princeton, has written an unusual study: an intellectual history, with occasional excursions to examine vernacular usage, for popular audiences. It has much to teach everyone, including whiteness experts, but it is accessible and breezy, its coverage broad and therefore necessarily superficial.
The modern intellectual history of whiteness began among the 18th-century German scholars who invented racial “science.” Johann Joachim Winckelmann made the ancient Greeks his models of beauty by imagining them white-skinned; he may even have suppressed his own (correct) suspicion that their statues, though copied by the Romans in white marble, had originally been painted. The Dutchman Petrus Camper calculated the proportions and angles of the ideal face and skull, and produced a scale that awarded a perfect rating to the head of a Greek god and ranked Europeans as the runners-up, earning 80 out of 100. The Englishman Charles White collected skulls that he arranged from lowest to highest degree of perfection. He did not think he was seeing the gradual improvement of the human species, but assumed rather the polygenesis theory: the different races arose from separate divine creations and were designed with a range of quality.
The modern concept of a Caucasian race, which students my age were taught in school, came from Johann Friedrich Blumenbach of Göttingen, the most influential of this generation of race scholars. Switching from skulls to skin, he divided humans into five races by color — white, yellow, copper, tawny, and tawny-black to jet-black — but he ascribed these differences to climate. Still convinced that people of the Caucasus were the paragons of beauty, he placed residents of North Africa and India in the Caucasian category, sliding into a linguistic analysis based on the common derivation of Indo-European languages. That category, Painter notes, soon slipped free of any geographic or linguistic moorings and became a quasi-scientific term for a race known as “white.”
Some great American heroes, notably Thomas Jefferson and Ralph Waldo Emerson, absorbed Blumenbach’s influence but relabeled the categories of white superiority. They adopted the Saxons as their ideal, imagining Americans as direct and unalloyed descendants of the English, later including the Germans. In general, Western labels for racial superiority moved thus: Caucasian → Saxon → Teutonic → Nordic → Aryan → white/Anglo.
The spread of evolutionary theory required a series of theoretical shifts, to cope with changing understandings of what is heritable. When hereditary thought produced eugenics, the effort to breed superior human beings, it relied mostly on inaccurate genetics. Nevertheless, eugenic “science” became authoritative from the late 19th century through the 1930s. Eugenics gave rise to laws in at least 30 states authorizing forced sterilization of the ostensibly feeble-minded and the hereditarily criminal. Painter cites an estimate of 65,000 sterilized against their will by 1968, after which a combined feminist and civil rights campaign succeeded in radically restricting forced sterilization. While blacks and American Indians were disproportionately victimized, intelligence testing added many immigrants and others of “inferior stock,” predominantly Appalachian whites, to the rolls of the surgically sterilized.
In the long run, the project of measuring “intelligence” probably did more than eugenics to stigmatize and hold back the nonwhite. Researchers gave I.Q. tests to 1,750,000 recruits in World War I and found that the average mental age, for those 18 and over, was 13.08 years. That experiment in mass testing failed owing to the Army’s insistence that even the lowest ranked usually became model soldiers. But I.Q. testing achieved success in driving the anti-immigration movement. The tests allowed calibrated rankings of Americans of different ancestries — the English at the top, Poles on the bottom. Returning to head measurements, other researchers computed with new categories the proportion of different “blood” in people of different races: Belgians were 60 percent Nordic (the superior European race) and 40 percent Alpine, while the Irish were 30 percent Nordic and 70 percent Mediterranean (the inferior European race). Sometimes politics produced immediate changes in these supposedly objective findings: World War I caused the downgrading of Germans from heavily Nordic to heavily Alpine.
Painter points out, but without adequate discussion, that the adoration of whiteness became particularly problematic for women, as pale blue-eyed blondes became, like so many unattainable desires, a reminder of what was second-class about the rest of us. Among the painfully comic absurdities that racial science produced was the “beauty map” constructed by Francis Galton around the turn of the 20th century: he classified people as good, medium or bad; he categorized those he saw by using pushpins and thus demonstrated that London ranked highest and Aberdeen lowest in average beauty.
Rankings of intelligence and beauty supported escalating anti-Catholicism and anti-Semitism in early-20th-century America. Both prejudices racialized non-Protestant groups. But Painter misses some crucial regional differences. While Jews and Italians were nonwhite in the East, they had long been white in San Francisco, where the racial “inferiors” were the Chinese. Although the United States census categorized Mexican-Americans as white through 1930, census enumerators in the Southwest, working from a different racial understanding, ignored those instructions and marked them “M” for Mexican.
In the same period, anarchist or socialist beliefs became a sign of racial inferiority, a premise strengthened by the presence of many immigrants and Jews among early-20th-century radicals. Whiteness thus became a method of stigmatizing dissenting ideas, a marker of ideological respectability; Painter should have investigated this phenomenon further. Also missing from the book is an analysis of the all-important question: Who benefits and how from the imprimatur of whiteness? Political elites and employers of low-wage labor, to choose just two groups, actively policed the boundaries of whiteness.
But I cannot fault Nell Painter’s choices — omissions to keep a book widely readable. Often, scholarly interpretation is transmitted through textbooks that oversimplify and even bore their readers with vague generalities. Far better for a large audience to learn about whiteness from a distinguished scholar in an insightful and lively exposition.
Linda Gordon is a professor of history at New York University and the author, most recently, of “Dorothea Lange: A Life Beyond Limits.”