Archive for the ‘Eating Plants’ Category
What follows are some thoughts I’m guessing most Americans will not be celebrating as they fire up the grill to celebrate the 4th of July. The whole thing is a trigger warning to your holiday happiness. –jm
If you’re occasionally confounded by the persistence of American optimism in the midst of ongoing socioeconomic despair, it helps to revisit the driving themes of American history during the nation’s infancy: slavery, republican ideology, Manifest Destiny. As a historian, this is what I do to make sense of the many contradictions at the core of American life.
It’s also, as a professor, what I teach.
Working together, these facets of the American experience fueled national development and generated relative prosperity for white men willing to pull up stakes and Go West. They also made possible the wildly quixotic idea of an America “for the people,” a bold conceit that whiggish historians insist—events such as Ferguson, Missouri notwithstanding—we’re getting closer and closer and closer to achieving. Proof that Americans—and Americans alone—have swallowed the pill of historical optimism comes from a recent Pew study showing Americans to have the most positive outlook on life. Even as other wealthy nations grow increasingly depressed in the face of global events, Americans, well, we just keep on shining.
What accounts for our sunny disposition? In a word: delusion. There’s really no other way to explain it. It’s at our core. What the dominant narrative of American history routinely fails to note is that each of the defining phenomena (slavery, republicanism, and Manifest Destiny) emerged from self-serving and carefully-crafted delusions—delusions sowed in the colonial period and delusions that bloomed like a field of dandelions after the American Revolution to perpetuate the fiction that the pursuit of happiness was integral to a concept that today seems more rhetorically relevant than ever: “American exceptionalism.”
Systematic self-deception began with Native Americans and private property. English settlers such as John Winthrop fully understood that the Native American conception of property—based on what Jefferson would later call “usufruct rights”—ran counter to the English conception of property (based on “fee-simple” ownership). Rather than acknowledge this difference, English settlers (with the exception of, say, Roger Williams) exploited it. They acted as if Native Americans lacked requisite long-term interest in the land they farmed and hunted and, based on this assessment, acquired that land through twisted and cynical legal fiat. Eventually, as the stereotype of native savagery became established in the white American mind, the foundation solidified for Andrew Jackson’s extermination project (1811-1836), a historically underplayed event that cleared space for an especially crazed delusion of Manifest Destiny and the concomitant notion that God personally chose white Americans to settle the west.
Expansion required slavery and slavery was an even more insidious form of the delusional thinking rotting the core of America’s founding. Diaries of slave owners (Virginia’s Landon Carter’s is a remarkable example) repeatedly confirmed the obvious reality of slave personhood. Slaves and masters interacted routinely as human beings mutually engaged in the project of plantation development and export trade (not to mention abuse, trade, and sex). But these very same white men were the ones who crafted constitutional compromises (three-fifths being the most notable) that explicitly belied the social history of white-black relationships as they played out on the ground. American “freedom” was, as the historian Edmund Morgan has argued, impossible without American slavery. This country—which was twenty-five-percent slave-based at its inception—has yet to confront the deceptive habit of mind that shackled these two realities—freedom and slavery—so tightly that we’re still trying to pry off the irons today.
Finally, there was the ideological appeal of republicanism. This imported way of thinking, coming from the motherland in the 1720s, resonated throughout the colonies as both a bulwark against corruption and an affirmation of natural English rights, representative government, and independence. Few espoused the virtues of independence—and translated them into revolutionary action—more triumphantly than the tobacco growers of Virginia. Tobacco was deeply intertwined with the republican project being forged by the founders.
But economic reality turned both ideology and tobacco into smoke. History books rarely note that those who most passionately espoused the virtues of independence were truly enervated men who were enslaved by debt to English lenders. When Americans were most in thrall to the idea of liberty, they were also at their most vulnerable and dependent. What enabled them to ignore the contradiction was nothing short of delusion—a self-serving one, given than independence would temporarily release them from economic bondage, authorize them to sell slaves down river, and fund the deep southern transition to cotton, empowering a contradiction so deep that it would take a war to resolve it.
American life today is infused with this legacy of deception. It has become an unthinking habit obscured in layers of upbeat historical narration. The Occupy Movement is one example of a momentary blip of consciousness, a fierce little awakening, that spurred us to wonder: Why do so many Americans choose groundless optimism over justifiable rebellion against an elite that gives new meaning to the term “super-rich” every day? Why do people still believe they can hit rock bottom, work three jobs, and thrive when, in fact, they sink deeper into debt? Why do we fail to see the oppression of immigrants as a modern-day version of slavery?
The answers to these questions (and others) lie in our history of delusion, a history that thrives on the perpetuation of a fiction, a fiction so intoxicating that we structure our lives all too successfully to avoid confronting. Our optimism is, in this sense, our addiction.
From the look of things, you’d be correct in thinking that a revolution in food production was underway. Calls for local, sustainable, slow, humane, organic, non-genetically modified, fair-wage, “real” food are not only ubiquitous, they’ve inspired a farm-to-table movement that seeks to end industrialized agriculture, empower small farmers, and replace Walmart with farmers markets. Hundreds if not thousands of books, articles, foundations, academic conferences, and documentaries have joined the cause, rallying around the idea that industrial agriculture should—and can—and will be stopped.
These efforts have spawned a unique public discourse, one ubiquitously re-iterating the message that industrial agriculture wreaks ecological havoc, endangers human health, and exploits workers in order to produce food that’s overly processed, overly cheap, and overly globalized. Given the intensity of this culinary zeitgeist (not to mention the fact it gets very little critical inquiry from an adoring media), there’s every reason to think that food-reform-minded Americans, voting with their forks, are finally changing how Americans eat.
It is always difficult to get beyond the rhetoric and quantify such trends, but one metric seems safe to assume: If the movement were working, factory farms would be in decline. But, as a report just released by Food and Water Watch reveals, the exact opposite is happening. While muckrakers have been exposing every hint of corruption in corporate agriculture, and while reformers have been busy creating programs to combat industrial agriculture with localized, “real food” alternatives, factory farms have been proliferating like superweeds in a field of Monsanto corn.
Last February the Waste and Resources Action Program—a British anti-waste organization—reported that one-third of the food produced globally is never eaten. That’s roughly two billion metric tons of edible waste that ends up in landfills, where it emits about seven percent of the world’s total greenhouse gas emissions. Environmentalists have taken notice and the problem—food waste—is now a serious environmental concern. “If more and more people recognize their own food waste,” writes Jonathan Bloom, author of American Wasteland, “we can take a bite out of this problem.”
As Bloom suggests, reformers have largely placed responsibility for reducing food waste in the hands of consumers. More often than not we are rightfully admonished to eat the whole metaphorical hog as an act of ecological redemption.
“Leftovers can be turned into completely different meals,” writes one reader of Bloom’s blog, Wasted Food. “To better utilize food,” writes another, “use the whole animal” (the reader suggests bone broth). “STIR-FRY,” says a third, who claims to “live in a forest.” Throw in the prevailing advocacy for eating “ugly fruit” and the intrepid dumpster divers and you have a landscape of waste warriors crusading to achieve meaningful reform by piling our plates with food items we’d normally toss (or have already tossed).
The most conspicuous example of this eat-the-leftovers approach to reducing food waste recently came from celebrity chef Dan Barber. For a stretch of time in March, Barber cleared out his famous Blue Hill restaurant and replaced it with a “pop-up” creation—called WastED—that served food scraps salvaged from commercial kitchens. For $85 a meal, patrons could sample an array of dishes cooked with recycled culinary debris, including pickle butts, carrot tops, offal, and skate-wing cartilage. Exchanging spare ribs for kale ribs, diners were able to experience a culinary novelty while doing a good deed for the environment. It was the American way of reform epitomized: Fix the problem by buying something that makes you happy.
Read more here.
“We had no idea that we were going to see what we saw.” These are never words you want to hear about a slaughterhouse. But they’re exactly what Adam Wilson, the director of investigations at Last Chance for Animals, a Los Angeles-based animal advocacy group, said about his organization’s recent investigation of Pel-Freez, the nation’s largest rabbit processing plant, located in Rogers, Arkansas.
The details, obtained by an undercover agent who worked at Pel-Freez as a “blood catcher” for six weeks last fall, are, even by abattoir standards, morbid. Slaughterhouse workers were filmed improperly stunning rabbits by whacking them in the face with the dull side of a knife (electrical stunning is the norm); they broke the legs of conscious rabbits to better fit them onto J-hooks designed for poultry; they decapitated fully conscious rabbits; and they ignored grievous rabbit injuries. Wilson noted how, in one instance, a worker encountered an abscessed wound on a rabbit so filled with pus that he wretched.
When the media first started covering the California drought it did so from the perspective of the specific foods we eat. Given that 80 percent of the state’s water is used for agriculture, this would seem to make sense. Mother Jones crusaded against the water-hogging impact of nuts, especially almonds. Michael Pollan, seizing on an illuminating Los Angeles Times infographic, took to Twitter and declared California lentils verboten. I highlighted the disproportionate share of the state’s water consumed by beef and dairy, specifically the alfalfa crop that helps sustain these industries.
The obvious benefit of this approach is that it empowers consumers. As a consumer, I feel good about not eating beef and a little guilty about the almond milk in my fridge. I feel compelled to purchase lentils from France but comforted by the fact that beer has a relatively low water footprint. I agree that much of the produce grown in the Central and Imperial Valleys should be grown in the Midwest, but until that happens (don’t hold your breath), I’m motivated to make concrete choices that address California’s water crisis. Hard data about specific foods helps me do this.
An aggressive strain of avian flu—the largest to appear in the United States in over 30 years—has forced Midwestern chicken and turkey producers to cull over 15.1 million birds since early March. Most of these losses have occurred since mid-April. The virus, which doesn’t appear to pose an immediate threat to humans, has spread to 10 states. Iowa and Minnesota have been hit the hardest.
The economic impact of the virus—called H5N2—has been severe. Mexico and China have halted the importation of U.S. birds and eggs. Hormel Food Corps, the nation’s second largest supplier of turkey meat, highlights “significant challenges” as it forecasts lower earnings. Contract growers, who have little recourse under such circumstances, are stuck with mortgaged farms and no income. At a meeting in Minnesota some of these growers broke down in tears. “Are we done?,” Iowa’s Secretary of Agriculture Bill Northey asked about the flu. The answer, it seems, is no. Not even close.
How should consumers interpret this situation? The conventional critique of such epidemics is that they result from industrial over-crowding—cramming too many birds into a tight space. GRAIN, a non-profit organization dedicated to sustainable agriculture, articulated this position during a 2006 H1N2 virus outbreak. The virus, it contended, was “essentially a problem of industrial poultry practices.” The proper response, it implied, was obvious: a transition to non-industrial, pasture-based management. Commenting on the current outbreak, Wayne Pacelle, CEO of the Humane Society of the United States, agreed with this perspective, writing that “the root cause” of the bird flu is “inhumane, overcrowded conditions in the poultry industry.
A direct, causal relationship between avian flu and industrial conditions would be fantastic news. Most notably, it would allow us to begin systematically fighting the disease through a surefire method: providing chickens and turkeys more space to roam. Unfortunately, the etiology of avian flu doesn’t support this connection. The problem of avian flu, it turns out, transcends farm size and stocking density and cuts right to the core of animal domestication per se.
Read more here.
As far as media attention goes, April 11, 2014, was a banner day for Greg Finch. As the lone supplier of antibiotic-free, pastured Vermont pork to the highly acclaimed 5-Knives, a specialized supplier of local pork, Finch was offered what amounted to subsidized advertising space in the Burlington Free Press. The paper’s staff reporter, Sally Pollak—who told me she met Finch at a coffeehouse—served as stenographer for Finch, who delivered his talking points:
“To [raise pigs] without the modern crutches of medicine, it’s management that makes you successful…. Doing things the right way all the time…. I take the best information I can find and adapt it to what I do.”
“This time around, with local foods, the farmer is a big part of the market, which is the exciting part of it…. It’s more of a collaboration. It’s much better for the farmer, and more vibrant for the farm.”
“I’m very, very careful about bio-security.”
Experienced observers will recognize these remarks as boilerplate rhetoric, the kind that characterizes much of today’s food writing. A year later, though, Finch finds himself mired in media muck rather than admiration.
The Vermont Agency of Agriculture recently revealed that much of Finch’s “Vermont” pork came from Pennsylvania pigs. Twice a month Finch headed south to an auction house in New Holland, purchased 50 or so conventionally raised pigs, and hauled them back to the Green Mountain State, where he had them processed into “local” bellies, hams, and other choice cuts.
It was a profitable move while it lasted. Read more.
The following review essay appeared in the Spring 2015 edition of The Virginia Quarterly Review. A link to the complete article is below. Please leave comments there.
The worst thing about sausage is that it has to be made. We know this because a generation of journalists has infiltrated North America’s feedlots and slaughterhouses to expose the apparatus that churns out mass quantities of commodity meat. American agribusiness—wreaking havoc on animals, laborers, consumers, and planet Earth—is generally understood to be irredeemable. Today, enlightened consumers wouldn’t be caught dead near a Big Mac. For what it’s worth, that’s progress.
The reformist lexicon that fuels the outrage resonates with the political right, left, and everyone in between. A libertarian Virginia farmer fumes over the “industrial agriculture complex.” An Oxford-educated activist vents that “globalized corporate agriculture” has left us “stuffed and starved.” A poet-farmer whose horse-drawn plow breaks up Kentucky soil laments how “the ideal industrial food consumer would be strapped to a table with a tube running from the food factory directly into his or her stomach.” Yikes (and yuck).
Such visceral disgust makes one wonder: Just who are these people monopolizing the world’s food supply? Indeed, the strangest thing about antiagribusiness angst is that it rages full tilt without a real understanding of the machinations that empower the corporate leviathan. We’re routinely hit with dramatic visuals: the slaughterhouses, endless corn and soy fields, obesity charts, deforestation photos, undercover animal-abuse films, and battery-caged birds. But we ignore the sterile office space where the sausage-making playbook is written
Two books—Ted Genoways’s The Chain: Farm, Factory, and the Fate of Our Food and Christopher Leonard’s The Meat Racket: The Secret Takeover of America’s Food Business—begin to fill this gap. Genoways, a contributing writer at Mother Jones (and former editor of this publication), and Leonard, an investigative reporter, offer respective portrayals of Hormel and Tyson Foods that show how the brutality of the abattoir reflects the sangfroid of the boardroom, where cuts of a more metaphorical sort enhance the wealth of salaried executives at the expense of disposable wage workers.
My son, 13, has started a small business selling some of the photos he’s taken. He decided to do this after having unexpected success selling prints at an Austin arts fair. Feel free to check out his website and share it with others who might be interested. Most importantly, enjoy.