Is a vegan muffin a form of animal activism?
I ask this question because it seems that every other tweet that enters my cultivated Twittersphere is a celebratory shout-out for some new vegan food product. Vegan donuts! Vegan cookies! Vegan bean burgers at Wendy’s!!
How to interpret these products? Of course, they offer vegans more commercial options and, as we conventionally understand matters, having more commercial options is a good thing. Likewise, there’s always the possibility that a non-vegan will see the vegan option and think, “you know, I’ll have the more humane pancake today.”
But the idea that a deeper respect for animals will emerge from greater consumer choice seems like a flimsy prospect at best. I mean, the vast majority of natural food that currently exists is already vegan. Shouldn’t we be promoting the consumption of apples and carrots rather than vegan versions of crap food? Plus, isn’t there a danger here? What if your vegan muffin tastes like a hockey puck? You could be affirming more non-veganism than veganism.
That said, I’m thrilled by genuine vegan substitutes—rather than supplements—that enter the food system closer to the point of production. Think about what Josh Tetrick and Hampton Creek is doing. Rather than add a vegan product to a shelf already sagging with non-vegan options, he’s aiming (in part) to enter the ingredient stream at an earlier stage, ensuring that what makes it to the shelves did so because of cheaper vegan substitutes (pea plant eggs rather than real egg whites).
Vegans are so besieged by the carnivorous reality that suffuses daily life that we tend to overplay the meaning of vegan versions of products that, by their nature, are sort of inextricably linked with animal products or, again, are just crap. Honestly, a vegan muffin is, in the grand scheme of things, just another worthless pile of calories.
Don’t get me wrong. A mock tuna sandwich is a delightful thing when done right. And I love that I can choose a mock tuna sandwich under certain circumstances. But it’s still a mock, an approximation of what’s “real.” And while it’s fun to think we can co-opt authenticity and raise our vegan muffins skyward and call them, simply, muffins–veganism implied. But come on.
Why not just go with the apple?
Yesterday a reporter asked me why so many athletes who went vegan found themselves feeling weak and sick. It’s a narrative that, as a runner, I hear a lot. My first–and I think the most sensible–reaction is to explain that many vegans simply do it wrong. They replace calories once obtained from animal products with processed junk food rather than nutrient dense plants. And they feel like shit.
I tell them about my vegan friend Yetik, with whom I’m currently training for a 50-mile race that we’re doing at the end of September. At 80+ miles a week in hot/humid conditions, our physical and nutritional needs are especially intense. I’m adding a lot of legumes, peanut butter, root veggies, and quinoa to my diet–and feeling great. Yetik, though, was flagging for several weeks. But when I suggested adding some more B-12 and quinoa, and he did, he had a noticeable rebound and is feeling strong. Fact is, he’s a running demon.
But still, there are cases in which athletes eat a smart vegan diet and still feel like something is missing, that some level of energy has been lost. And it is also very likely true that a piece of lean meat or a bowl of yogurt would ameliorate the situation for that runner, even if the affect was more placebo than real. In these situations, I find myself less able to offer advice that will be realistically accepted.
Going vegan is a wonderfully pragmatic way to respond to the myriad ecological and ethical problems endemic to the American way of eating. Do it. But it’s also a radically counter-cultural thing to do. Those who make the transition, and see the benefits, as I have, are far more likely to embrace and stick with veganism than those who are asked to not only make a socially-ostracizing counter-cultural shift but, at the same time, suffer a physical consequence, however seemingly minor, as a result.
This scenario raises many interesting questions. To what extent is an individual obligated to sacrifice a personal sense of physical health in order to stick to the moral ideals of veganism? Is there a point at which an individual’s sense of physical well being becomes so compromised that the morality of eating meat changes, whereby eating a piece of lean salmon once a week becomes more justifiable than it would be for a non-compromised person?
I have no answers, but I’m in an inquisitive mood as I contemplate a return to daily blogging here at the Pitchfork. Looking forward to your thoughts.
Every diet is an aspiration to an ideal. Consequently, every diet is easy to criticize. Vegans aspire to avoid harming animals, but critics note that plant crops require the mass extermination of innumerable wild critters. Weight Watchers aspires to reduce body mass index with a calories-in/calories-out approach, but critics note that not all calories are equal. The macrobiotic diet aspires to balance the yin with the yang; critics note that they have absolutely no earthly idea what this might mean.
If every diet is open to criticism, the paleo diet—also called the “caveman diet”—is in a league of its own. The dietary practices of the Paleolithic period centered on hunted-and-gathered meat, seafood, fruits, nuts, seeds, and vegetables. It excluded grains, legumes, dairy, and refined sugar. Paleo advocates argue that cavemen thrived on these foods, growing tall and avoiding the lifestyle diseases that plague “the moderns,” as some paleos prefer to call the rest of us. But critics deem the quest to replicate the pre-agrarian diet not only delusional—primarily because equivalent foods no longer exist—but also ignorant of human evolution. Recently, as a sort of nail in the coffin of the diet’s besieged reputation, a much-anticipated book on raising paleo babies was pulled at the last minute for lack of scientific evidence.
I’ve been critical of the paleo diet in the past, primarily because of its heavy reliance on meat consumption. But recently I wondered: What would happen if I examined the diet differently? That is, what if I examined its aspirations rather than its failure to achieve an ideal? What if I watched the diet at work in the hands of a master, a true believer, a genuine beneficiary of what’s too easy to dismiss as a fad?
To explore these questions I shelved my presuppositions and went to central Maine to visit Arthur Haines. Haines is an ethno-botanist and paleo advocate who runs the Delta Institute of Natural History, a program that organizes workshops on “neoaboriginal lifeways.” In an attempt to reach “everyone seeking an alternative to the current paradigm of living,” he instructs students on how eat an aboriginal diet, focusing on trapping, foraging, and hunting skills, as well as wild medicinal cures and the finer points of ancestral child rearing. For what it’s worth, Haines, who has developed a loyal YouTube following, is as sturdy as an ox, healthy as a horse, and has a gentle, understated presence.
But there’s nothing gentle or understated about what he eats for lunch. On the occasion of my visit, it’s a heap of pre-agrarian grub. Haines piles his plate with wild rice he recently harvested, gravy made from reduced bone broth, and venison shot and processed last autumn (before being canned for preservation). He leans over the table and eats with urgency. He scoops out seconds while his partner, Nicole Leavitt, and their 18-month-old daughter, Samara, work more deliberately through their first servings. Samara eats exactly what her parents eat—she always has (her parents chewed her meat for her before she teethed). Just as I was wondering to myself how Arthur and Nicole made it through Maine winters, in relative isolation, without so much as a warming drop of alcohol, Nicole plunked down a bottle of homemade mead on the table. Mead is a fermented honey drink that tastes something like Riesling. It was thus with a stomach full of wild rice and a head buzzing with mead that I finally saw what I came to see: Haines in action.
Read more here.
What follows are some thoughts I’m guessing most Americans will not be celebrating as they fire up the grill to celebrate the 4th of July. The whole thing is a trigger warning to your holiday happiness. –jm
If you’re occasionally confounded by the persistence of American optimism in the midst of ongoing socioeconomic despair, it helps to revisit the driving themes of American history during the nation’s infancy: slavery, republican ideology, Manifest Destiny. As a historian, this is what I do to make sense of the many contradictions at the core of American life.
It’s also, as a professor, what I teach.
Working together, these facets of the American experience fueled national development and generated relative prosperity for white men willing to pull up stakes and Go West. They also made possible the wildly quixotic idea of an America “for the people,” a bold conceit that whiggish historians insist—events such as Ferguson, Missouri notwithstanding—we’re getting closer and closer and closer to achieving. Proof that Americans—and Americans alone—have swallowed the pill of historical optimism comes from a recent Pew study showing Americans to have the most positive outlook on life. Even as other wealthy nations grow increasingly depressed in the face of global events, Americans, well, we just keep on shining.
What accounts for our sunny disposition? In a word: delusion. There’s really no other way to explain it. It’s at our core. What the dominant narrative of American history routinely fails to note is that each of the defining phenomena (slavery, republicanism, and Manifest Destiny) emerged from self-serving and carefully-crafted delusions—delusions sowed in the colonial period and delusions that bloomed like a field of dandelions after the American Revolution to perpetuate the fiction that the pursuit of happiness was integral to a concept that today seems more rhetorically relevant than ever: “American exceptionalism.”
Systematic self-deception began with Native Americans and private property. English settlers such as John Winthrop fully understood that the Native American conception of property—based on what Jefferson would later call “usufruct rights”—ran counter to the English conception of property (based on “fee-simple” ownership). Rather than acknowledge this difference, English settlers (with the exception of, say, Roger Williams) exploited it. They acted as if Native Americans lacked requisite long-term interest in the land they farmed and hunted and, based on this assessment, acquired that land through twisted and cynical legal fiat. Eventually, as the stereotype of native savagery became established in the white American mind, the foundation solidified for Andrew Jackson’s extermination project (1811-1836), a historically underplayed event that cleared space for an especially crazed delusion of Manifest Destiny and the concomitant notion that God personally chose white Americans to settle the west.
Expansion required slavery and slavery was an even more insidious form of the delusional thinking rotting the core of America’s founding. Diaries of slave owners (Virginia’s Landon Carter’s is a remarkable example) repeatedly confirmed the obvious reality of slave personhood. Slaves and masters interacted routinely as human beings mutually engaged in the project of plantation development and export trade (not to mention abuse, trade, and sex). But these very same white men were the ones who crafted constitutional compromises (three-fifths being the most notable) that explicitly belied the social history of white-black relationships as they played out on the ground. American “freedom” was, as the historian Edmund Morgan has argued, impossible without American slavery. This country—which was twenty-five-percent slave-based at its inception—has yet to confront the deceptive habit of mind that shackled these two realities—freedom and slavery—so tightly that we’re still trying to pry off the irons today.
Finally, there was the ideological appeal of republicanism. This imported way of thinking, coming from the motherland in the 1720s, resonated throughout the colonies as both a bulwark against corruption and an affirmation of natural English rights, representative government, and independence. Few espoused the virtues of independence—and translated them into revolutionary action—more triumphantly than the tobacco growers of Virginia. Tobacco was deeply intertwined with the republican project being forged by the founders.
But economic reality turned both ideology and tobacco into smoke. History books rarely note that those who most passionately espoused the virtues of independence were truly enervated men who were enslaved by debt to English lenders. When Americans were most in thrall to the idea of liberty, they were also at their most vulnerable and dependent. What enabled them to ignore the contradiction was nothing short of delusion—a self-serving one, given than independence would temporarily release them from economic bondage, authorize them to sell slaves down river, and fund the deep southern transition to cotton, empowering a contradiction so deep that it would take a war to resolve it.
American life today is infused with this legacy of deception. It has become an unthinking habit obscured in layers of upbeat historical narration. The Occupy Movement is one example of a momentary blip of consciousness, a fierce little awakening, that spurred us to wonder: Why do so many Americans choose groundless optimism over justifiable rebellion against an elite that gives new meaning to the term “super-rich” every day? Why do people still believe they can hit rock bottom, work three jobs, and thrive when, in fact, they sink deeper into debt? Why do we fail to see the oppression of immigrants as a modern-day version of slavery?
The answers to these questions (and others) lie in our history of delusion, a history that thrives on the perpetuation of a fiction, a fiction so intoxicating that we structure our lives all too successfully to avoid confronting. Our optimism is, in this sense, our addiction.
From the look of things, you’d be correct in thinking that a revolution in food production was underway. Calls for local, sustainable, slow, humane, organic, non-genetically modified, fair-wage, “real” food are not only ubiquitous, they’ve inspired a farm-to-table movement that seeks to end industrialized agriculture, empower small farmers, and replace Walmart with farmers markets. Hundreds if not thousands of books, articles, foundations, academic conferences, and documentaries have joined the cause, rallying around the idea that industrial agriculture should—and can—and will be stopped.
These efforts have spawned a unique public discourse, one ubiquitously re-iterating the message that industrial agriculture wreaks ecological havoc, endangers human health, and exploits workers in order to produce food that’s overly processed, overly cheap, and overly globalized. Given the intensity of this culinary zeitgeist (not to mention the fact it gets very little critical inquiry from an adoring media), there’s every reason to think that food-reform-minded Americans, voting with their forks, are finally changing how Americans eat.
It is always difficult to get beyond the rhetoric and quantify such trends, but one metric seems safe to assume: If the movement were working, factory farms would be in decline. But, as a report just released by Food and Water Watch reveals, the exact opposite is happening. While muckrakers have been exposing every hint of corruption in corporate agriculture, and while reformers have been busy creating programs to combat industrial agriculture with localized, “real food” alternatives, factory farms have been proliferating like superweeds in a field of Monsanto corn.
Last February the Waste and Resources Action Program—a British anti-waste organization—reported that one-third of the food produced globally is never eaten. That’s roughly two billion metric tons of edible waste that ends up in landfills, where it emits about seven percent of the world’s total greenhouse gas emissions. Environmentalists have taken notice and the problem—food waste—is now a serious environmental concern. “If more and more people recognize their own food waste,” writes Jonathan Bloom, author of American Wasteland, “we can take a bite out of this problem.”
As Bloom suggests, reformers have largely placed responsibility for reducing food waste in the hands of consumers. More often than not we are rightfully admonished to eat the whole metaphorical hog as an act of ecological redemption.
“Leftovers can be turned into completely different meals,” writes one reader of Bloom’s blog, Wasted Food. “To better utilize food,” writes another, “use the whole animal” (the reader suggests bone broth). “STIR-FRY,” says a third, who claims to “live in a forest.” Throw in the prevailing advocacy for eating “ugly fruit” and the intrepid dumpster divers and you have a landscape of waste warriors crusading to achieve meaningful reform by piling our plates with food items we’d normally toss (or have already tossed).
The most conspicuous example of this eat-the-leftovers approach to reducing food waste recently came from celebrity chef Dan Barber. For a stretch of time in March, Barber cleared out his famous Blue Hill restaurant and replaced it with a “pop-up” creation—called WastED—that served food scraps salvaged from commercial kitchens. For $85 a meal, patrons could sample an array of dishes cooked with recycled culinary debris, including pickle butts, carrot tops, offal, and skate-wing cartilage. Exchanging spare ribs for kale ribs, diners were able to experience a culinary novelty while doing a good deed for the environment. It was the American way of reform epitomized: Fix the problem by buying something that makes you happy.
Read more here.
“We had no idea that we were going to see what we saw.” These are never words you want to hear about a slaughterhouse. But they’re exactly what Adam Wilson, the director of investigations at Last Chance for Animals, a Los Angeles-based animal advocacy group, said about his organization’s recent investigation of Pel-Freez, the nation’s largest rabbit processing plant, located in Rogers, Arkansas.
The details, obtained by an undercover agent who worked at Pel-Freez as a “blood catcher” for six weeks last fall, are, even by abattoir standards, morbid. Slaughterhouse workers were filmed improperly stunning rabbits by whacking them in the face with the dull side of a knife (electrical stunning is the norm); they broke the legs of conscious rabbits to better fit them onto J-hooks designed for poultry; they decapitated fully conscious rabbits; and they ignored grievous rabbit injuries. Wilson noted how, in one instance, a worker encountered an abscessed wound on a rabbit so filled with pus that he wretched.
When the media first started covering the California drought it did so from the perspective of the specific foods we eat. Given that 80 percent of the state’s water is used for agriculture, this would seem to make sense. Mother Jones crusaded against the water-hogging impact of nuts, especially almonds. Michael Pollan, seizing on an illuminating Los Angeles Times infographic, took to Twitter and declared California lentils verboten. I highlighted the disproportionate share of the state’s water consumed by beef and dairy, specifically the alfalfa crop that helps sustain these industries.
The obvious benefit of this approach is that it empowers consumers. As a consumer, I feel good about not eating beef and a little guilty about the almond milk in my fridge. I feel compelled to purchase lentils from France but comforted by the fact that beer has a relatively low water footprint. I agree that much of the produce grown in the Central and Imperial Valleys should be grown in the Midwest, but until that happens (don’t hold your breath), I’m motivated to make concrete choices that address California’s water crisis. Hard data about specific foods helps me do this.
An aggressive strain of avian flu—the largest to appear in the United States in over 30 years—has forced Midwestern chicken and turkey producers to cull over 15.1 million birds since early March. Most of these losses have occurred since mid-April. The virus, which doesn’t appear to pose an immediate threat to humans, has spread to 10 states. Iowa and Minnesota have been hit the hardest.
The economic impact of the virus—called H5N2—has been severe. Mexico and China have halted the importation of U.S. birds and eggs. Hormel Food Corps, the nation’s second largest supplier of turkey meat, highlights “significant challenges” as it forecasts lower earnings. Contract growers, who have little recourse under such circumstances, are stuck with mortgaged farms and no income. At a meeting in Minnesota some of these growers broke down in tears. “Are we done?,” Iowa’s Secretary of Agriculture Bill Northey asked about the flu. The answer, it seems, is no. Not even close.
How should consumers interpret this situation? The conventional critique of such epidemics is that they result from industrial over-crowding—cramming too many birds into a tight space. GRAIN, a non-profit organization dedicated to sustainable agriculture, articulated this position during a 2006 H1N2 virus outbreak. The virus, it contended, was “essentially a problem of industrial poultry practices.” The proper response, it implied, was obvious: a transition to non-industrial, pasture-based management. Commenting on the current outbreak, Wayne Pacelle, CEO of the Humane Society of the United States, agreed with this perspective, writing that “the root cause” of the bird flu is “inhumane, overcrowded conditions in the poultry industry.
A direct, causal relationship between avian flu and industrial conditions would be fantastic news. Most notably, it would allow us to begin systematically fighting the disease through a surefire method: providing chickens and turkeys more space to roam. Unfortunately, the etiology of avian flu doesn’t support this connection. The problem of avian flu, it turns out, transcends farm size and stocking density and cuts right to the core of animal domestication per se.
Read more here.