A central chapter in the “man was meant to eat meat” narrative insists that animal domestication reflected the natural human quest for flesh. That is to say, that the biological impulse to eat animals was so persuasive that it led humans to isolate chosen members of a wild species, coax them into genetic tractability, and then exploit them for food. On the surface, this claim seems sensible enough—if not beyond question.
But there’s a much more interesting (and historically accurate) way of thinking about the origins of animal domestication. In his excellent book Hunters, Herders, and Hamburgers, Richard Bulliet argues that animal domestication was almost certainly not a conscious strategy driven by an explicit desire to eat penned or pastured animals. Eating domesticated animals, according to Bulliet, was likely an afterthought, an unintended consequence of a lurching process that happened so gradually, and over so many generations, that humans didn’t even know it was taking place. “It is unimaginable,” he writes, “that the humans who ultimately reaped the benefits of domestication had any clear recollection of how their domestic stock originated.”
This line of investigation is necessarily speculative, but Bulliet keeps it real with thrilling hypotheses and convincing results. Painstakingly, he makes the case that animals might very well have been passively domesticated and maintained in an increasingly tractable state in order to control for trash (pigs), play roles in rituals (cows), provide amulets (bull’s penis as “a sign of power”), serve as status symbols, pull/carry things (horses), protect humans (wolves/dogs), and even provide immediate aesthetic gratification (birds). Nothing in his analysis prevents us from rightfully thinking that humans may even have wanted animals closer to them because we were curious, intrigued, and even overwhelmed by their beauty. All these motivations likely interacted and overlapped, all the while preceding the decision to domesticate animals for the primary purpose of eating them.
I think this is a truly important possibility to consider. Complicating the conventional domestication hypothesis is critical to countering the essentialist nature of the dominant carnivorous narrative, one that fails to question the primacy and centrality of meat consumption in human history. The whole debate about “were humans meant to eat meat” quite simply bores me. It bores me because it doesn’t matter what we were meant to eat. We eat it—and that is that. But what is relevant is the fact that today we control billions of animals to consume and this behavior seems perfectly normal—if not worthy of celebration—to most people, even people who think about these sort of issues. But it may not be “normal” in any true sense of the word.
Humans have been around in our current anatomical form for around 200,000 years or so. It is only in the last 6,000 or so that we have started to systematically consume the flesh of domesticated beasts. The fact they we have only been doing so for about 3 percent of human history should be enough to give us pause of the place of this behavior in the human condition. The possibility that we only were at it as secondary or tertiary endeavor should convince us to stop elevating the act of eating animals to the status of sleeping and breathing.
Here’s a nugget of advice for writers covering stories about the largely hidden emotional lives of animals: as you document nonhuman sentience don’t mention how delectable the animals are to eat. That’s bad form. It’s like writing about war and cracking jokes, or covering a house fire and joshing about all those zany! pyromaniacs.
In a way, it’s remarkable that one has to even note such an obvious point of writerly etiquette. But when it comes to journalism and animals, there are no codified rules, no standards that journalists need follow. So, when tasked with writing about a serious discovery bearing on animal cognition, journalists too often resort to inane attempts at cute humor in an effort to make the piece “entertaining.” This is especially the case when the topic is technical in nature.
But for anyone who knows anything about animal ethics, it’s not entertaining. It’s offensive. A recent article at Smithsonian.com reiterates why. The writer, a freelancer and Smithsonian contributor named Rachel Numer, opened with the news that crawfish—invertebrates—turn out to experience anxiety. That’s cool, and important. The author rightly notes that the conventional wisdom was once that only vertebrates worried. She suggests that the kind of anxiety under discussion is the kind that humans experience. In any other realm, this kind of connection would warrant a tone of gravitas, especially given the seriousness with which the scientists undertook their work (described quite well by Nuwer).
But animals don’t get the gravitas treatment. Nuwer, after reporting the critical kernel of news, somehow feels compelled to pepper her report with fluffy and whimsical asides, as if she were writing for fifth graders. She refers to “those delectable freshwater crustaceans,” which is a ridiculous thing to say about an animal upon whom you’re reporting news about its sophisticated intelligence. (Plus, it’s subjective. When I ate animals I found crawfish disgusting to eat.) Dumbing down the matter to an unprecedented degree, the author next includes a recipe for cooking crawfish, noting that “those [crawfish] that come with a boiling cauldron of Cajun spices, corn and potatoes (mmmm delicious)” will have undergone especially high levels of anxiety. Well, yeah.
Articles in which the writer clearly knows nothing about animal ethics typically include an unintentional contradiction—done by way of evasion—regarding the moral implications of the scientific discovery being described. Numar scores big in this front. She ignores several hundred years of ethical thinking about animals when she blithely assumes that human emotions are “more sophisticated.” She writes, “Crawfish, the team thinks, could serve as excellent study subjects for future anxiety research, as well as for exploring the evolutionary origins of more sophisticated (read: more distressing) forms of anxiety that occur in humans.” More sophisticated? How? What do we mean by sophistication? Has this writer heard the word “speciesist”? Comments such as these are understandable, given the peripheral nature of so much work being on animal ethics and behavior. But they scream for a corrective.
Proof that the author has no idea of her own complicity in fostering attitudes inimical to the findings she writes about, Nuwar concludes, “Unfortunately for the crustaceans, crawfish’s status as invertebrates means that many of the ethical protections their rodent counterparts enjoy are not extended to them.” With articles like this one, it’s not hard to see why.
Update: Please do not post comments personally lambasting the writer mentioned in this post. The Pitchfork is better than that! The point of this piece is to educate, not to insult. Calling the writer names will hardly initiate a change in her perspective. Thank you. -jm
“When it comes to restoring grasslands, ecologists may have another way to evaluate their progress — ants.” So begins Science Daily‘s recently featured research on the ecological impact of ants. Maybe the organizers of Slow Meat 2014—dedicated as they all are to restoring grasslands—should have invited the great myrmecologist E. O. Wilson to discuss pasture restoration rather than Allan Savory, who wants to stack global deserts cheek to jowl with cattle in order to make the dry lands bloom. As the lead researcher involved in the ant study, Laura Winkler, said, the impact of ants–who aerate the oil, protect plants, and attract wildlife—is “like having dairy cattle.” And, if we are carnivorously intent on taking a pound or two of flesh from the pasture, ants don’t have to go to the slaughterhouse. Plus, they do better in a drought. Read more about it here.
(Thanks to Mary Finelli for the tip.)
Next week Slow Food USA will host an event called Slow Meat 2014. Allan Savory, the current guru of rotational grazing, will deliver the keynote address. Obscuring the ethics of slaughter behind culinary rhetoric, the event—among other stunts—will “honor” the American bison (the meeting is in Colorado) with an “artistic, narrated breakdown.” Which basically means Slow Foodies will slaughter the bison, butcher him, and discuss their actions with high-minded intentionality. They will not rush.
Ellen Kennelly (a frequent participant at the Pitchfork) and I recently discussed the importance of getting ahead of the media on these issues by attempting to preempt predictably uncritical coverage. Any reporter covering this event, for example, should understand that Allan Savory’s colleagues have seriously questioned his research. They should also know that there are ethical implications to killing a sentient animal for the purposes of entertainment and culinary indulgence. Fast food or slow, these issues should be addressed, or at least fall on the media radar. In an important respect, there’s a reason that thousands of people will gather to witness the slaughter of a bison and not question the act: a lack of knowledge.
To that end, Ellen—who is one of those people who is constantly engaging the public on animal issues in the most tactful and effective manner—wrote the following letter to her acquaintances in the Slow Food club. It’s an invitation to discuss the issues that concern animal advocates. Not fight over. Discuss. She also outlined for me the kind of information we should seek to present to those who attend and write about this event. I think it’s all very smart.
The meeting will happen. Slow Food will go on. The bison will die and be eaten. But that doesn’t mean our outrage can’t exist more publicly, in the mainstream media, rather than merely seething in the confines of our little blogs.
I trust you will share this information far and wide.
I’ve been doing some historical research lately. One of the rewards that comes from investigating the details of 18th-century agriculture is that an unexpected discovery can cast doubt on common assumptions about the way agriculture works today (or is supposed to work). One of the more tenacious beliefs common in contemporary agriculture is the idea that the best way to keep soil healthy is to graze animals on it. Defenders of rotational grazing insist they’re farming the way nature intended us to farm, and the way farmers have been doing it for centuries. It was thus more than a little gratifying to stumble upon the following account, published in Maryland in March of 1789, by a sheep farmer, who was not so convinced of this received wisdom:
So far as dung improves soil, it ought to be allowed for; and this is for all dung applied from winter littering or summer folding; but how far, if at all, it is to be prized, when slowly dropt about in pasturing, is a question. Beasts constantly ramming the soil of a pasture into a close compact state, until it more than is commonly apprehended. That the foot of the beast does more damage to soil than his dung so dispersed and exposed to exaltation does good, is probable from several instances related by serious good people of clover fields having been divided, and the one half pastured on, all the summer, the other mown twice and both sown at the same time with wheat on one plowing, when the mown gave considerably the best crops of wheat. Let us suppose a lay of grass has been left unpastured, and even uncut, for three years; another like field at the same time is pastured close as is usual during the same three years; now let the farmer walk into these and observe how mellow, light, and lively the one is,–how firm the other. Whish of these will he prefer for a crop of grain? . . .It them may be suspected that pasturing doth not improve the soil; that on the whole it even injures it.
It really makes you wonder in what other ways we’ve twisted the agrarian past to fulfill today’s utopian visions, or at least on what sources what we’ve based our contemporary ideas.
The long duration of human history creates a slow burn effect on repetitive human behavior, habituating our thoughts and actions in ways we easily underestimate or forget. When I recently highlighted one bittersweet manifestation of this slow burn—the hard won and long-tested omnivorous knowledge about what was or wasn’t safe to eat—several readers countered that the weight of the past was in fact easily shucked off because, to paraphrase, “I did it with no problem.” But here’s the thing: when you talk about the history of human history, you don’t matter. We’re talking large patterns not small blips in time, such as your existence.
Other readers didn’t necessarily contend with my argument so much as wonder why I would offer ammunition to “the carnivores.” I need to be clear on this: I don’t think that way. I don’t see animal advocacy in such dichotomized terms. It’s not a zero-sum game, one in which information is deployed to save souls from carnivorous damnation. Instead, I think in terms of broader trends, trends that shape human mentalities and moralities, integrating ethics into culture in a subtle and effective ways. I’m all for protesting a Chipotle or marching in the streets for animal liberation. But I see the impact of those actions in the framework of how they shape broader cultural mentalities. If you write about the love of your pet, that’s lovely. But I’m only concerned with how it shapes our transcendent understanding of the human-animal relationship.
This perspective can lead to some counterintuitive ideas. For example, I’m much less concerned with whether or not an individual is vegan than with how the ideological substrate that supports basic human behavior is shifting. So, the reason why I bring up issues such as our million years of inherited meat-eating choices is that they comprise the substrate that I want to see changed. We need to grasp that reality before we work to change it. It is also for this reason that my veganism inadvertently slips, I don’t lose a moment of stress. Just as it’s not about you, it’s not about me. If I hid in a closet and and choked down a burger, it wouldn’t matter in the least.
Change will take time. Not another million years—revolutions in communications have changed the game. But we’re unlikely to see systemic changes with respect to eating animals in the course of our lifetimes (or at least mine). I make this claim not to extinguish our activist fire, but to acknowledge how deeply the act of eating animals has shaped human identity, one that’s as pre-programmed as it is adaptable. Rage on, people. But know the depth of the history we confront.
My son and I just finished the final episode of Masterchef Junior. It’s a compelling show. I’d even say that—the obvious detriment notwithstanding—it’s a great show. Kids aged 8-13 draw upon a wealth of hidden talent and experience to cook meals that could be featured in any top restaurant in the country. The show offers an important reminder that, when parents back off a bit and let kids loose they can force you to redefine “age appropriate” behavior. Inspired, my son has practically commandeered the kitchen and gone to work. I’m now watching him make a vegan tempura batter. Yum.
Now, to that obvious detriment notwithstanding. The recipes on Masterchef Junior would make a carnivore quiver like a tuning fork. When the judges taste the superlative results emanating from the kids’ kitchen, their reaction is visceral. They lurch forward, brows furrowed, eyes rolling back into their heads. “I want to give you a hug,” one of them might say. Or, “this meal could be a restaurant’s signature dish.” The kids beam.
The suffering behind it all is the furthest thing from anyone’s mind. The failure to connect the animal products to the animals was especially clear when the kid who ended up winning the competition—aged 13—cooked a meal that the judges said was the best they’d seen and tasted in four years. It was a veal chop. With that victory, the essence of our culture’s disconnection from the food we eat was affirmed. My son and I could not help but note that the winner won because he cooked a baby cow.
Vegans are increasingly getting involved in all sorts of video projects, many of which are necessarily dark and depressing. A cooking show that featured kids cooking first rate vegan meals, being judged by top vegan chefs, could be a great way to raise awareness, celebrate kids’ interest in plant-based cuisine, and demonstrate that a vegan, too, can melt with pleasure when tasting finely prepared food.
There are few ambitions more audacious than trying to convince someone to change his diet on ethical grounds. Eating is, at its core, an act of personal intimacy and nobody really wants to be instructed on how to be intimate. We like to think we’ve got that one figured out.
This point is one that a lot of animal advocates forget when we present our case with what seems like airtight moral logic, only to then be ignored or scorned by folk who are perfectly happy with their BLT and BBQ, thank you very much.
It drives animal advocates a bit nuts. When we congregate we will often say, somewhat plaintively, “when will people realize what they’re doing to animals? When will change happen?!” Brows furrow. Heads shake in frustration.
The problem with the question, of course, is that it rests on the flawed assumption that humans respond to moral logic with appropriate behavioral change. Not only is this wrong, and not only is the moral logic we espouse rarely as persuasive as we think it is, but the truly daunting fact of the matter is that, in politely asking humanity to stop eating animals, we overlook how eating animals is more than a cultural choice. It’s a biological act rooted in our deepest eco-evolutionary past.
Now, that doesn’t make it right (as readers know I’m well aware). But let’s take this seriously: a couple million years of hunting and gathering, not to mention the adaptation of the human brain to life on the African savannah, makes the carnivore doet a pretty freakin’ tenacious habit
Three aspects of our evolutionary legacy strike me as particularly relevant to the claim that, when it comes to eating animals, the past has a commanding and assuredly long-term hold on the present. In this post, I’ll note just one.
It has to do with the experimental backstory to the human carnivore’s diet. As a species, hominids did not burst on the scene and start eating a standard diet appropriate to their needs.
The standard diet appropriate f0r hominids (and homo sapiens) in the pre-agricultural era was forged through endless and terrifying trial and error. Plants and animals were tested, accepted, rejected. They were savored, flavored, regurgitated. People routinely died because of poor dietary choices. Others thrived.
In other words, an essential part of becoming and being human involved testing the natural world to see what would keep us in the game. A tremendous amount of evolutionary energy was invested in this process, one that, by virtue of hundreds of thousands of years of accumulated assessments of what “works,” made eating at least some meat central to being human.
That fact might run counter to contemporary animal ethics, but it’s a heavy anchor to pull up all at once.
Here is what an orca whale eats, according to the National Oceanic and Atmospheric Administration: “a wide range of prey, including fish, seals, and big whales such as blue whales.” They also consume herring, cod, squid, and octopus. They are actually the largest known marine mammals to kill and eat other mammals, consuming 375-500 pounds of food a day.
In no way bound to the ethical standards of humans, they are, nonetheless, massive destroyers of sentient life. They have to be in order to live. But they don’t, as it turns out, eat humans. Not because they take pity on us. But because we’re too bony and don’t smell right. Plus, they never see us in the wild. We generally don’t swim in their waters.
I mention these details not as a lead-in into yet another story on SeaWorld, but as an attempt to make sense of Jeffrey Mousaieff Masson’s truly bizarre recent claim that, “I would rather have been born an orca.” Evidently he’s serious. “No kidding,” he writes, “I really would.”
Why would he want to be an orca? Humans, so quick to the pull the trigger on each other, have dismayed Masson so thoroughly—we killed 200 million of our own in the 2oth century–that he wants to join the orca clan because orcas have “killed exactly zero of their kind.” In this respect—the fact that they spare their own—he adores their “gentle lives.” Intended to be a plea for compassion, Masson’s gambit is really an expression of self-preservation and moral exoneration.
First the self-preservation. Masson’s main problem with humans, at least as he articulates it in the article, is that we kill other humans. This intra-species violence is why he wants to jump ship from humanity and join the “gentle” orca community, a species that shreds to death some of the smartest creatures alive and eats their children. Note that Masson doesn’t condemn humans for killing other species, as orcas do (and he would get to do as an orca), but only for killing each other. So the claim, although hardly his intention, ultimately suggests that Masson wants to leave the human world and join the orcas because he’d be safer and maybe live longer.
Being an orca would also let Masson off the ethical hook, allowing him to become a guiltless shredder of ocean animals while garnering respect from Masson-types as peace-lovers for not killing their own (or humans). In his tirade against his own, Masson writes, “There may be no orca heroes, but nor are there orca psychopaths . . .” What Masson fails to acknowledge is that, because there are no orca psychopaths, there are also no orca animal rights activists. There are no orcas who will stand up and say, “stop the slaughter of our octopi brethren!”
So, if Masson were an orca he would a) destroy blue whales while b) bearing none of the ethical responsibility for doing so. As a human, though, he admirably has assumed the burden of responsibility (writing wonderful books on animals), a burden that he would, if his wish came true, toss off as casually as he did this silly article.
If you care about honeybees, you probably know about colony collapse disorder (CCD). The disappearance of the world’s greatest living pollinators evokes an especially uneasy kind of ecological discomfort. After all, honeybee pollination brings us much of our food.
It would therefore seem especially critical—if only in a self-interested way— to understand the causes of honeybee collapse. And quickly. A wide range and combination of circumstances have been proposed over the years as factors contributing to the disorder. So diverse are the causal possibilities that the complexity of this problem has become legion to entomologists worldwide. It’s therefore not at all surprising that what’s missing from the CCD debate is a smoking gun. THE answer.
But if you read Mother Jones (May 23), you’d be forgiven for thinking that the ever elusive smoking gun was, at long last, discovered.”Did Scientists Just Solve the Bee Collapse Mystery,?” ran the headline. It’s a thoroughly Mother Jones-ish tactic. Strongly suggests a clear answer to a multifaceted problem—that smoking gun—but stay aware that the issue is really a lot more complex than the article will make it seem. Then add a question mark to cover everyone’s ass while still allowing the reader to feel the satisfaction of a clear and singular answer, not to mention righteous outrage at the dastardly menaces behind this ecological tragedy.
This is good guys/bad guys journalism.
The MJ article breezily relies on a single study, one that happens to have a Harvard imprimatur on it (along with that of a beekeeping association) to argue that the “key driver” of colony collapse disorder has once and for all been identified: a class of pesticides known as neonicotinoids. The study identifying neonicontinoids as the cause of CCD is praised by MJ for its clarity (The experiment couldn’t have been simpler”), it’s brilliance (“What makes the new Harvard study remarkable . . .”), and it’s conclusiveness (“the CCD mystery has been solved.”) The author of the MJ article, Tom Philpott, effectively blames CCD on Bayer, the manufacturer of the pesticides in question.
But then, as always happens with MJ’s coverage of agriculture—coverage driven first and foremost by an inveterate hatred of industrial structures (Bayer in this case)—the other shoe drops with a thud. It happens a lot at MJ. I’ve noted as much in the past with respect to GMOs and an eventually retracted French study on rats. In the CCD case, the backlash against the “Harvard study” fingering neoniotinoids was unusually swift. The more you learn about the study used to play the role of the smoking gun, the harder it is to believe that it was given so much weight in a major magazine to explain one of the most mysterious ecological phenomenon on earth.
The best critique of the “Harvard study” that MJ placed on a pedestal is here. Suspicions begin with the journal in which the paper was published—an obscure Italian publication called The Bulletin of Insectology. Critics note that the study’s author Chensheng Lu, “ has had trouble getting his work on honeybees past peer review in many US journals.” The study’s sample included only 18 honeybee colonies, all located in central Massachusetts. The researcher “had exposed his bees to an unrealistically high dose of pesticides,” a level that bayer itself agrees would be lethal for bees. ”This study is a total distraction,” said an entomologist at the University of Maryland. “It’s not surprising that those bees died — those doses weren’t field realistic. The only surprise was that the bees didn’t all die right away.”
But a distraction is what Philpott and MJ are all about. They will shamelessly deploy the flimsiest science to bash industrial agriculture. Which is sad because industrial agriculture is so thoroughly flawed on its own terms, and its flaws are so readily obvious, that nobody should have to rely on questionable science to expose those problems. I’m all for sticking it to Big Ag—which is why I advocate for veganism—but let’s not resort to deception to do it. Follow the money, for sure. But follow the science as well.