Archive for the ‘Uncategorized’ Category
As an advocate for both books and therapy, I determined, upon first hearing the word “bibliotherapy,” that this might be my bespoke profession. I go to group therapy. I read a lot of novels. I’m constantly recommending novels to my group. Members struggling with various problems typically don’t count on me to empathize through personal experience. They count on me for book recommendations. Your adult son is an expat in Europe and is exploring his sexuality? See Caleb Crain’s Necessary Errors. You feel alienated from your wealthy family but drawn to nagging spiritual questions about existence? Walker Percy’s The Moviegoer is for you. Gutted by the loss of a loved one? You could do worse than James Agee’s A Death in the Family (Men’s therapy group, by the way).
Read more here.
On September 20th, the United Nations Secretary General Ban-Ki Moonpresided over the U.N.’s first-ever general assembly meeting on the topic of “superbugs.” Superbugs are drug-resistant bacteria that kill over 700,000 people a year. They develop in response to one general cause: the overuse of antibiotics.
“If we fail to address this problem quickly and comprehensively,” Moon said, “antimicrobial resistance will make providing high-quality universal health-care coverage more difficult if not impossible.” The World Health Organization (the U.N.’s public-health division) urged medical professionals to radically reduce antibiotic use.
That’s sensible advice. The fewer antibiotics prescribed, the fewer the opportunities for superbugs to develop. But it shouldn’t obscure the fact that the leading cause of antibiotic resistance comes not from the treatment of gonorrhea or tuberculosis?—?both mentioned by the U.N.?—?but from animal agriculture.
What follows is the intro to my recent piece in the Virginia Quarterly Review. Clink link below to go to full article.
Dylan, a three-year-old yellow Lab, leaped from the van and made a beeline for Beth, a volunteer who was standing alone in her driveway, pressing a black blindfold to her eyes. This was their second meeting. Judging from Dylan’s demeanor (his tail wagged like a metronome) his final days as a guide-dog-in-training were happy ones. His trainer, Natalie Garza, introduced Dylan to my daughter and me, and although he bowed his head for a scratch, you could tell his heart wasn’t in it. Instead, he was focused on the task at hand: leading Beth, who otherwise has normal vision, on a test walk through a suburban neighborhood in Austin, Texas.
Guiding a sighted volunteer, Garza explained, required Dylan to work harder than usual. Specifically, he had to exaggerate his signals with Beth, signals that a visually impaired person accustomed to sightless navigation might not require. Dylan, who was soon to be matched with his first visually impaired guardian, couldn’t afford to cut corners today. “These dogs save people’s lives,” Beth reminded me as Garza lowered a harness over Dylan’s head and adjusted it for comfort. He had to be on his game.
Complete article here.
Like most people who vow to lose weight, Becca Reed?—?51, diabetic, confined to a wheelchair, taking nearly a dozen medications?—?had precise goals. Unlike most people, she’s willing to share them. On a piece of crinkled notebook paper she wrote in bubbly cursive script what she hoped her future would be like:
- Reach 225 pounds or less.
- Feel sexy and buy an outfit at a regular store.
- Have James look at me with that sparkle in his eye.
- Feel better able to clean the house.
- Walk five minutes straight.
- Bench press 100 pounds or more.
- Back fat and fat above butt—get rid of it.
- Strengthen arms?—?flabby upper part.
- Get off medication!
- Ride horses on Padre Beach with James.
- Be able to stand long enough to sing more than one song.
James Reed, Becca’s husband, is a baby-faced 57 who suffers from his own obesity-related problems, namely his high blood pressure, which was 250/190 before he was finally medicated for it. He, too, made a list of weight-loss goals and agreed to share it:
- Get below 250 pounds.
- Feel better about myself.
- Bench 400 pounds.
- Leg press 600 pounds.
- No blood pressure meds.
- Weigh 210–215 pounds.
- Make it to retirement.
- Don’t become diabetic.
I recently sat down with Becca, James, and their 26-year old son Drew . . . . Read More
It probably wasn’t on your calendar. But between June 21 and June 30 the annual Lychee and Dog Meat Festival took place in Yulin, China. While declining in popularity, eating dog meat remains common enough throughout Asia. In the Shaanxi province, it’s more than common: It’s a delicacy considered central to the region’s culinary identity.
This dietary preference has spelled trouble for Asian canines. In years past, over 10,000 dogs? (some strays, some farmed, some possibly stolen from pet owners?) were slaughtered to sate the palates of festivalgoers eager to sample such fare as “crispy dog” and “dog hot pot.” The prevailing belief that the taste of dog meat improves when the animal is killed while in distress hasn’t helped the festival’s global appeal and, with reports of horrific slaughter accumulating, this year’s attendance numbers (as well as the number of dogs killed?) have dropped. Still, for the diehard aficionados of dog meat, the festival remains an annual rager to be defended at all costs, on grounds both culinary and cultural.
“If the Yulin dog meat festival weren’t real, philosophers would have dreamt it up,” says Bob Fischer, a Texas State philosopher and author of The Moral Complexities of Eating Meat. Indeed, dog meat presents conscientious Westerners with a perfect conundrum. It pits an enlightened expectation of cultural tolerance? (?live and let live!?) against our deep emotional attachment to dogs as companion animals?, ?an attachment that makes eating them seem, at the least, morally repugnant.
Despite a 2015 report from the Centers for Disease Control and Prevention suggesting that childhood obesity was in decline, the numbers?—?when properly interpreted (and supplemented with more recent research)?—?confirm the opposite. As they have for decades, children between the ages of two and 19 are, in fact, becoming overweight or obese at a steadily increasing rate.
Today, 33.4 percent of kids are considered overweight, with 17.4 percent of them qualifying as obese (defined as having a body mass index [BMI] of 35 or more) or severely obese (a BMI over 40). To put these measurements in perspective, a healthy person who is 5’9″ and 150 pounds will have a BMI of around 22.
These numbers intersect with an especially compelling sociological observation: As childhood obesity becomes commonplace, parents are increasingly unable to recognize the condition in their children. Writing inScientific American, Jane Ogden explained that, “as populations get fatter, the new normal has become overweight and therefore invisible.”
If your brand has anything to do with food, the last place you want it to appear in is the Food Poison Journal. But that’s exactly where the Chipotle Mexican Grill recently found itself, prominently placed in a headline confirming its most recentE. coli outbreak.
Thirty-nine people in Washington and Oregon came down with E. coli O26 after eating at a Chipotle restaurant in late October. Twelve were hospitalized. The source of the outbreak has yet to be verified, but experts suspect tomatoes. Chipotle shuttered 43 stores and tossed all remaining ingredients into the trash, patting itself on the back for its “abundance of caution.”
The problem with this response, though, is that Chipotle—whose defining creed is “food with integrity”—has assured consumers that an “abundance of caution” was integral to its mission from the start. Chipotle’s much-touted cautionary approach has underscored such definitive moves as banning genetically modified organisms and supporting locally sourced produce. Thus the “fast casual” alternative has been able to transform a burrito—as one of its advertisements proclaims—into a “food-culture changing cylinder of deliciousness.”
Read more here.
A version of this piece ran in the New York Times in 2006.
This time of the year, the windows of America are beginning to be dotted with carefully carved jack-o’-lanterns, but in a week or so, the streets will be splotched with pumpkin guts. Orange gourds will fly from car windows, fall from apartment balconies, career like cannon fire from the arms of pranksters craving the odd satisfaction of that dull thud.
There are, to be sure, more productive ways to deploy a Halloween pumpkin. Post-holiday, composting is a noble option. A pumpkin grower in Wisconsin once turned a 500-pound Atlantic Giant into a boat.
But what we almost certainly won’t do is eat it. First cultivated more than 10,000 years ago in Mexico, cucurbitaceae were mainstays of the Native American diet. If for no other reason than its status as one of America’s oldest cultivated crops, an honest pumpkin deserves our reverence.
The current batches that will soon litter the pavement, however, are for the most part irreverent fabrications, cheap replicas inflated for the carving knife. Food in name only, they’re a culinary trick without the treat. For those of us who value America’s culinary past, smashing a generic pumpkin is more of a moral obligation than an act of vandalism.
During the colonial era, the pumpkin was just one squash among dozens, a vine-ripening vegetable unmarked by a distinctive color, size or shape. Native Americans grew it to be boiled, roasted and baked. They routinely prepared pumpkin pancakes, pumpkin porridge, pumpkin stew and even pumpkin jerky.
Europeans readily incorporated the pumpkin into their own diet. Peter Kalm, a Swede visiting colonial America, wrote approvingly about “pumpkins of several kinds, oblong, round, flat or compressed, crook-necked, small, etc.” He noted in his journal — on, coincidentally, Oct. 31, 1749 — how Europeans living in America cut them through the middle, take out the seeds, put the halves together again, and roast them in an oven, adding that “some butter is put in while they are warm.”
Sounds tasty. But one would be ill advised to follow Kalm’s recipe with the pumpkins now grown on commercial farms. The most popular pumpkins today are grown to be porch décor rather than pie filling. They dominate the industry because of their durability, uniform size (about 15 pounds), orange color, wart-less texture and oval shape. Chances are good that the specimen you’re displaying goes by the name of Trick or Treat, Magic Lantern, or Jumpin’ Jack. Chances are equally good that its flesh is bitter and stringy.
In contrast, pumpkins grown in the 19th and early 20th centuries — the hybridized descendants of those cultivated by Native Americans — were soft, rich and buttery. They came in numerous colors, shapes and sizes and were destined for the roasting pan.
The Tennessee Sweet Potato pumpkin looked more like a pear than a modern pumpkin and, as its name implies, was baked and eaten like the sweet potato. The Winter Luxury Pie pumpkin, first introduced in 1893, became so popular for pies that it posed a fresh challenge to the canned stuff. These pumpkin varieties, and scores of others, were once the most flavorful vegetables in the American diet.
Fortunately, the edible pumpkin is not completely lost. While akin to endangered species, heirloom seeds are only a few mouse clicks and a credit card number away. By growing heirloom pumpkins, you can have your jack-o’-lantern and eat it too. More immediately, you can search out heirloom pumpkins at some farmers’ markets.
Of course, advocating a shift in any holiday tradition seems like a futile exercise in a nation that (perhaps because we’re so young) takes its traditions rather seriously. But it’s not as if there’s much of a Halloween tradition to violate. Halloween is relatively new to America. The Irish brought the holiday to the United States in the 1840’s (and used turnips as jack-o-lanterns). But Halloween didn’t become profitable enough for commercial growers to produce decorative pumpkins until the suburbanized 1950’s.
Edible pumpkins were driven near extinction in the early 1970’s when a farmer named Jack Howden started to mass produce a firm, deep orange, rotund pumpkin endowed with thick vines to create a fat handle to hold while carving. The $5 billion a year industry that developed around Howden’s inedible creation is, historically speaking, still in its infancy.
And thus the “tradition” is ripe for improvement. Next year, let’s do something not so different. Let’s replace a fake pumpkin with a real one. The face you carve into it might be more distorted, and it might cost a bit more, but there will finally be a credible reason not to smash the thing at the end of the evening. And most important, as Peter Kalm observed back in 1749, we could once again split it open, roast it, add butter and remind ourselves that some traditions — like cultivating vegetables to eat — should never be destroyed.
Last month the United States Department of Agriculture and Environmental Protection Agency agreed to establish the “first ever national food waste reduction goal.” The program is not only notable for its ambition—it aims to reduce food waste 50 percent by 2015—but for the diversity of its participants. An array of churches, corporations, charitable organizations, and local governments has been asked to play a role. The plan, anodyne though it may be, will surely get a lion’s share of (dull) media attention.
But the one relevant group that’s been overlooked has the most to offer when it comes to reducing food waste: freegans. Freegans encourage eating food sourced from various waste streams pouring from the cracks of an excessively abundant food system. They’re scrappy scavengers who frequent grocery store alleyways, restaurant dumpsters, un-cleared food court tables, and anywhere else that yields a free meal and keeps freegan cash out of Big Food coffers—which kind of explains why the USDA and EPA aren’t terribly impressed. Freegans, who root their lifestyle in 1960s Berkeley-ish activism, package themselves as a subversive social movement.
Precisely what kind of movement—anarchist?, socialist?, punk?—is difficult to say. The freegan manifesto, as it were, reads as if it was written by a precocious if rant-prone high-schooler. It describes freeganism as a “withdrawal from the consumer death culture,” observes that “working sucks!,” condemns “the all oppressive dollar,” and implores us not to sacrifice “humanity to the evil demon of wage slavery.” Couching the generic dumpster dive in this rhetorically shrill language, a “stick-it-to-the-man” posture that supports an “anti-consumeristic ethic of eating,” the freegan manifesto might inspire angrier souls to thrust a fist skyward. But, for the sober-minded reformer, it threatens to condemn the movement to a kind of self-imposed solipsism. This is, after all, America.
Still, we cannot afford to dismiss freegans. . .
My last three columns have explored philosophical defenses for eating animals. I’ve done this from the perspectives of utilitarianism, animal rights, and contractualism. My intention with this series has been, in part, to reiterate how difficult it can be to justify eating animals, but also to defuse the off-putting “total abstinence” dictum inherent in the vegan ideology. There is, after all, almost certainly moral space for consuming animals.
But it’s not necessarily an easy space to find. It’s often neither consistent with the way we currently source meat nor tolerant of a business-as-usual approach to agriculture. It may require radical behavioral changes as well as structural shifts that are pragmatically beyond our control. Ironically, given the current configuration of our food system, these changes may be so difficult to achieve that choosing veganism by default could prove to be the easier option.
That said, there appear to be legitimate ways to eat meat, ways that are consistent with the ethical principles that we rely on to guide us through life, and ways that the future’s food architects might consider accommodating.
Wendell Berry has famously declared eating to be “an agricultural act.” This phrase has become a rallying cry for an agrarian reform movement that seeks to know the sources of our food supply. But, perhaps even more so, eating is also an ecological act, an elemental behavior that extends beyond the local farm and the farmers’ market to the endlessly interrelated biotic community. It is from this latter perspective—deep ecology—that I want to suggest a fourth philosophical defense for eating meat.
Read about it here.