On September 20th, the United Nations Secretary General Ban-Ki Moonpresided over the U.N.’s first-ever general assembly meeting on the topic of “superbugs.” Superbugs are drug-resistant bacteria that kill over 700,000 people a year. They develop in response to one general cause: the overuse of antibiotics.
“If we fail to address this problem quickly and comprehensively,” Moon said, “antimicrobial resistance will make providing high-quality universal health-care coverage more difficult if not impossible.” The World Health Organization (the U.N.’s public-health division) urged medical professionals to radically reduce antibiotic use.
That’s sensible advice. The fewer antibiotics prescribed, the fewer the opportunities for superbugs to develop. But it shouldn’t obscure the fact that the leading cause of antibiotic resistance comes not from the treatment of gonorrhea or tuberculosis?—?both mentioned by the U.N.?—?but from animal agriculture.
What follows is the intro to my recent piece in the Virginia Quarterly Review. Clink link below to go to full article.
Dylan, a three-year-old yellow Lab, leaped from the van and made a beeline for Beth, a volunteer who was standing alone in her driveway, pressing a black blindfold to her eyes. This was their second meeting. Judging from Dylan’s demeanor (his tail wagged like a metronome) his final days as a guide-dog-in-training were happy ones. His trainer, Natalie Garza, introduced Dylan to my daughter and me, and although he bowed his head for a scratch, you could tell his heart wasn’t in it. Instead, he was focused on the task at hand: leading Beth, who otherwise has normal vision, on a test walk through a suburban neighborhood in Austin, Texas.
Guiding a sighted volunteer, Garza explained, required Dylan to work harder than usual. Specifically, he had to exaggerate his signals with Beth, signals that a visually impaired person accustomed to sightless navigation might not require. Dylan, who was soon to be matched with his first visually impaired guardian, couldn’t afford to cut corners today. “These dogs save people’s lives,” Beth reminded me as Garza lowered a harness over Dylan’s head and adjusted it for comfort. He had to be on his game.
Complete article here.
Like most people who vow to lose weight, Becca Reed?—?51, diabetic, confined to a wheelchair, taking nearly a dozen medications?—?had precise goals. Unlike most people, she’s willing to share them. On a piece of crinkled notebook paper she wrote in bubbly cursive script what she hoped her future would be like:
- Reach 225 pounds or less.
- Feel sexy and buy an outfit at a regular store.
- Have James look at me with that sparkle in his eye.
- Feel better able to clean the house.
- Walk five minutes straight.
- Bench press 100 pounds or more.
- Back fat and fat above butt—get rid of it.
- Strengthen arms?—?flabby upper part.
- Get off medication!
- Ride horses on Padre Beach with James.
- Be able to stand long enough to sing more than one song.
James Reed, Becca’s husband, is a baby-faced 57 who suffers from his own obesity-related problems, namely his high blood pressure, which was 250/190 before he was finally medicated for it. He, too, made a list of weight-loss goals and agreed to share it:
- Get below 250 pounds.
- Feel better about myself.
- Bench 400 pounds.
- Leg press 600 pounds.
- No blood pressure meds.
- Weigh 210–215 pounds.
- Make it to retirement.
- Don’t become diabetic.
I recently sat down with Becca, James, and their 26-year old son Drew . . . . Read More
It probably wasn’t on your calendar. But between June 21 and June 30 the annual Lychee and Dog Meat Festival took place in Yulin, China. While declining in popularity, eating dog meat remains common enough throughout Asia. In the Shaanxi province, it’s more than common: It’s a delicacy considered central to the region’s culinary identity.
This dietary preference has spelled trouble for Asian canines. In years past, over 10,000 dogs? (some strays, some farmed, some possibly stolen from pet owners?) were slaughtered to sate the palates of festivalgoers eager to sample such fare as “crispy dog” and “dog hot pot.” The prevailing belief that the taste of dog meat improves when the animal is killed while in distress hasn’t helped the festival’s global appeal and, with reports of horrific slaughter accumulating, this year’s attendance numbers (as well as the number of dogs killed?) have dropped. Still, for the diehard aficionados of dog meat, the festival remains an annual rager to be defended at all costs, on grounds both culinary and cultural.
“If the Yulin dog meat festival weren’t real, philosophers would have dreamt it up,” says Bob Fischer, a Texas State philosopher and author of The Moral Complexities of Eating Meat. Indeed, dog meat presents conscientious Westerners with a perfect conundrum. It pits an enlightened expectation of cultural tolerance? (?live and let live!?) against our deep emotional attachment to dogs as companion animals?, ?an attachment that makes eating them seem, at the least, morally repugnant.
Despite a 2015 report from the Centers for Disease Control and Prevention suggesting that childhood obesity was in decline, the numbers?—?when properly interpreted (and supplemented with more recent research)?—?confirm the opposite. As they have for decades, children between the ages of two and 19 are, in fact, becoming overweight or obese at a steadily increasing rate.
Today, 33.4 percent of kids are considered overweight, with 17.4 percent of them qualifying as obese (defined as having a body mass index [BMI] of 35 or more) or severely obese (a BMI over 40). To put these measurements in perspective, a healthy person who is 5’9″ and 150 pounds will have a BMI of around 22.
These numbers intersect with an especially compelling sociological observation: As childhood obesity becomes commonplace, parents are increasingly unable to recognize the condition in their children. Writing inScientific American, Jane Ogden explained that, “as populations get fatter, the new normal has become overweight and therefore invisible.”
It’s later afternoon at the Town Lake YMCA in Austin, Texas, and a man in Lane Two is gliding through the pool with fearless perfection. His movements are languid; breathing metronomic; pace effortless. He completes lap after lap with such ease of motion that the only word that keeps coming to mind as I watch him move down the lane is natural. That’s a natural-born swimmer.
In fact he’s nothing of the sort. No human being is a natural-born swimmer.To confirm, I need only look over to the YMCA’s instructional pool, where the whole notion of being a natural-born swimmer is quickly disabused by a clutch of six physically fit adults milling anxiously in waist-deep water around Dena Garcia, a swim instructor.
They’re participants in a TOW?—“Terrified of Water”—class, and the contrast with what’s happening in Lane Two illustrates something important. At some point in time (probably very early in life), the impossibly elegant lap swimmer had to do exactly what these courageous adults were now doing in the instructional pool: confront his fears by gripping the edge and kicking, placing his face in the water and making bubbles, and allowing his body to float while avoiding a panic attack.
Exactly why we don’t instinctively swim is a mystery. But as Daniel E. Lieberman, professor of human evolutionary biology at Harvard University, explains, “It is possible that some of our ancestors swam or occasionally waded into marshes to collect sedges, but there is very little evidence that natural selection acted much on human abilities to swim.” He calls humans “slow, inefficient, and awkward as swimmers.”
Read the full story here.
It’s an interesting time for luxury voyeurism. The obvious expressions of rare wealth — the 50,000-square-foot homes, the $250,000 cars, the private jets, the stratospheric penthouses, the monthlong trips around the globe — are, at least in terms of shock value, fading. These acquisitions have become so Disneyland-ish in their well-publicized glitz that they barely register on the over-the-top meter.
But what’s more notable, if only sociologically, is how the wealthy elite, perhaps bored with its own ostentation, has unleashed its purchasing power on the more commonplace aspects of life. Indeed, simple acts that the rest of us could feel a flash of superiority about performing are being colonized by a kind of extravagant, trickle-down consumption manifest in once-humble acts, such as gardening.
This observation came to me while running through Houston’s River Oaks neighborhood, the residential Eden of the city’s moneyed elite. You know you’re in River Oaks because pets are walked by hired help, leaf blowers provide the white noise, faux chateaux compete for manorial distinction and a private police force keeps the peace (well, actually, they pick up newspapers left out too long). But what stopped me while I was running was an activity that had never before registered, in my experience, as a spectacle: Someone was planting a tree. only the act involved an aerial lift crane, a truckload of labor and a backhoe.
The idea had never occurred to me: The super-rich generally don’t drop to their knees and plant saplings. To the contrary, they outsource photosynthesis, allowing annual tree rings to accumulate on someone else’s time, with someone else’s labor, with the nutrients from someone else’s soil. on this occasion, a fully formed willow oak protruded from a root ball the size of a large pickup truck. It swung from a crane that was slowly lowering it into an even larger hole. Six men leaned into the thing. They attempted to hold it steady while shuffling around the canyon’s edge, negotiating the precipice while pushing against the swaying heap, one terrifying misstep away from being smashed between root ball and hole. A master gardener I know (my mother) later told me, in a rather unfazed manner, that the entire operation probably cost well over $100,000.
In the early 1970s, John Tarrant, a British ultramarathoner who set world records in the forty- and hundred-mile distances, suffered a hemorrhaging stomach ulcer that occasionally sent him to the hospital for tests and blood transfusions. Tarrant despised the interruptions to his training schedule, and during at least one stay, he ducked into the bathroom, changed into running gear beneath his hospital gown, and snuck outside for a quick five-miler. As Bill Jones recounts in his book The Ghost Runner, Tarrant sacrificed everything for his sport—his work, his family, and, evidently, his better judgment.
Today is the 120th Boston Marathon [this piece was originally published on April 18, 2016], and I’d wager that nearly every runner in the race would understand Tarrant’s impulse. Training for long-distance races breeds a restless need to elevate the heart rate, score an endorphin hit, and achieve what Tarrant called that “magnificent feeling of well being.” Running begets more running, an insidious cycle that can become, over time, a game of high-mileage brinkmanship that blurs the line between dedication and obsession. At the peak of his training, Tarrant was logging 180 miles a week—an addiction, no doubt, but a healthy addiction, at least according to the runners. (The doctors aren’t convinced: by 2015, running-related cardiology concerns had crystallized into something called the excessive-endurance hypothesis; google the phrase and “scarring of the heart” comes up a lot.)
In a paper about to be published in The Proceedings of the Royal Society, a team of researchers identifies something they call the “paradox of unanimity.” If you’ve ever smelled a rat when everyone else is celebrating an idea then this paradox is for you. While unanimous agreement (or something close to it) might suggest that a particular claim is right, the researchers, led by Lachlan J. Gunn, an engineer at the University of Adelaide in Australia, found the opposite to be true. Rather than confirming truth, unanimity indicates that something went wrong, that a “systemic failure” undermined popular judgment, that the confidence of the crowd has been skewed by bias.
As it’s currently framed, the paradox applies primarily to criminal justice concerns—police line-ups and the like. But it also has implications for food and agriculture. Few fields of popular interest have cultivated a wider array of glib axioms of empowerment than food: genetically modified organisms are bad, local is better, you shouldn’t eat food your grandmother wouldn’t eat, and so on. In the context of Main Street foodie wisdom, these claims enjoy something close to unanimity. But, for all their support, none comes closer to the unanimity quotient than the gilded assertion that organic food is food grown without pesticides.
This past year, 2015, may go down as the year we admitted that our relationship with digital media was getting a little dysfunctional. Several recent books—notably Sven Birkerts’ Changing the Subject: Art and Attention in the Internet Ageand Sherry Turkle’s Reclaiming Conversation: The Power of Talk in a Digital Age—persuasively argue that an unhealthy immersion in online culture is blurring our focus. Meaningful tasks such as reading serious books or having face-to-face conversations have gotten more challenging. Fretful articles appearing in places such as the Chronicle of Higher Education have covered the emotional fallout (among Millennials in particular). We’re anxious, in turns out, from too much digital investment, and while few us are taking a digital detox, the idea is starting to sound pretty nice.
These studies—which became a media staple in 2015—sketch a dire portrait of chronic digital distraction, maybe at times too dire. But they effectively highlight notable aspects of our social media habits that, because of digital culture’s comprehensive claim on our lives, we may be too distracted to recognize as detrimental to our well-being. Although the connection is by no means obvious, the digital trap—whether overstated or not—illuminates the intractable struggle we have with another vexed consumer behavior: eating unhealthy food.
The parallel begins with an elemental point: Modernized humans dedicate much of their lives to avoiding discomfort. The quest to live as stress free as possible is such a fundamental desire that it almost seems self-justifying. Not surprisingly, fast food and digital applications speak persuasively to this primal urge, promising as they do to ease us through the day with miraculous short cuts and life hacks that never cease to wow us with their novelty. An app that directs us exactly where to drive, not unlike a pizza with cheese jammed into the crust, offer enough convenience and novelty—granted in very different ways—to keep us in thrall to the charms of modernity.