The Imperial Calorie

Is it better to know, or not to know?  Are there certain pieces of information of which you’re happy to remain ignorant?  At what point does “knowledge is power” get subsumed by “ignorance is bliss”?  And what happens when all of these considerations involve the number of calories in your food?

Thanks to a new federal regulation that kicked in earlier this month, those sorts of questions have become slightly less theoretical than they were before.  In compliance with the Affordable Care Act—and following years of resistance by special interest groups—all food establishments in the U.S. with at least 20 outlets are now required to post calorie counts of all their products in all their stores.

While many chains have been doing this voluntarily for years, the practice became law on May 7, which means you can no longer order a muffin at Dunkin’ Donuts without learning that it contains nearly twice as many calories as a bagel, nor can you finish a meal at Olive Garden without willfully consuming more caloric energy than the average American burns in an entire day—with or without breadsticks.

Of course, maybe this new law means nothing to you.  Maybe you are a knowledgeable, health-conscious consumer who knows exactly what you’re putting into your body at all times.  Maybe you’ve long been aware of how deadly chain restaurant food tends to be for your waistline and cholesterol levels, and you tread carefully whenever you indulge—as you do when eating at home, at work or at Thanksgiving dinner.

However, this would hardly make you a prototypical American, 160 million of whom are either overweight or obese—a jaw-dropping figure that suggests a majority of our fellow countrymen either don’t understand how their digestive systems work or don’t care, and who pose an existential threat to our national healthcare system in any case.

As a matter of public health, then, requiring eating establishments to disclose nutrition information is a no-brainer and a win-win, and has largely been accepted as such in recent years.  By listing calorie counts on the menu, a restaurant provides valuable, potentially life-saving information to those who might need it, while still honoring every citizen’s God-given right to eat whatever they damn well please.

The problem here—as I suggested at the top—is that you cannot un-see what is written directly in front of you, and there’s a certain group of Americans who really, desperately wish they could.  If some people want to know how many calories they’re consuming while others are indifferent, there is also a third category:  Those (sometimes including me) whose culinary pleasure is dependent on not knowing, chemically-speaking, exactly what it is they’re eating, and once facts and figures enter into it, the whole experience turns sour.

I don’t know about you, but when I was younger and first scanning the nutrition labels on every foodstuff in the kitchen, the whole point of dining out was to eat as much as humanly possible, because you had no earthly idea how many calories were involved and could therefore assume there were none at all.  As any corrupt politician will tell you, plausible deniability is a powerful thing.

Admittedly, one cannot responsibly live in such utter obliviousness forever—aforementioned 160 million Americans notwithstanding—and as I’ve grown older, I’ve become considerably more informed and mindful about the science of nutrition and human metabolism, which has enabled me to balance the books in my eating and exercise routines, as well as to perform ballpark calorie calculations in my head in almost any setting—a superpower that is both highly useful and profoundly irritating.

On the one hand, becoming educated about food has unlocked the secret to losing (or at least not gaining) weight and feeling generally in control of my destiny.  By turning meals into a math problem—or, more accurately, a budget—I am considerably less likely to stuff my face for the hell of it and then feel like crap for the rest of the day.

On the other hand, by being super-vigilant about what I deposit into my pie hole—say, by scarfing down three slices of pizza for lunch instead of six—I risk turning eating into a purely clinical and joyless act—something every diet fad in history has expressly tried to avoid, because why on Earth would you remove the pleasure from the most inherently pleasurable activity of your day?

It has taken me several years—and one rather dramatic period of weight loss—to reconcile those twin urges without driving myself completely crazy.  (As Oscar Wilde put it, “Everything in moderation, including moderation.”)  While I don’t regret this strange journey to enlightenment (such as it was), I often wonder whether I’d be happier if I’d remained fat and ignorant instead of thin and neurotic—and whether America as a whole is feeling similarly now that it’s become virtually impossible to eat anything without the terrible knowledge of how much it’s costing us (in all senses of the word).  Whether our ability to live longer and healthier is necessarily making us live better.

There’s a saying amongst dieters, “Nothing tastes as good as skinny feels.”  How wonderful life would be if such a thing were actually true.

Advertisements

The Man Who Wouldn’t Be King

It says a lot about America that John McCain was never elected president.  It says even more that, in retrospect, we sort of wish he had been.

Indeed, all the way back in 2001, during an interview with Charlie Rose (ahem), Bill Maher cited McCain—recently defeated in the GOP primaries by George W. Bush—as among his favorite Republican politicians.  “He’s everyone’s favorite,” said Rose, to which Maher dismissively retorted, “Then why doesn’t he win?”

It’s a damn good question, and a useful lens through which to view our entire political system.  As McCain clings ever-more-precariously to life—having spent the last 10 months ravaged by glioblastoma, an aggressive form of brain cancer—we might reflect on the strange way that our most accomplished and admired public officials tend not to rise all the way to the Oval Office—and why a great many more never bother to run in the first place.

On paper, McCain would seem exactly the sort of person the Founding Fathers had in mind as a national leader:  A scrappy rebel from a distinguished family who proves his mettle on the battlefield, then parlays that fame into a steady career in public service.  (He was first elected to Congress in 1982 and has never held another job.)

While hardly a first-class intellect—he famously graduated near the bottom of his class at Annapolis—McCain’s grit and endurance through five-and-a-half years of torture and deprivation in a Vietnamese prison forever burnished his reputation as among the most indefatigable men in American life—someone who would speak truth to bullshit and hold no loyalties except to his own conscience.  Having cheated death multiple times, here was a man with precious little to fear and even less to lose.

Against this noble backdrop, it would be the understatement of the year to say that, as a two-time presidential candidate, John McCain was a complicated and contradictory figure—perhaps even a tragic one.  In 2000, he established his political persona as a crusty, “straight-talking” “maverick,” only to be felled in South Carolina by a racist Bush-sanctioned robocall operation that McCain was too gentlemanly to condemn.  (The robocalls implied, among other things, that McCain’s adopted daughter from Bangladesh was an out-of-wedlock “love child.”)

Eight years later, having learned a thing or three about brass-knuckles campaigning, McCain scraped and clawed his way to the Republican nomination—besting no fewer than 11 competitors—only to throw it all away with the single most irresponsible decision of his life:  His selection of Alaska Governor Sarah Palin as his running mate.

With nearly a decade of hindsight, the science is in that choosing Palin—a world-class ignoramus and America’s gateway drug to Donald Trump—constituted the selling of McCain’s soul for the sake of political expediency.  Rather than running with his good friend (and non-Republican) Joe Lieberman and losing honorably, he opted to follow his advisers’ reckless gamble and win dishonorably.  That he managed to lose anyway—the final, unalterable proof that the universe has a sense of humor—was the perfect denouement to this most Sisyphean of presidential odysseys.  He was damned if he did and damned if he didn’t.

The truth is that McCain wouldn’t have won the 2008 election no matter what he did, and this had very little to do with him.  After eight years of George W. Bush—a member of McCain’s party, with approval ratings below 30 percent in his final months—the thrust of history was simply too strong for anyone but a Democrat to prevail that November.  (Since 1948, only once has the same party won three presidential elections in a row.)

If McCain was ever going to become president, it would’ve been in 2000.  Pre-9/11, pre-Iraq War and post-Bill Clinton, a colorful, self-righteous veteran could’ve wiped the floor with a stiff, boring policy wonk like Al Gore.

Why didn’t he get that chance?  The official explanation (as mentioned) is the reprehensible smear campaign Team Bush unloaded in the South Carolina primary.  However, the more complete answer is that Republican primary voters throughout the country simply didn’t view McCain as one of their own.  Compared to Bush—a born-again Christian with an unambiguously conservative record—McCain was a quasi-liberal apostate who called Jerry Falwell an “agent of intolerance” and seemed to hold a large chunk of the GOP base in bemused contempt.

McCain’s problem, in other words, was the primary system itself, in which only the most extreme and partisan among us actually participate, thereby disadvantaging candidates who—whether through their ideas or their character—might appeal to a wider, more ideologically diverse audience later on.  Recent casualties of this trend include the likes of John Kasich and John Huntsman on the right to John Edwards and (arguably) Bernie Sanders on the left.

On the other hand, sometimes primary voters will do precisely the opposite by selecting nominees whom they perceive to be the most “electable”—a strategy that, in recent decades, has produced an almost perfect record of failure, from John Kerry to Mitt Romney to Hillary Clinton.

By being his best self in 2000 and his worst self in 2008, McCain managed to fall into both traps and end up nowhere.  Indeed, he may well have been a victim of bad timing more than anything else—as was, say, Chris Christie by not running in 2012 or Hillary Clinton by not running in 2004.

Then again, all of history is based on contingencies, and it is the job of the shrewd politician to calibrate his strengths to the tenor of the moment without sacrificing his core identity.  However appealing he may be in a vacuum, he must be the right man at the right time—the one thing Barack Obama and Donald Trump had in common.

As Brian Wilson would say, maybe John McCain just wasn’t made for these times.  Maybe he wasn’t elected president because America didn’t want him to be president.  Maybe his purpose in life was to be exactly what he was:  A fiery renegade senator who drove everybody a little crazy and loved every minute of it.  Maybe he wouldn’t have been any good as commander-in-chief anyhow—too impulsive, too hawkish—and maybe we’re better off not knowing for sure.

Will someone of McCain’s ilk ever rise to the nation’s highest office in the future?  Wouldn’t it be nice if they did?

Get on the Cannabus

I did not smoke pot this past April 20.  Truth be told, I haven’t smoked pot at all since the summer of 2010—and only a handful of times before that.  I don’t say this to impress you.  Were a joint to spontaneously appear in front of me, I’d likely grab it faster than Donald Trump grabs a Filet-o-Fish (and to greater effect).

I first encountered (read: inhaled) marijuana during my freshman year of college—specifically, in my dorm’s communal bathroom on Good Friday—because some guy down the hall had a secret stash and I happened to be idling nearby.  While I wouldn’t call that evening life-changing—if memory serves, it consisted mainly of eating a family-sized bag of Doritos and avoiding eye contact with the RA—it set the template for every weed-smoking episode that followed:  I didn’t actively seek it out, but when the opportunity presented itself—invariably through some vague acquaintance whom I’d probably never see again—I didn’t put up much resistance.  Following years of curiosity—and all the hysterical anti-drug propaganda that went with it—I wanted to understand what the fuss was about, and I was seldom disappointed with the result.

That was then—a blessedly distant world of prohibition in which to get high was to put oneself at the mercy of the American legal system—a risk that, as with underage drinking, undoubtedly added to the allure and pleasure of the overall experience.  (White privilege probably helped, too.)

In the intervening years, however, something rather strange has happened:  Marijuana has become legal.  As of this writing, nine states and the District of Columbia have OK’d the personal recreational use of the cannabis plant in all its forms, while another 20 states have sanctioned it for medicinal purposes—a gateway maneuver if I ever saw one.

Among the nine-and-a-half states that have gone whole hog on the pot question is my home commonwealth of Massachusetts, whose voters approved a pro-pot ballot referendum on November 8, 2016—an admittedly ironic day for such a liberal, forward-thinking decision.

Strictly-speaking, marijuana became legal in Massachusetts less than six weeks after Election Day, with residents allowed to grow, possess and consume small amounts of the substance to their hearts’ desire in the privacy of their own homes.  However, government bureaucracy being what it is, it will not be until July 1—fully 20 months after the vote—that recreational pot shops will open their doors and, for the first time, their products will be commercially available to those, like me, who have been largely cut off from the cannabis black market up to now.

Of course, the $1 billion question is whether the normalization of weed will turn me—and, in time, the entire state—into a lazy-eyed smokestack who spends all day listening to Pink Floyd and giggling at the wallpaper.  Whether ease of access will translate into frequency of use, and all the productivity-depleting horrors that supposedly follow.

Having never tended my own private marijuana nursery, I cannot know that answer for sure until the magic hour arrives.  However, my hunch is that very little will change in my consumption habits overall, and I would wager the same about most of the fellow inhabitants of my state.

How so?  First, because, as a rule, the per-serving market rate for legal weed tends to exceed that of alcohol—already the far more entertaining of the two drugs—and I am nothing if not a cheap date.  Second—and speaking of booze—I can’t help but notice that, pound-for-pound, I imbibed a lot more liquor before turning 21 than after.  As enjoyable as moderate drinking can and will always be, once all the legal barriers fell—once I could walk into a package store without a fake ID and emerge with a six-pack of Sam Adams unmolested—the temptation to overindulge was just never the same.  Call me an old fogy, but I find that spending the majority of one’s Sunday hunched over a toilet bowl isn’t nearly as fun at age 30 as it is at 18, 19 or 20.

The dirty little secret about drugs—as with pretty much everything—is that nothing dulls the appetite like legalization, and the most surefire way to create a culture of addicts is to take their favorite product away from them.  History is littered with examples of this very phenomenon—not least in the United States between 1920 and 1933—although my personal favorite is the observation made in the 1990s to Salman Rushdie—then under fatwa for writing The Satanic Verses—that “in Egypt, your book is totally banned—totally banned!—but everyone has read it.”

To be honest, it’s unlikely I’ll be smoking cannabis ever again—even after July 1.  Having never learned to roll a joint properly and not wanting to set off smoke alarms in my own house, my pot consumption, such as it is, will almost surely come in edible form, be it candy, chocolate or whatever else the kids are cooking up these days.  While I understand the pitfalls of ingesting marijuana-laced baked goods for the first time—elucidated most memorably by Maureen Dowd in a 2014 New York Times column—the notion of sucking smoke deep into my lungs has struck me as an increasingly unappetizing means of getting high when biting into a slightly odd-tasting cookie will produce more-or-less the same result.

But that’s just me.