Great Scot!

The most surprising and remarkable figure from last week’s referendum on Scottish independence was not the 10-point gap between the unionists and the nationalists in the final vote tally. (The “no” side won.)

Rather, it was the overall voter turnout, which rung in at nearly 85 percent.

You read that right: Last Thursday, more than five out of six eligible voters went out to cast a ballot on whether Scotland should secede from the United Kingdom and become an independent country. And this from a potential voter pool of people as young as 16.

Here I was, thinking that Scotland and the United States had some kind of kinship. On the basis of the above, I was wrong twice.

As it turns out, the Scots are less keen than us about extricating themselves from the (former) British Empire. At the same time, however, they are considerably more keen about formally expressing their political views in any case.

Analysts noted that the 85 percent turnout was unusually high for Scotland. That’s a gross understatement. Among the world’s true democracies, such a stratospheric participation rate is unusually high for anywhere.

You know the last time 85 percent of voting-age Americans participated in a national election? Never, that’s when.

In fact, no more than 60 percent of eligible voters have turned out for a U.S. presidential election since 1968; no more than 65 percent since 1908; and no more than 70 percent since 1900.

On this vital metric of the strength of a healthy democracy, we Yanks have been utterly shamed by a population bloc roughly the size of Wisconsin.

I know what you’re gonna say: “Apples and oranges.” Apart from anything else, the decision to declare independence is several orders of magnitude more consequential—and more rare—than the choice of whether to be led by a Democrat or a Republican. For an individual voter, opting whether to secede is almost surely a once-in-a-lifetime opportunity and a monumental responsibility to assume, not to mention a great honor. (Thomas Paine, reflecting on his experiences in America and France, said that having “a share in two revolutions is living to some purpose.”)

In other words, of course virtually everyone in Scotland who was offered this chance seized it by the horns. Only someone in the deepest throes of abject apathy could possibly choose to sit this one out.

For that matter, isn’t it wonderful that, for once, the level of interest in public affairs is so great that we can be very nearly certain that the final vote tally is representative of what the entire country—that is to say, every last citizen—actually thinks about the issue at hand, thereby allowing the leadership to run the government according to their wishes?

The British newspaper The Guardian, for its part, argued on the eve of the vote, “[A] decision of such gravity—to break away from a 300-year-old union—should be the settled will of a nation. The very fact that Scottish opinion is so closely divided is itself a weakness in the case for independence. Moves of such import should command enduring and overwhelming support.”

As a general principle, this is a compelling and worthy point to make. Coupling the Scottish people’s near-total interest in the question of separation with the surprisingly clear answer (“no, thanks”), the whole matter would seem to have been settled.

That is, except for a possibly interesting historical footnote: The American Revolution in 1776 was, itself, not especially popular among Americans.

While comprehensive public polling did not yet exist, it was estimated by John Adams—as fierce an advocate for separation as anybody—that the inhabitants of the thirteen colonies were split evenly three ways. As Adams put it, “We were about one third Tories, and [one] third timid, and one third true blue.” As such, had the matter had been put to a popular vote, rather than a convention and then a war, it very easily could have failed.

And this for a cause that, in retrospect, was so much more just and necessary than the prospective Scottish split from the same mother country. (To wit: The Scots are not subject to taxation without representation or an occupying army.) If the most successful political rebellion in history had to be sent kicking and screaming into effect, what hope does any other such effort have?

So perhaps we ask too much for a people to fully agree on a radical course of action before taking it. As we have learned from our present Congress, to expect our representatives to forge a consensus on anything is, to quote Colin Firth in The King’s Speech, to “wait rather a long wait.”

The best for which we can hope is precisely what the world was gifted by the people of Scotland last week: a vigorous and comprehensive show of civic pride.

So long as the people—all the people—play a part in the democratic process and elect intelligent representatives who respect their views (an admittedly tall order), they will get the society they deserve. A society that, with any luck at all, is also one that is truly worth living in.

Advertisements

A Football Fantasy

The first thing to notice about America’s ongoing NFL problems is how very easy they are to solve.

The epidemic of concussions and other injuries, including brain damage, among players past and present.  The flagrant cheating by the New England Patriots by means of secretly recording other teams’ coaching signals.  The “bounties” offered by players and coaches on the New Orleans Saints to mutilate its opponents on the field.  The residual queasiness of teams accepting openly gay teammates.  And, of course, the recent kerfuffles involving star players abusing women and children and more or less getting away with it.

These incidents and more—specifically, the league’s reaction to them—have drawn enormous attention and wide condemnation from all corners of the United States, inducing everything from congressional hearings to calls for the sacking of Commissioner Roger Goodell.

No doubt these limited efforts to coax the NFL to get its act together will generate real results—indeed, some already have—but there is one obvious way the entire American public could force the National Football League to fly right once and for all, and it dwarfs all the others.

America should boycott the NFL.

Take the year off.  Don’t watch the games.  Don’t buy the jerseys.  Don’t join an online fantasy league.

Quit professional football cold turkey.  Yes, even on Thanksgiving, when you can easily occupy yourself with hot turkey.

Let me be clear:  I’m not talking about a series of piddling one-person protests involving some inane, self-righteous Facebook rants orchestrated by a bunch of people who, for the most part, don’t really care about football, anyway.

I’m talking about season ticket holders not showing up on game day, and everyone else not purchasing tickets at all.  I’m talking about dropping your NFL Sunday Ticket subscription and not so much as flipping past ESPN or NBC when a game is on.

I’m talking about abstaining from the NFL’s regular promotions, and not buying products with the league’s logo printed on the packaging.

Starve the beast.  Make the league’s front office go into a white hot panic.  Force the commissioner and his underlings to reform the NFL’s practices and policies as a matter of maintaining not just the league’s image, but its very livelihood.

If the people of America are even half as outraged at the behavior of the National Football League as they claim to be—if the league deserves to be punished and browbeaten into taking the concerns of women and gays and retired players as forcefully as everyone insists—then the threat of a nationwide football fan strike is the only logical course of action.

After all, up until now, the main rationale for the NFL doing as little as possible in solving its myriad PR problems is that it is, in the end, the most powerful sports organization on planet Earth, generating nearly $10 billion in revenue each year.

Well, what could possibly be more effective in getting its attention than striking a blow to every last source of this pile of gold?  Issuing statements of condemnation is all well and good.  But actually withholding money from an organization that cares about nothing but money?  Now we’re talking.

Instituting an embargo against the NFL is the obviously right thing to do.  So why don’t we do it?

Well, that’s an easy one, isn’t it?  It’s because Americans love football just a little bit too much.  It’s because the game is too firmly threaded into the national fabric for us to simply turn our backs.

It’s because, when push comes to shove, we think football is more important than defending the rights of women and children.

In an ideal world, that would be a provocative and outrageous assertion.  In the world in which we actually live, it’s a simple statement of fact.  With the NFL’s TV ratings as strong as ever, in spite of all the controversy, how could it be otherwise?

Certainly, that’s not the way individual football fans explain it to themselves.  (Who would admit to being an enabler of wife beaters and child abusers?)  Instead, they rationalize their sports viewing habits either by proclaiming their own team guiltless in the current scandals, or, failing that, insisting that the league is, in fact, slowly learning its lesson and that this whole mess will be cleaned up shortly and everything will go back to normal.  Forgive and forget, as they say.

Then there are the honest fans, who either a) admit to being hypocritical about this and are comfortable living with the contradiction, or b) are simply too oblivious to understand that a contradiction is occurring.

I confess that I speak from the convenient position of someone who hasn’t watched an NFL game from start to finish since the Giants defeated the Patriots three Super Bowls ago, and who would not be terribly broken up if my apocalyptic proposal actually played itself out.  I don’t have to experience an endless series of dark nights of the soul, reconciling an uncontrollable affection for something that is self-evidently rotten to the core.

However, untold millions of my fellow Americans do, and since a massive shakeup like I have outlined will never occur, the country’s loyal football fans should at least recognize their culpability in a continuing national disgrace, and that there is far more they can do to rectify it than simply penning an angry tweet and sitting right back in front of the TV in time for opening kickoff.

David vs. David

Here’s some good news:  The race for Massachusetts governor is between two certified losers.

In one corner is the state’s attorney general, Martha Coakley, whose signature political achievement is having been defeated by Scott Brown in the 2010 Senate special election to succeed the late Ted Kennedy.

And in the other corner is businessman Charlie Baker, who ran for governor four years ago against the incumbent, Deval Patrick, and lost by six and a half points.

Of these two dubious distinctions, Coakley’s is widely viewed as the more embarrassing, owing (if nothing else) to her status as a Democrat in one of the most liberal states in the Union.  Then and now, it begs the question:  How inept of a candidate does one need to be to lose a statewide vote to a conservative, pick-up truck-driving good old boy in the land of the Kennedys?

Not that we should leave Baker, the Republican, off the hook.  In fact, before Patrick’s election in 2006, the commonwealth was presided over by Republican chief executives for 16 years running.  It may seem counterintuitive that Massachusetts voters would so regularly entrust the keys to the State House to members of the minority party—including one Willard Romney—but they did it nonetheless.

And so the gubernatorial race that will be decided on November 4 is as wide-open as one can be, and my fellow Bay State residents ought to consider themselves highly fortunate to be faced with these two pitiful failures from which we have to choose.

It is fairly well-known among political and historical junkies how personal setbacks tend to turn losing political candidates into victors—and, with any luck, into considerably better people.

Richard Nixon endured two bitter electoral defeats, in 1960 and 1962, before roaring back in 1968, having evidently learned the secret to securing the American people’s trust and affection (undeservingly, as it turned out).  Ronald Reagan lost the Republican nomination for president twice, in 1968 and 1976, before finding his golden moment in 1980.

More recently, Mitt Romney learned a thing or two in 2008 about how to weave his way through the Republican primaries four years later, even if it wasn’t quite enough to carry him all the way to the White House.  Similarly, Hillary Clinton demonstrated a definite adaptability in appealing to the Democratic base in the course of the 2008 primaries against Barack Obama.  Most analysts agree:  Obama’s success forced Clinton to become a better candidate.

For all that separates these disparate test cases, they all demonstrate that Charles Darwin’s theory of evolution by natural selection can be applied to the unnatural world of electoral politics:  Faced with past and potentially future defeat, a candidate must either change his or her behavior or die.

Oftentimes in politics, it’s all just a matter of luck.  For instance, Nixon in 1968 had the enormous built-in advantage of the nationwide disillusionment with Lyndon Johnson and the Democratic Party, whereas in 1960 Nixon himself represented the outgoing administration against an appealing young whippersnapper, John Kennedy.  Same for Reagan in 1980, running against a rather feckless Jimmy Carter.  Sometimes the country is simply in the mood for an insurgent, and all you have to do is play the part.

What makes the current contest in Massachusetts unique and interesting—and, in my view, potentially welcoming—is that both candidates (not just one) can be considered underdogs and would-be “comeback” stories, since both of them lost the last time around.

(Note:  Coakley did, in fact, win her last race for attorney general, but it was not nearly as competitive or consequential as the Senate campaign against Brown.  As such, nobody cares.)

Because of this dynamic, neither candidate can take comfort in any assumption of “inevitability,” or even of merely being the front-runner.  Accordingly, neither Baker nor Coakley has any cause to take anything for granted or become complacent or arrogant.  They have both sobered up, as it were, and understand that votes will truly need to be earned.  That the office of governor should be considered neither a birthright nor a foregone conclusion.

In short, both candidates will need to take the race seriously, and very probably will.  In a state that is traditionally dominated by one political party—a state in which all nine sitting congresspersons are Democrats, six of whom do not currently face a Republican opponent—this is something to savor and to celebrate, for it may not happen again anytime soon.

And what a shame if it doesn’t.

By all means, a state in which a supermajority of the public agrees about the major issues of the day ought to elect public officials who share those views.  However, this does not negate the necessity to debate such issues, forcefully and thoroughly, all the way to Election Day (and beyond).  And the most effective way to do this is to have a serious and formidable member of the opposition with whom to argue, forcing the campaign’s presumed “favorite” not to coast to victory on a wave of entitlement.

In a battle between two people with a great deal to prove, neither of whom can really be considered the favorite at all, the commonwealth of Massachusetts may, for the next two months, play host to a rare and real breakout of democracy within its borders.  I sure hope we’re ready for it.

Baker’s Dozen

Roughly a year from now, I will have lived in a post-9/11 world longer than I ever lived in a pre-9/11 world.

Presumably this means nothing to you, but it sure scares the hell out of me.

Know what’s even scarier?  Last month I attended the bar mitzvah—the Jewish coming-of-age—of a cousin for whom the memory of the September 11, 2001, attacks is no memory at all, because he was two months old at the time.

Worse still:  Last week I shot hoops and played wiffle ball with another cousin, aged four and a half, who probably doesn’t yet know what “9/11” is, and when he does, it will present simply as one more event in history, much as Watergate and the Iran hostage crisis did for me.

To my fellow twentysomethings, I ask:  Have we already reached that point where we talk to young people about September 11 the way our grandparents always talked to us about World War II?  I can’t believe I’m saying this, but:  Where does the time go?

While debate still rages, up until now my own definition of what it means to be a Millennial is that the formative global event of your life—albeit if only viewed on television—was the act of evil committed in New York and Arlington, Virginia, 13 years ago today.  For me and pretty much everyone in my graduating class, it most certainly was, if only because nothing else was quite so interesting.

Yet here are members of my generation—contemporaries, as it were—for whom September 11 means nothing because they were born just a few years later than I.

Indeed, Richard Linklater’s seismic new movie Boyhood, which effectively bottles up the Millennial experience for all future generations to consider, begins sometime in 2002, with a protagonist just old enough to be aware of the attack but too young to understand what it means.

Even as the film progresses—it covers 12 years in all—the only allusions to 9/11 are indirect or after-the-fact, such as when a young soldier recounts his tours of duty in the Middle East or when the boy’s dad rants about how the Iraq War was one big scam.

But the event itself seems to have had no immediate effect on this family.  It’s just something that happened far away at some point in the past.  So far as the movie is concerned, the world prior to September 11, 2001, is not worth mentioning.

So perhaps I had everything all wrong:  When the dust clears and the timelines are adjusted, maybe Millennials will be defined not as the generation on which 9/11 had the deepest impact, but as the first generation on which it had none at all.

In any case, it’s not like September 11 has grown any less important over time.  Au contraire.  With each passing year, it becomes ever clearer how the reality of so much of today’s world, good and bad, is a direct consequence of that horrible day, whether it should be or not.

To wit:  With no 9/11, there would have been no Iraq War.  With no Iraq War, there would have been no opportunity for a young, charismatic state senator from Illinois to oppose said war and rise to national prominence just in time for the anti-Bush backlash in 2008.  And with no President Barack Obama…well, I leave you to fill in the blanks.

(This is to say nothing of the effects of the Iraq War on the Middle East itself, but in the interest of time, I’ll say nothing of them.)

Every big political event has a way of altering the assumed trajectory of history, but 9/11 is still the Big Kahuna of our time.  It may not have “changed everything” right away, but 13 years out, we find there is very little about our lives that it did not change.  Its shadow only grows.

So in a way, it almost doesn’t matter that an increasing proportion of the world’s population didn’t experience the attack in real time.  For those born in the late-1990s onward, the post-9/11 world is the only world they know, and since it’s the only world we now occupy, there is little cause for alarm.

As someone who was already a teenager on the fateful day, and who saw the smoke billowing from Ground Zero from the top of a hill in my hometown in Westchester County, I guess I just didn’t expect this moment to come so quickly.  I wasn’t prepared to treat my own personal memory of 9/11 as something precious—something that wasn’t also shared, in one way or another, by every other person on planet Earth.

For this emerging generation—Millennial 2.0, perhaps?—I don’t know whether to feel sorry or envious.  On the one hand, today’s teens have never known the relative peace, quiet and civil liberties of the pre-9/11 era.  On the other hand, they also do not know what it is like to lose them.

We Still Can’t Handle the Truth

Sometimes our politicians lie to us.  Sometimes we give them no choice.

Toward the end of a debate last week among the three Democratic candidates for Massachusetts governor, the moderator asked each of them, “What is your biggest weakness?”

Steve Grossman, the long-shot in the race, answered, “When I should be using 10 words, I tend to use 20.  I’m a little long-winded.”

Martha Coakley, the front-runner, offered, “Coffee.”

And from Donald Berwick, who has been polling in the single digits:  “I have a very big heart.  My compassion drives me to want to help, sometimes more than I can.  But I’ll never stop trying.”

I recount these responses in ascending order of hilariousness (in real time, they were given in reverse) to illustrate how queries that look valuable on paper can become utterly useless in practice, how some political clichés write themselves, and how sometimes it’s our own damn fault.

Let us consider this question, “What is your biggest weakness?”

On the surface, it seems like a perfectly sensible thing to ask the people who wish to become our representatives in government.  Knowing a prospective leader’s virtues and strengths is all well and good, but it is often in one’s faults that one’s true measure can be taken.  In any case, there is enormous benefit to gleaning as much info about such a person as possible.

(To wit:  In 1992, Bill Clinton struck most voters as intelligent and empathetic, but knowing that he was also a horndog and a liar would have saved us a great deal of grief later on.)

However, this ostensible advantage from probing for one’s personal flaws is immediately negated by a very obvious problem:  Very few politicians are going to offer up their ugliest warts voluntarily.  Doubtless some don’t think that they have any, while the rest will simply exercise their Fifth Amendment right not to incriminate themselves in front of an unforgiving public in the middle of a campaign.  How stupid do we think they are?

What we are really doing with this “greatest weakness” question, and others like it, is setting our candidates a big, fat trap.  We are daring them to be honest about themselves in a way that would immediately repel us if they actually complied.  We are making ourselves to be hypocrites and our candidates to be liars.

We say we want our leaders to be straight with us, but what would happen if they actually were?  What if, say, Walter Mondale said in 1984 that he would raise all of our taxes because it was the fiscally responsible thing to do, or if Jimmy Carter in 1979 implored everyone to use less gasoline for the same reason?

Oh that’s right:  They did say those things, and it cost them their political lives.

Strictly on the matter of one’s personal shortcomings, the principle is the same:  Honesty is demanded, and then punished.  We know this to be true, and so do our politicians.

The result, then, is the string of predictably stupid responses like we got in Massachusetts last week.  They range from the safely banal—Coakley’s professed powerlessness in the face of caffeine, as if anyone today would ever dock points for such a thing—to the full-throated humble brag—Berwick’s apparently crippling empathy and sense of social justice (the horror!).

Steve Grossman, citing long-windedness as his most regrettable quality, came the closest to actually answering the question, although even his response could hardly be called politically risky:  Presumably his uncommon verbosity was already known to those following the race closely, and anyway, isn’t “I talk too much” essentially a modest-sounding way of saying, “I’m just so darned smart that I can’t help myself”?

But at least he recognized, however sheepishly, that he is not perfect and, unlike his opponents, is capable of genuine introspection in front of a camera.

That, in the end, is the secret to mastering (or at least surviving) this whole “straight talk” dance with the public:  You have to be honest, but not too honest.  You have to be prepared to acknowledge certain faults, but not ones that might actually get the voters to think twice about supporting you.

You have to speak the truth, but not the whole truth.  We, the people, are far too fragile to handle it.

Not All Clowns Are Sad

Dying is easy.  Comedy is hard.

This quip, or some variation thereof, has been attributed to just about every great comedian who has ever died.  Few doubt that it’s true—particularly the second part—although even fewer understand how very true it is.

Of course, the only people who can fully appreciate the singular challenges of stand-up comedy are those who have actually done it.  We who haven’t can only use our imaginations.

In light of the recent suicide of Robin Williams, our culture has come to conflate humor with sadness and dysfunction.  As a rule, America’s funniest citizens are also its most insecure, owing either to a traumatic childhood (and/or adulthood) or some mental illness that cannot quite be accounted for.

“While I don’t know what percentage of funny people suffer from depression, from a rough survey of the ones I know and work with, I’d say it’s approximately all of them,” wrote David Wong of Cracked.com.  “Comedy, of any sort, is usually a byproduct of a tumor that grows on the human soul.”

Reading such things, both before and after Williams took his own life, I could not help but think, “Thank God I’m not funny.”  The gift of comedy might allow one to bring joy to millions, but if it also requires—and is a direct consequence of—incalculable misery within oneself, I would just as well do without.  I understand the notion of “suffering for one’s art,” but personally, I’d prefer not to suffer and not be called an artist.  Seems like a reasonable trade-off to me.

However, many folks are unwilling or unable to settle for a life of comfort and risk-aversion—they’re just too damned funny—and last week we lost another such specimen in the person of Joan Rivers.

Watching Joan Rivers: A Piece of Work, the 2010 documentary that follows its subject for a year and also serves as a career retrospective, we find a natural-born comedienne afflicted with all sorts of personal and familial quirks, but depression was not necessarily among them.

Rather, what the documentary portrays above all is a woman who achieved great fame and success as a comedic performer through good old-fashioned hard work.  In so doing, it shows stand-up comedy itself to be not just a calling—something either you have or you don’t—but as a job like any other, requiring perseverance and resolve, raw talent and the understanding that you could be rejected a thousand times in spite of it, as Rivers most assuredly was.

There is one moment in particular in Joan Rivers: A Piece of Work that brings the preeminence of a strong work ethic into sharp relief.  It comes when Rivers directs us to an old filing cabinet in her Upper East Side penthouse—a set of drawers much like those one used to find in a library—and we are informed that it contains every joke that Rivers has ever written, organized alphabetically by subject matter.

In other words, Rivers didn’t become a comedy legend because she was depressed.  She became a comedy legend because she harnessed every iota of comedic potential in her politically incorrect brain, wrote it down, worked it out, and never took a day off.

Certainly, one can do those things and also be depressed.  One can also be a brilliant improvisational star, as Robin Williams was, without doing any particular prep work.  No two comics work in exactly the same way.

What Rivers demonstrated, in any case, is that sometimes the secret to comedy is not as dark as we are often led to believe.  Sometimes a clever mind, a strong constitution and a little bit of luck is all it takes.

The scene with Rivers’ filing cabinet put me in mind of an equally hard-working contemporary of hers, George Carlin.  Known above all as a zany anarchist on stage, Carlin could easily give the impression of improvising on the spot.  In fact, Carlin, who died in 2008, was a meticulous craftsman and wordsmith who spent months composing, revising and fine-tuning his act on paper before trying it out in front of an audience.  He was as much a writer as a performer.  It’s a testament to his skill at both that you would never know it from watching him.

Carlin was one other thing, too: happy.  Raised by a single mother, he had a fairly typical childhood in an agreeable middle-class neighborhood in northern Manhattan.  While he regularly went after the Catholic Church in his routines (along with every other religion), he insisted that his actual Catholic school experience was utterly benign and sometimes outright enjoyable.  He was married to the same woman for 36 years (until her death), and then to another woman for 10 years (until his death).  While he more than dabbled in every illicit substance he could get his hands on, his drug use never seemed to have a deleterious effect on his life or his career.

Perhaps Carlin is simply an anomaly in this respect, as he is in most other respects.  Or perhaps he had demons like everyone else and was just really good at concealing them.  We’ll never know for sure.

But so far as we can reasonably surmise, he was a normal, healthy guy who conquered the world of stand-up comedy through sheer determination and uncommon intellect, and without the supposedly necessary baggage of depression and perpetual discontent.

Much like Joan Rivers.

Fat Facts

To the lay person, it is emblematic of the irony and delight of science that eating foods with high levels of fat will cause one to lose weight.

It seems counterintuitive, if not outright impossible, that such a thing should be true—how can fat make you less fat?—yet it has apparently been confirmed in the past week in what has been billed as a major new study on health and nutrition funded by the National Institutes of Health and published in the Annals of Internal Medicine.

For this experiment, researchers hired 150 people to follow a certain type of diet for one full year.  Half of the subjects were instructed to mostly eat foods rich in carbohydrates and low in fat—bread, cereal and the like—while the other half did the reverse, roughly mimicking the Atkins diet of meat, eggs, cheese and other staples high in protein and fat.  Neither group was required to alter its exercise habits, nor, interestingly, was it subject to any calorie limits.

When the year was up, both groups had managed to lose weight, but the people in the low-carb, high-fat group had lost eight pounds more on average.  As well, the Atkins-like regimen proved more successful in reducing total body fat while increasing total muscle mass, thereby making one less susceptible to heart disease and similar maladies down the road.

In fact, the notion that dietary fat can help you lose weight is not a new discovery.  To the contrary, many nutrition experts (not just Dr. Atkins) have advocated such an approach to dieting for quite some time.

What is more, the basic science behind this phenomenon is not terribly complicated.  In brief:  High-fat, high-protein foods fill you up and also make you feel full, while certain carbohydrates—particularly sugar—fill you up but make you think you’re still hungry.

In other words, the difference between fats and carbs is as much a mental issue as a physical one.  A serving of meat may contain more calories than a serving of cake, but the meat will satisfy your hunger until your next meal, while the cake will just make you eat more cake.

This much was old news to most health experts (if not to ordinary Americans), and it helps explain why the people in this new study, permitted to consume as many calories as their hearts desired, would yield the results that they did.

It’s reassuring when things happen exactly as science says they will.

As to the matter of high-fat foods strengthening muscle mass (and vice versa), I can affirm the hypothesis from personal experience:  As a little leaguer in my mid-teens, I could crank baseballs over the outfield fence with minimal effort.  Today, in the aftermath of a sustained culinary regimen consisting largely of bagels and whiskey, I can barely lift the bat over my shoulders.  (Not that I ever try.)

My object at the time was simply to lose weight, which I did, but it hardly occurred to me that merely avoiding fatty foods was not the only way to do it—nor, more importantly, that it was probably the worst way of all, insomuch as it would result in depleting precisely the sort of bodily material—namely, lean muscle mass—that one ought to retain no matter how many pounds one wishes to shed.

I’d like to think that I was just being an idiot and that everyone else knows these sorts of vital physiological facts.  Yet I harbor doubts that this is actually the case, and that much of today’s dieting community is still going about it all wrong.

They are hardly to blame for it.  For starters, America’s health professionals, for all their tireless research, have hardly arrived at a consensus as to what is truly “right” and “wrong” when it comes to leading a healthy lifestyle.  We’re pretty sure about fruits, vegetables and exercise, but everything else is forever subject to revision and to an individual’s own requirements.  As Lewis Black said, “What’s good for one of you will kill the person sitting next to you.”

Further, as the articles about this new study point out, the idea that high-fat foods are good for weight loss was anathema to most Americans until fairly recently.  The conventional wisdom used to be precisely the opposite, hence the proliferation of food products advertised as being “low-fat” and “fat-free,” while all the time containing the sorts of added sugars and other undesirables that, as the science would now have it, have probably been slowly killing us all along.

But that, in the end, is what science is all about.  It’s a continual search for truth that rarely occurs in a straight line.  Often, it is an exercise in irony, as we find things to be true that we always assumed to be false, and vice versa.  In one sense, this means taking a leap of faith in the promise of the scientific method itself.  But then the scientific method is specifically and painstakingly designed to require so such faith at all.