Sore Losers and Abysmal Winners

There has been no finer thing to happen to the New England Patriots in the last 15 years than losing Super Bowl XLII to the New York Giants in 2008.

With the possible exception of losing Super Bowl XLVI, again to the Giants, four years later.

Those twin defeats, lest we forget, followed a veritable golden age of Patriots football—the brief but intensely gratifying era, beginning in 2001, that saw Bill Belichick’s boys secure three Super Bowl victories in a mere five seasons, sittin’ pretty atop the world of professional sports.

Suffice to say, so much success in so little time had rendered the entirety of Patriot Nation a little too self-satisfied.  They needed a handful of particularly stinging losses to bring them back down to Earth, up to and including the team’s rather embarrassing performance in this month’s AFC Championship match against the Baltimore Ravens.

The fact is, there is nothing more corrosive, boring or insufferable in all of American sports fandom than a team that does nothing but win.

In the NFL for the last decade, that team has indeed often been the Pats, and it was very much in the national public interest for them to fall on some (relatively) hard times.

I say this as a sort-of Patriots fan—that is, as a person who enjoys football, lives in New England and doesn’t particularly like any other team.

More to the point, however, I say this as an old-time supporter of another storied local sports franchise, the Boston Red Sox.

You see, although I was born in Boston, the formative sports-spectating experience of my life was spending my adolescence in New York during the late ’90s glory days of the Yankees, for whom a World Series championship was considered nothing less than a birthright.

Day in and day out, the Evil Empire would pull off one improbable victory after another, rendering their supporters drunk on success, and there I stood in the corner, a loyal fan of an opposition for whom disappointment and defeat—often at the hands of the Bronx Bombers themselves—was a fall ritual as inevitable as the changing of the leaves.

Accordingly, it is with considerable experience and authority that I can attest what horrible things tend to happen to a cluster of fans that comes to feel entitled:  The arrogance.  The rudeness.  The lack of introspection, self-awareness or any very advanced sense of humor.

None of these are personality traits that endear one to one’s community—or at least to those who are not fanatic fellow travelers.

As a Sox fan, I was naturally repulsed by the behavior of my Yankee peers throughout the regular season and the playoffs, almost relieved about my own team’s knack for losing, knowing that, for all its agonies, it would at least keep me humble.

In a way, I feared what might happen should the Red Sox suddenly end a century of tradition and actually win the damn championship.

When they finally did, in 2004, I certainly fell victim to waves of euphoria and a general feeling of invincibility.  In time, however—to my enormous relief—I came to agree with Boston Globe columnist Dan Shaughnessy’s half-tongue-in-cheek remark that he rather missed the old team—the one who could be counted upon to disappoint in the end.

Winning it all just felt odd, and I confess I abandoned much of my old passion for the club following a second championship in 2007.  A triumph once in a while is nice, but let’s not get carried away.

Much the same was the case for me with the Patriots:  They and their many victories were making me feel too much like a Yankees fan circa 1999.  I was dangerously close to becoming what I used to detest.

In one of the many choice dialogue exchanges in Quentin Tarantino’s Django Unchained, Leonardo DiCaprio’s Calvin Candie spews to Christof Waltz’s Dr. King Schultz, “You, sir, are a sore loser,” to which Schultz retorts, “And you are an abysmal winner.”

My experience with sports has taught me that the cliché is true:  “It’s not whether you win or lose, but how you play the game.”  It is infinitely better to lose with grace than to win without, because that other cliché—the one about how character counts above all else—is true as well.

Losing imparts wisdom, where winning tends only to impart hubris.  Victims of the latter—from the New England Patriots to the Republican Party—have learned this lesson the hard way.  Sooner or later, they might be very thankful that they did.

Our Sloppy Selves

Two recent news items for your consideration.

First, a story from the New York Times Green blog about “Think, Eat, Save,” a global initiative to combat the various ways in which the world squanders valuable food, be it through poor harvesting, poor transportation or simply poor shopping habits.

(A representative of the Food and Agriculture Organization makes the point that one quarter of the food we waste annually would be enough to feed all the world’s hungry.)

And second, the suggestion by bioethicist Daniel Callahan, as part of a new paper, that the solution to America’s famed obesity epidemic is to stigmatize the overweight as much as possible—a practice otherwise known as “fat-shaming.”

A former smoker, Callahan writes, “The force of being shamed and beat upon socially was as persuasive for me to stop smoking as the threats to my health […] why is obesity said to be different from smoking?”

Taking these two snapshots of today’s world in tandem, we are left with a doozy of a mixed message:  In our culinary activities, we Americans have simultaneously managed to be extremely wasteful and extremely gluttonous.

Perhaps this is what we mean by “American exceptionalism.”

Of course, there are all sorts of avenues we might traverse in order to reconcile our culture’s weird, complicated and unholy relationship with food—hopefully in the service of national self-improvement.

For instance:  Why not view our massive national waistline positively by concluding that, when our mothers commanded us to finish our supper because there are children starving all over the world, we obeyed?  As the numbers thunderously suggest, we finished our supper and kept on going.  Way to go, us!

Please do not think me overly flippant in saying this.  Eating at restaurants growing up, I always made a point of licking my plate clean, knowing that whatever food didn’t end up in my mouth would end up in the trash.  I figured the former was preferable to the latter, and food is delicious, so it was a win-win—environmentalism used as a cover for my natural piggishness.

Even as I have since reigned myself in, discovering such wondrous innovations as the doggie bag and basic table manners, I find myself nonetheless casting a critical eye upon the heaping piles of uneaten, perfectly palatable grub my fellow primates manage to accumulate.

At a movie house, for instance, once you’ve shelled out an hour’s wage for a single bag of popcorn, wouldn’t you then at least go to the trouble of gobbling up more than three or four handfuls of it?  The number of nearly-full bags one finds as the end credits roll is striking.

Having dispatched with all this, I cannot quite hear myself recommending mindless face stuffing merely for the sake of saving a few square inches of landfill somewhere down the road.

For any halfway-savvy consumer, this is a false choice.  The real trap door out of the waste-gluttony loop is nothing more complicated than buying only the food you intend to eat and not eating too much of it at any one time.  Voila.

Accordingly, one can understand the sentiment behind Daniel Callahan’s proposal for an all-out assault on obesity, rendering supersized appetites a socially unacceptable public health pariah.  If Americans could be made to curb their culinary urges before they begin, there would be far less food available to waste.  Two birds, one stone.

For his part, Massachusetts Governor Deval Patrick last week introduced a budget proposal for his state that includes raising taxes on candy, soda and cigarettes, begging the inevitable and crucial question about what role the government ought to play in sorting out all of this hullabaloo.

One could argue that even if the state has no business telling you what you may or may not consume, it nonetheless has a direct and vital interest in controlling the nation’s annual waste deposits—a task that simply cannot be handled by individuals alone.

This is the balance we ought to strike if we are to reckon with our twin manifestations of sloppiness, as we very probably should.  So long as we have only one life to live and one planet on which to live it, let us do so with the utmost care and precision.

It would be a waste not to.

Civil Rights Status Update

I must admit, the first time I watched President Obama’s second inaugural, I completely missed Stonewall.

It came toward the end of the president’s address, and evidently my attention was beginning to drift.

So I watched the speech a second time, and I missed the reference a second time.  It took reading the transcript for me to finally track down the first-ever citation of the gay rights movement’s inception in a major presidential address.

As something of a history buff, I am prone to lapse into fits of nostalgia.  Because I find certain junctures in the past so very interesting, I cannot help but think how much fun it would be to fire up my flux capacitor and spend some time living there.

It is a common American practice, even in the best of times, to wallow in our greatest hits, pining wistfully for our country’s innocent early days.  Life was so much better back then, was it not?

President Obama’s inaugural speech was a useful reminder, if we needed one, of how very wrong that is.

For the uninitiated:  Stonewall is the name of a gay bar in Greenwich Village that was raided by police in June 1969, provoking a series of riots that led, in the messiest possible way, to the modern push for gay liberation and gay rights.

The president wedded his mention of Stonewall to ones of Seneca Falls and Selma, comparable flashpoints in the struggles, respectively, for rights for women and blacks.

The message of Obama’s alliterative allusion—intended or not—is that there has never been a better time to be alive in America than now.  As exciting as the big bangs of various civil rights movements must have been for those on the front lines, to actually live in those times, on a day-to-day basis, was infinitely more painful than it is today, and is a fate not to be wished upon anyone, not least upon oneself.

We romanticize the past, but we do so at our extreme peril.

Surely no cognizant person today needs to be informed why, say, a typical black person might prefer life in the present to, say, the Deep South in 1955.  Or why a woman would think twice before voluntarily zapping back to 1919.  The indictments against America’s past indiscretions write themselves.

It is simply a fact that the majority of U.S. history has pretty well sucked for members of most minority groups.  To recognize this is a healthy and necessary check on the usual banging on about how America can do no wrong and has, from day one, been the greatest country in the history of forever.

As if this truth were not uncomfortable enough, we must then face the logical next step—and the secondary implication of the president’s speech—which is that, just as surely as we view the present as a vast improvement over the past, we will someday regard the time we currently occupy with a very critical eye as well.

It is extraordinary that same-sex marriage is now legal in nine states, when less than a decade ago it was legal in none.  (For good measure, consensual gay sex was itself illegal in 14 states until the Supreme Court intervened in 2003.)  But what will we think in, say, another decade or two when the practice is universal?

I suppose it is a half-empty vs. half-full situation, realizing how far we have come but also how much farther we have yet to go.

The essential point is that one is entitled—perhaps even duty-bound—to take both views, not having to choose one over the other.  America is a land of contradictions, and we have no cause to deny it.

After all, it is our contradictions that compel us to continually push ourselves to be better.  They are what led our defense secretary, just this week, to announce our armed forces will finally permit women to serve in combat.

And they are what led a U.S. president who eight months ago was publicly opposed to same-sex marriage to proclaim to a worldwide audience, “If we are truly created equal, then surely the love we commit to one another must be equal as well.”

Surely, indeed.

The Row Over ‘Roe’

Do you ever suspect that some issues exist merely so that we can argue about them?

Lewis Black surmised as much, in one of his stand-up routines, about the scuttlebutt surrounding “under God” in the Pledge of Allegiance.  The ongoing national debate about whether the appositive phrase is unconstitutional, Black dryly assured his audience, is the kind of contention we should “save for a peaceful time when it makes sense to have an argument.”

Which brings us to abortion.

It was forty years ago today that the U.S. Supreme Court established the right to a first trimester abortion in Roe v. Wade.  We have been fighting about it ever since.

One reason I include abortion among the subjects destined for eternal argument is the fact that it is a practice about which one is almost not allowed to be neutral or conflicted.  Nuance is highly frowned upon.

Consider Mitt Romney.  Running for president in 2008, the former Massachusetts governor attempted to reconcile his perceived shiftiness on the subject by saying the following:  “I’ve always been personally pro-life, but for me, it was a great question about whether or not government should intrude in that decision.  And when I ran for office, I said I’d protect the law as it was, which is effectively a pro-choice position.”

This is a wholly consistent and honorable position to assume—recognizing the difference between individual views and the purpose of government—but few let Romney have it.  It was just too darned complicated.

And why is complexity so often the enemy of all sides in the abortion discussion?  A big, fat clue can be found in the language we employ.

As we know, neither side views itself as being “against” anything.  Those who support a right to abortion call themselves “pro-choice,” while those who oppose call themselves “pro-life.”

As though such terms were not loaded enough, neither side accepts the other’s self-classification, opting to up the ante further:  The “pro-choice” group describes its adversaries as “anti-choice,” while “pro-lifers” tar their dissenters as simply “pro-abortion.”

For something that is a literal matter of life and death, those on the rhetorical front lines can be forgiven for intemperance.  But that does not make them any less unhelpful in the process of grappling with this most serious and personal of concerns.

Abortion is a subject that demands the rigorous intellectual work of adults.  It would be nice if such people would cease behaving like children.

It is therefore my humble wish, on this anniversary of a Supreme Court case that turned out to be the beginning of a great national debate rather than a conclusion, that we temper the language with which we conduct this argument, showing each other a bit more respect.  As it stands, we don’t.

To wit:  By calling foes of abortion rights “anti-choice,” one is effectively laying a charge of authoritarianism.  I ask:  When was the last time calling someone authoritarian made him or her more sympathetic to the logic of one’s argument?  I hazard to guess that precisely the opposite is usually the case.

To call oneself “pro-life” is no less incendiary, as it dismisses the other side as a gaggle of unholy nihilists.  Of course, as the “pro-life” crowd regularly refers to abortion itself as “murder,” it cannot be faulted for a lack of ideological clarity.

But what these purveyors of rhetorical fireballs equally project is a deep-seated immaturity that hurts their own cause as much as it might puncture that of the opposing team.

The success of any movement ought to rest on the strength of its arguments, not the passion of its participants.

If the “pro-life” side truly believes in the validity of its case—that a fetus is a human being entitled to the rights and privileges of any other human being—it should not need to call its opponents cold-blooded murderers.

If the “pro-choice” clan truly thinks abortion is justified, it might consider acknowledging that not everyone who disagrees is a fascist and a misogynist.

These are not concessions.  They are mere recognitions of reality that would do wonders in civilizing and clarifying a debate that deserves worthy debaters.

This week we marked the second inauguration of a man who sought to “change the tone” of American politics.  Thus far, he has failed, but that does not mean the rest of us cannot succeed.

Discounting Seniors

Why are young people so bad at interacting with old people?

It is something I have wondered for a very long time, and the thought sprung to mind again as I watched Michael Haneke’s Amour, which centers on an elderly man and woman, Georges and Anne, and their struggles to coexist with the much younger folk in their lives.

First is their daughter, Eva, who sits at her stricken mother’s bedside talking incessantly about money, oblivious to Anne’s obvious physical discomfort and utter disinterest in such trivial matters.

Then there is Alexandre, a piano prodigy and former student of Anne’s, who visits shortly after her stroke and lacks the tact and maturity to overlook the physical maladies she plainly does not wish to talk about.

Finally, the last straw:  The horrible visiting nurse devoid of any and all empathy for her patients who is shocked—shocked!—to be dismissed for incompetence, huffily telling Georges she has never been so reprimanded in her life.

There is a disconnect here that extends far beyond the terrain of Haneke’s film.

On certain social and political matters, the existence of a marked generational divide is both clear and uncontroversial.

The story of the 2008 presidential election was precisely one of the young versus the old.  Among voters under 30, Barack Obama beat John McCain by a score of 66-32, while McCain beat Obama 51-47 among voters 60 and older.  The 2012 election saw similar figures for Obama and Mitt Romney.

The hot social issue of the time, same-sex marriage, follows the same thrust, supported by a supermajority of young’uns and opposed by a majority of old fogies.

Of course, we could recite statistics ‘till the cows come home.  It doesn’t mean that we necessarily know anything.

The case for the inevitability of gay marriage acceptance is as follows:  Today’s old people are the final holdouts against the cause, so it is only a matter of waiting for them to die before support becomes universal.

The problem with this formulation (apart from its abject callousness) is the assumption that the opinions and general disposition of most old people is a simple function of the time in which they came of age.  That is to say, that a person’s outlook stays more or less the same throughout his or her life—whatever one believed about gay marriage in 1963 will be retained in 2013 and forevermore.

Thinking in this way, we neglect to consider the effect of aging itself on a person’s inner and outer natures.

Recall the old witticism, widely but falsely attributed to Winston Churchill, “If you’re not a liberal when you’re 25, you have no heart.  If you’re not a conservative by the time you’re 35, you have no brain.”

While the particular sentiment here is a bit flippant, its broader implication is an essential one:  With age, people change.

We do wrong, in other words, in regarding people of different ages as if they are somehow of a separate species that we will never completely understand.

They are, rather, a species we will ourselves one day become.  We should be careful not to so boldly assume to the contrary, on marriage or anything else.

I hasten to add that I make these observations not merely as a check against treating one’s elders with scorn, but also against treating one’s elders with an overindulgence of affection.

A particular bugaboo of mine is the way so many of us—with the purest of intensions, to be sure—speak to old folks as if they are infants, our voices assuming an unnaturally high pitch and our words carrying an air of condescension, as though old age and senility were indistinguishable from one another and that the former implies the latter.

I am skeptical whether most of the recipients of this odd behavior appreciate it.  While I certainly should not presume to speak for an entire generation of people, I am fairly confident that in my own latter years, should I reach them, I would hope to be spoken to like the adult I would then be.

In the meanwhile, I try my level best (with varying results) to regard all my fellow Homo sapiens as if age did not separate us, and was nothing more than the series of cosmic accidents that it is.

We are all one.  That’s amour.

That’s ‘Amour’

In this month’s 60 Minutes/Vanity Fair poll, all the questions were about love.

While most of the queries were boring and predicable—“Do you believe in love at first sight?” “How important is good sex to a successful relationship?”— there was one consideration that caught my eye, and is worth pondering at greater length:  “Which marriage vow is the hardest to keep?”

Is it “To always be faithful”?  “In sickness and in health”?  “For richer and for poorer”?  Or perhaps, simply, “For better and for worse”?

In other words, how might we truly take the measure of one’s love for someone else?  That is, of course, assuming such a thing can be measured at all.

These impossible questions are the subject of Amour, the amazing new movie by Michael Haneke, which opens in Boston this weekend.

What the film is about—indeed, all that the film is about—is that it’s easy enough for two halves of a marriage to declare their love for one another when they’re young, healthy and relatively carefree.  It is the arrival of difficulty, disease and death when the measure of one’s devotion is put to the test.

Amour is the story of Anne and Georges, a long-married couple now in their 80s.  After a lifetime of mutual self-sufficiency, Anne suffers a stroke and requires Georges’s support—moral and physical—in ways neither of them is used to or particularly adept at handling.

What makes Amour great—nay, what makes it tolerable—is its understanding that true love, in the context of a long marriage, has very little to do with sex or even romance, and everything to do with commitment, sacrifice and accepting that some things are more important than your own happiness.

In one sequence, we see Georges feeding Anne a glass of water through a straw, which she is no longer able to do herself.  Anne is deeply demoralized by having to go about such a basic task in this manner, and George’s own impatience is evident as well.

Georges’s measure of devotion here is proved not by the pleasure he might derive from assisting his wife, but by the obvious agony.  Scenes of him helping Anne off the toilet, raising her from bed and cutting up her vegetables make a similar point:  He doesn’t particularly enjoy doing any of these things, but his marriage vows demand it.

The movie contains no musical score, no moments of overt melodrama, no yelling and shouting—no “action,” at least by the standards of conventional cinema.  Amour is largely a series of long, static shots as the characters carry on their lives as best they know how.

As a movie, Amour would be unbearably tedious were it not so well-acted, well-directed and, well, true.  It is dramatic in the sense that life itself is dramatic.  It works because we understand why Anne and Georges behave as they do—even if we might have acted differently in a comparable situation.

But then we can’t know such things until they actually happen.  People express love in different ways, and there are certain forms we might not notice or appreciate until after the fact.  In his first Late Late Show monologue following the death of his father, Craig Ferguson very affectingly recounted the way his father never expressed emotion, but that through four decades of hard work as a postal worker, providing steady support for his wife and kids, “I was never in any doubt that he loved me.”

In its way, Amour is a cautionary tale against entering into a marriage lackadaisically, not taking the commitment seriously and not thinking things through.  It is an institution that is not for the fainthearted.

As America grapples with the changing meaning of marriage in today’s society, we have come to recognize that for a time marriage was largely about commitment, but that today it is largely about love.

What Amour suggests above all else is that these two enigmatic concepts are not mutually exclusive.  Those traditional marriage vows, as old as the hills, are not a hindrance to true love, but rather are the means to its fullest expression.  For better and for worse.

Trivial Matters

Perhaps you missed the news, but something mildly remarkable—yet largely unremarked-upon—occurred at last Thursday’s announcement of this year’s Academy Award nominees.

That is, David O. Russell’s screwball romantic comedy Silver Linings Playbook became the first movie in 31 years to secure nominations in all four acting categories, with citations for leads Jennifer Lawrence and Bradley Cooper and supporting players Jacki Weaver and Robert De Niro.

Never in the history of the Oscars has a single film won in all four departments (Silver Linings Playbook is not expected to, either); on only two occasions has a single film won in three (A Streetcar Named Desire and Network).

A much likelier headline on Hollywood’s biggest night, February 24, is for Daniel Day-Lewis to become the first man ever to win three Oscars for performances in leading roles, having won in 1989 for My Left Foot and in 2007 for There Will Be Blood.  Katherine Hepburn won for Lead Actress four times; to date, no man has won more than two.

If all else fails, perhaps Quvenzhané Wallis will win and become the youngest Best Actress in history or, alternatively, Emmanuelle Riva will win and become the oldest.

What does this all mean?  You guessed it:  Not a goddamn thing.

It’s trivia—the little bits of information we have no reason to know or care about, except it’s just too much fun.

Everyone has their area of expertise—the one subject on which their breadth of knowledge is unquestioned and completely out of proportion.

Like Bradley Cooper’s character in Silver Linings Playbook, one of my pet subjects is U.S. presidents.  Whenever I meet a fellow presidential trivia buff, it is only a matter of time before one of us asks the other what the “S” in “Harry S Truman” stood for.  (Answer:  Nothing.)

My inkling is that everyone wants to be a go-to human encyclopedia about something.  To an extent, it doesn’t really matter what it is.  The more obscure, the better—after all, you don’t want too much competition.  Otherwise, you risk being Steve Carell in Little Miss Sunshine, whose claim to fame is being the country’s No. 2 scholar of Marcel Proust.

The reason our obsession with trivia is so vexing, and so interesting, is because it is so meaningless.

For all the fun facts about U.S. presidents I have managed to cram into my head, I am under no illusion that they are of any practical use.  That the Maxwell House slogan “Good to the Last Drop” was coined by Teddy Roosevelt might be amusing, but it tells us absolutely nothing about U.S. history.

I wonder:  Is there some evolutionary reason for this seemingly irrational attraction to the inconsequential?

Dave Barry has expounded at length about the curious way our brains seem wired to store utterly useless information (and really annoying pop songs) at the expense of more pertinent things like credit card information and where we left our keys.

Could the cause of this phenomenon also explain our inclination to memorize frivolous data on purpose?  Are we trying to protect ourselves from seriousness and profundity, from exerting ourselves beyond what is absolutely necessary?

No doubt there is a wealth of research by cognitive scientists that can provide explanations for all of these questions, which I could freely look up at any old time.  But somehow I’m not feeling up to it today.  Too much Oscar trivia to get to.

The President’s Pastor Problem

President Barack Obama will formally begin his second term next Sunday, January 20, and on the following day the nation will mark his second inauguration on the steps of the U.S. Capitol in Washington, D.C.

While provisions for the ceremony have largely proceeded according to plan, the administration ensnared itself into one significant controversy by, for the second time, hiring a radioactive clergyman to perform the pageant’s benediction.

In this case, the pastor in question is a gentleman by the name of Louie Giglio, an evangelical from Georgia best known as the founder of the Passion Movement, an extremely well-attended getup that holds conferences and events throughout the country.  Giglio has been roundly praised for drawing attention to the horrors of human trafficking.

However, upon the announcement of Giglio’s participation in Monday’s festivities, he became equally known for having delivered a sermon in the 1990s in which he condemned homosexuality in no uncertain terms, calling it “less than God’s best for his creation,” and assailed the gay rights movement as having an “aggressive agenda […] to seize by any means necessary the feeling and the mood of the day, to the point where the homosexual lifestyle becomes accepted as a norm in our society and is given full standing as any other lifestyle, as it relates to family.”

While the latter portion of this thought is undoubtedly true, various gay rights organizations did not care for the pastor’s tone and applied pressure on the Presidential Inaugural Committee to rescind its invitation to Giglio, who swiftly withdrew from the program in the interests of not being a distraction.  The Inaugural Committee, for its part, said it had been unaware of the dicey sermon when it selected Giglio for the Inauguration Day gig.

Predictably—and in light of the similar spectacle of Rick Warren at the 2009 inauguration—many on the left excoriated Obama for again anointing such a divisive figure to the ostensibly unifying role of wishing the president and the country well.  What, they ask, is so terribly difficult in finding a member of the clergy who does not have a record of public condemnation toward gays or any other group?

For me, however, the real question is the one nobody seems to be asking:  Why does the inauguration of the president include a benediction in the first place?

The United States, one must never tire of saying, is a secular country bound by a secular constitution.  We have no official state religion, and our founding documents’ only mention of religious faith is to limit its role in the public square.

We rightly prohibit religious displays on all public property, mandating that the freedom of religion guaranteed by the First Amendment is the business of individuals and private organizations, not the government.  The right to worship includes the right not to worship as well.  As far as the government is concerned, no particular faith is superior to any other, and none at all shall be observed or practiced on the proverbial public dime.

The inclusion of a clergyman invoking religious language in a foundational American public exercise such as a presidential inauguration—as has occurred at every such ceremony since 1937—would seem to be a textbook violation of this most sacred of national principles.

Even when rejecting this whole premise—as the present administration evidently does—one need not expend any effort whatever in examining why the task of locating a preacher with an unblemished record of inclusion is a troublesome one.

Churches are, by their very nature, exclusionary.  To believe in one god is to reject all the others.  As Richard Dawkins cheekily put it, “We are all atheists about most of the gods that societies have ever believed in.  Some of us just go one god further.”

There is very little, if anything, that all of America’s houses of worship agree on.  Accordingly, anything that one priest, rabbi or imam says that is particular to his or her own faith is destined to offend adherents of other faiths, not to mention some within the speaker’s own church.

Should a religious leader manage an entire address without inciting umbrage from a sizable chunk of the public, he or she does so through an appeal to a common humanism, which only further suggests that said speaker need not be associated with a religious organization.

Why do we need our national quadrennial transfer of power to be “blessed”?  Why invite such controversy to a setting in which it is not welcome and does not belong?  Constitutional questions aside, is it really worth all the trouble?

Etiquette of Fingering

Here is a heart-warming story left over from the holidays.

A woman in Denham Springs, Louisiana, suspecting a neighbor had kidnapped her dog, erected a Christmas-style light display on her roof in the shape of a giant middle finger, directed at said neighbor.

The woman, Sarah Childs, along with the American Civil Liberties Union of Louisiana, is currently suing the city of Denham Springs, after being told by police that she must remove the display, which the authorities said constitutes “disturbing the peace” and is not protected speech under the U.S. Constitution.

In the English language—nay, in all languages—few mysteries are more vexing than the existence of profanity.  How very odd it is that we would invent words only to forbid their use.

The comic George Carlin explored this curiosity to great effect throughout his career, most famously when he singled out the seven most frowned-upon words of all—these being “shit,” “piss,” “fuck,” “cunt,” “motherfucker,” cocksucker” and “tits”—later expanding the list well into the thousands.

The whole concept of words being offensive might have long died out by now, except that enough of us Homo sapiens have opted to be co-conspirators in this dance by being offended by words.

The middle finger—rather, the extending thereof—is more or less a silent variation of this same concept.  We, as a culture, have decided such a gesture is offensive—an affront to decency—without quite being able to explain why.

A central question in the Denham Springs case—and a worthy question in general—is whether “the finger” is merely rude and disrespectful or is, in fact, obscene.

This distinction is not without precedent.  In 1976, an appellate court in Hartford, Connecticut ruled that raising one’s middle finger at another person is “offensive, but not obscene,” the judge reasoning that “for the finger gesture to be obscene it must be significantly erotic or arouse sexual interest.”  The Connecticut Supreme Court upheld the decision.

As the appellate court noted at the time, “flipping the bird” can be traced as far back as Ancient Greece, where the gesture did indeed carry an explicitly sexual connotation, appearing in the work of Aristophanes and elsewhere, in both playful and contemptuous contexts.

In America, anthropologists trace use of “the finger” to Italian immigrants, with the first recorded instance in the States occurring in 1886 during a baseball game between the Boston Beaneaters (later known as the Braves) and the New York Giants.

The other crucial consideration as to whether Sarah Childs should be made to remove the light display from her roof is the all-important quandary as to when speech can be defined as an action, and therefore restricted.

In the seminal 1919 Supreme Court case Schenck v. United States, Justice Oliver Wendell Holmes, Jr., popularized the hypothetical scenario of “falsely shouting fire in a theatre and causing a panic” as an example of “words used […]of such a nature as to create a clear and present danger.”

In point of fact, Childs was informed by police that her unsightly roof message could be interpreted as threatening toward her neighbors.  While it is undeniable that the display’s intended and perceived message is one of hostility, I find it difficult to entertain the proposition that one could then come to feel physically threatened by it—unless, of course, Childs rigged it to snap off her roof and tumble onto her neighbor’s driveway as he walked out to get the mail.

My own quaint view is that the right to free expression must include expressions that are rude, disrespectful and—within the boundaries of local laws—intimidating.  Displaying a giant, illuminated middle finger might not endear one to one’s community, and may well incite some kind of a backlash, but that is hardly grounds for prohibiting such behavior.

Letting your entire neighborhood know that you are an immature, sociopathic nut is what freedom is all about.

Light and ‘Dark,’ Part 2

In a recent column, I spent so much time excoriating Zero Dark Thirty—in particular, the disingenuousness of director Kathryn Bigelow and screenwriter Mark Boal—that I failed to mention how very much I enjoyed it.

As much as Bigelow’s follow-up to The Hurt Locker demanded a thorough rebuke, so also does it deserve a resounding defense.

I began last time with a plea from Gustave Flaubert to Harriet Beecher Stowe, “Is it necessary to utter one’s ideas about slavery?  Show it, that’s enough.”  My recommendation was, and is, that Zero Dark Thirty ought to be considered in this “show, don’t tell” context with respect to its depictions of torture.

As an extension of this thought, I offer the following quotation from Andrew Sullivan:  “The case against torture is simply that it is torture, that capturing human beings and ‘breaking’ them physically, mentally, spiritually is a form of absolute evil that negates the core principle of human freedom and autonomy on which the West is founded.  It is more fatal to our way of life and civilization than terrorism.”

If Sullivan is correct, then Flaubert is correct.  If torture is axiomatically, viscerally and morally repugnant, then Bigelow’s film need not make any comment on it other than simply showing it being done.  Those who are repulsed by torture will conclude the movie is against its use, while those who are not might think differently.

It is suggestive of the film’s greatness, not failure, that its politics can be subject to utterly contradictory interpretations by its viewers.  The very existence of a debate over the film’s intentions is the most persuasive argument yet for Bigelow’s and Boal’s contention that Zero Dark Thirty is a movie without an agenda.

I am reminded of the brief, but passionate, brouhaha that erupted in early 2005 regarding Clint Eastwood’s film Million Dollar Baby, in which (spoiler alert!) the character played by Eastwood is compelled to assist in ending the life of a stricken dear friend.  Critics argued that because Eastwood’s character was clearly intended to be sympathetic—the “hero,” as it were—the film was effectively in favor of assisted suicide.

To this, Roger Ebert countered that a freethinking person could just as easily see the film and conclude that Eastwood was a good man who made a bad decision, and that such a phenomenon does not diminish the movie one whit.

I would optimistically wager that a similar sentiment might be made about Zero Dark Thirty, although in this case it’s a bit more complicated—first because Bigelow’s film is based on real events, and second because its implications reach far beyond the conscience of a single person.

My own view, having seen the thing once, is that Zero Dark Thirty does not glorify or justify torture, although one can be forgiven for concluding to the contrary.

The film shows the employment of waterboarding, stress positions and so forth as part of the amassing of intelligence that led to the killing of Osama bin Laden, but it does not suggest that the intelligence that actually cracked the case was a direct result of said “techniques.”

What we see, rather, is a prisoner providing valuable information to two CIA agents as they offer him hummus and fruit across a picnic table in the warm sunshine—that is, as they treat him with basic human dignity.

The complication is that this follows a sequence in which this man is indeed tortured, and his present tip-off might well spring from fear of being tortured again.  Would he not have cooperated had he been treated humanely the whole time?  Or, perhaps, might a lack of torture made his information even better?

It is a complex and nasty business.  Good on Bigelow for dealing with complexity and nastiness.  Few American filmmakers go to such trouble.  I wish more of them did.

Of course, we are hardly done with the hard questions about the long journey from September 11, 2001 to May 1, 2011.  Was torture necessary to gather the intelligence we required to conduct the so-called war on terror?  If so, does that axiomatically make it justified?  Or is Andrew Sullivan correct that some things—certain fundamental American values—are simply more important?

On the practicality question, I refer you to Nice Guy Eddie from Reservoir Dogs, who cautioned a pair of cop-torturers, “If you beat this guy long enough, he’ll tell you he started the goddamn Chicago fire—now that don’t necessarily make it so!”

On the moral question, I leave it up to you.