The Definition of Family

My mom and dad got married 28 years ago today, and in spite of all they have in common, they are still together.

On this final day of the most popular month of the year for weddings, let us reflect upon the rather momentous recent development in the history of marriage in the United States—namely, the twin Supreme Court rulings last Wednesday that struck down Section 3 of the Defense of Marriage Act and, in effect, rendered California’s Proposition 8 moot.

This first decision establishes the equal treatment of same-sex unions under federal law, while the second allows such unions to resume and flourish in America’s most populous state.

Asked for his reaction to these pronouncements, New Jersey’s governor, Chris Christie, reiterated his opposition to gay marriage and his view that the issue ought to be adjudicated by the public rather than the judiciary, offering the familiar trope, “You’re talking about changing an institution that’s over 2,000 years old.”

Indeed, this question about “changing the definition of marriage”—the cornerstone of the argument against gay marriage—is one that must always be addressed, and which holds particular interest for your humble servant at the present time.

In addition to my parents’ anniversary, this weekend sees a sizable family reunion on my mother’s side—an assembling, as such events are, of all sorts of couples (and non-couples) marking their places within the larger family unit, which itself serves as a microcosm of the melting pot that is America.  In discussing the meaning of the country, considering the definition of marriage is inescapable.

Per example:  While both my parents are Jewish, three of their four combined siblings married people who were not.  (One spouse later converted, but the others have retained their religion of birth.)  Under traditional Jewish law, these so-called “interfaith marriages” are invalid:  The Talmud expressly forbids them, and most rabbis refuse to officiate over interfaith wedding ceremonies in their synagogues.  Only through contemporary civil laws are they allowed to exist at all.

Another of my family’s matrimonial duos consists of my white cousin and her black husband, a union that in 1967 would have been illegal in 16 states and, in 1948, in 14 others.  What is more, American public approval of interracial marriage—Governor Christie’s standard for determining which marriages are legitimate and which are not—did not eclipse 50 percent until 1994.  In the early 1980s, when my cousin and her husband were born, only a third of their fellow citizens thought their eventual merger was a good idea.

(As a footnote:  Despite the Supreme Court ruling anti-miscegenation laws unconstitutional in Loving v. Virginia in 1967, the state of Alabama did not remove its own ban from the books until a 2000 ballot proposal to do so, which passed with 59 percent of the vote.)

This is all an illustration of a very simple point:  My family, as I know it, only exists because the definition of marriage has changed on several occasions within my parents’ own lifetimes, both in law and in the minds of the people.

When my uncle brought a nice Christian girl home for dinner for the first time, my traditional Jewish grandparents were slightly less than welcoming toward the idea, and toward her.  With time, however, they came to accept a non-Jew into the family, growing to love their daughter-in-law as their own.

A generation later, faced with a suitor for their granddaughter who was not Jewish and (gasp!) not even white, no objection was raised because no objection was felt.  He was a great guy, they were in love, and that was that.

Neither of my grandparents lived to see their great-grandson be born and quickly become the most delightful member of every family gathering, so it is left to the rest of us to appreciate the radical changes to a 2,000-year-old institution that allowed him to come into existence.

I recount this set of personal anecdotes in light of the latest turn in America’s understanding of marriage because I suspect mine is not the only family affected by such turns in the past.  To the contrary, I cannot imagine any great number of families which are not.

Accordingly, in a culture that has come to regard the phrase “changing the definition of marriage” negatively—people such as Christie use it as a slur, while members of Team Gay tend to avoid it altogether—I offer the humble proposal that, in light of the facts, we instead treat the concept as a necessary and welcome one, and something to which every one of us, in one way or another, literally owe our lives.

In Vino Veritas?

One would not expect a man who was filmed saying “I love Hitler” to earn a Jew’s pity and (dare I say) affection.  But that John Galliano is one charming guy.

Galliano, you might recall, is the influential fashion guru who ruled over Christian Dior until his career came to a spectacular, crashing end in February 2011, when a cell phone video revealed him hurling anti-Semitic and other insults at fellow patrons in a Parisian bar.  Galliano has been lying low as a social pariah and persona non grata ever since.

Until now.  Galliano’s first post-exile interview appears in this month’s issue of Vanity Fair, and he also spoke in recent weeks with PBS’s Charlie Rose.  Galliano used both platforms as a means of explaining himself and making amends, and he proves a highly compelling subject.

The fascination with the former king of couture springs from the fact that the abjectly hateful comments in question—informing Jewish customers, “your mothers, your forefathers would all be fucking gassed,” among other things—genuinely seemed to come out of nowhere.

If all one knew of him was the footage of his fateful night in Paris, one would simply render him a British Mel Gibson minus the subtlety:  A boorish, arrogant, bigoted little puke who enjoys the fine art of imbibing just a little too much.

In point of fact, alcohol was indeed a main character in the story of Galliano’s fall—far more so than in Gibson’s, and in a far more intriguing way.

In the case of Mr. Braveheart, who in 2006 famously ranted about “the Jews” while being arrested for driving while intoxicated, alcohol’s magical powers of removing one’s inhibitions led him to express views that the world already knew (or suspected) he possessed.  Gibson might not have uttered such anti-Semitic slurs in a state of sobriety, but such slurs uttered under the influence nonetheless reflected what he truly thinks about God’s chosen tribe.

Galliano, by contrast, has never been known to harbor anti-Semitic sentiments in his life, either in word or in deed.  His outburst came amidst an extended period of alcoholism so pronounced that he has no memory of it ever occurring.  At the time, he was operating as a (fairly high-functioning) blackout drunk, not fully cognizant of the thoughts that were forming in his head and spilling out of his mouth.

To the extent that Galliano’s comments reflected views heretofore stowed safely in his subconscious, they were buried so profoundly deep as to call into question whether they can fairly be categorized as his.

The relevant adage we must address, as Vanity Fair does, is in vino veritas—the notion that alcohol is a truth serum and the key to our real selves.  That the things we say in sobriety are tailored to political correctness and social mores, while our drunken musings are the genuine article.

A great deal depends upon the veracity of this maxim, and yet the science on the matter remains highly unsettled.  As we drinkers all know from experience, liquor certainly can induce one to speak truths otherwise left unsaid.  However, it can equally provoke meditations and wisequacks whose origins we cannot quite place, even within our own heads, the effect of which can be quite unnerving, indeed.

Are we prepared to indict ourselves for every impolitic remark that has ever passed our lips at 3 o’clock in the morning after a few dozen glasses of scotch?  Should we treat what we say while drunk as if it were said while sober?  If we are willing to declare—as much of the culture was—that what John Galliano said in a blackout state warranted the end of his career and two years of public exile, are we equally willing to levy the same for ourselves?

None of this is to relieve Galliano—or anyone else—of responsibility for irresponsible drinking.  However culpable one’s peers might be for acting as enablers and eggers-on—particularly of someone rich and famous, who is that much more prone to spiral out of control with little advance warning—we are all free agents, and must try our level best to extricate ourselves from habits that are harmful to us and to others.

Yet I nonetheless wonder if our culture’s policy of having “zero tolerance” for those who say and do repulsive things, without examining the particular context of those words and acts, is not itself harmful to our society, which is supposedly rooted in a Judeo-Christian tradition that prizes forgiveness, understanding and mercy as the highest of earthly virtues.

One-Party State

A fellow named Edward Markey has just been elected senator by the good folks of Massachusetts, who picked Markey to succeed John Kerry, who surrendered his seat in February to be secretary of state.

The run-up to yesterday’s special election yielded extremely limited interest all the way through, with the commonwealth’s attention being largely focused on the Boston Marathon bombing and its aftermath.  Indeed, awareness of Tuesday’s vote was so tepid that both candidates were compelled to expend considerable resources simply to remind voters which day the election was to be held.

While we could drone on ad infinitum about how depressing it is that Americans take their most sacred rights so much for granted that they sometimes forget about them entirely, the fact is that the Massachusetts election never carried anything in the way of real tension or urgency, its result never much in doubt.

Markey’s opponent, Gabriel Gomez, was by no means lacking in positive appeal.  The son of immigrants, Gomez graduated from the U.S. Naval Academy to become an aircraft carrier pilot and later a Navy SEAL.  After retiring from the Armed Forces, he embarked upon a business career that has proved quite profitable.

However, Gomez was persistently (and fatally) handicapped in the campaign by his most marked characteristic of all:  He is a Republican.

What is worse, his opponent, Markey, is a Democrat.

Considerable scholarship has been done on the eternally complicated balancing act that is required for a Republican to win elected office in a state such as Massachusetts, where a highly disproportionate number of voters are registered with the other team.

The “trick” is simple enough:  Pledge to cut taxes and not restrict the rights to abortion and same-sex marriage.  As in so much of America, to be socially liberal and fiscally conservative allows one a fighting chance for electoral success regardless of party affiliation.

All the same, events such as Scott Brown’s victory in the 2010 vote to replace Ted Kennedy are exceptions to the rule.  In the Massachusetts House of Representatives, Democrats outnumber Republicans by a score of 128-32; and in the State Senate, 32-4.

What I would argue, in the context of the special election just past, is that far more alarming than the inherent disadvantage in being a Massachusetts Republican is the inherent advantage in being a Massachusetts Democrat.

Case in point:  What was arguably the most substantively damning charge against Markey was simultaneously the most advantageous.

That is, the assertion that in his 37 years as a U.S. congressman, Markey had not once bucked the party line on any major legislation on any major issues.  Time and again, he proved a reliable rubber stamp for Democrats in Washington, with seemingly no interest in assuming a contrary view.

According to conventional wisdom, such a record is supposed to be the kiss of death.  In every last opinion poll, Americans claim to value nothing so much as bipartisanship, and will vote with happy abandon for those who credibly vow to “cross the aisle” in the interest of “getting things done.”

This week in Massachusetts?  Not so much.  In a state with more than three registered Democrats for every registered Republican, compromise is all well and good, but you know what is even better?  Liberalism, that’s what.

The people of Massachusetts were told a vote for Markey was a vote for every Democratic Party policy in the book, and they responded, “Yes, please.”

On an individual basis, this is entirely rational.  I cannot hear myself dissuading someone from voting for a candidate with whom he or she agrees on practically every issue.

Yet I despair, nonetheless, that the “D” at the end of a candidate’s name makes his or her ultimate victory more or less inevitable, just as it is still very much true that the word “Kennedy” at the end of a candidate’s name ensures the same.

This trend, as long as it persists, tends to engender a sense of entitlement amongst its benefactors and a sense of bitterness amongst those not already in the club.  Members of the first group are given every last benefit of the doubt by John Q. Voter, while those in the latter are provided none at all, and must prove themselves far more rigorously as a consequence.

Somehow this does not seem fair.  No election should be a foregone conclusion, not least on the basis of party affiliation.  No instance of the mass exercise of the right to vote should be so assured as to empower a lowly scribbler to muse upon the meaning of the results—as this particular scribbler has—before said results have even trickled in.

Clinically Fat

To America’s overweight:  Rejoice!

You have long been tarred as a group of undisciplined, lazy gluttons in desperate need of lifestyle refinement and reform.

But no more.  As of today, you are all officially diseased.

On Tuesday, the American Medical Association voted to classify obesity as a disease, following a yearlong study by the organization’s Council on Science and Public Health, which concluded precisely the opposite and recommended the AMA do no such thing.

While one might wonder why the country’s largest group of physicians would disregard its own findings, the discrepancy is no big mystery.  In its official statements this week, the AMA has made it clear the vote to establish obesity as a disease, rather than a mere disorder or condition, was a matter of practicalities rather than science.

“Recognizing obesity as a disease will help change the way the medical community tackles this complex issue that affects approximately one in three Americans,” explained Dr. Patrice Harris, an AMA board member.

Morgan Downey, author of a recent report on the subject (unaffiliated with the AMA), said, “I think you will probably see from this physicians taking obesity more seriously, counseling their patients about it […] Companies marketing [anti-obesity medication] will be able to take this to physicians and point to it and say, ‘Look, the mother ship has now recognized obesity as a disease.’”

In other words, the fact that so many Americans are fat axiomatically justifies altering our understanding of fatness itself, as if medical science were some sort of popularity contest, rather than a set of objective truths amendable only in the face of empirical evidence.

The trapdoor that allows the AMA to get away with playing fast and loose with scientific definitions is the fact that, properly speaking, there is no agreed-upon scientific definition for “disease” in the first place.  The concept is so broad, even within the medical community, that it can essentially be made to mean whatever one wants it to mean.

Accordingly, if the AMA wishes to exploit this linguistic loophole to coax doctors into medicating their patients thin, it is free to give it the old college try.

Unfortunately, there are a couple of actual scientific facts on the subject of fatness from which we cannot completely will ourselves free.

To wit:  The Council on Science and Public Health’s primary argument against reclassification was to point out that, from the start, obesity has been a highly problematic gauge of one’s overall health, perhaps deserving less attention from the medical profession than it currently receives, rather than more.

The word itself, along with “overweight” and “underweight,” simply encompasses a particular range of body mass index, or BMI, which is the ratio of one’s body weight to one’s height.  As countless experts have argued for years, such a formula to assess an individual’s physical wellness is limited at best and counterproductive at worst.

By design, BMI does not reflect the nuances of one’s diet or exercise routine, nor does it distinguish between good body fat and bad body fat—the muscle vs. the flab.  A person in prime physical condition—someone who spends all afternoon at the gym, consumes hundreds of grams of lean protein and could easily live to age 90 and beyond—can nonetheless be termed “obese,” thanks to the collective weight of all that muscle mass.

In other words, this entire conversation is based upon a faulty premise.  Whatever definition one might devise for what constitutes a “disease,” we can probably agree that any such condition is inherently undesirable and unhealthy.  Obesity, by its sole defining variable, does not meet even this most basic of standards.

Of course, we shouldn’t be overly cute about this.  Most people who qualify as being overweight are, in fact, the blubbery couch potatoes we envision whenever the term “obesity epidemic” pops up.

By making the concept more clinical, the AMA and its supporters are hoping to remove the stigma that packing on a few hundred extra pounds is caused by eating too much and exercising too little, insisting that the truth is more complex.  For some people, the argument goes, maintaining a healthy body weight is beyond one’s control.  While my own experience has shown the reverse to be the case—without fail, eating less ice cream and going on more bike rides has yielded a slimmer waistline, and vice versa—I would not presume myself to represent everyone else.

In any case, if the aim of this semantic silliness is to induce Americans to be healthier, I cannot mount an argument to the contrary.  Epidemic or not, physicians ought to inform their plainly out-of-shape patients of the dangers of eating crap and never getting any exercise, and encourage behavioral modifications.

Nonetheless, one should remain skeptical that establishing fatness as inherently toxic is either the right or useful means of bringing a healthier America about.

Wash Your Hands

I do not mean to sound like a prude, but I make a point of washing my hands every time I go to the bathroom.

Call it a personality quirk.  It was the hygienic habit upon which my beloved grandmother most insisted, and one I have never quite kicked.  Indeed, on occasion I will even take to the sink before sitting down to dinner, using both the water and the soap.  If I’m feeling especially hoity-toity, I will do the same before breakfast and lunch as well.

Nonetheless, I am acutely aware that many of my fellow Americans do not trouble themselves with this admittedly time-consuming activity, and I would not presume to impose my own snooty traditions upon them.

In a fresh study published by the Journal of Environmental Health, researchers found that a resounding 5.3 percent of people using public restrooms washed their hands for at least 15 seconds.  (The Centers for Disease Control and Prevention recommend scrubbing with soap and water for at least 20 seconds.)

Among the subjects observed, 10.3 percent did not wash their hands at all, while an additional 22.8 percent used water but not soap.  The report notes that “the presence of even discreet observers could have affected behavior, probably encouraging more hand washing,” which somehow does not make me feel any better.

For that matter, neither do the signs in the restrooms at Starbucks, which provide a step-by-step explanation (complete with accompanying illustrations) about how hand-washing works.

Ostensibly, these diagrams serve to reassure Starbucks customers that the coffee behemoth takes the “all employees must wash hands” policy seriously.  To the contrary, I dare say we can be forgiven for growing suspicious of the company’s hiring practices, if these signs fairly reflect its estimation of the intelligence of the average employee.

I remember well the moment in my junior high school cooking class when the teacher cautioned us, “If you saw what happens in the kitchen of any fast food restaurant, you’d never eat there again.”

All these years later, thanks to a plethora of enterprising documentarians and other journalists who have done the dirty work for us, we no longer have to idly speculate about what atrocities our Big Mac suffered on its journey from the slaughterhouse to our plate, let alone the much shorter, but no less treacherous, path from the grill to the counter.

It was only last week that America was subjected to a photograph of a Taco Bell worker using his tongue to polish a tall stack of hard taco shells, to which Bill Maher observed, rightly enough, “What?  Like Taco Bell was health food before?”

The debate we might have, from these anecdotes and many others, is to what extent personal hygiene and cleanliness is a personal choice, and at what point it becomes a societal imperative.

The matter of fast food workers is among the easier ones to adjudicate.  It is fairly difficult to argue that the folks who handle your food are not responsible for ensuring their paws are reasonably sterile.  While we cannot expect our lunch to be free of every last micro particle that could possibly cause us harm, it would seem a reasonable enough request for it to be devoid of the urinary and fecal matter of the good folks who prepared it.

But what of the rest of us?  Do we not owe the same courtesy to our fellow customers?

As the cliché goes, every time you touch a doorknob, you are also touching the hand of every other person who touched that doorknob since it was last cleaned.  The same goes for every tabletop, every salt shaker and, of course, every coin and dollar bill.  And how often does anyone bring his stack of legal tender to the cleaners?

We know we inhabit a dirty world, and most of the time we passively accept it and hope for the best.  What other choice do we have?  In a culture in which one in three does not believe in using soap, things do not look promising.

The challenge for us—individually and collectively—is to stop viewing hand-washing and the like as mere matters of personal preference, and regard them instead as mandatory prerequisites for living in a civilized society, in which our own behavior affects others in ways we are not always aware but ought to be.

As a nominal libertarian, I take no pleasure in the notion that I should behave for the benefit of people other than me, and that my actions have consequences outside my own self.  However, part of becoming an enlightened person is to face unpleasant facts, and the moral imperative of washing one’s hands is one of them.

A Day in the Life

Quentin Tarantino has proclaimed the biopic the film genre that least rouses his interest.  To make a movie that chronicles the life and times of a particular individual is, he assures us, not something to which he intends to expend his talents.

Tarantino does, however, permit himself an addendum on this subject:  He could be persuaded, in the right circumstances, to undertake a work of historical drama that concerns a particular event in a particular man’s journey from the cradle to the grave.  A day in the life, as it were.  A singular moment to reflect all the others.

What a curious conceit it is that by spending a mere hour or two in another’s company, one can be made to feel as if having peered into that person’s soul.

On this point, allow me to draw your attention to Richard Linklater’s Before Midnight, the new movie starring Julie Delpy and Ethan Hawke that checks in on the relationship that began in 1995’s Before Sunrise and rekindled in 2004’s Before Sunset.

Like all great film romances, the one in the Before saga fills its adoring audiences with a real sense of insight into what makes its characters tick.  After three films whose subjects started as 23-year-olds and are now 41, Celine and Jesse (Delpy and Hawke) seem as much like real people as any fictional couple in the cinema.

What makes this more intriguing still is the fact (easily overlooked) that none of the three movies encompasses more than 24 hours of Celine’s and Jesse’s lives.  Before Sunrise begins in the evening and ends the following morning, while Before Sunset and Before Midnight both exist within a single calendar day.  They are mere snap shots—the most fleeting of glimpses into the comings and goings of strangers.

The catch is that these are no ordinary days.  Rather, they are the moments of high drama—major turning points, in some cases—that command our attention in ways that a more typical slice of life might not.

The arresting third act of Before Midnight, in which Delpy and Hawke lay down their emotional cards in an elegant hotel room, is the sort of domestic squabble that we do not normally witness in real life, although we know bloody well that it really happens.  Even the most seemingly harmonious duos are susceptible to the occasional battle.  The days of wine and roses are even shorter than we might presume.

These fairly commonsensical facts have never not been true, but we might spend an extra moment or two reflecting upon them in light of the recent news—which, in point of fact, is neither news nor recent—that the U.S. government has granted itself the authority to tap our phones and read our e-mails whenever its heart desires.  Privacy as we know it, already in a highly fragile state, is rapidly becoming a thing of a past.

While reaction to this so-called revelation has hardly been uniform, an alarmingly high number of citizens have essentially shrugged off the prospect that whatever privacy they had left has been forfeited in the name of fighting evildoers.  The refrain “I’ve got nothing to hide” has become all too common in the national lexicon.

On this particular claim, I can only stand back in awe.

If you can genuinely assert that you would feel no discomfort were everything you say and do to be secretly recorded and splattered across the front page of the New York Times, then please accept my congratulations.  You possess a level of self-confidence to which I can only daydream.

True, even with the powers now attained by the National Security Agency, it is fairly unlikely that a typical American will find his most sensitive personal business nationally broadcast without probable cause.  But the point is that it could be, and there would be very little one could do about it.

If one is prepared to surrender more and more of one’s civil liberties, this is the sort of culture one will need to accept.  When you say, “I have nothing to hide,” realize that soon enough this will become literally true:  You will not be able to hide anything, even if you want to.

Celine and Jesse’s midnight fight will no longer be between just them, and we will no longer feel like eavesdroppers in witnessing it.

Indeed, we will not require the penetrating power of film to peer into the lives of others.  We can simply ask the government.

Great Expectations

There is no term in the English language more overrated than “overrated,” and no approach to judging art more corrosive than on the basis of preconceived expectations.

This weekend sees the wide release of Richard Linklater’s new film Before Midnight, which has been riding a wave of critical euphoria since its premiere at Sundance in January.  The movie, starring Julie Delpy and Ethan Hawke, is the third in a series that began with Before Sunrise in 1995 and continued with Before Sunset in 2004.

Presumably, most audience members for the new installment have followed it from the start, swept up in the continuing story of Celine and Jesse, the characters played by Delpy and Hawke, whose on-again, off-again relationship is among the most intriguing and unusual of all cinematic couplings.

However, because the word-of-mouth for Before Midnight has been so ecstatic, it stands to reason that a fair number of curious moviegoers heretofore unfamiliar with the saga will wander into art houses showing it, if only to see what all the fuss is about.

My fear is that these late-arrivers will enter the theater with impossibly high expectations, fueled by the film’s near-universal critical acclaim, and exit in a state of profound disappointment.

It is inevitable:  When you are told the movie you are about to experience is the greatest thing since sliced bread, the only possible result is to feel let down—first, because nothing could truly be that good in any case; and second, because the bar has been set at a level that cannot conceivably be met.

The trend is so common and predicable in the American culture, you can set your watch to it:  A movie or TV show accrues a reputation for being unquestionably great, which in a few weeks’ time erodes into a reputation for being over-esteemed, as a second wave of viewers washes out the early enthusiasm of the first.

Speaking with the bias of someone who is often an early adopter of such cursed creative confections, I find it acutely irritating that Americans’ relationship with popular art functions in such a way with such frequency.

For starters, it strikes as profoundly unfair to the artists themselves, who produce the works in question long before the expectations game has been set into motion.  The average author or director often has no idea how his or her work will be judged by the public; the honest ones are simply trying their best to create something worthwhile, and might not even truly care if their audiences don’t like them.

In any case, these finished products ought to be considered with this dynamic in mind.  That is to say, they should be judged on their own merits, rather than being weighed against the opinions of those who happened to see them before you did.

To declare something “overrated” is, after all, nothing more than a critique of other people’s tastes, not a critique of the object itself.  The art is guilty of being liked more than it perhaps should be.  Why should this be the fault of the artist?

Gaze upon the brush strokes of Michelangelo and da Vinci as if six centuries had not passed since those men last roamed the earth.  Read Adventures of Huckleberry Finn without your mind clogged by a hundred years’ worth of political correctness and academic canonization.  Watch the first five seasons of Mad Men without reading every last comment on every last goddamned message board, as if some anonymous stranger on the Internet had any more wisdom or taste than you do.

Of course, this is all easier said than done.  Such a feat as viewing a work of art with total objectivity and freshness would require one of two rather herculean feats:  Either draining one’s mind of everything one has ever heard about said work, or not hearing anything about it in the first place.  The former is impossible (or nearly so), and the latter is paradoxical (how could you know to see something of which you are unaware?).

All we can reasonably do is try the best we can to be fair and open-minded, which requires the much more modest task of not taking other people’s opinions too seriously, tuning out the prevailing view about a particular piece of pop art until one has digested it for oneself.

And if you have the chance, be sure to catch Before Midnight.  It is an extraordinary movie, and I know you’ll just love it.

Profiles in Disingenuousness

Following the death on Monday of New Jersey’s senior senator, Frank Lautenberg, the Garden State’s governor, Chris Christie, had several options (well, two) as to how to go about replacing him.

One such avenue, if traversed, would prove to be straightforward and inexpensive; the other, rather complicated and quite costly.  Given the political landscape of the moment, however, the former approach would likely prove to Governor Christie’s disadvantage, while the latter would allow his political fortunes to flourish.

Can you guess in which direction he proceeded?

The governor’s decision, announced with impressive swiftness, is for New Jersey to hold a special election for the now-open Senate seat on October 16, a seemingly random date just three weeks shy of the already-scheduled contest for governor—i.e. Election Day—when Christie could have opted to hold the Senate vote as well.

Creating this second event on a separate day, rather than simply adding an additional column to a preexisting ballot, is expected to cost the state $24 million.  As a strident fiscal conservative—in both word and deed—Christie would surely wish to avoid imposing such fees on his constituents if at all possible.

What is more, holding an election in October rather than November gives prospective candidates less time to cobble together their campaigns, and voters less time to digest and ruminate on them.

Why, then, did Christie nonetheless choose to unleash this needless, unsightly mess?

Because, dear reader, were the votes for senator and governor to occur simultaneously, a whole lot more Democrats would turn up to the polls and Christie’s prospects for a commanding victory would be jeopardized.

In short, Christie did what is best for Christie at the expense of the state he rules.  In this case, his appetite for fiscal prudence ended where his future plans for high office began.

It is, of course, entirely within Christie’s power to behave in this way.  New Jersey’s laws on special elections permit him, as the chief executive, to do pretty much whatever he wants, and his ultimate decision is (according to most analysts) a politically shrewd move.  No rational, ambitious public official so empowered could, or should, be expected to go out of his way to make his own life difficult.

However, what Christie does owe the good citizens of New Jersey, and has thus far failed to deliver, is a simple acknowledgement that this—in the words of Walter Cronkite—is the way it is.

If the governor insists upon behaving as the calculating political animal that he is—acting in his own interests first, and those of his constituents second—he ought to say so.  Insisting, as he has, that an October vote is the most fortuitous option for the voters, well worth the added expense, is an exercise in disingenuousness that ought to strike any Jerseyan as a wee bit insulting.

By dropping the righteous theatrics and leveling with us, Christie would merely be recognizing what is a wholly common practice in state-level politics, as all who follow it are well aware.

In Massachusetts in 2009, for example, when Governor Deval Patrick was granted the power to schedule the replacing of the late Senator Edward Kennedy, he did not for a moment hesitate to change a state law that he, himself, had previously supported in order to produce a result that was to his own personal liking.

The question in the Kennedy case was whether the governor should have the authority to appoint a temporary seat-filler to cast Senate votes between a vacancy and the installation of the winner of a special election.  In 2004, the Massachusetts legislature had removed such a prerogative in anticipation of Senator John Kerry being elected president and the state’s then-governor, Mitt Romney, appointing a Republican to replace him.  Five years later, having a Democratic governor suddenly made the idea easier for the deep-blue state to swallow.  What are the odds?

A lesson we might draw is that any process left in the hands of politicians is doomed to be sorted out in a political manner.  This is not necessarily something over which to despair; however, it does require constant vigilance by us, the people.  What we would hope, and what we ought to demand, is for the officials to whom we bestow certain authority to exercise it in an open and intellectually honest fashion.

Is that really too much to ask?

Ain’t Too Proud

We are in the midst of Gay Pride Week here in the capital city of the first state to legalize same-sex marriage, with a full roster of activities and events planned across the greater Boston area, culminating in the annual Pride Parade on Saturday.

Your humble servant will not be out marching this weekend.  Nor do I plan to participate in “Queeraoke” in Jamaica Plain or attend “Pride Night” at the Red Sox-Rangers game at Fenway.

None of these goings-on particularly piques my interest, and engaging in any of them would be an empty gesture on my part.

I am not proud to be gay.

Now, I understand that striking such an attitude in today’s particular political and cultural environment can come off as an act of heresy, since gay people in 2013 are not allowed to carry any ambivalence whatever about their gayness.

What is more, this Thursday holds enormous significance for me on this front, being the anniversary of the day I announced my orientation on Facebook.  While the positioning of this moment in the middle of Pride Week was coincidental at the time, I can accept the overlap today, as my series of comings-out are, in fact, among the proudest acts of my life.

But simply being gay?  What’s so special about that?

This, in so many words, is the distinction we ought more precisely to draw:  Acts vs. facts.

I am proud to have publicly acknowledged my homosexuality because I could have chosen not to.  Because it required subjecting myself to the possibility of certain social hardships that could have been completely avoided by my simply shutting up.  (I hasten to add that, to date, no such unpleasantness has ever occurred.)

By contrast, the gayness itself is just one of those freak accidents over which I have no control.  Like the color of my eyes or the geographic origins of my kin, my sexuality is a simple fact of life, completely uninteresting in and of itself.

To be sure, there are many people who define themselves entirely on the basis of their sexuality.  (Most of these people are straight, but never mind.)  Historically, the gay rights movement has been nothing so much as an expression of identity politics, its leaders regarding themselves as homosexuals first and Americans second.

In earlier, more repressive days, such an approach made some sense:  Building a strong movement required unquestioned solidarity amongst its members, which in turn required a strident emphasis on the one characteristic that bound them all together.

The trauma that is coming out can operate under a similar dynamic.  As a means of alleviating the inherent loneliness that living in the closet engenders, one tends to hew to every gay stereotype one can get one’s hands on.  To announce one’s homosexuality is to officially join the gay community; on such an occasion, it is only polite to defer to the community’s values and traditions, no matter how ridiculous and antiquated they might seem.

However, in the years since that big bang of my existence as a publicly gay person, I have parted ways with certain gay orthodoxies, realizing that, in many cases, I never truly believed them in the first place.  They did not represent the “real” me.  Like a kid who takes up smoking in the hope that it will make the cool kids accept him, I was merely going through the motions.

As time has marched onward, the fact of my homosexuality has diminished in importance in my daily life, and would rank fairly low on a list of ways in which I might define myself—an exercise I generally resist in any case.

More important still, I have come to terms with the twin facts of, first, knowing that I do not easily mesh with whatever it is the “gay community” today represents, and second, that I do not particularly care.

For all that I hoped coming out would provide me with a “crowd” with whom I could find comfort, inclusion and common cause, the experience of being openly gay has shown to be one more illustration of my true self as a perennial outsider—even within groups that are, themselves, a collection of cultural misfits.

Perhaps this all sounds a bit dreary, but it is something which I have long come to accept and would have no other way.

Of this, I suppose I can also be proud, although I do not need to march in a parade to express it.

Dispensable Pop Stars

Two significant cultural events occurred in recent days, with parallels so obvious they were impossible to miss.

First, Minnesota Congresswoman Michele Bachmann announced she would not seek reelection in 2014.

And second, the media pronounced dead the career of Justin Bieber.

OK, perhaps the connection is not so self-evident, but allow me to explain.

Bachmann, of course, is the Tea Party-styled four-term representative who, in her 2012 presidential campaign, cemented a reputation for making patently false assertions with unwavering conviction, such as when she claimed the HPV vaccine causes mental retardation, because a random woman at a campaign event told her so.

Her most recent House race was also her most competitive, and when she announced her impending retirement from Congress, the man who nearly defeated her in 2012, Jim Graves, declared, “Mission accomplished.”

As he would have it, Graves regarded himself as a mere vessel, on behalf of the good people of Minnesota, to remove Bachmann from the national scene.  While he might have preferred to do so by personally unseating her, achieving the same ends by alternative means—namely, spooking her into fleeing politics the way she tends to flee reporters—was good enough for him.

As for Justin Bieber, who I trust requires no introduction, it is probably a bit soon to proclaim the sun has set upon a pop sensation still in his teens, and what is more, one with such natural charisma and (on good days) a knack for navigating the labyrinthine world of celebrity.

All the same, various pop culture news outlets have done exactly that, weaving together a “meltdown” narrative from a series of unfortunate events in Bieber’s recent past, including such crimes against humanity as smoking marijuana, turning up late for concerts and, most amusingly, having his pet monkey indefinitely detained by the Federal Republic of Germany.

As we come to grips with the prospect of a world with a few square inches not inhabited by Justin Bieber, I am drawn back to the dawn of his young career, which to me and my particular circle of acquaintances signaled precisely one thing:  The effective end of the Jonas Brothers in the cultural bloodstream.

It is easy to forget today, but the tender trio with matching purity rings and, like Bieber, a supposed connection to the music industry was quite the commodity for a good couple of years, with precisely the sort of wall-to-wall coverage in all the usual celebrity rags (and a rabid fan following to boot) currently enjoyed by Bieber.

No more.  In effect (if not by design), Bieber served the same purpose in his pursuit of fame as Jim Graves in his pursuit of political office:  Knocking the reigning “it girl” off the pedestal.

It is often said (accurately enough) that America will build up its celebrities only to destroy them later on.  Fame is ephemeral.  Today’s rock star is tomorrow’s has-been.  Our so-called heroes in the world of entertainment, with precious few exceptions, are ultimately disposable and replaceable.

The useful connection we should draw, then, is that this principle applies as much to politics as to entertainment.

Lest we forget, Bachmann’s primacy in the Tea Party universe was itself a product of the waning influence of the former queen bee of the proverbial far right, Sarah Palin, for whom Bachmann was viewed as something of a surrogate in the 2012 GOP primaries.

This is no small fact, when we reflect the degree to which the world of punditry managed to convince itself and many others that Palin would be a force in American politics for many years to come.  For a good long while—particularly during the early months of the Obama administration—it was inconceivable that Palin would all but vanish from the scene and become irrelevant.

And then she all but vanished from the scene and became irrelevant.

As a nation, America might well be “indispensable,” as President Obama asserted in one of his debates against Mitt Romney.  However, the same is not true about any individual American.

To wit:  We take it on faith that Franklin Roosevelt was the “only” man who could possibly have won World War II.  But then how do we explain how his unknown, untested successor, Harry Truman, managed to patch the world back together again when the war was over?  It could not have been entirely a matter of luck, could it?

Accordingly, we should resist the temptation to anoint political and cultural saviors for ourselves, as if we would be lost without them.  Happily, in point of fact, we would not.

America is a big country, and there are plenty of clever, talented people who live here.  By no means are we all truly created equal, but we are equally human.  We should rejoice at this news, rather than constantly rebel against it.