Gay on the Gridiron

It was a very gay week.

The U.S. Supreme Court, having never before considered the question of same-sex marriage, did so twice in consecutive days.  One case, Hollingsworth v. Perry, concerns the legality of California’s Proposition 8, which banned gay marriages in the state; the other, United States v. Windsor, seeks to repeal Section 3 of the Defense of Marriage Act, which effectively refuses to acknowledge such unions at the federal level.  The Court is expected to decide both cases in June.

To mark this unprecedented confluence of official interest in homosexual matters, throngs of supporters and opponents of same-sex couplings (mainly supporters) turned out in Washington, D.C., to register their views, and Facebook quickly grew saturated with red equal signs in a nationwide show of solidarity with Team Gay.

In this environment, bursting with respect and goodwill, we have again been confronted by the prospect of an openly gay athlete in a major American sport, and the question of whether we are “ready” for such an event.

The cause this time, rather than some stray anti-gay comment or the like, was a rumor that a particular member of the National Football League (as yet unnamed) is considering coming out in the near future, which no active NFL player has yet done, thereby lending fresh urgency to the question of national readiness.

This would be the moment—as pertinent as any—to define our terms.  In asking “Is the NFL ready for an openly gay player?” we would do well to clarify what exactly we mean by “the NFL” and what “ready” entails.

To encompass both considerations at once, we might fairly and bluntly acknowledge that the most immediate concern is how often a gay footballer could expect to be called “faggot,” whether by NFL fans, by rival players or, most disturbingly, in his own locker room.

The fear, after all, is for a retread of the kind of hostile environment that ensued when Jackie Robinson joined the Brooklyn Dodgers in 1947.  As Major League Baseball’s first black player, Robinson was called “nigger” not only by idiotic fans, but also by idiotic members of opposing teams, and was subject to physical abuse and intimidation on the field of play itself.

Is homophobia as bad in sports today as racism was in 1947?  I suppose we’re about to find out.

While analogies are often drawn (not least by me) between the black civil rights and gay rights movements, the two are distinctive in at least one critical way:  Gay people have the luxury of hiding their homosexuality, if they wish.  Concealing one’s skin color is a more exacting trick to pull off.

What this means, in all likelihood, is that the first NFL player to come out will already be known to a public that, until now, will simply have assumed him to be straight.  His sexuality will be new, but his presence on the gridiron will not.

The NFL doesn’t need a Branch Rickey figure to scout out promising gay players to add to its ranks.  They are already there.

On that point, I close with a math problem.

A 2012 Gallup survey found that 3.4 percent of Americans identify as gay.  We can only speculate about the number of Americans in the closet, but we can safely assume it to be considerable.

Supposing, for the moment, a true gay population figure of 5 percent, and further supposing that gays are evenly-distributed across all walks of life, we find that the NFL, with more than 1,600 players on its active rosters, presently contains somewhere in the vicinity of 80 gay souls—an average of two or three per team.

I perform these calculations as a form of good news for whoever the Jackie Robinson of gays turns out to be:  However long it takes to verify, he will not be the only one.

And it probably won’t take long, because coming out is as much in vogue as ever it has been, and has proven susceptible to the domino effect:  When one closet door opens, a thousand others follow.

Therein (maybe) lies our answer to the “ready” riddle:  Like the country at large, the NFL and the other major sports have never really had the chance to prove themselves capable of accepting gays, because until quite recently, gays have chosen to be invisible.

Today, with this no longer being the case and homosexuals having shown themselves to be identical to heterosexuals in every way but one, all the old stigmas and prejudices are slowly but steadily washing away, and it is only a matter of time before the NFL sees what is directly in front of its nose and responds the only way that it can:  With professionalism, respect and, ultimately, a hearty sigh of indifference.

Never Finished

I, like so many Americans, have a complicated and uneasy relationship with food, and on roughly one day each month I go on a madcap culinary bender to remind myself why I don’t on the remaining 29.

My most recent binging adventure went better than most.  It was a large, potluck-style family gathering, and between my aunt’s sweet potatoes and my mother’s brisket, I had not even reached the dessert table before the stitching in my half-zipped hoodie burst from the fabric and unspooled clear across the dining room.  (Or did I imagine that part?)  From this one meal, I probably packed in enough fat and protein to last me until the Fourth of July.

Nonetheless, my general disposition the following morning was much the same as when I have overindulged in nutritional culs-de-sac like cake, ice cream and chocolate—that is, one of abject self-disgust, followed by the vow never to repeat such a ridiculous face-stuffing ever again.

Certainly, to the typical American college student this groggy “never again” feeling is a familiar one, generally occurring sometime around 2 p.m. on a Sunday afternoon, as one begins the arduous process of rolling out of bed and coming to grips with just how much liquor one managed to choke down the previous evening.

“I’m never drinking like that again,” we have all heard ourselves say, only to be seen engaging in precisely the same pastime, right on schedule, the following week.

Albert Einstein famously rendered “doing the same thing over and over again and expecting different results” as the very definition of insanity.  Yet it is a near-universal way of life.

We all possess an inherent desire—manifested in myriad forms, not just food and drink—to lick a bad habit once and for all, and most of us believe it within our grasp to do so.  That if we simply hunker down, focus our gaze and stop goofing off, the desirable will become achievable.

The notion underpinning this notion, which extends beyond New Year’s resolutions and the positive thinking movement, is that certain things contain an endpoint.  That there is a dragon to be slain, and once the dragon is dead, the problem is solved forever.  Done and done.

Sadly, as most current and former addicts (to anything) will affirm, real life tends not to be so simple, orderly and neat.

I am reminded of a wonderful exchange in David Fincher’s 2010 movie The Social Network. Discussing the prospects for their promising little web startup called “The Facebook,” Eduardo asks Mark, “So when will it be finished?”  “It won’t be finished,” Mark matter-of-factly retorts, “That’s the point—the way fashion is never finished.”

That is to say (for those who have never encountered Vogue or Project Runway) that a dress is never perfect—it can be continuously altered and improved, but can never truly be said to be complete.  At a journalism seminar I attended years ago, movie critic Lisa Schwarzbaum made the same observation in postulating, “A review is never done—you just run out of time.”

I must say that I find enormous appeal in this concept, in spite of myself and my many attempts to disprove it.  Schwarzbaum’s observation has rung true for me with distressing regularity.  I can imagine rewriting and revising this column for the remainder of my natural life and never being confident of arriving at a “definitive” version of it.

President Obama is also a subscriber to the “never finished” theory, having underlined in numerous speeches—most famously his “race speech” in March 2008—the phrase “to create a more perfect union” from the Preamble to the U.S. Constitution.  America is indeed not perfect—as Obama is endlessly criticized for pointing out—and the story of both the country and the Constitution is one of continuous improvements.

The good news in all of this disappointment is that what we humans lack in perfection, we compensate for in self-correction.

I may never overcome my fits of concentrated gluttony, but I can winnow them down to manageable levels.  In like spirit, Mark Zuckerberg can continue to refine Facebook, Tom Ford can forge ever-new frontiers in the world of spectacles and bow ties, and the Supreme Court can help to coax the United States (or not) into establishing gay folks as citizens equal under the law.

The work goes on.  May it never stop.

American Tall Tales

Monday marks the opening night of Passover, the weeklong Jewish festival that commemorates the Jews’ famed Exodus from slavery in ancient Egypt, followed by their 40-year safari through the desert toward Mt. Sinai and the Promised Land.

It is an exciting, inspiring story—and also complete and utter nonsense.

To date, no archaeological excavation of the area alleged to have hosted the Passover saga has uncovered verifying evidence of any sort.  Nor should we expect it to.  After all, what is a profession of “faith” if not a tacit acknowledgment that one’s beliefs are not supported by facts?  If they were, faith would not be necessary.

Of course, in carrying on Passover traditions, Jews are hardly the first or only people to cull a major day of remembrance from a narrative that is less than historically accurate.

While we could go on for days about questions of veracity in Judaism and other religions, the truth is that America’s secular roster of official holidays is a veritable treasure trove of myths, half-truths and outright falsities.

To begin:  We celebrate our nation’s birth every year on July 4th, even though it was actually two days earlier in 1776 when the Continental Congress in Philadelphia voted to ratify the Declaration of Independence.  In correspondence with his wife, John Adams wrote, “The second day of July, 1776, will be the most memorable epocha in the history of America.  I am apt to believe that it will be celebrated by succeeding generations as the great anniversary festival.”

So it should have been.  Very little actually happened at Independence Hall on July 4, 1776 beyond a bit of tidying up, but the fourth somehow wound up as the “official” date stamped atop the sacred document, and that was the one we got stuck with.

Then again, this whole discrepancy regarding the precise moment we declared our independence is essentially a minor accounting error.  The central narrative we annually commemorate with fireworks and barbeque otherwise happened more or less as we say it did.

Compare this, per instance, with something like Columbus Day, observed on the second Monday of October, in which we perpetuate this great myth that a guy called Christopher Columbus “discovered” the “New World.”

We have long known and recognized two giant monkey wrenches in the old orthodoxy—first, that Columbus came upon the American continent mistakenly, thinking he had reached India; and second, that there was nothing particularly virginal about the place, insomuch as it was already inhabited by tens of millions of unassuming natives—but we have yet to fully abandon the legend of Columbus as a pre-Revolutionary founding father.

Returning to Judaism and Passover, it is curious that the world’s oldest monotheism would settle on an easily disproved legend as its central narrative about its amazing survival, when there are so many actual, albeit less romantic, examples of Jewish perseverance from which to choose.

The same is true about the story of America, which boasts no shortage of genuine heroism and goodwill from its earliest days onward.

In the conclusion of John Ford’s The Man Who Shot Liberty Valance, we are famously told, “When the legend becomes fact, print the legend.”  Certainly, America is nothing if not a land of legends.

Still, as the citizenry of a country that has so much about which to be proud, why do we so strongly feel the need to embellish?  To turn everything into a matter of black and white when the gray matter is so much more interesting?  Why can’t we handle the truth?

Out of Sight

Here is a man who does not know when to quit.

On Monday, exactly one week after being told he cannot force New Yorkers to drink less soda, Mayor Michael Bloomberg announced his newest idea to force New Yorkers to smoke less tobacco.

The idea is to prohibit most businesses that sell cigarettes from displaying them publicly, relegating their presence to behind and beneath the counter.  Customers of legal age would retain the right to purchase them, of course, but a greater effort would be required simply to determine whether the establishment sells them at all.

The psychological assumptions underpinning this scheme are fairly self-evident, but fascinating nonetheless.  In promoting this “out of sight, out of mind” approach to tobacco sales, Bloomberg and allies are banking on the theory, supported by research, that dangling a product directly in front of a person’s nose has a measurable influence on his or her decision to purchase it.  By extension, then, forbidding such displays will reduce an item’s sales figures.

Some critics of the proposal have swiftly and predictably adopted the “slippery slope” argument for this case.  If cigarettes are made fair game for government regulation of this sort, where does it end?  Are other health hazards such as candy and soda next?

This is, I must say, a highly amusing prospect.  After all, if convenience stores really were induced to conceal any product that could possibly do a customer harm, their shelves would very quickly come to resemble the liquor store whiskey aisle on the day after St. Patrick’s, and the back of the counter like the stateroom in A Night at the Opera.

That is to say, in other words, that the experience of convenience store shopping would become much more similar to the current experience of shopping online.

Consider:  When we wander onto a site such as Amazon, we might not know precisely what we intend to buy, but we nonetheless need to have a general idea in order to find the bloody thing.  We cannot scan the virtual shelves, as it were, for they are nearly infinite.  Yes, the Internet is now capable of recommending products based on past purchases, but all the same, shopping in today’s world is essentially an exercise in personal initiative.

In this way, funnily enough, I am put in mind of the evolution of Facebook.  While the social networking behemoth has undergone myriad cosmetic alterations in its near-decade of life, probably the most significant was the introduction of the “news feed” in the fall of 2006.  In Facebook’s earliest days, if you wanted to know what a particular “friend” was up to, you needed to run a search to find out.  The onus for acquiring information was on you.

With the news feed, by contrast, you are positively inundated with your fellows’ cyber activities, regardless of your interest in them.  The initiative on your part has shifted from seeking content to averting it—if you truly wish to avoid the comings and goings of certain friends, you must adjust your personal settings accordingly.  Otherwise, they will continue to present themselves before your very eyes.

By no means is this trend toward unlimited, unsought information universal.  On the contrary, there is a very eloquent case, made by many, for the inherent strength of newspapers over Internet-based news from the observation that to read a newspaper, page by page, is to encounter stories one might easily and unwittingly ignore on the web, which offers users the freedom to decide, in advance, which sorts of headlines their browser will collect and display.

So maybe Mayor Bloomberg’s latest evil plot to make New York City a healthy place to live will fare better than the last, with a fair number of people paying no attention to those smokes behind the curtain.  The science behind the power of visual cues is on Bloomberg’s side, as is our culture’s natural tendency to be distracted and drawn away from what is not directly in front of us.

What is left to combat, however, is our equally natural and equally strong inclination to get what we want, no matter the cost or inconvenience.  Not to mention the rather troublesome “forbidden fruit” dynamic, whereby the more rigorously a particular product is withheld, the more passionately it is desired.

Which of these considerations will prevail in New York?  We shall see.

Scouting Report

I have never engaged in a formal one-on-one debate with anyone on the subject of same-sex marriage, but were such an event to occur, I am fairly certain my line of inquiry would begin with the following challenge:  “Please explain why marriage between a black person and a white person should be legal.”

You see, my thinking is (and long has been) that in making the case for interracial marriage, one makes the case for gay marriage as well.  I understand there are many folks, black and white, who get annoyed when bits of the gay rights movement are likened to bits of the black civil rights movement—they insist any similarities between the two are superficial—but I find the parallels a trifle too linear to ignore.

The most salient point of all—the bottom-est of the bottom lines—is that once one establishes that civil marriage in the United States is and ought to be a union of two consenting adults pledging themselves to each other in good faith forever and ever—nothing more, nothing less—all arguments for limiting the institution to only certain types of people evaporate on contact.

The legalization and subsequent propagation of interracial marriage in America demonstrated the validity of this point, both for itself and for all forms of so-called “non-traditional” marriage thereafter.  We didn’t need to worry about any horrid unintended consequences of marriage between gays—miscegenation had already proved such fears utterly and blessedly unfounded.

I offer this throat clearing because I have just read the survey the Boy Scouts of America is distributing to its scouts and their parents, and in mulling its contents, I have been struck with the most acute sense of déjà vu.

As you must surely have heard, America’s revered coming-of-age organization has faced such concentrated criticism in recent years over its longstanding prohibition on openly gay scouts and scoutmasters, that it is very seriously considering dropping the policy outright in the near future.

To prepare itself for such an eventuality, the Boy Scouts commissioned this new questionnaire to take the temperature of its present membership regarding its views on homosexuality.  While the queries cover a range of hypothetical scenarios, they are essentially different ways of asking the same basic question:  What would actually happen, on a troop-by-troop basis, were the Scouts to welcome open homosexuals into its ranks?

If this all sounds terribly familiar, it might be because it is precisely the same process undertaken by the U.S. Armed Forces in 2010 to ascertain the effects of repealing its own anti-gay policy, “Don’t Ask, Don’t Tell.”

Like the Boy Scouts today, military leaders wondered how inclusion of gays would affect “unit cohesion.”  They fretted about recruitment and funding.  They broached questions of morality and ethics.  And they were skeptical about upending years and decades of tradition to embark upon a journey with an uncertain destination.

Gay soldiers have now served openly for a year and a half.  While this is far too short of a time span from which to draw definitive conclusions, a think tank called the Palm Center produced an “assessment” one year into DADT’s repeal, which it gleaned “had no negative impact on overall military readiness or its component parts.”

“While repeal produced a few downsides for some military members—mostly those who personally opposed the policy change,” the report expounded, “we identified important upsides as well, and in no case did negative consequences outweigh advantages.  On balance, DADT repeal appears to have slightly enhanced the military’s ability to do its job by clearing away unnecessary obstacles to the development of trust and bonding.”

From this (admittedly tentative) account, the Boy Scouts can perhaps derive some clues as to how its own adventures in gayification might play out.

While the Boy Scouts of America is no more equivalent to the U.S. military than is same-sex marriage to miscegenation, one is nonetheless compelled to ask:  If the Armed Forces are capable of operating with homosexuals in their midst, why not the Scouts?

Of course, a possible reason the repeal of DADT turned into something of a “non-event”—and a cause to think the prospective inclusion of gays in the Scouts would as well—is the rather inconvenient (and obvious) fact that gay people had served in the military all along, albeit silently.  Perhaps some soldiers and higher-ups convinced themselves to the contrary, certain that any and all traces of gayness had been thoroughly cleansed from their platoons, but they were only fooling themselves and fooling others.

In truth, we already know the Boy Scouts can handle the presence of gays, for it always has.  The choice it faces, then, is whether to continue to engage in an elaborate self-deception, or whether instead to face the world as it really is.  Would the latter not be the more honorable—dare I say, the more meritorious—thing to do?

Bloomberg, Revisited

Last July, I sprinkled faint praise on Michael Bloomberg, the mayor of New York City, for having the gumption to push through his initiative to ban large containers of sugary beverages in his city’s theaters and restaurants, no matter how many people objected.

In a political world of timidity and pandering, I wrote, here at least was a guy with the courage of his convictions and the force of will to get the job done.

It appears that not everyone in America agrees.

Last Tuesday was to be when New York’s famous (or infamous) soda ban took effect, except that on the previous day, a New York Supreme Court judge invalidated the whole bloody thing, calling it “arbitrary and capricious.”

And the floodgates of schadenfreude burst open.

As reported pretty much everywhere, response to the news of the soda ban’s sudden demise, both within and without New York’s city limits, has been decidedly of the “good riddance” variety.  The Onion, reliable as ever, summed things up nicely with the headline, “Opposition To Soda Ban Sad Proof That Americans Still Fight For What They Believe In.”

A good deal of this antipathy seems as much against Bloomberg himself as against his latest public health pet project.

In point of fact, for as long as he has been mayor, Bloomberg has invited intense feedback from his not-always-adoring public.  He could fairly be described as a “polarizing” figure, not least for his ability to provoke a polarized reaction in a single person.

As a case in point, it is worth recalling Bloomberg’s rather memorable 2008 campaign to extend his own tenure.  Faced with an imminent forced retirement from the mayoralty thanks to a 1993 term limit law, Bloomberg successfully lobbied the City Council to extend the mayor’s maximum reign from two four-year terms to three.

Tellingly, New York public opinion was largely against the term extension idea—voters have affirmed such limits every time they have been given the chance, including in 2010—yet Bloomberg nonetheless won a third term in 2009 and has maintained relatively high approval ratings for most of his rule.

In short, however strongly the good people of New York have judged Bloomberg’s ideas and works, they have come to stridently disapprove of his methods.  The ends are not justified by the means, and in this case, the people didn’t much care for the ends, either.

The scorn of the Onion notwithstanding, I can only applaud this state of the public mind as a rare and admirable defense of principle over personality.

To be certain, the principle being defended in the present controversy is not a noble one.  The right to pour indiscriminate amounts of high fructose corn syrup down one’s gullet without having to shuffle back to the counter for a refill is (probably) not what Patrick Henry quite had in mind in proclaiming, “Give me liberty or give me death.”

But that does not make such a concern illegitimate, for the principle behind the principle—the right to do what one damn well pleases—is as central to the American way of life as ever it has been, and must always be guarded and reaffirmed.

The challenge, then, is to direct these healthy and essential affirmations of one’s liberties toward more weighty matters.

I am reminded of the old gripe about how much better shape America might be in if the folks who spend an entire weekend camped outside Best Buy to purchase the newest iPhone were able to summon equal passion and dedication toward, say, eradicating HIV in Africa or combating climate change here in the States.

Our great country is not suffering from an enthusiasm deficit so much as a seriousness deficit.

As for Mayor Bloomberg himself:  With this setback in his quest for a healthier New York (following so many successes), he has perhaps been humbled to learn, at long last, that the keys to Gracie Mansion and a few billion dollars license a person to accomplish only so much unilaterally.  That if the common folk are truly disdainful of their leader’s actions, they will eventually rebel.

Then again, Bloomberg has vowed in no uncertain terms to continue his war against liquid sugar to the bitter, bubbly end.  In the arrogance department he, like Charles Foster Kane, may “need more than one lesson.”  The question is, with less than a year left in his (apparently final) term, whether there is time enough for him to receive it.

Too Sacred to Repeal?

In the nation’s capital today, there is no easier task than balancing the federal budget.

If resolving our so-called debt crisis is what we really wanted to do, there is no mystery to it whatever.

All you’d have to do is get rid of Social Security and Medicare.

By no means do I know the first thing about economics.  Nor am I so stupid or naïve to think (as an alarming number of my fellow Americans do) that because I can balance my own checkbook, I am therefore qualified to balance the country’s.

But I am capable of glancing at the 2012 federal budget and observing that the U.S. ran a deficit of $1.3 trillion last year and expended $1.3 trillion on Social Security and Medicare.

Accordingly, the most dramatic yet straightforward means of erasing the deficit (if not the debt) is staring us squarely in the face.  I don’t know a lot, but I am fairly confident about what happens when you subtract $1.3 trillion from itself.

Of course, the United States is not about to abolish two of its signature grand entitlement programs, enacted, respectively, as part of the New Deal and the Great Society.  Heck, it requires a Herculean effort to raise the official retirement age by six months.

And why is this the case, ladies and gentlemen?  Why is it a vain hope that we will ever balance our budget by the most surefire means available to us?

In the specific case of entitlements for old folks, the usual assumption is that no politician dares to say a negative word because it would annoy seniors, the most reliable voting bloc in America.

However, the deeper impetus for broad structural change in our system is much more interesting.  It’s the simple fact that massive government programs such as Medicare have existed long enough that we have come to accept them as an inevitable piece of American life.  Once something so intricate has been done, it becomes almost impossible to be undone.

I make this observation in the aftermath of a recent tizzy surrounding Supreme Court Justice Antonin Scalia and comments he made about the Voting Rights Act of 1965, whose partial repeal is currently under consideration.

Among Scalia’s more incendiary assertions involved his employment of the term “racial entitlement.”  “Whenever a society adopts racial entitlements,” Scalia intoned, “it is very difficult to get out of them through the normal political processes.”

“I don’t think there is anything to be gained by any senator to vote against continuation of this act,” he continued.  “Even the name of it is wonderful:  The Voting Rights Act.  Who is going to vote against that in the future?”

For these and other comments, Scalia has been lacerated from one end of the culture to the other.  Indeed, the objections have been so high-pitched, the ad hominem so fierce, it has left us very little time and space to consider whether Scalia might be correct after all.

For starters, he is undoubtedly on to something on the matter of legislators being scared of voting against a bill with an appealing name.  Indeed, I would hazard to guess the matter was more or less settled with the USA PATRIOT Act in October 2001—voted for by nearly all, actually read by almost none.

As for the “racial entitlement” charge:  Remove the word “racial” from the above quotation and you are left with an utterly uncontroversial statement of fact, and the point with which I began.

The term I would evoke here is stare decisis.  Latin for “let the decision stand,” it is the judicial principle of respecting precedent and recognizing that once the Court has pronounced judgment on a particular issue, the debate on that issue is effectively over forever.

Certainly we have witnessed rather dramatic exceptions to this philosophy in the last two and a half centuries.  The principle of holding human beings as property was a precedent worth not respecting, for instance, as were the high court’s past decisions on segregation, sodomy and women’s rights.

Reverting back from the judiciary to the legislature, the question we might ask, in this epoch of businesses that are “too big to fail,” is whether it is wise to assume that certain legislation is too sacred to repeal.

The American republic existed for nearly two centuries without Medicare, yet we now regard it as an indispensable birthright, completely untouchable in all considerations of getting the federal budget under control.

Sooner or later, we will need to reckon with the fact that, if we are serious in our budget talks about nothing being “off the table,” we would do well to negotiate around a less porous table.