Don’t Say ‘Gay’

Mahmoud Ahmadinejad, the outgoing president of Iran, famously made an unholy spectacle of himself in a 2007 speech at Columbia University when he asserted, “In Iran, we don’t have homosexuals, like in your country.”

Today in Russia, President Vladimir Putin is trying to make this literally true.

As part of a broader crackdown on gay rights activities of various sorts, Russia recently passed a law against “homosexual propaganda,” making it illegal to “spread information about non-traditional sexual behavior” to children under 18 years old.  Transgressors will be penalized with heavy fines, and violators from other countries will be subject to deportation.

This “propaganda” legislation joins similar measures against public homosexuality in the Russian Federation, such as the prohibition on foreign same-sex couples adopting Russian children, as well as the general practice by Russian police to break up gay rights marches and demonstrations, often violently, and detaining some of their participants.

As well, assaults on gay Russians by straight Russians have run rampant in the country for a long time.  Anti-gay sentiment apparently cuts wide and deep, with 88 percent of the public giving the new “propaganda” law a thumbs-up, according to a state-run polling organization.  (The trustworthiness of such opinion-gathering outfits is in some dispute, but one suspects this one is not too far off.)

Here in America, most of this official anti-gay policy strikes us as positively barbaric, and is no longer tolerated in our open, pluralistic culture.

Or is it?

Reading about the “propaganda” law, I could not help but be reminded of the kerfuffle in Tennessee at the beginning of this year over what came to be known as the “Don’t Say ‘Gay’ bill.”  Proposed by State Senator Stacey Campfield, the bill, if passed, would have effectively banished all discussion of homosexuality in the state’s elementary schools.

“At grade levels pre-K through eight,” the bill stipulated, “any such classroom instruction, course materials or other informational resources that are inconsistent with natural human reproduction shall be classified as inappropriate for the intended student audience and, therefore, shall be prohibited.”

Sound familiar?  Is there any part of that sentence of which Vladimir Putin would not approve?

This is not to say that Tennessee is as bad on gay rights as Russia, per se.  Campfield’s bill never actually passed muster in either house of the state’s legislature, having died in committee.  Further, no American state government is systematically rounding up pro-gay rights agitators as they regularly are under the Putin regime and in many other hell holes around the world, particularly in Africa.

What should nonetheless command our attention here in the states—the one way in which the shenanigans in Tennessee mirror the shenanigans in Russia—is the leading role that language plays in the battle over gay civil rights around the world.

The central irony of the new “propaganda” law—unmistakable and essential—is how it is, itself, an exercise in propaganda.

While everyone with any sense knows that homosexuality in some Homo sapiens is an objective fact of life that cannot simply be wished away, this legislation seeks to do precisely that.

In a Russia whose population has flatlined in the last several decades, where the prospect of insufficient occurrences of heterosexual congress presents as an existential threat, homosexual intimacy can reasonably be seen (by the homophobia-inclined) as slightly beside the point.

The Russian government has been actively encouraging procreative sex for years.  In this way, the “propaganda” law can be seen as complementary and then some—a means not merely to discourage one form of non-procreative sex, but to deny its very existence.

Conceivably, this would make sense if human sexuality was a choice, as some apparently still believe, and homosexual relations were merely a form of rebellion against social mores, as some apparently also still believe.

The problem is that this is not the case.  Homosexuality exists whether a government wants it to or not, which means any attempt to argue or legislate to the contrary will ultimately be futile and subject to the sort of ridicule President Ahmadinejad faced when he suggested Iran was immune to the gay germ.

“Don’t Say ‘Gay’” policies are not merely an affront to gay people, you see, but an affront to truth.

Not unlike neo-Nazi denial of the Holocaust or Turkish denial of the Armenian Genocide, they are an attempt not to attack a particular group of people, but rather to delegitimize them outright by withholding from them the most basic component of human dignity:  Acknowledging that they exist at all.

A Fresh Take on Tobacco

The U.S. government is thinking about severely regulating the sale of menthol cigarettes, if not banning them outright.

Why is that?

Because menthol cigarettes might be hazardous to your health.

Yes, I was shocked, too.  But apparently it is true that the countless toxins embedded in the nicotine of your friendly neighborhood cigarette are not magically disappeared by the addition of a fresh, minty aftertaste.

Who knew?

The particular concern about menthol cigarettes—as might interest those who, like your humble servant, assumed all cigarettes were more or less interchangeable—is that they are more addictive than non-menthol cigarettes, and therefore a riskier habit for young, first-time smokers to take up.

“There is little evidence to suggest that menthol cigarettes are more or less toxic or contribute to more disease risk to smokers than regular cigarettes,” according to a recent Food and Drug Administration review.  “However, there is adequate data to suggest that menthol use is likely associated with increased smoking initiation by younger people and that menthol smokers have a harder time quitting.”

The review went on to explain that “there’s also evidence indicating that menthol’s cooling properties can reduce the harshness of cigarette smoke and that menthol cigarettes are marketed as a smoother alternative.”

This new FDA report makes no explicit recommendation as to whether, and how, the government should act on these fresh findings, although a similar report in 2011 noted that “removing menthol cigarettes from the market would benefit public health.”

In making the case against any further tobacco regulation, one is tempted merely to fall back on all the usual tropes.  You know, the ones about how smoking is an individual’s right and choice—two values upon which the American republic is founded—and that if one is not granted the right to make the wrong choices, one has no rights at all.

Further, that while it is regrettable that the age limit for purchasing cigarettes has long proved to be of limited practical use, we cannot and should not prohibit adults from engaging in adult activities simply because they might also be engaged in by children.

And that it is beyond the competency of the government to determine which activities are good and which are bad.

And that there is no sentient being left in the United States who does not know that, in health terms, smoking is a breathtakingly stupid thing to do.

All of these things are as true as ever they have been.  Any libertarian-minded person can be contented that the moral argument against smoking prohibitions was complete many years ago and requires no further comment.

And yet, one feels somehow obligated to revisit and perhaps recalibrate this pro-tobacco line of logic in light of the unique challenge that menthol cigarettes present.

I noted at the start how, until presented with this information about the effects of menthol, I had assumed all cigarettes were created equal.  While I knew that, like liquor or coke, they came in many colors, names and brands, I nonetheless figured that their overall effect on one’s system was the same.

My inkling, and my concern, is that many other people are equally unaware of the difference between menthol and non-menthol cigarettes, not knowing that the former, by design, tend to be more addictive than the latter.

Taking this to be true, it stands to reason that an aspiring teenage smoker who might be capable of handling the poisons in non-menthol cigarettes will, for reasons of taste, opt for menthols under the impression that they are no more crippling than the regular brands, and wind up with an addiction that proves to be a bit more than he bargained for.

Accordingly, the case for applying special legal scrutiny to menthol, relative to non-menthol, would seem to rest on the principle of full disclosure.

Lest we forget, the original war on Big Tobacco was based not on the fact that cigarettes are poison, but rather that the companies selling them insisted that they were not.

If we are to regulate menthol in a stronger way, that is the basis on which we should do so:  By informing menthol’s current and potential users precisely what it is they are putting in their mouths, thereby allowing them to smoke, suffer and die in the most intellectually honest possible way.

Monsters Among Us

Maybe it’s just me, but I found the kid who was drenched in blood, with a red laser dot pointed directly at his forehead, far more sympathetic than the cute, cleaned-up one on the cover of Rolling Stone.

No one likes a preening, narcissistic prima donna with perfect skin.  But someone who spent an entire day rotting in a pool of his own fluids, unable to tend to several dozen open wounds, looking positively defeated when finally taken into custody?  The poor dear.

Rolling Stone ruffled all sorts of feathers with the release of its current issue, whose cover is occupied by the pretty boy face of Dzhokhar Tsarnaev, the surviving of the two Boston Marathon bombers.  Critics howled that the image romanticizes Tsarnaev, making him out as some sort of “rock star.”

In response, a Massachusetts State Police sergeant leaked heretofore classified photographs depicting the capture of Tsarnaev at the end of a daylong manhunt, to remind everyone of the barbarian he really is.

Reading “Jahar’s World,” the article about Tsarnaev in Rolling Stone—which I strongly recommend to everyone—I sensed the light bulb above my head illuminate.  For the first time since the April 15 attack, I felt like I really got it.

But “it,” I do not mean that I understood Tsarnaev himself, and the reasons he might have had to join his brother, Tamerlan, in executing a savage and unforgivable assault on our free society.  Indeed, such insights are beyond the faculties of anyone outside Tsarnaev’s own head, and perhaps Tsarnaev himself.

Rather, I now better comprehend the particular context from which Tsarnaev emerged—the one that led Rolling Stone editors to depict him as they did.

Janet Reitman, the author of the piece, quotes Peter Payack, Tsarnaev’s high school wrestling coach, who says of him, “I knew this kid, and he was a good kid.  And, apparently, he’s also a monster.”

That’s it.  That’s the key to the whole business:  The terrible, frightening prospect that the bad guy was also a good guy.

This is entirely distinct from the profile of the typical teenaged mass murderer—the “loner” who “kept to himself” and “didn’t seem to have any friends” and one day brought his machine gun to school and massacred a few dozen classmates.

As Reitman’s article makes clear, Dzhokhar Tsarnaev wasn’t like that at all.  He wasn’t a loner, he didn’t keep to himself, and he had plenty of friends, several of whom Reitman interviewed for the piece.  They seem like friendly, normal folk, and so does he.  No one suspected he was capable of committing an act of terrorism because he never gave anyone a reason to be suspicious.

Indeed, Dzhokhar here is seen not merely as a nice guy, but also as uncommonly generous and a positive influence on his community.

He would often go out of his way to do favors for friends.  He seemed to be perennially carefree and at ease—aided, no doubt, by his apparently bottomless supply of marijuana.

He cared passionately about his Muslim faith, but unlike his elder brother, showed no desire to enforce its values upon others.  Indeed, he talked about religion so infrequently that many of his friends—Christians and Jews among them—did not know he was a Muslim at all.

Details like these, taken together, lead one to an inevitable and frightening conclusion:  With just a bit of cosmic shuffling—slight alterations of time and space—Dzhokhar could have been a friend of yours or of mine, and neither of us would necessarily have felt a fool for forging such an acquaintance.

He could have been you or me.

To restate the point by Payack, the wrestling coach:  Dzhokhar was not a super villain, devilishly biding his time until the perfect opportunity to unleash holy hell finally presented itself.  Rather, he was a decent kid who committed an evil act, for which he cannot and should not be forgiven.

The question then becomes:  What do we do with this information?  Does any of it really matter?

In legal terms, it matters not one whit.  A crime is a crime, and any good that Dzhokhar might have done prior to April 15 is irrelevant background noise in a court of law.

Probably the only lasting use of the details in “Jahar’s World” will be sociological, forcing us to pause about what we think we know about human nature and the people with whom we surround ourselves every day.

The problem, as this whole sordid episode suggests, is that the conclusions to which we might ultimately be led may well be too horrible to contemplate.

Open-Ended Grievance

Barack Obama is one of the most thoughtful men ever to occupy the Oval Office.  He is the rare president—nay, the rare politician of any sort—who is a true intellectual, effectively reasoning his way through his job.

Anyone who still doubted the commander-in-chief’s cerebral capacities, having heretofore attributed his rhetorical magic to speechwriters and Teleprompters, was given a rather stern rebuke by the president’s comments on Friday regarding the shooting of Trayvon Martin and subsequent trial of George Zimmerman, who was acquitted of both murder and manslaughter charges last week.

The 17-minute quasi-speech, extemporaneous and flowing directly from the president’s heart, covered similar territory as his celebrated “race speech” of March 2008, and seemed to make the same broad point:  On matters of race relations, the United States has progressed and matured by leaps and bounds, but is still very much a work in progress.  Racism in America is not nearly as bad as it once was—not by a long shot—but it has not altogether disappeared.  It has merely grown more subtle.

Obama’s central plea in these addresses is for white Americans to understand why many black Americans still feel they have gotten a raw deal from their mother country.  That nearly every black person, at one time or another, has found himself the object of a white person’s fear and/or suspicion for no reason except that he is black.

The implication, in light of the Zimmerman verdict, is that a white person’s irrational, prejudicial views about black people can lead to a senseless killing and, more alarming still, allow one to literally get away with murder.  In other words, this is not merely a philosophical problem.

The popular view about George Zimmerman is that the only reason he considered Trayvon Martin “suspicious,” following him across the neighborhood and thereby provoking a scuffle that led to him shooting Martin dead, is because Martin was black.  Had Martin been white, the theory goes, Zimmerman would not have given Martin’s behavior a second thought and the shooting would never have occurred.

We have no idea if this is true.  Zimmerman denies it, although he could be lying.  The audio of his phone conversation with police has him commenting, “These assholes, they always get away,” but we have no particular cause to assume he had black people in mind.  For years, his and Martin’s gated community had been rife with burglaries, break-ins and the like, committed by people of many skin colors.  Racially speaking, “these assholes” is fairly all-encompassing.

It is with these details in mind that we must consider the president’s observation that personal experiences of white people’s prejudices “inform how the African-American community interprets what happened one night in Florida.  And it’s inescapable for people to bring those experiences to bear.”

My question is this:  For how long will it be “inescapable”?  Under what circumstances will it no longer be morally justified to infer racist motives in cases where such prejudices are not necessarily borne out by the facts?  Assuming a white person harbors racist views is certainly justified by history, but what happens when it’s not justified by the evidence?

The president didn’t say, and I rather wish that he had.

My primary concern (beyond the Zimmerman case) is that the heretofore understandable black suspicion toward white suspicion will endure far beyond its natural lifespan.  That the notion that white people assault black people for purely racial reasons will continue to be accepted as a given, thereby allowing America’s residual racial divides to survive to fight another day.

As a highly imperfect analogy, one might consider certain Jews’ attitudes toward the Republic of Germany.

In the early years following the end of the Second World War, members of the Twelve Tribes could be forgiven for suspecting that folks with German blood were, shall we say, out to get them.  A crime committed by a German against a Jew could reasonably be assumed to have been anti-Semitic in nature.

Today, nearly seven decades since the last gas chambers were extinguished, Germany has all but outlawed anti-Semitism within its borders—denying the Holocaust is a criminal offense—and individual Germans tend not to be any more anti-Jewish than other Europeans; if anything, they are less so.

Yet there are countless Jews who still refuse to buy a German car or patronize German businesses, even here in the states.  No one has to explain why this happens, yet we are nonetheless entitled to question whether such behavior is any longer rational or even ethical.  Why should a German teenager automatically suffer for the sins of his grandfather?

The message is not “forgive and forget.”  Some people don’t deserve to be forgiven, having committed crimes that ought always to be remembered as sharply as one can muster.  Some modern-day Germans (and non-Germans) really are out to get the Jews, just as some white folk really do profile black folk, sometimes in a lethal fashion.

Rather, one should refrain, as much as one can, from combating bad faith with bad faith.  A right, two wrongs do not make.

The ultimate solution, as President Obama correctly noted, is for those still in need of enlightenment on the issue to be given the education they so urgently require.  As we wait for such an eventuality to occur (not that such a project will ever truly be complete), we would do well for ourselves and our society—if I may coin a phrase—to give each other the benefit of the doubt.

Leave Florida Alone

When I was in high school, the concept of self-defense did not exist.

In my high school’s official student handbook, it was made plain that, in the event that two students fought on school grounds, neither one would be granted the presumption of having acted in self-defense.

As a classmate aptly put it, “If someone starts hitting you, the correct answer is to just stand there and keep getting hit until a teacher happens to walk by and break it up.”  Indeed, it seemed that an abjectly passive response to physical harassment was the only way you could be certain not to face disciplinary actions later on.

In the real world, of course, the effective prohibition on defending oneself from harm is utterly unworkable and, in point of fact, morally repugnant.  In practice, it would render one either a sitting duck or an unwitting future prison inmate.  It leaves only the bullies to decide who gets to live or die.

The principle of self-defense is something on which nearly everyone agrees.  The controversy lies only in the details.

There was a great deal of debate about the minutiae of self-defense laws during the trial of George Zimmerman, who last week was acquitted of second-degree murder and manslaughter charges in the shooting of 17-year-old Trayvon Martin.

Indeed, self-preservation was the long and the short of Team Zimmerman’s case, which argued that Martin’s behavior on the night of February 26, 2012, caused Zimmerman to fear for his physical well-being to such a degree that he had no choice but to shoot Martin, which he did.

This rationale proved persuasive enough to the six-member jury, which ruled that Zimmerman’s actions were within the boundaries of Florida state law on the matter, and that he was not to be held criminally liable for Martin’s death.

In the meanwhile, the state of Florida has been subject to enormous critical ire.  Stevie Wonder vowed never to perform in the state again, while temporary Daily Show host John Oliver declared it the “worst state” in the union.  A spliced-together clip of Bugs Bunny circumcising Florida from the continent with a handsaw circulated across online social networks, and most applauded the idea.

The basis for all this antipathy is the provision in Florida’s self-defense laws known as “stand your ground,” enacted in 2005, which licenses anyone fearing for his life to use deadly force against the person he perceives to be threatening him.

The prevailing view is that the looseness of Florida’s policy is sui generis and the only reason George Zimmerman is now a free man.

The prevailing view is wrong.  In fact, it’s wrong twice.

For all the press that “stand your ground” has received throughout this ordeal, Zimmerman’s attorneys did not specifically cite it in their argument for acquittal.  Instead, they relied on state laws that existed before “stand your ground” was written—clauses that entitle one to execute deadly force if one is being savagely attacked, as Zimmerman allegedly was by Martin.  If one takes Zimmerman’s version of events at face value, as the jury did, then the case for self-defense writes itself.

What is more, on the matter of “stand your ground,” Florida is by no means the only state with such low standards for what constitutes justified self-defense.  Not even close.

The “duty to retreat” doctrine—an attempt-to-flee-before-shooting statute that used to underlie common law on the matter—has effectively been done away with in no fewer than two dozen states, which have followed Florida’s lead in putting the onus on the prosecution to prove defensive lethal force was not necessary in a given situation, rather than on the defense to prove that it was.

“Castle law,” the 17th century English concept that one can shoot a threatening person with happy abandon should he enter one’s home, has been expanded to include cars and various public places in a similar number of states.  The details are by no means identical from state to state, but the principle is the same:  If you feel personally endangered and you happen to be armed, fire away.

The fact must be faced:  In today’s America, George Zimmerman could have been acquitted of murdering Trayvon Martin in jurisdictions from coast to coast.  It is not simply a problem for one particular state, even one as silly, dysfunctional and backward as Florida.

Birth of a Monarch

Catherine, Duchess of Cambridge—otherwise known as Kate Middleton, wife of Prince William—is soon to deliver the couple’s first child, an event that has been dubbed “the most anticipated birth since the dawn of Twitter.”

This child, when it does appear, will have precisely one official duty with which to occupy its time:  Waiting for his or her great-grandmother, grandfather and father to die.

When the last of these dreary eventualities finally comes to pass, the “royal baby” will at last assume what had all the time been his or her birthright:  Sitting in a comfy chair and passively waving to the good citizens of Great Britain.

With such exciting prospects for the young whippersnapper, one can understand what all the fuss has been about.

Here in the United States, we have a firmly-entrenched concept known as “American royalty.”  These are our fellow citizens whom we have collectively decided to treat as slightly above us on the cultural food chain—a designation typically earned through a combination of descending from a distinguished family and having good fashion sense.

Ostensibly, we engage in this odd activity to compensate for the fact that America has no actual royalty of its own, as stipulated by our Constitution.

However, rather than explaining away our collective fawning over holier-than-us celebrities, this serves only to beg a further question:  What good is there in doing so in the first place?  Why do we want royalty of any sort?

After all, were America to abruptly fall under an actual heretical dictatorship, with a leader sitting atop his throne until exhaling his final breath, we would find the whole business intolerable—as we did in the latter years of the 18th century, when we decided to turn our backs on the British Empire and give democratic republicanism the old college try.

We fashion ourselves a fake monarchy as a luxury of not having a real one.

What is most robustly demonstrated by our infatuation with the current British Royal Family, from Princess Diana onward, is our related tendency here in the states to generate cult followings for people who are, as the saying goes, “famous for being famous.”  That is, individuals whose cultural prominence seemingly stems from nothing at all and is, accordingly, utterly undeserved.

In point of fact, the British Royal Family—indeed, any royal family—is the perfect and absolute encapsulation of this horrid concept, demonstrated in no finer way than in the present preoccupation with the incoming heir to the throne.

This royal baby, having yet accomplished nothing other than simply existing, shall for a long time be “famous for being famous” by definition.  He or she will be subject to bottomless media coverage from dawn to dusk, from cradle to grave, for no reason except the cosmic accident of having Alfred the Great for a great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great grandfather.

This dynamic is no less true for Princes William and Harry, their father Prince Charles, or Her Majesty herself, Queen Elizabeth II.  They are all essentially unremarkable people whose high positions in English society were sealed at the moment of conception.

What makes the tabloid-style worship of the Windsor clan just the slightest bit creepy—particularly in the British Isles themselves, where nearly 80 percent of the citizenry still supports the monarchy as an institution—is because the objects of this affection really are royalty.

Lest we forget, the Queen is officially the head of state, as well as the head of the Church of England.  She could, if she desired, unilaterally dismiss the sitting prime minister and appoint, in his place, anyone her heart desired.  As well, she could declare war on Iran or Syria and later bestow pardons on those who committed atrocities in the process.

In practice, such “royal prerogatives” have fallen almost entirely by the wayside.  The Queen has shown no interest in exercising them, instead delegating all business of government to Parliament.  None of her eventual heirs has indicated any intensions to the contrary.

That is entirely beside the point.  That the monarch declines to assume certain absolute powers does not negate the absurdity of the existence of such powers in the first place.  The real test of the monarchy’s so-called popularity will come when the monarch attempts to assert a level of authority that is, after all, nothing less than his or her birthright.

The magnificence and genius of the American presidency, by contrast, lies not in its great powers, but in its great limitations, which are established and enforced not merely by tradition, but by actual written documents by which the president is legally, constitutionally and morally bound.  If and when he attempts to assume too much authority, Congress and the courts are enjoined to intervene.  This does not always work in practice, but that is the fault of individuals, rather than the system itself.

From all this, what we might say for us in the Anglo-American world, on both sides of the Atlantic, is that we prefer our royalty in name but not in practice, with the maximum of glamour and the minimum of power.  In the name of all that is good and holy, let’s keep it that way.

Commentator-in-Chief

President Barack Obama has long been criticized for his reluctance to involve himself in the business of Congress, so often declining to march down Pennsylvania Avenue to personally harangue members of the House and Senate to pass a particular bill, as past presidents have been known to do.

This charge, while sometimes exaggerated, is true enough.

Considering the president’s general aloofness on matters of American governance, it is rather curious that he has no such reticence on matters of the American culture, into which he seems positively itching to dive.

There he was, mere hours after a Florida jury acquitted George Zimmerman of second-degree murder, issuing an official statement reading, in part, “The death of Trayvon Martin was a tragedy.  Not just for his family, or for any one community, but for America.”

This was not the first time the commander-in-chief chimed in on the murder trial that captured the nation’s imagination.  In March, as coverage of the case reached saturation levels, the president intoned, “When I think about this boy, I think about my own kids, and I think every parent in America should be able to understand why it is absolutely imperative that we investigate every aspect of this […] If I had a son, he would look like Trayvon.”

We may well ask:  Why is the leader of the free world commenting about a matter that is the business of local Florida law enforcement?  What concern is the killing of one private citizen by another private citizen to the most powerful man on Earth, that he cannot help but offer his own personal musings about it?

But then, we know the answer to these queries, at least in this particular case.  Obama insinuated himself into the Trayvon Martin conversation because he views it as a “teachable moment” for America on the issue of gun violence.  It is, in his words, an opportunity to “ask ourselves if we’re doing all we can to stem the tide of gun violence,” and to figure out “how we can prevent future tragedies like this.”

Indeed, you might say that, by exploiting a local incident to push his national agenda, Obama is doing culturally what he sometimes fails to do legislatively:  Claiming the moral high ground.  If this is what it takes to govern, he might argue, then so be it.

All the same, this does not make the general practice of presidential involvement in ostensibly low-level news events any less dubious.

The question we must ask is simply this:  Being a figurehead, not just an individual, is the president not obligated to position himself above and slightly removed from the friction of daily life in the United States?  Should he not recuse himself from matters that do not require his attention, in the interest of at least appearing to be disinterested and objective?

Consider a slightly less serious example than Trayvon Martin:  The NCAA tournament.

Every year while he has occupied the Oval Office, Obama has filled out his own March Madness bracket and broadcast it to the nation.  (Indeed, his predictions have proved quite prescient.)

While he has every right to join the millions of his fellow Americans in this sacred spring ritual—as a lifelong basketball enthusiast, he presumably would be doing it anyway—I nonetheless wonder if it is not improper to do so at the White House desk.  It somehow seems beneath the majesty of the office he presently holds.

What must it be like to be, say, a promising 19-year-old freshman point guard and be told the president of the United States has penciled your team in for a first round loss?  It cannot feel great.  It seems to me that the nation’s highest officeholder should have the courtesy to keep out of it.

To be the president—a one-man institution—is to surrender certain privileges for as long as your term endures.  For instance, you cannot run out to the corner florist to buy a bouquet for your mistress (unless, of course, you are Michael Douglas in The American President).  You cannot drive a car or drink to excess, nor can you call in sick because of a particularly splitting headache.

And you can’t offer your opinion on every last aspect of the American culture, no matter how persistently members of the press corps might ask for them.  Some impulses ought to be resisted for the sake of old-fashioned propriety.

Sometimes, the most essential duty of the president is not to participate, but simply to preside.

Core Culinary Competency

Dunkin’ Donuts is in the midst of an identity crisis.

As reported by NBC News (and noticed by regular customers, I suppose), the Massachusetts-based coffee mega-chain has tweaked the look of its cafés in recent weeks, with the sudden appearance of big, cushy chairs, quasi-casual soft lighting and free Wi-Fi service.

That’s right:  Dunkin’ Donuts is turning itself into Starbucks.

As explained in a press release, “The modernized design incorporates many new features to create a warm environment for guests who seek a longer, more relaxed visit to Dunkin’ Donuts as part of their day.”

The article quotes a retail strategist, Todd Hooper, who reasons, “People are eating and drinking around the clock now and working wherever they have to.”  To better serve a growing demand for public spaces in which to do so, he continues, fast food establishments such as Dunkin’ are “going from just a kitchen to being an out-of-home den or office or conference room.”

Certainly, this phenomenon of coffee shop-as-office is not new.  For most people it has existed, albeit in increasingly ubiquitous forms, for as long as Starbucks has—the Seattle-based java house (and Dunkin’s primary competitor) having more or less introduced the form.

But then there is my point:  If I want a place to buy coffee and sit down to read or bang on my laptop for the better part of an afternoon, I will go to Starbucks.

However, if I am rather in the mood for patronizing a massive coffee behemoth but do not wish to linger—and if I am perhaps also nursing a hankering for Boston Kreme—then I opt for Dunkin’.

This is the dynamic we, the people, have agreed upon for a very long time, and the only one with which I am truly comfortable.

The term that springs to mind is “core competency.”  This is the notion that a successful business tends to excel at one particular thing, and should simply concentrate on perfecting its aptitude for said thing, rather than attempting to mimic said success on something entirely different.

The current proliferation of long-established fast food joints drastically altering their looks and menus in order to compete for the dollars of busy working folk would seem a flagrant violation of this principle.

For all the business sense it might make to embrace a model of evolving one’s identity to cater to America’s evolving needs and tastes, I lament this trend nonetheless, for the intrinsic dishonesty of which it smacks.

In point of fact, Dunkin’ Donuts reupholstering its interior is among the least alarming examples of this.

Within the big name food service industry, the true offenders of the “core competency” rule are the ones undertaking full-scale recalibrations of their actual food, and, in turn, their very selves.

After an increasingly lucrative period of revolutionizing the semi-healthy lunch market, Subway sandwich shops started selling breakfast food in 2010—a move perhaps not quite as odd as four years earlier, when it introduced pizza.

Amidst the worldwide weight loss craze of the last decade, the undisputed king of unholy portion sizes, the Cheesecake Factory, unveiled its “SkinnyLicious” menu in 2011, featuring dishes of more reasonable acreage and nutritional value.  Comparable “family” restaurants such as Applebee’s and Olive Garden have made similar moves to attract those rare Americans who do not wish to exit in a Category 5 food coma.

And of course there are the enduring hamburger staples like McDonald’s, Wendy’s and Burger King, which have bent over backwards to sell products not completely drenched in salt and trans fats.

From an illuminating recent article in the New York Times, we learn that despite having these plentiful new “healthy options,” even the most otherwise health-conscious McDonald’s customers, by and large, are still ordering deep-fried crap.

While the article quotes all sorts of experts in an attempt to solve this so-called mystery, I suspect the true answer comes from food consultant Darren Tristano, who poses the radical theory that “consumers don’t see fast food as a place to eat healthy.”

You don’t say.

In his book Food Rules, Michael Pollan issues the injunction, “Avoid foods that are pretending to be something they are not.”  Per example:  If you want ice cream, eat ice cream.  Don’t waste your time with those non-fat, no-sugar-added “frozen dairy dessert” imposters.

In like spirit, we just might need to face the awful truth that when a person enters McDonald’s, it’s because he wants a goddamned hamburger, because that’s what McDonald’s is for.

Establishments that peddle culinary garbage ought to embrace their unique place in the food universe, and not attempt to be something they are not.  There are certain destinations whose core purpose is, and has always been, to contribute to the great American tradition of slowly eating ourselves to death, and by God, they should not stop now.

To Not Believe

A new study suggests that people who believe in God tend to respond better to psychological therapy than those who do not.

To this, I cannot help but respond:  So what?

Here’s what happened:  In an experiment involving 159 men and women undergoing various forms of counseling, researchers at McLean Hospital in Belmont, Mass., asked the participants about the degree to which they believe, or do not believe, in some sort of God.

As the therapy sessions progressed, the researchers observed that those who professed a belief in the supernatural were faring quite well, while those who denied (or highly doubted) God’s existence were not, relatively speaking.

As well, the study found that participants who expected to be helped by the counseling found the hope to be self-fulfilling, and that the reverse was also true, with skeptics of the sessions’ promise finding them to be of limited assistance in alleviating their various psychological maladies.

Data points such as these raise all sorts of questions regarding causality and the possibility that God has nothing to do with it.  After all, once we have established (as this study apparently has) that simply wishing for a positive outcome from therapy will nearly always produce one, has the “mystery” not already been solved?

Nonetheless, I view the results and conclusions of this research as a sparkling opportunity to examine another inevitable question that it raises:  What does it mean to be an atheist?

As a nonbeliever, I find myself possessing a particular stake in the answer to this question.  What is more, in light of the uncommonly high saturation of press coverage the cause of atheism has accrued in recent years, I feel compelled to address and, in some cases, correct a series of assumptions the godly community has about us heathens, some of which (in their defense) the above study has seemingly confirmed.

Stephen Fry, the polymathic British actor, recently affirmed his own godlessness on the Late Late Show with Craig Ferguson, only to then add, “Being an atheist doesn’t mean anything to me.”  By means of explanation, he continued thusly:

If there was a word to describe someone who doesn’t believe in the Tooth Fairy—a ‘flimpist,’ for example—I’d have to say I’m a flimpist.  But being a flimpist is meaningless.  It just means I don’t believe in the Tooth Fairy.  It doesn’t involve a set of values.

My thoughts exactly.  So far as I am concerned, to not believe in God means precisely that.  By definition, all those who fashion themselves atheists share the view that there is no reason to think the universe we inhabit was designed, and is lorded over, by a singular, intelligent being that sees us when we’re sleeping and knows when we’re awake.

However, that is all that we have in common.  To take this single shared conviction and infer a laundry list of additional characteristics—well, it would be as silly as for me to say that all religious folk think the Earth is flat and that men and dinosaurs once lived side by side.

Beware the hazards of painting with a broad brush.

Probably the most damning charge against the atheist community—were it to be true—is that not to believe in God is not to believe in objective morality.  That without God, life is cold and meaningless.

Are there people who believe such things to be true?  Of course there are.  In fact, we have ready-made terms for them, such as “nihilist,” “sociopath” and “party pooper.”

What this has to do with atheism, I cannot say.

One can, if one desires, extrapolate the view that the universe contains no divine father figure to mean there is no reason to treat others with dignity and respect.  Equally, one can surmise (as many do) that God’s eternal presence and grace licenses one to strap on a vest, totter into a crowded marketplace and blow up a few hundred innocent men, women and children.

However, neither of these trains of thought is inevitable, universal or (most importantly) the slightest bit rational.  They are non sequiturs of the most profound sort.  That they are sometimes true matters not one whit.

That is why I am skeptical that the correlation the McLean Hospital study appears to have drawn between belief in God and susceptibility to psychological counseling has any real meaning.

Sometimes therapy works, and sometimes it does not.  Does the wiring in one’s brain that leads one to doubt the existence of the supernatural make one impervious to therapy?  Perhaps it does.  Let us investigate further.

However, I would humbly advise that one take care not to infer more than one possibly could about what these findings truly say about what it means to not believe in God.

Watching ‘Big Brother’

When one is a chronicler and complainer of the myriad times when things in the American culture are done wrong, one is especially grateful for, and duty-bound to report upon, the instances when things are done exactly right.

I must confess that I have never subjected myself to a single episode of the CBS reality series Big Brother.  It’s nothing personal.  I have never subjected myself to any other reality TV series, either.  The genre has never much appealed to me, as witnessing a group of silly, exhibitionistic goobers locked in a giant house yelling at each other somehow is not my idea of an enjoyable evening in front of the old set.

However, I concede that for millions of Americans, the exact opposite is the case.  In light of the controversy that erupted on the set of Big Brother in recent days, I can almost understand why.

The kerfuffle involves a duo of participants in the program’s 15th season, which premiered on June 26, who apparently do not think very highly of their non-white fellow competitors.

As first revealed in the show’s live feed—the Truman Show-like setup that paying customers can access via YouTube— a cast member named Aaryn Gries has completed such charming turns-of-phrase as suggesting the show’s Asian contestant should “shut up [and] go make some rice,” and musing that a gay contestant might win the competition because “everybody loves the queers.”  Another cast member, GinaMarie Zimmerman, has uttered similar slurs against Asians, as well as referring to welfare programs as “nigger insurance.”

There are plentiful other comments by Gries and Zimmerman—and by others, for that matter—that we might cite, but I dare say a rough sketch of the atmosphere on the set has been sufficiently rendered.

That is the setup.  The question is how the various involved parties would react.  To a person, they have all performed precisely as they should.

CBS, the network that produces the program, did not air every last racist, sexist and homophobic barb that has crossed the transom (there aren’t enough hours in the day), but it included an impressive cross section in the episodes in which they occurred—enough of them to prove that network executives are prepared to stand behind the content of their reality TV project, no matter how offensive it might be to the unsuspecting viewer.

(Naturally, CBS was compelled to release a disclaimer that the views expressed by Big Brother cast members are not necessarily those of the network.)

Let us not delude ourselves into thinking the decision to enforce a no-holds-barred policy toward its programming is motivated by anything other than money and ratings.  Conflict and bad attitudes are the lifeblood of reality TV, and there is nothing particularly noble about embracing and exploiting it when it becomes especially ugly.  (Thus far, the show’s ratings have held steady.)

Indeed, the premise upon which Big Brother is founded is designed to generate precisely this sort of controversy.  For the uninitiated:  The program is a series of contests among its group of misfits, who live together in a mansion under continuous surveillance, are “evicted” one by one, and until then are completely isolated from the outside world.

CBS would be disingenuous to feign innocence about what it was getting itself into in green-lighting such a concept.  The network has resisted, seeing its conceit through to its logical, wretched conclusion.  To that extent, it deserves our grudging respect.

In that same spirit of openness and intellectual honesty, people like Gries and Zimmerman can be said to have done the world a service by revealing themselves to be the almost comically awful human specimens that they are.  There is a special sort of gall in saying what no one wants to hear in the full knowledge that everyone is listening.  It shows character, albeit the poorest, most reprehensible sort.

As the icing on this sad, sorry cake, we find that among those listening were Gries’ and Zimmerman’s employers—a modeling agency and a pageant planning company, respectively—which relieved the women of their duties shortly after their poisonous comments became known and widely distributed.

Were they wrong to do so?  Is a modeling agency duty-bound to stand by one of its cover girls after she is shown to emit racial slurs on camera with no apparent regret?  Don’t make me laugh.  The First Amendment allows contestants on a game show to say stupid things, but it does not prevent disagreeable consequences to befall them as a result.

In sum, this Big Brother business is a sordid, silly affair with no great cultural significance, except as an illustration—as I suggested at the top—that sometimes the messy minutiae of such squabbles play out precisely as they ought to.  We should rejoice when such phenomena occur.

Well done, everyone.  Keep up the good work.