Don’t Say ‘Gay’

Mahmoud Ahmadinejad, the outgoing president of Iran, famously made an unholy spectacle of himself in a 2007 speech at Columbia University when he asserted, “In Iran, we don’t have homosexuals, like in your country.”

Today in Russia, President Vladimir Putin is trying to make this literally true.

As part of a broader crackdown on gay rights activities of various sorts, Russia recently passed a law against “homosexual propaganda,” making it illegal to “spread information about non-traditional sexual behavior” to children under 18 years old.  Transgressors will be penalized with heavy fines, and violators from other countries will be subject to deportation.

This “propaganda” legislation joins similar measures against public homosexuality in the Russian Federation, such as the prohibition on foreign same-sex couples adopting Russian children, as well as the general practice by Russian police to break up gay rights marches and demonstrations, often violently, and detaining some of their participants.

As well, assaults on gay Russians by straight Russians have run rampant in the country for a long time.  Anti-gay sentiment apparently cuts wide and deep, with 88 percent of the public giving the new “propaganda” law a thumbs-up, according to a state-run polling organization.  (The trustworthiness of such opinion-gathering outfits is in some dispute, but one suspects this one is not too far off.)

Here in America, most of this official anti-gay policy strikes us as positively barbaric, and is no longer tolerated in our open, pluralistic culture.

Or is it?

Reading about the “propaganda” law, I could not help but be reminded of the kerfuffle in Tennessee at the beginning of this year over what came to be known as the “Don’t Say ‘Gay’ bill.”  Proposed by State Senator Stacey Campfield, the bill, if passed, would have effectively banished all discussion of homosexuality in the state’s elementary schools.

“At grade levels pre-K through eight,” the bill stipulated, “any such classroom instruction, course materials or other informational resources that are inconsistent with natural human reproduction shall be classified as inappropriate for the intended student audience and, therefore, shall be prohibited.”

Sound familiar?  Is there any part of that sentence of which Vladimir Putin would not approve?

This is not to say that Tennessee is as bad on gay rights as Russia, per se.  Campfield’s bill never actually passed muster in either house of the state’s legislature, having died in committee.  Further, no American state government is systematically rounding up pro-gay rights agitators as they regularly are under the Putin regime and in many other hell holes around the world, particularly in Africa.

What should nonetheless command our attention here in the states—the one way in which the shenanigans in Tennessee mirror the shenanigans in Russia—is the leading role that language plays in the battle over gay civil rights around the world.

The central irony of the new “propaganda” law—unmistakable and essential—is how it is, itself, an exercise in propaganda.

While everyone with any sense knows that homosexuality in some Homo sapiens is an objective fact of life that cannot simply be wished away, this legislation seeks to do precisely that.

In a Russia whose population has flatlined in the last several decades, where the prospect of insufficient occurrences of heterosexual congress presents as an existential threat, homosexual intimacy can reasonably be seen (by the homophobia-inclined) as slightly beside the point.

The Russian government has been actively encouraging procreative sex for years.  In this way, the “propaganda” law can be seen as complementary and then some—a means not merely to discourage one form of non-procreative sex, but to deny its very existence.

Conceivably, this would make sense if human sexuality was a choice, as some apparently still believe, and homosexual relations were merely a form of rebellion against social mores, as some apparently also still believe.

The problem is that this is not the case.  Homosexuality exists whether a government wants it to or not, which means any attempt to argue or legislate to the contrary will ultimately be futile and subject to the sort of ridicule President Ahmadinejad faced when he suggested Iran was immune to the gay germ.

“Don’t Say ‘Gay’” policies are not merely an affront to gay people, you see, but an affront to truth.

Not unlike neo-Nazi denial of the Holocaust or Turkish denial of the Armenian Genocide, they are an attempt not to attack a particular group of people, but rather to delegitimize them outright by withholding from them the most basic component of human dignity:  Acknowledging that they exist at all.

Advertisements

A Fresh Take on Tobacco

The U.S. government is thinking about severely regulating the sale of menthol cigarettes, if not banning them outright.

Why is that?

Because menthol cigarettes might be hazardous to your health.

Yes, I was shocked, too.  But apparently it is true that the countless toxins embedded in the nicotine of your friendly neighborhood cigarette are not magically disappeared by the addition of a fresh, minty aftertaste.

Who knew?

The particular concern about menthol cigarettes—as might interest those who, like your humble servant, assumed all cigarettes were more or less interchangeable—is that they are more addictive than non-menthol cigarettes, and therefore a riskier habit for young, first-time smokers to take up.

“There is little evidence to suggest that menthol cigarettes are more or less toxic or contribute to more disease risk to smokers than regular cigarettes,” according to a recent Food and Drug Administration review.  “However, there is adequate data to suggest that menthol use is likely associated with increased smoking initiation by younger people and that menthol smokers have a harder time quitting.”

The review went on to explain that “there’s also evidence indicating that menthol’s cooling properties can reduce the harshness of cigarette smoke and that menthol cigarettes are marketed as a smoother alternative.”

This new FDA report makes no explicit recommendation as to whether, and how, the government should act on these fresh findings, although a similar report in 2011 noted that “removing menthol cigarettes from the market would benefit public health.”

In making the case against any further tobacco regulation, one is tempted merely to fall back on all the usual tropes.  You know, the ones about how smoking is an individual’s right and choice—two values upon which the American republic is founded—and that if one is not granted the right to make the wrong choices, one has no rights at all.

Further, that while it is regrettable that the age limit for purchasing cigarettes has long proved to be of limited practical use, we cannot and should not prohibit adults from engaging in adult activities simply because they might also be engaged in by children.

And that it is beyond the competency of the government to determine which activities are good and which are bad.

And that there is no sentient being left in the United States who does not know that, in health terms, smoking is a breathtakingly stupid thing to do.

All of these things are as true as ever they have been.  Any libertarian-minded person can be contented that the moral argument against smoking prohibitions was complete many years ago and requires no further comment.

And yet, one feels somehow obligated to revisit and perhaps recalibrate this pro-tobacco line of logic in light of the unique challenge that menthol cigarettes present.

I noted at the start how, until presented with this information about the effects of menthol, I had assumed all cigarettes were created equal.  While I knew that, like liquor or coke, they came in many colors, names and brands, I nonetheless figured that their overall effect on one’s system was the same.

My inkling, and my concern, is that many other people are equally unaware of the difference between menthol and non-menthol cigarettes, not knowing that the former, by design, tend to be more addictive than the latter.

Taking this to be true, it stands to reason that an aspiring teenage smoker who might be capable of handling the poisons in non-menthol cigarettes will, for reasons of taste, opt for menthols under the impression that they are no more crippling than the regular brands, and wind up with an addiction that proves to be a bit more than he bargained for.

Accordingly, the case for applying special legal scrutiny to menthol, relative to non-menthol, would seem to rest on the principle of full disclosure.

Lest we forget, the original war on Big Tobacco was based not on the fact that cigarettes are poison, but rather that the companies selling them insisted that they were not.

If we are to regulate menthol in a stronger way, that is the basis on which we should do so:  By informing menthol’s current and potential users precisely what it is they are putting in their mouths, thereby allowing them to smoke, suffer and die in the most intellectually honest possible way.

Monsters Among Us

Maybe it’s just me, but I found the kid who was drenched in blood, with a red laser dot pointed directly at his forehead, far more sympathetic than the cute, cleaned-up one on the cover of Rolling Stone.

No one likes a preening, narcissistic prima donna with perfect skin.  But someone who spent an entire day rotting in a pool of his own fluids, unable to tend to several dozen open wounds, looking positively defeated when finally taken into custody?  The poor dear.

Rolling Stone ruffled all sorts of feathers with the release of its current issue, whose cover is occupied by the pretty boy face of Dzhokhar Tsarnaev, the surviving of the two Boston Marathon bombers.  Critics howled that the image romanticizes Tsarnaev, making him out as some sort of “rock star.”

In response, a Massachusetts State Police sergeant leaked heretofore classified photographs depicting the capture of Tsarnaev at the end of a daylong manhunt, to remind everyone of the barbarian he really is.

Reading “Jahar’s World,” the article about Tsarnaev in Rolling Stone—which I strongly recommend to everyone—I sensed the light bulb above my head illuminate.  For the first time since the April 15 attack, I felt like I really got it.

But “it,” I do not mean that I understood Tsarnaev himself, and the reasons he might have had to join his brother, Tamerlan, in executing a savage and unforgivable assault on our free society.  Indeed, such insights are beyond the faculties of anyone outside Tsarnaev’s own head, and perhaps Tsarnaev himself.

Rather, I now better comprehend the particular context from which Tsarnaev emerged—the one that led Rolling Stone editors to depict him as they did.

Janet Reitman, the author of the piece, quotes Peter Payack, Tsarnaev’s high school wrestling coach, who says of him, “I knew this kid, and he was a good kid.  And, apparently, he’s also a monster.”

That’s it.  That’s the key to the whole business:  The terrible, frightening prospect that the bad guy was also a good guy.

This is entirely distinct from the profile of the typical teenaged mass murderer—the “loner” who “kept to himself” and “didn’t seem to have any friends” and one day brought his machine gun to school and massacred a few dozen classmates.

As Reitman’s article makes clear, Dzhokhar Tsarnaev wasn’t like that at all.  He wasn’t a loner, he didn’t keep to himself, and he had plenty of friends, several of whom Reitman interviewed for the piece.  They seem like friendly, normal folk, and so does he.  No one suspected he was capable of committing an act of terrorism because he never gave anyone a reason to be suspicious.

Indeed, Dzhokhar here is seen not merely as a nice guy, but also as uncommonly generous and a positive influence on his community.

He would often go out of his way to do favors for friends.  He seemed to be perennially carefree and at ease—aided, no doubt, by his apparently bottomless supply of marijuana.

He cared passionately about his Muslim faith, but unlike his elder brother, showed no desire to enforce its values upon others.  Indeed, he talked about religion so infrequently that many of his friends—Christians and Jews among them—did not know he was a Muslim at all.

Details like these, taken together, lead one to an inevitable and frightening conclusion:  With just a bit of cosmic shuffling—slight alterations of time and space—Dzhokhar could have been a friend of yours or of mine, and neither of us would necessarily have felt a fool for forging such an acquaintance.

He could have been you or me.

To restate the point by Payack, the wrestling coach:  Dzhokhar was not a super villain, devilishly biding his time until the perfect opportunity to unleash holy hell finally presented itself.  Rather, he was a decent kid who committed an evil act, for which he cannot and should not be forgiven.

The question then becomes:  What do we do with this information?  Does any of it really matter?

In legal terms, it matters not one whit.  A crime is a crime, and any good that Dzhokhar might have done prior to April 15 is irrelevant background noise in a court of law.

Probably the only lasting use of the details in “Jahar’s World” will be sociological, forcing us to pause about what we think we know about human nature and the people with whom we surround ourselves every day.

The problem, as this whole sordid episode suggests, is that the conclusions to which we might ultimately be led may well be too horrible to contemplate.

Open-Ended Grievance

Barack Obama is one of the most thoughtful men ever to occupy the Oval Office.  He is the rare president—nay, the rare politician of any sort—who is a true intellectual, effectively reasoning his way through his job.

Anyone who still doubted the commander-in-chief’s cerebral capacities, having heretofore attributed his rhetorical magic to speechwriters and Teleprompters, was given a rather stern rebuke by the president’s comments on Friday regarding the shooting of Trayvon Martin and subsequent trial of George Zimmerman, who was acquitted of both murder and manslaughter charges last week.

The 17-minute quasi-speech, extemporaneous and flowing directly from the president’s heart, covered similar territory as his celebrated “race speech” of March 2008, and seemed to make the same broad point:  On matters of race relations, the United States has progressed and matured by leaps and bounds, but is still very much a work in progress.  Racism in America is not nearly as bad as it once was—not by a long shot—but it has not altogether disappeared.  It has merely grown more subtle.

Obama’s central plea in these addresses is for white Americans to understand why many black Americans still feel they have gotten a raw deal from their mother country.  That nearly every black person, at one time or another, has found himself the object of a white person’s fear and/or suspicion for no reason except that he is black.

The implication, in light of the Zimmerman verdict, is that a white person’s irrational, prejudicial views about black people can lead to a senseless killing and, more alarming still, allow one to literally get away with murder.  In other words, this is not merely a philosophical problem.

The popular view about George Zimmerman is that the only reason he considered Trayvon Martin “suspicious,” following him across the neighborhood and thereby provoking a scuffle that led to him shooting Martin dead, is because Martin was black.  Had Martin been white, the theory goes, Zimmerman would not have given Martin’s behavior a second thought and the shooting would never have occurred.

We have no idea if this is true.  Zimmerman denies it, although he could be lying.  The audio of his phone conversation with police has him commenting, “These assholes, they always get away,” but we have no particular cause to assume he had black people in mind.  For years, his and Martin’s gated community had been rife with burglaries, break-ins and the like, committed by people of many skin colors.  Racially speaking, “these assholes” is fairly all-encompassing.

It is with these details in mind that we must consider the president’s observation that personal experiences of white people’s prejudices “inform how the African-American community interprets what happened one night in Florida.  And it’s inescapable for people to bring those experiences to bear.”

My question is this:  For how long will it be “inescapable”?  Under what circumstances will it no longer be morally justified to infer racist motives in cases where such prejudices are not necessarily borne out by the facts?  Assuming a white person harbors racist views is certainly justified by history, but what happens when it’s not justified by the evidence?

The president didn’t say, and I rather wish that he had.

My primary concern (beyond the Zimmerman case) is that the heretofore understandable black suspicion toward white suspicion will endure far beyond its natural lifespan.  That the notion that white people assault black people for purely racial reasons will continue to be accepted as a given, thereby allowing America’s residual racial divides to survive to fight another day.

As a highly imperfect analogy, one might consider certain Jews’ attitudes toward the Republic of Germany.

In the early years following the end of the Second World War, members of the Twelve Tribes could be forgiven for suspecting that folks with German blood were, shall we say, out to get them.  A crime committed by a German against a Jew could reasonably be assumed to have been anti-Semitic in nature.

Today, nearly seven decades since the last gas chambers were extinguished, Germany has all but outlawed anti-Semitism within its borders—denying the Holocaust is a criminal offense—and individual Germans tend not to be any more anti-Jewish than other Europeans; if anything, they are less so.

Yet there are countless Jews who still refuse to buy a German car or patronize German businesses, even here in the states.  No one has to explain why this happens, yet we are nonetheless entitled to question whether such behavior is any longer rational or even ethical.  Why should a German teenager automatically suffer for the sins of his grandfather?

The message is not “forgive and forget.”  Some people don’t deserve to be forgiven, having committed crimes that ought always to be remembered as sharply as one can muster.  Some modern-day Germans (and non-Germans) really are out to get the Jews, just as some white folk really do profile black folk, sometimes in a lethal fashion.

Rather, one should refrain, as much as one can, from combating bad faith with bad faith.  A right, two wrongs do not make.

The ultimate solution, as President Obama correctly noted, is for those still in need of enlightenment on the issue to be given the education they so urgently require.  As we wait for such an eventuality to occur (not that such a project will ever truly be complete), we would do well for ourselves and our society—if I may coin a phrase—to give each other the benefit of the doubt.

Leave Florida Alone

When I was in high school, the concept of self-defense did not exist.

In my high school’s official student handbook, it was made plain that, in the event that two students fought on school grounds, neither one would be granted the presumption of having acted in self-defense.

As a classmate aptly put it, “If someone starts hitting you, the correct answer is to just stand there and keep getting hit until a teacher happens to walk by and break it up.”  Indeed, it seemed that an abjectly passive response to physical harassment was the only way you could be certain not to face disciplinary actions later on.

In the real world, of course, the effective prohibition on defending oneself from harm is utterly unworkable and, in point of fact, morally repugnant.  In practice, it would render one either a sitting duck or an unwitting future prison inmate.  It leaves only the bullies to decide who gets to live or die.

The principle of self-defense is something on which nearly everyone agrees.  The controversy lies only in the details.

There was a great deal of debate about the minutiae of self-defense laws during the trial of George Zimmerman, who last week was acquitted of second-degree murder and manslaughter charges in the shooting of 17-year-old Trayvon Martin.

Indeed, self-preservation was the long and the short of Team Zimmerman’s case, which argued that Martin’s behavior on the night of February 26, 2012, caused Zimmerman to fear for his physical well-being to such a degree that he had no choice but to shoot Martin, which he did.

This rationale proved persuasive enough to the six-member jury, which ruled that Zimmerman’s actions were within the boundaries of Florida state law on the matter, and that he was not to be held criminally liable for Martin’s death.

In the meanwhile, the state of Florida has been subject to enormous critical ire.  Stevie Wonder vowed never to perform in the state again, while temporary Daily Show host John Oliver declared it the “worst state” in the union.  A spliced-together clip of Bugs Bunny circumcising Florida from the continent with a handsaw circulated across online social networks, and most applauded the idea.

The basis for all this antipathy is the provision in Florida’s self-defense laws known as “stand your ground,” enacted in 2005, which licenses anyone fearing for his life to use deadly force against the person he perceives to be threatening him.

The prevailing view is that the looseness of Florida’s policy is sui generis and the only reason George Zimmerman is now a free man.

The prevailing view is wrong.  In fact, it’s wrong twice.

For all the press that “stand your ground” has received throughout this ordeal, Zimmerman’s attorneys did not specifically cite it in their argument for acquittal.  Instead, they relied on state laws that existed before “stand your ground” was written—clauses that entitle one to execute deadly force if one is being savagely attacked, as Zimmerman allegedly was by Martin.  If one takes Zimmerman’s version of events at face value, as the jury did, then the case for self-defense writes itself.

What is more, on the matter of “stand your ground,” Florida is by no means the only state with such low standards for what constitutes justified self-defense.  Not even close.

The “duty to retreat” doctrine—an attempt-to-flee-before-shooting statute that used to underlie common law on the matter—has effectively been done away with in no fewer than two dozen states, which have followed Florida’s lead in putting the onus on the prosecution to prove defensive lethal force was not necessary in a given situation, rather than on the defense to prove that it was.

“Castle law,” the 17th century English concept that one can shoot a threatening person with happy abandon should he enter one’s home, has been expanded to include cars and various public places in a similar number of states.  The details are by no means identical from state to state, but the principle is the same:  If you feel personally endangered and you happen to be armed, fire away.

The fact must be faced:  In today’s America, George Zimmerman could have been acquitted of murdering Trayvon Martin in jurisdictions from coast to coast.  It is not simply a problem for one particular state, even one as silly, dysfunctional and backward as Florida.

Birth of a Monarch

Catherine, Duchess of Cambridge—otherwise known as Kate Middleton, wife of Prince William—is soon to deliver the couple’s first child, an event that has been dubbed “the most anticipated birth since the dawn of Twitter.”

This child, when it does appear, will have precisely one official duty with which to occupy its time:  Waiting for his or her great-grandmother, grandfather and father to die.

When the last of these dreary eventualities finally comes to pass, the “royal baby” will at last assume what had all the time been his or her birthright:  Sitting in a comfy chair and passively waving to the good citizens of Great Britain.

With such exciting prospects for the young whippersnapper, one can understand what all the fuss has been about.

Here in the United States, we have a firmly-entrenched concept known as “American royalty.”  These are our fellow citizens whom we have collectively decided to treat as slightly above us on the cultural food chain—a designation typically earned through a combination of descending from a distinguished family and having good fashion sense.

Ostensibly, we engage in this odd activity to compensate for the fact that America has no actual royalty of its own, as stipulated by our Constitution.

However, rather than explaining away our collective fawning over holier-than-us celebrities, this serves only to beg a further question:  What good is there in doing so in the first place?  Why do we want royalty of any sort?

After all, were America to abruptly fall under an actual heretical dictatorship, with a leader sitting atop his throne until exhaling his final breath, we would find the whole business intolerable—as we did in the latter years of the 18th century, when we decided to turn our backs on the British Empire and give democratic republicanism the old college try.

We fashion ourselves a fake monarchy as a luxury of not having a real one.

What is most robustly demonstrated by our infatuation with the current British Royal Family, from Princess Diana onward, is our related tendency here in the states to generate cult followings for people who are, as the saying goes, “famous for being famous.”  That is, individuals whose cultural prominence seemingly stems from nothing at all and is, accordingly, utterly undeserved.

In point of fact, the British Royal Family—indeed, any royal family—is the perfect and absolute encapsulation of this horrid concept, demonstrated in no finer way than in the present preoccupation with the incoming heir to the throne.

This royal baby, having yet accomplished nothing other than simply existing, shall for a long time be “famous for being famous” by definition.  He or she will be subject to bottomless media coverage from dawn to dusk, from cradle to grave, for no reason except the cosmic accident of having Alfred the Great for a great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great-great grandfather.

This dynamic is no less true for Princes William and Harry, their father Prince Charles, or Her Majesty herself, Queen Elizabeth II.  They are all essentially unremarkable people whose high positions in English society were sealed at the moment of conception.

What makes the tabloid-style worship of the Windsor clan just the slightest bit creepy—particularly in the British Isles themselves, where nearly 80 percent of the citizenry still supports the monarchy as an institution—is because the objects of this affection really are royalty.

Lest we forget, the Queen is officially the head of state, as well as the head of the Church of England.  She could, if she desired, unilaterally dismiss the sitting prime minister and appoint, in his place, anyone her heart desired.  As well, she could declare war on Iran or Syria and later bestow pardons on those who committed atrocities in the process.

In practice, such “royal prerogatives” have fallen almost entirely by the wayside.  The Queen has shown no interest in exercising them, instead delegating all business of government to Parliament.  None of her eventual heirs has indicated any intensions to the contrary.

That is entirely beside the point.  That the monarch declines to assume certain absolute powers does not negate the absurdity of the existence of such powers in the first place.  The real test of the monarchy’s so-called popularity will come when the monarch attempts to assert a level of authority that is, after all, nothing less than his or her birthright.

The magnificence and genius of the American presidency, by contrast, lies not in its great powers, but in its great limitations, which are established and enforced not merely by tradition, but by actual written documents by which the president is legally, constitutionally and morally bound.  If and when he attempts to assume too much authority, Congress and the courts are enjoined to intervene.  This does not always work in practice, but that is the fault of individuals, rather than the system itself.

From all this, what we might say for us in the Anglo-American world, on both sides of the Atlantic, is that we prefer our royalty in name but not in practice, with the maximum of glamour and the minimum of power.  In the name of all that is good and holy, let’s keep it that way.

Commentator-in-Chief

President Barack Obama has long been criticized for his reluctance to involve himself in the business of Congress, so often declining to march down Pennsylvania Avenue to personally harangue members of the House and Senate to pass a particular bill, as past presidents have been known to do.

This charge, while sometimes exaggerated, is true enough.

Considering the president’s general aloofness on matters of American governance, it is rather curious that he has no such reticence on matters of the American culture, into which he seems positively itching to dive.

There he was, mere hours after a Florida jury acquitted George Zimmerman of second-degree murder, issuing an official statement reading, in part, “The death of Trayvon Martin was a tragedy.  Not just for his family, or for any one community, but for America.”

This was not the first time the commander-in-chief chimed in on the murder trial that captured the nation’s imagination.  In March, as coverage of the case reached saturation levels, the president intoned, “When I think about this boy, I think about my own kids, and I think every parent in America should be able to understand why it is absolutely imperative that we investigate every aspect of this […] If I had a son, he would look like Trayvon.”

We may well ask:  Why is the leader of the free world commenting about a matter that is the business of local Florida law enforcement?  What concern is the killing of one private citizen by another private citizen to the most powerful man on Earth, that he cannot help but offer his own personal musings about it?

But then, we know the answer to these queries, at least in this particular case.  Obama insinuated himself into the Trayvon Martin conversation because he views it as a “teachable moment” for America on the issue of gun violence.  It is, in his words, an opportunity to “ask ourselves if we’re doing all we can to stem the tide of gun violence,” and to figure out “how we can prevent future tragedies like this.”

Indeed, you might say that, by exploiting a local incident to push his national agenda, Obama is doing culturally what he sometimes fails to do legislatively:  Claiming the moral high ground.  If this is what it takes to govern, he might argue, then so be it.

All the same, this does not make the general practice of presidential involvement in ostensibly low-level news events any less dubious.

The question we must ask is simply this:  Being a figurehead, not just an individual, is the president not obligated to position himself above and slightly removed from the friction of daily life in the United States?  Should he not recuse himself from matters that do not require his attention, in the interest of at least appearing to be disinterested and objective?

Consider a slightly less serious example than Trayvon Martin:  The NCAA tournament.

Every year while he has occupied the Oval Office, Obama has filled out his own March Madness bracket and broadcast it to the nation.  (Indeed, his predictions have proved quite prescient.)

While he has every right to join the millions of his fellow Americans in this sacred spring ritual—as a lifelong basketball enthusiast, he presumably would be doing it anyway—I nonetheless wonder if it is not improper to do so at the White House desk.  It somehow seems beneath the majesty of the office he presently holds.

What must it be like to be, say, a promising 19-year-old freshman point guard and be told the president of the United States has penciled your team in for a first round loss?  It cannot feel great.  It seems to me that the nation’s highest officeholder should have the courtesy to keep out of it.

To be the president—a one-man institution—is to surrender certain privileges for as long as your term endures.  For instance, you cannot run out to the corner florist to buy a bouquet for your mistress (unless, of course, you are Michael Douglas in The American President).  You cannot drive a car or drink to excess, nor can you call in sick because of a particularly splitting headache.

And you can’t offer your opinion on every last aspect of the American culture, no matter how persistently members of the press corps might ask for them.  Some impulses ought to be resisted for the sake of old-fashioned propriety.

Sometimes, the most essential duty of the president is not to participate, but simply to preside.