Not Pleading the Fifth

Of all the reactions to Donald Sterling, I liked President Obama’s most of all.

“When ignorant folks want to advertise their ignorance,” said the commander-in-chief upon being asked his take, “you don’t really have to do anything—you just let them talk.”

I couldn’t have put it better or more succinctly myself.  In many ways, that is all that really needs to be said about the episode to which we refer.  At any rate, it would certainly be nice to think so.

As a non-fan of the NBA, I didn’t know who Donald Sterling was until this past weekend.  Now that I do, I wish I still didn’t.

For those who have managed to remain blissfully unaware until now:  Sterling is the longtime owner of the Los Angeles Clippers, and is on record as being a blithering, racist thug.

In the audio of a conversation with his then-girlfriend, recently obtained by the ever-illuminating TMZ, Sterling says, “It bothers me a lot that you want to broadcast that you’re associating with black people,” later adding, “You can sleep with [black people].  You can bring them in, you can do whatever you want.   The little I ask you is not to promote it on that…and not to bring them to my games.”

This plea was apparently triggered by an Instagram photo the woman uploaded, which depicted her with some black guy by the name of Magic Johnson.

An official statement by the Clippers insists that “what is reflected on that recording is not consistent with, nor does it reflect [Sterling’s] views, beliefs or feelings.”  While that may be true, it is nonetheless consistent with pretty much everything else Sterling has ever said, according to various sources who have had the misfortune of hearing him talk over the years.

Which begs the question:  If Sterling does not think himself racially prejudiced, why do racially prejudicial sentiments seem to so consistently emanate from his mouth?  Isn’t there an old saying, “If it looks like a racist and quacks like a racist…”?

As such, I think President Obama’s aforementioned assessment is instructive, and it directs us to the fairly reliable truism that, given enough time, a stupid person is just as likely to reveal his stupidity on his own, unprompted, as he would under some sort of interrogation.  People’s most repulsive opinions can only be suppressed for so long before they come blundering out, in all their wretched glory.

Confirmation of this (in case anyone needed it) came at roughly the same time in neighboring Nevada in the form of one Cliven Bundy, the Tea Party flavor of the week, whose widely-circulated (and illegal) claims to graze cattle for free on federally-owned land somehow gave way to his musing about whether black people’s socioeconomic problems occur “because they never learned how to pick cotton.”

“I’ve often wondered,” added Bundy, “are [black people] better off as slaves, picking cotton and having a family life and doing things, or are they better off under government subsidy?”  When asked to clarify, Bundy only proved capable of digging himself into an even deeper hole.

Of course, the media and the rest of us shine a light on cretins like Bundy and Sterling primarily for comic relief.  We might all say and do things of which we are not especially proud, but once you hear somebody start a sentence with, “I want to tell you one more thing I know about the Negro”—well, you begin to feel quite a bit better about yourself.

Yet there is at least one way in which we should take this ridiculousness seriously, and that is to ask:  So long as ignorance of this sort continues to exist in our culture, would we prefer that it be kept hidden by those most acutely afflicted by it, or should we be grateful for the way it tends to spill out into the open for all to see and hear?

I must say I am rather glad that so many of our most despicable citizens seem so incapable of concealing what they really think about various minority groups.  I think it is just as well.

I would prefer that we be made aware exactly who the idiots among us are, so that we then know in which direction our public shaming campaigns should be directed.

As well, having all the cards on the table should give reassurance to the 90-plus percent of us who feel patronized whenever a “national conversation about race” is proposed, that the idea is not completely without merit or necessity.

However regrettably, you see, some problems cannot be worked around.  They can only be worked through.

Whose Revolution Is It, Anyway?

When will we know for sure that “Obamacare” is a success?

When Republicans stop calling it “Obamacare.”

I’ve been carting around that joke for a while now.  I’m pretty sure I didn’t come up with it, although I certainly wish I had.

It’s the perfect little joke, because it’s founded on a basic truth about human nature, and about the nature of partisan politics in particular.  No one needs to explain the joke, because everyone understands the dynamics of taking credit and assigning blame.  I refer you to the old proverb, “Success has many fathers, but failure is an orphan.”

Indeed, if we are to devise a general rule of thumb from both the joke and the proverb, it is that the success of a given policy or social movement is directly related to the number of people claiming credit for it.

That is why I am so delighted by the petty squabbling that has broken out in the last week over the legacy of gay marriage.

Here’s what happened.  This past Tuesday saw the release of new book called Forcing the Spring: Inside the Fight for Marriage Equality.  Written by Jo Becker, a Pulitzer Prize-winning New York Times reporter, the tome purports to be the inside story of the gay marriage movement from 2008 to the present—in particular, the effort to overturn Proposition 8 in California and, in so doing, attempt to bring gay marriage to all 50 states.

The book is told through first-hand accounts of several key players, including now-president of the Human Rights Campaign Chad Griffin, screenwriter Dustin Lance Black, and the famed legal team of Ted Olson and David Boies.

Fair enough, except that according to a veritable avalanche of critics, Forcing the Spring presents these characters and their legal adventures not as simply the most recent (and most fruitful) phase of the struggle for gay marriage rights in America, but as the whole damn story.

(I have not read the book, apart from a few excerpts.)

According to Becker, Griffin et al. were bold revolutionaries—Griffin himself is compared to Rosa Parks on the very first page—who rebelled against a do-nothing gay “establishment” that had effectively driven the cause into a ditch.

As the book would have it (according to these naysayers), nothing that occurred in the struggle for same-sex marriage really, truly mattered until the moment in 2008 when Griffin and like-minded allies made wholesale changes in strategy—legally and rhetorically—that would lead directly to the domino of successes the country has experienced ever since.

Long story short (too late?), the charge against Forcing the Spring is that Becker allows her sources to claim nearly all the credit for the fact that gay marriage is now legal in 17 states and is endorsed by a clear majority of the American public, at the expense of countless others who deserve equal, if not greater, credit for carrying the fight as far as they did.

Among these unacknowledged factors are a legal showdown in Hawaii in the 1990s that set the template for all that would follow; people like Andrew Sullivan and Evan Wolfson, who articulated the now-mainstream arguments for gay marriage decades before they were taken seriously; and the mere fact that, before public support for gay marriage rose from 40 percent to 54 percent between 2008 and 2013, it rose from 27 percent to 46 percent between 1996 and 2007.  (Yes, apparently support dropped six points between 2007 and 2008.)

As the book’s dissenters make plain, to say the anti-Prop 8 crowd is singularly responsible for effecting same-sex marriage, as the book implies, is analogous to crediting the Civil Rights Act of 1964 entirely to Lyndon Johnson, while failing to even mention figures like Martin Luther King, A. Philip Randolph or, indeed, Rosa Parks.

For the most part, this contest over the history of the gay rights movement can be categorized as a family quarrel.  All sides wish to achieve the same ends; they disagree, if at all, only about the means.

What is most encouraging is that this argument is happening at all, because it means that the history of bringing same-sex marriage to America is one that its participants can be proud of.  Those on the struggle’s front lines are falling all over each other to claim responsibility because their efforts have proved successful after a long period of failure, and they feel they deserve their due.  I am positively thrilled that we, as a society, have come this far.

Of course, we still await the moment when we’ll know for sure, and beyond all doubt, that gay marriage is here to stay:  That is, when members of the GOP begin to claim that it was their idea all along.

Still Whistling ‘Dixie’

As the United States approaches the 150th anniversary of the end of the Civil War, it has become increasingly common for relics from the old Confederacy to recede from public view.

While there are undoubtedly certain corners of America in which warm feelings toward the slave-owning Deep South still burn, as a general rule, a given locale or organization today has precious little to lose—and often much to gain—from abandoning whatever residual Confederate loyalties it might yet possess.  Particularly when it is under public pressure to do so.

But what happens when the entity in question is so deeply and inextricably tethered to a component of the Confederacy itself that to renounce such ties would be to hollow out its own soul?

It looks like we’re about to find out.

Down in the sleepy Virginia town of Lexington, there lies a small liberal arts college called Washington and Lee.  Founded in 1749, the school assumed George Washington’s name in 1796, following a hefty donation from the man himself.  When the Civil War ended in 1865, the school recruited Robert E. Lee, the former general of the Confederate Army, to be its president.  Lee accepted, and held the post until his death in 1870.

So mighty was Lee’s impact in transforming Washington College into a serious and respected institution of higher learning, the place was swiftly rechristened Washington and Lee University in his honor.

To this day, W&L defines itself by the general’s personal code of conduct from his days as chief administrator.  “We have but one rule here,” said Lee, “and it is that every student must be a gentleman.”  (The school has been co-ed since 1985; today, the women outnumber the men.)

From this credo, W&L maintains an honor system that most American students would find both odd and terrifying, and the result is a university that ranks in the top tier of just about every “best colleges” list and, according to at least one survey, boasts the strongest alumni network in all the United States.

(Full disclosure:  My younger brother is one such alumnus, and, in point of fact, has become as much of a gentleman as anyone I know.)

Against the clear benefits of a university adhering to the values of this particular man, there is at least one equally obvious drawback:  the fact that this same Robert E. Lee spent four years fighting in the defense of slavery in the United States.

Whatever his personal views might have been about America’s peculiar institution—they were complicated, to say the least—Lee functioned as the seceded states’ rebel-in-chief during the climactic months of the war, thereby endorsing the proposition that the holding of human beings as property was a principle worth fighting, dying and killing for.

If a university is prepared to assume the totality of a man’s strengths as part of its core identity, must it not also be prepared to answer for that man’s most unattractive faults—not least when they involve the trafficking and torture of people he would otherwise wish to be educated?  Can this wrinkle in Lee’s makeup really be so easily glossed over?

Such an intellectual compromise is, in so many words, the primary intent of an intriguing new list of demands, submitted last week to the board of trustees, from a group of seven W&L law students calling themselves “The Committee.”

To be precise, these stipulations are for the school to remove the Confederate battle flags that adorn the inside of Lee Chapel, where the late general is buried; to prohibit pro-Confederacy groups from demonstrating on school grounds; to suspend classes on Martin Luther King Day; and, perhaps most dramatically, to “issue an official apology for the University’s participation in chattel slavery and a denunciation of Robert E. Lee’s participation in slavery.”

Doth the Committee protest too much?  Does W&L have a moral obligation to the whole story of Robert E. Lee, and not just the bits that serve its interests?

It is critical to note that, in its official policies and practices, the school today cannot credibly be accused of harboring neo-Confederate or anti-black biases.  (In its letter, the Committee refers to “racial discrimination found on our campus,” but does not cite specific examples.)

The town of Lexington, which has historical ties to Stonewall Jackson as well as Lee, naturally contains many citizens who hold such repugnant views, and who sometimes express them through marches or other forms of public demonstration.  However, this is not, as it were, Washington and Lee’s problem.

It is precisely because W&L makes no formal overtures toward the pre-war South’s view of civilization that it could seemingly afford to differentiate its latter-day founding father’s virtues from his vices.  The university’s president, Kenneth P. Ruscio, suggested as much in a magazine article in 2012, writing, “Blindly, superficially and reflexively rushing to [Lee’s] defense is no less an affront to history than blindly, superficially and reflexively attacking him.”

So why not put real muscle behind this plea for historical nuance by acceding to the Committee’s fourth and final demand (if not the first three)?  What does W&L stand to lose by looking reality in the eye and acknowledging a few unpleasant facts?

Wouldn’t that be the gentlemanly thing to do?

Back to Normal

When I was in college, Marathon Monday simply meant getting drunk and having a great time.

The Boston Marathon begins in Hopkinton at 10 a.m.  For us in our dorms near Kenmore Square, that meant waking up at 9, breaking out a 30-pack of Bud Light a few minutes later, and eventually hobbling over to Beacon Street to see the race’s leading men, women and wheelchair-bound pass by, followed soon thereafter by 20,000 or so runners-up.

It’s a perfectly sensible tradition in our fair city.  Drinking beer with friends is a joyous experience, and cheering on thousands of ironclad runners is a joyous experience as well.  To do both things simultaneously—well, the word “orgasmic” would not be too far off.

The theory is that the third Monday in April—Patriots’ Day, as it is officially known—is the one day when Boston cops don’t bother citing people for public intoxication.  (Note:  This is not necessarily true.)  For spectators, the Boston Marathon is such a merry, mellow and family-friendly event that any outbreak of inebriation is of a decidedly harmless and good-natured sort.  (Patriots’ Day is also understood as the one time in which you can drink in the morning and not be considered an alcoholic.)

A decade and a half earlier, the relevant marathon liquids were water and orange juice, which my short schoolboy self would help distribute to runners along the course.  This was an especially high honor in 1992, when my father and uncle ran the marathon together.  I remember it well:  I had lined up two plastic cups along the curb—one for Dad, one for Uncle Roy.  As soon as we spotted the duo coming up the hill, I dashed for the first cup, handed it off to Dad with perfect precision, then dashed back for the second.  When I returned to the edge of the street, expecting Roy’s outstretched arm, he and Dad were both long gone.  I guess they had some place to be.

There are plenty more Marathon Monday stories I could recount, and they are all indicative of the Boston Marathon’s core cultural purpose, which is to bring together virtually every resident of the Boston metro area in a display of total, unadulterated gaiety.  If you aren’t an actual participant, you attend the marathon for no reason except that it’s so goddamned enjoyable.  For all the boozing and tomfoolery, it’s just about the most innocent mass gathering in all of the United States.

And now, of course, it’s not.

As the city executes its final preparations for Monday’s race—the first since the moment when last year’s went horribly, horribly wrong—we are forced once again to deal with this concept known as the “new normal.”  Like boarding an airplane or sending an e-mail, the act of watching (let alone running) the Boston Marathon is no longer as innocuous or carefree as we long assumed it to be.

As a consequence of last year’s madness, this year’s festivities will be subject to extraordinary security provisions, including new restrictions on bags and other personal items, random searches by police, and prohibitions on strollers, large bottles and costumes.

Some of these regulations are perfectly reasonable; others seem needlessly excessive.  In any case, they illustrate how the Boston Marathon has joined the ever-growing list of public spaces subject to uncommonly intense scrutiny by the authorities, in the interest of keeping the peace and ensuring that nothing goes awry—a task that is ultimately impossible, since any marathon is an inherently open event.

In essence, the “new normal” is about the tension between security and freedom, with the implication that the former has taken precedence over the latter.

On better days, I take the view that safety in America has always been something of an illusion, that one assumes a million and one risks the moment one steps out the front door, and that the notion of national “innocence” is an absurdity that never existed and never will.

And yet when it comes to the Boston Marathon, I prefer the old normal.  I wish we could have it back.  I wish we didn’t have to think about the possibility of terror and violence at such an otherwise happy occasion, and I think it’s an obscenity that it took just two people with one bad idea to force us to think otherwise.

But our hand has indeed been forced, and there is no turning back.  As such, we are left with the second-best course of action, and that is to descend upon Monday’s race in record numbers and have the time of our lives.  Just like we always used to.

Traditions Passed Over

The Passover Seder is among the most sacred, enduring and universal of all Jewish traditions.

It is worth noting, then, that no two Seders are ever exactly alike.

When I was younger, for instance, my family’s service would be led by my grandfather, who had us read very solemnly from an ancient edition of the Passover Haggadah, replete with arcane, sexist language that we kids could not begin to understand.  Our recitation of the Exodus story and its implications left no detail unuttered.  Including the meal, a Seder begun at 6 o’clock could be expected to carry on until well past 9.

In more recent years, my folks, my brother and I have sometimes joined close friends of ours in their more modern, “family-friendly” event, featuring a homemade, illustrated version of the Haggadah that abbreviates and clarifies the text, eliminating the dull, sluggish bits while emphasizing the songs and encouraging audience participation—not least in flinging plastic frogs at each other while recounting the Ten Plagues.

This year, our clan was graciously included in a large-ish gathering that took the “do-it-yourself” approach several additional steps.  The “Four Questions” were asked not only in English and Hebrew, but also in Spanish, Polish, Latin and sign language (even though no one present was deaf or foreign-born).  The singing of “Chad Gadya” became a competition as to who could complete the most verses in a single breath.  (The eventual winner nearly fainted in the process.)  The hidden afikoman, or middle matzo, was found not by the children, but by one of the host’s teddy bears.

This is a mere sampling of the Seders I have personally experienced here in my own tiny corner of Judaism.  How the world’s remaining 14 million or so Jews conduct their annual Passover observances, I can only guess, but I suspect that they, too, are all over the map.

Admittedly—crucially, in fact—all of the disparate spins on Passover described above adhered to the same general rubric, and all contained the same essential elements:  the Exodus narrative, the Seder plate, the cup for Elijah, and so forth.  You might say the differences from year to year were ones of style more than substance.  And as many would argue, you can alter a holiday’s details without destroying its essence.

Except that for many Jews, the details are the essence, whether during festivals like Passover or a typical Shabbat service.  In the minds of folks like my late grandfather, one must never stray from the original script; it would be an insult to our ancestors if we did.

This mindset looks upon alternative approaches to Judaism with a mixture of sadness and contempt, viewing them as acts of cultural and religious effacement.  Owing to the Jewish people’s history of being nearly exterminated over and over again, historical continuity is essential—a means of bridging one generation to the next.

And yet I, for one, have drawn far more meaning from our recent “revisionist” Seders than from the old-school, Rabbinically-sanctioned ones of my upbringing.  They are more enjoyable, yes, but also more adept at communicating Passover’s actual significance, thereby imparting to us why we bother to observe it in the first place.

Is tradition-for-tradition’s-sake really more important than ensuring that the basis of the tradition is widely understood?  Don’t let anyone tell you this is an easy question to answer.  It most assuredly is not.

The tension between old customs and new sensibilities is real, and it assumes many forms.  Further, we can probably abandon any hope that such a clash will ever completely go away.

To wit:  We young people can pooh-pooh the “we’ve always done it this way” argument all we want—as a supporter of gay marriage rights, I do this quite often—but what happens when we’re faced with people for whom the very fact of an act’s infinite and unchanging repetition is what gives the act its meaning?

What happens when it’s our own sacred traditions that fall under scrutiny?  Will we be as susceptible to change as we demand others to be?  What makes us so special?

The Seder I attended this week was as memorable and entertaining as any I can recall, organized and led by people who take their faith seriously but also aren’t afraid to defy certain conventions for the sake of setting a lively table.

Yet as I shot a plastic green frog into my brother’s wine glass, I could faintly hear my grandfather’s harrumph of disapproval in a back corner of my mind, and I had to concede that his view of what constitutes a proper Seder is as valid as anyone else’s.

What is more, he could rest assured that, even at this table, at least it was Manischewitz in the glass.

Muhammad vs. Ali

Meet Ayaan Hirsi Ali. Born in Somalia in 1969, she was subjected to genital mutilation at age 5—among other physical abuse—and, later on, forced into an arranged marriage to a distant cousin. That is, until she escaped to the Netherlands in 1992, where she worked various jobs that eventually led to a position in the Dutch parliament and as a campaigner for women’s rights—particularly within Islam, her religion of birth that she would ultimately renounce.

In 2004, she penned the screenplay for Submission, a short film criticizing the treatment of women in Muslim culture. The movie’s director, Theo van Gogh, was subject to an especially harsh critique in the form of being murdered in the street by a member of a Dutch terrorist organization called the Hofstad Network. What is more, attached to the knife that killed van Gogh was a note to Hirsi Ali, informing her that she was the next person on Hofstad’s hit list.

As a result, Hirsi Ali briefly went into hiding, before resuming her work as an advocate for the empowerment of women, including by founding the AHA Foundation in 2007, “to help protect and defend the rights of women in the West from oppression justified by religion and culture.”

For these efforts, Hirsi Ali was to be given an honorary degree from Brandeis University next month. However, last week the distinguished Waltham, Mass., institution opted to un-invite her from its commencement exercises, following on-campus protests by students, faculty and others.

What was their grievance? It was that this woman, who had spent the balance of her adolescence being tortured by practices ordained and justified by a particular wing of Islam, has had a few disparaging things to say about Islam.

Case in point: In a now-infamous 2007 interview in Reason Magazine, Hirsi Ali asserted, “I think that we [in the West] are at war with Islam,” that the religion is inherently violent and extreme, and the only way for Islam to “mutate into something peaceful” is for it to be “defeated.” In a separate interview in the same year, she called Islam “a destructive, nihilistic cult of death.”

Naturally, the wide dissemination of these contentions led many to tar Hirsi Ali as hateful, bigoted, Islamophobic and all the rest. Petitions were circulated across the campus, demanding the school rethink its decision to honor Hirsi Ali.

Last Tuesday, it did exactly that. I greatly wish that it hadn’t.

In examining this whole brouhaha, we probably need not expend much time on the question of rights. To wit: No one has the “right” to an honorary degree from Brandeis or any other great American university. An institution of higher learning has the full freedom to make such decisions however it deems fit.

What happened here, however, is that Brandeis specifically chose Hirsi Ali for the privilege of addressing its graduating class, only to then rescind the invitation when it became clear that too many members of the Brandeis community were afraid to hear what she might have had to say.

I say “afraid” because that’s what they were. They couldn’t handle facing an opinion about the world around them with which they do not agree—a sentiment that might force them to turn their brains on and exercise some critical thinking. And a university is no place for that.

A highly prudent question in this case is whether Brandeis was aware of Hirsi Ali’s more inflammatory statements when it first tapped her as an honoree. In an official statement, the school says it was not—that the administration regards her as “a compelling public figure and advocate for women’s rights,” but that it “cannot overlook certain of her past statements that are inconsistent with Brandeis University’s core values,” adding, “we regret that we were not aware of these statements earlier.”

One would like to be able to take Brandeis’s powers-that-be at their word—namely, to accept that Hirsi Ali’s stridently anti-Islam comments came as a surprise. However, in order to swallow this, one would necessarily need to believe that no one on this degree-granting committee thought to peruse Wikipedia or Google or any other source of basic biographical information on a woman whom this university evidently valued enough to bestow such a distinction in the first place.

Does this sound plausible to you? If conferring an honorary degree truly is “akin to affirming the body of a recipient’s work,” as the New York Times put it, why did Brandeis perform such apparently shoddy research on this particular would-be recipient?

No, I think it’s perfectly reasonable to conclude the administration knew that it was making a gamble—an admirable one, in my view—and then when it realized it had a small mutiny on its hands, the school panicked and bowed to the will of the mob.

Is this what constitutes “Brandeis University’s core values” nowadays? I dearly hope not.

Mozilla’s Fired Fox

Let’s do a bit of supposin’, shall we?

Suppose, for instance, that a flourishing technology company hires some guy to be its CEO, and shortly thereafter it is revealed that this man once donated $1,000 to the Ku Klux Klan—a contribution he does not regret.  Following a public outcry from both within and without the company, the CEO finds the pressure too great for him to continue, and he resigns.

Nothing wrong with this, right?  In the world we now inhabit, to express white supremacist views—and financially support white supremacist groups—is perfectly legitimate grounds for the face of a large (or small) corporation to be effectively hounded from his post.

Yes, one has the right to say anything one wants and to spend one’s cash as one sees fit.  However, this does not prevent a company from concluding that such an official holding such views could yield catastrophic economic consequences (read: a massive exodus of customers) and thus, of out prudence, getting rid of this cretin as swiftly as possible.

In short:  Freedom of speech does not guarantee freedom from consequences of that speech.

Now suppose, however, that instead of having given $1,000 to the KKK, our hypothetical CEO had contributed to some anti-abortion group, such as National Right to Life or Pro-Life Action League.  Were this disclosure to lead to the same series of events described above—outrage, cries for dismissal, and actual dismissal—would it not be considered a scandal?

To express any opinion whatever about abortion might be destined to cause controversy, but since the American public is divided on the question, it would be absurd to contend that any particular opinion is effectively “out of bounds” in the national discourse.

As such, to dismiss or otherwise ostracize the head of a company that has nothing to do with abortion on the basis of his views on abortion smacks just the slightest bit of totalitarianism, does it not?  Do we really want to be a country that simply gets rid of people who say things that might make us uncomfortable?

I float these hypothetical scenarios in response to the non-hypothetical occurrence last week, in which a CEO named Brendan Eich was forced to resign from the Mozilla Corporation because of his $1,000 donation in 2008 in support of California’s Proposition 8.  You know, the one that outlawed gay marriage.

In this keruffle’s wake, the central question—duly hashed out across the blogosphere for the past week—is whether Eich’s opposition to same-sex marriage was, all by itself, a valid reason for him to be induced to abandon his position atop Mozilla, which runs the Firefox browser system.

In other words, is the view that gay marriage is a bad idea now among those thoughts that a person can no longer express without fear of losing his or her job?

I must confess that I am conflicted.

On one hand, I put tremendous stock in my position as a First Amendment absolutist.  I would prefer that everyone be able to say exactly what they think, and that everyone else allow them to do so.  As a supporter of gay marriage rights, I am rather horrified by the possibility that anyone with an opposing view would decline to state it for reasons of political correctness or outright fear of persecution.

Frankly, I think some people make too much of a deal about what a particular CEO thinks when making decisions as consumers.  As blogger Andrew Sullivan so crispy observed, it is not a little ironic that the very people who have long demanded “tolerance” from their rhetorical foes are, themselves, now acting so very intolerantly toward those who refuse to tow the party line on the matter of gay rights.

There is just one thing preventing me from vocalizing the principle of open discourse at the absolute top of my lungs, and that is my unadulterated delight that this party line on gay rights—and specifically gay marriage—is now the majority view in the United States.

To be clear:  At this point in time, to say that gay people are not entitled to marriage is not as horrific as saying black people are not entitled to any and all rights accorded white people.  However, our culture is plainly, steadily and irreversibly moving in that direction, and I am pleased as punch that this is the case.  So far as I’m concerned, the day that anti-gay words and actions become utterly verboten in polite society is one that cannot possibly come soon enough.

But until that day dawns, let us resist the urge to impose it by force, like some politically correct mob.  It’s a highly unattractive means of getting one’s way, whatever the issue might be, and we Americans are supposed to be better than that.

Suppose, in the future, we make more of an effort to prove it?