Eye of the Beholder

Can a piece of art ever exist entirely on its own, or is it always tethered to the context of its creation?

For instance, is it possible to listen to the Ring Cycle without remembering that Richard Wagner was an anti-Semitic prick whose music inspired the rise of Hitler?

Can one watch Manhattan—the story of a 42-year-old man’s love affair with a 17-year-old girl—and not be distracted and/or repulsed by the personal life of its writer, director and star, Woody Allen?

As a society, we’ve had a version of this argument many times before, trying to figure out how to separate the art from the artist, while also debating whether such a thing is even desirable in the first place.  (The answer to both:  “It depends.”)

Lately, however, this perennial question has assumed a racial dimension, compelling us to re-litigate it anew—this time with considerably higher stakes.

Here’s what happened.  Over at New York’s Whitney Museum of American Art, the curators of the institution’s 78th biennial—an exhibition of hundreds of contemporary works by dozens of artists—chose to include Open Casket, a semi-abstract painting that depicts the mutilated corpse of Emmett Till, the 14-year-old African-American boy who was tortured and lynched in Mississippi in 1955 for allegedly whistling at a white girl.  (The woman in question later admitted she made the whole thing up, but that’s another story.)

As a painting, Open Casket is arresting, with the oils so thickly layered that Till’s mangled face literally protrudes from the canvas, as if calling out to us from beyond the grave.  As a political statement, it fits comfortably into our uncomfortable era of police brutality and racial unease—a natural, even obvious, choice for any socially conscious art show in 2017.

There was just one little problem:  The creator of Open Casket is white.  Specifically, a Midwestern white woman living in Brooklyn named Dana Schutz.

Upon hearing that a Caucasian had dared to tackle Emmett Till as the subject for a painting, many patrons demanded the Whitney remove Open Casket from its walls, while condemning Schutz for attempting to profit off of black pain—a practice, they argued, that has defined—and defiled—white culture since before the founding of the republic, and should be discouraged at all costs.  The message, in effect, was that white people should stick to their own history and allow black people to deal with theirs.

In response to this brouhaha, the Whitney defended its inclusion of Schutz’s work without directly addressing the race question, while Schutz herself issued a statement that read, in part, “I don’t know what it is like to be black in America.  But I do know what it is like to be a mother.  Emmett was Mamie Till’s only son.  I thought about the possibility of painting it only after listening to interviews with her.  In her sorrow and rage she wanted her son’s death not just to be her pain but America’s pain.”

In other words:  Far from being exploitative or opportunistic, Open Casket is meant as an act of compassion and empathy toward black America from an artist who views Emmett Till’s death as a tragedy for all Americans—not just black ones.

Of course, that is merely Dana Schutz’s own interpretation of her work, and if history teaches us anything, it’s that the meaning of a given cultural artifact is never limited to what its creator might have intended at the time.  The artist Hannah Black, one of Schutz’s critics, is quite right in observing, “[I]f black people are telling her that the painting has caused unnecessary hurt, she […] must accept the truth of this.”

The real question, then, is whether offensiveness—inadvertent or not—is enough to justify removing a piece of art from public view, as Black and others have advocated in this case.

If, like me, you believe the First Amendment is more or less absolute—that all forms of honest expression are inherently useful in a free society—then the question answers itself.  Short of inciting a riot (and possibly not even then), no art museum should be compelled to censor itself so as not to hurt the feelings of its most sensitive patrons, however justified those feelings might be.  Au contraire:  If a museum isn’t offending somebody—thereby sparking a fruitful conversationit probably isn’t worth visiting in the first place.

Unfortunately, in the Age of Trump, the American left has decided the First Amendment is negotiable—that its guarantee of free speech can, and should, be suspended whenever the dignity of a vulnerable group is threatened.  That so-called “hate speech” is so inherently destructive—so wounding, so cruel—that it needn’t be protected by the Constitution at all.  As everyone knows, if there was one thing the Founding Fathers could not abide, it was controversy.

What is most disturbing about this liberal drift toward total political correctness is the creative slippery slope it has unleashed—and the abnegation of all nuance and moral perspective that goes with it—of which the Whitney kerfuffle is but the latest example.

See, it’s one thing if Open Casket had been painted by David Duke—that is, if it had been an openly racist provocation by a callous, genocidal lunatic.  But it wasn’t:  It was painted by a mildly-entitled white lady from Brooklyn who has a genuine concern for black suffering and wants more Americans to know what happened to Emmett Till.

And yet, in today’s liberal bubble factory, even that is considered too unseemly for public consumption and must be stamped out with all deliberate speed.  Here in 2017, the line of acceptable artistic practice has been moved so far downfield that an artist can only explore the meaning of life within his or her own racial, ethnic or socioeconomic group, because apparently it’s impossible and counterproductive to creatively empathize with anyone with a different background from yours.

By this standard, Kathryn Bigelow should not have directed The Hurt Locker, since, as a woman, she could not possibly appreciate the experience of being a male combat soldier in Iraq.  Nor, for that matter, should Ang Lee have tackled Brokeback Mountain, because what on Earth does a straight Taiwanese man like him know about surreptitious homosexual relationships in the remote hills of Wyoming?  Likewise, light-skinned David Simon evidently had no business creating Treme or The Wire, while Bob Dylan should’ve steered clear of Hattie Carroll and Rubin Carter as characters in two of his most politically-charged songs.

Undoubtedly there are some people who agree with all of the above, and would proscribe any non-minority from using minorities as raw material for his or her creative outlet (and vice versa).

However, if one insists on full-bore racial and ethnic purity when it comes to the arts, one must also reckon with its consequences—namely, the utter negation of most of the greatest art ever created by man (and woman).  As I hope those few recent examples illustrate, this whole theory that only the members of a particular group are qualified to tell the story of that group is a lie.  An attractive, romantic and sensible lie, to be sure—but a lie nonetheless.

The truth—for those with the nerve to face it—is that although America’s many “communities” are ultimately defined by the qualities that separate them from each other—certainly, no one would mistake the black experience for the Jewish experience, or the Chinese experience for the Puerto Rican experience—human nature itself remains remarkably consistent across all known cultural subgroups.  As such, even if an outsider to a particular sect cannot know what it is like to be of that group, the power of empathy is (or can be) strong enough to allow one to know—or at least estimate—how such a thing feels.

As a final example, consider Moonlight—the best movie of 2016, according to me and the Academy (in that order).  A coming-of-age saga told in three parts, Moonlight has been universally lauded as one of the great cinematic depictions of black life in America—and no wonder, since its director, Barry Jenkins, grew up in the same neighborhood as the film’s hero, Chiron, and is, himself, black.

Slightly less commented on—but no less noteworthy—is Moonlight’s masterful meditation on what it’s like to be gay—specifically, to be a gay, male teenager in an environment where heterosexuality and masculinity are one and the same, and where being different—i.e., soft-spoken, sensitive and unsure—can turn you into a marked man overnight, and the only way to save yourself is to pretend—for years on end—to be someone else.

Now, my own gay adolescence was nowhere near as traumatic as Chiron’s—it wasn’t traumatic at all, really—yet I found myself overwhelmed by the horrible verisimilitude of every detail of Chiron’s reckoning with his emerging self.  Here was a portrait of nascent homosexuality that felt more authentic than real life—something that cannot possibly be achieved in film unless the men on both sides of the camera have a deep and intimate understanding of the character they’re developing.

Well, guess what:  They didn’t.  For all the insights Moonlight possesses on this subject, neither Barry Jenkins, the director, nor a single one of the leading actors is gay.  While they may well have drawn from their own brushes with adversity to determine precisely who this young man is—while also receiving a major assist from the film’s (gay) screenwriter, Tarell Alvin McCraney—the finished product is essentially a bold leap of faith as to what the gay experience is actually like.

Jenkins and his actors had no reason—no right, according to some—to pull this off as flawlessly as they did, and yet they did.  How?  Could it be that the condition of being black in this country—of feeling perpetually ill at ease, guarded and slightly out of place in one’s cultural milieu—has a clear, if imprecise, parallel to the condition of being gay, such that to have a deep appreciation of one is to give you a pretty darned good idea of the other?  And, by extension, that to be one form of human being is to be empowered to understand—or attempt to understand—the point of view of another?  And that this just might be a good thing after all?

Whodunit?

There’s an old personality test—introduced to me in middle school and lovingly preserved on the interwebs—involving a woman who gets herself killed journeying between her husband and her lover.  The “test,” as it were, centers on the question of who is most to blame for the woman’s untimely death.  Is it the bored husband who neglected to take his wife along on his business trip?  Is it the greedy boatman who refused to ferry her across the river to safety?  Is it the heartless boyfriend who didn’t lift a finger in her defense?  Or is it the woman herself for being unfaithful and blundering into the wrong place at the wrong time?

It’s a ridiculous conceit, but the idea is that how you assign blame for the woman’s murder is determined by what you value most in life.  The options, in this case, include such things as “fun,” “sex,” “money” and, my personal favorite, “magic.”

Anyway, that story’s been on my mind for the last few days as I’ve seen Donald Trump campaign events descend into violence and mayhem whenever a gaggle of anti-Trump agitators has sneaked its way into the arena.

With regards to these unholy scuffles, everyone seems to have a firm opinion about who is most at fault.  Interestingly, however—and I think you know where I’m going with this—no one can quite agree on who, exactly, that is.

Obviously, then, what we need is to update that silly game about the two-timing wife so that it applies to our own time and our own values.  With Trump—a man who stands as America’s signal Rorschach test of 2016—we can learn a great deal about how each of us thinks just by how we interpret what is happening directly in front of our eyes.

From a sampling of reactions, we find that most people trace the cause of this campaign unrest to either a) the protesters, b) Trump supporters or c) Trump himself.  To an extent, one’s opinion of these incidents is merely an affectation of one’s politics:  If you find Donald Trump generally detestable, you generally attribute all detestable acts to the man himself.  Conversely, if you think Trump speaks truth to political correctness, you find fault only with those who are preventing him from speaking.  It’s confirmation bias in action:  You see what you want to see and filter out everything else.

But of course, all of that is but the tip of the bloody, bloody iceberg.  However illuminating it might be to debate which side threw the first punch, it’s not until folks start to blame those who weren’t even in the room that the real fun begins.

We might start with the Donald himself, who has fingered Bernie Sanders as the main culprit for the madness, saying that the party crashers at his gatherings are on direct marching orders from the socialist from Vermont.  It is noteworthy that Trump bases this claim on no evidence whatsoever, while he has simultaneously blamed other outbursts on ISIS—yes, that ISIS—due to a YouTube video that was swiftly exposed as a typical Internet hoax.  As Trump explained on Meet the Press, “All I know is what’s on the Internet,” reminding us that he is apparently the one person in America who believes, with all his heart, that if it’s online, it must be true.

Farce that this undeniably is, such behavior nonetheless offers real insights into Trump’s personality and that of his fellow travelers.  Strongest among these, perhaps, is the value of “truthiness,” a.k.a. believing something to be true simply because your gut tells you so.

In fact, Trump’s entire movement is dependent on truthiness, since at least 80 percent of his campaign’s major claims are demonstrably false and his promise of “restoring America’s greatness” is one big fatuous smoke-and-mirrors routine containing nary a whiff of substance or honest reporting.  If all presidential candidates engage in hyperbole, Trump is unique for engaging in absolutely nothing else.

The real problem, though, is how sinister that hyperbole has been for the last nine months and how deeply it has metastasized within the GOP.  While this week’s outright physical violence might be relatively new, the truth is that Trump and his flock have been blaming other people for America’s problems for his entire presidential run.  Like any seasoned demagogue, Trump has invented most of this blame from whole cloth, while at other times he has even managed to invent the problems themselves.  (Who would ever know, for instance, that net immigration from Mexico is actually negative over the last five years, or that U.S. military spending increased from 2014 to 2015?)

Which leads us, as it must, to the most disturbing personality quirk of all:  The one that blames all of this turmoil on African-Americans and views the entire American experience in terms of white supremacy.

While it would be irresponsible to peg every Trump voter as a white supremacist—or, specifically, a Nazi or a Klansman—the point is that Trump rallies have become a safe space—if not a veritable breeding ground—for white people who think that punching, kicking and spitting on black people is their God-given right as members of a privileged race.  For all Trump’s claims that the protesters are the true instigators of these melees, most video clips suggest otherwise:  Largely, we just keep seeing groups of young, mostly black people nonviolently holding up signs and chanting cheeky slogans while white guards and white attendees proceed to manhandle them with the greatest possible force—egged on, every single time, by the candidate himself.

You see pictures like these—paired with people like Mike Huckabee calling the protesters “thugs,” a word that Republicans only ever use to describe African-Americans—and you realize all that’s missing are the dogs and the fire hoses.

Among the many sick ironies of Donald Trump is his supposed fidelity to the First Amendment, which he claims the dissenters at his rallies are attempting to suppress (as if Trump has ever lacked an outlet for expressing himself on a moment’s notice).  Historical ignoramus that he is, he doesn’t seem to realize that, when it comes to muzzling free speech, few things are more effective than riling up a large gang of angry white people by telling them how to mistreat a small gang of dark-skinned antagonists.  (And then, of course, pleading ignorance when those same white people do exactly what you suggest.)

Even if there were nothing at all race-based in Trump and company’s behavior, we would still be left with this profoundly dangerous idea that all problems can, and should, be solved with physical violence.  To hear Trump talk, you’d think his were the first-ever campaign events to feature any sort of disruptors and that there is no rational response except to treat them like enemy combatants.  (How long before Trump recommends waterboarding?)

The relevant terms here are “escalate” and “de-escalate.”  As any honest police officer knows, whenever you are faced with a potentially explosive situation, it is your moral responsibility to try to de-escalate tensions and not make matters worse.  Indeed, for anyone who wields authority or influence over others—not least in politics—the obligation to lead by example and get your minions under control is absolute and non-negotiable.

Donald Trump has failed that charge over and over again.  In so doing, he has revealed which values he holds dear and which values he does not—if, that is, he can be said to possess any values at all.

It proved quite prescient that Trump opened his campaign while riding an escalator in Trump Tower in Manhattan:  As it turns out, he is an escalator.

Best of Enemies

It’s almost too obvious to mention, but when it comes to religious liberty in America, we are in the midst of a veritable golden age.

The First Amendment to our Constitution begins, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof,” and damned if we haven’t nailed it in the last many years.  The right to live according to the dictates of one’s faith has never been stronger, and there is little indication that this will change in our lifetimes.  As ever, we don’t realize how lucky we are.

Whether you are a Christian, a Sikh or a Seventh Day Adventist, you can travel to your place of worship on Sunday (or whenever) totally unmolested by your government or, with rare exceptions, your fellow citizens.  Observant Jews can wear kipot and refrain from eating pork, while Muslims can pray five times a day and…refrain from eating pork.

While being a member of the “wrong” religion can get you shunned, maimed or murdered in many other countries of the world, America is truly a land of pluralism—a nation that, at least on paper, protects its most vulnerable citizens just as robustly as its most populous.

Indeed, the inclination toward granting each other religious freedom is so forceful—such a prevailing view—that we are now having a semi-serious debate about whether the right to one’s faith-based opinions actually entitles an individual to break the law and deny the civil rights of other individuals.  Yes, even if that particular individual happens to work for the government.

Of course, I am referring to the one-woman crusade currently being waged by a Kentucky county clerk named Kim Davis.  As an observant Christian, Davis has refused to issue marriage licenses to same-sex couples, because doing so would violate her religious beliefs.  This in spite of the fact that, since June 26, gay marriage is the law of the land in all 50 states.

In effect, the issue is whether the First Amendment’s “free exercise” clause can ever supersede the rule of law.  In other words, can the word of God take legal precedence over the word of Congress or the Supreme Court?

As we have seen, this question has precisely one correct answer.  By refusing to issue marriage licenses to couples who have every right to obtain one—even after the nation’s highest court explicitly ordered her otherwise—Davis has been held in contempt and carted off to jail.  While, as an elected official, she cannot technically be “fired,” it doesn’t look terribly likely that she will remain in this job much longer.  And rightly so:  Why should Kentucky taxpayers be compelled to pay a clerk for not doing her job?

Much has been made of the disclosure that Davis herself has been married four times and divorced thrice.  Personally, I’m still reeling from the fact that, five months after divorcing Husband No. 1, she give birth to twins who were adopted by Husband No. 2 but were, in fact, fathered by Husband No. 3.  (Feel free to read that sentence again.)

Of course, all of that is perfectly legal and we should never judge or make assumptions about anyone’s marital history.  Relationships are complicated, and marriage is messy even under the most ideal circumstances.

On the other hand, marital infidelity is clearly and definitively condemned in the Bible and, in Deuteronomy, is punishable by death.

Kim Davis has said she performs her official duties in accordance with the Biblical definition of marriage.  It begs the question:  If she really means that, then why hasn’t she hired someone to kill her?

Happily for everyone, she plainly doesn’t mean it.  She is against homosexuality for reasons all her own and, like every Christian, she handpicks the Biblical passages that align with her views and ignores the ones that don’t.

This is not to suggest that her beliefs are not sincerely held.  It just means they are not held for the reasons she claims and that she is a massive glittering hypocrite when it comes to enforcing holy writ.

Of course, as an American, she is fully entitled to be the horrible person that she is and to believe whatever the hell she wants.  That’s the very definition of religious liberty and no one would dare force her to think differently.  If we all agreed about everything, we wouldn’t need a First Amendment in the first place.

However, we are nonetheless a society in which laws reign supreme over religion, and it’s precisely because we have so many different religions that can each be interpreted in a billion different ways.  While it might be amusing to imagine a culture in which everyone can ignore any rule they disagree with, the idea of actually doing it doesn’t even pass the laugh test.

Put simply:  To say the First Amendment includes the right to deny someone else a marriage license makes no more sense than saying the Second Amendment includes the right to commit murder.

Certainly, there are countries in which “the authority of God” (as Davis called it) has final say over who gets to live or die, let alone who can get married or not.  Of course, these countries tend to be predominantly Muslim and their system, known as “sharia,” is universally condemned—particularly by American conservatives—as medieval and antithetical to everything that Americans hold sacred.

How curious, then, that many of these same conservatives (read: half the GOP presidential candidates) are now defending this very same principle when the God in question is a Christian one.  How peculiar that defying settled law through Islam is repulsive, but doing the same through Christianity is just fine.  I’m sure there’s a non-racist, non-homophobic explanation for this somewhere.  As an atheist, I regret I’m not the best person to find it.

In any case, I didn’t come here to talk about Kim Davis, as such.  Really, I would just like to take a moment to underline how unbelievably lucky the gay community has been lately with respect to its would-be antagonists.

It would have been one thing if the self-appointed poster child for upholding “traditional marriage” were someone who actually engaged in the practice herself.  Someone who could credibly claim to be holier than thou.

That this particular mascot for following “God’s will” happens to be a raging phony is not merely hilarious; it also demonstrates just how phony her entire argument is.

To be clear:  Davis’ personal morality has absolutely no bearing on the legal arguments vis-à-vis her behavior as the Rowan County clerk.  Her actions would be contemptuous and absurd regardless of how many husbands she has had.

That, in so many words, is the point:  The law does not care about morality.  The law exists whether you agree with it or not, and applies to all citizens equally.  Further, if you happen to be a public official whose one and only job is to carry out the law, then your opinion of the law does not matter.  Either you do your job or you resign.

But of course, this doesn’t negate the role that ethics play in our day-to-day lives, and this is where Davis has become the gay rights movement’s new best friend.

Now that same-sex marriage is legal in all 50 states—and will almost certainly remain that way forever—there is nothing left to concern ourselves with except for the proverbial “changing hearts and minds.”

And where persuading people of gays’ inherent humanity is concerned, what finer image could there be than a thrice-divorced heterosexual turning her back on a homosexual couple attempting to get married just once?  In what possible universe does the person who has cheated her way through three marriages assume the moral high ground over couples who are embracing this sacred institution afresh?  What possible threat do those couples pose to society or morality, other than the possibility that, in time, they may turn into people like Kim Davis?

The G Word

Today in Germany, it’s against the law to deny the existence of the Holocaust.

Today in Turkey, it’s against the law to affirm the existence of the Holocaust.

We’re talking here about two different Holocausts, but the point is the same:  Some countries have the courage to fess up to past atrocities, while others are abject cowards.

For us Americans, the responsibility to acknowledge other countries’ grievous sins would seemingly be straightforward.  And yet, in practice, it has become so fraught and complicated that you’d think we’d committed the crimes ourselves.

I’m speaking, of course, of the annual disgrace that is the American president’s failure to call the Armenian genocide by its rightful name.

Beginning on April 24, 1915—exactly a century ago—the Ottoman Empire in present-day Turkey began a process of premeditated, systematic murder against Christian Armenians living within its borders.  Generally, this was done either through outright slaughter or through prolonged “death marches,” whereby victims would ultimately starve.

At the start of World War I, Armenians numbered roughly two million within the empire itself.  By 1922, about 400,000 were left.

While there remains a debate about the exact numbers, a broad historical consensus has emerged that what happened to Armenians under the Ottoman Turks was, in fact, genocide.  That is, it was a deliberate attempt to annihilate an entire people on the basis of their ethnicity.

(An interesting linguistic footnote:  The word “genocide” did not exist until 1943.  In 1915, U.S. Ambassador Henry Morgenthau referred to the Ottomans’ treatment of Armenians as “race extermination”—a term that, as Christopher Hitchens observed, is “more electrifying” than the one we now use.)

A century on, the legacy of the Armenian Holocaust is as contentious as ever.  However, the basic facts are only “controversial” in the sense that the basic facts about climate change are “controversial.”  Politicians continue to argue, but among the folks who actually know what they’re talking about—in this case, historians—the science is resoundingly settled.

Which brings us to the unnervingly Orwellian chapter of this story:  The careful refusal by every American president to utter the word “genocide” whenever the subject comes up.

It’s weird and frightening that this is the case, and in more ways than one—even when just considering the present occupant of the Oval Office.

You see, it’s not as if Barack Obama avoids the issue altogether.  Thanks to the efforts of the Armenian community in America and elsewhere, he doesn’t have a choice.

During this centennial week, Obama aides have met with several Armenian-American groups, and Treasury Secretary Jacob Lew is in Armenia’s capital to mark the anniversary.  National Security Advisor Susan Rice, meeting with Turkish officials, called for “an open and frank dialogue in Turkey about the atrocities of 1915.”

Nor—while we’re at it—does Obama himself deny the truth that is staring him directly in the face.  In January 2008, as a presidential candidate, he said, “The Armenian genocide is not an allegation, a personal opinion or a point of view, but rather a widely documented fact.”

And yet, in the six-plus years of the Obama administration, the word “genocide” has never passed the lips of any American official.

The explanation for this is depressingly straightforward:  Turkey, a strategic U.S. ally, denies that such a genocide ever took place, and the U.S. is terrified that if we declare otherwise, our relationship with Turkey will suffer irreparable harm.

That’s right:  Our government, in our name, is publicly maintaining a major historical lie in order to placate a foreign country that murdered a million and a half of its own citizens and, a hundred years later, still pretends that it didn’t.

By comparison, just imagine a world in which it was official U.S. policy not to formally recognize an organized plot by Hitler’s Germany to eradicate the Jewish population of Eastern Europe.  (To say nothing of the continent’s gays, Gypsies, Poles and others.)  Imagine if Germany today claimed that the six million Jewish casualties were essentially a fog-of-war coincidence.  Imagine if Angela Merkel arrested and jailed anyone who implied otherwise and the U.S. did nothing meaningful to stop her.

We don’t need to imagine it.  Replace “Germany” with “Turkey” and “Jews” with “Armenians,” and you’re left, more or less, with the world we have.

The Turkish government acknowledges that a great many Armenians were killed in the First World War, but denies that it was the Ottomans’ fault.  Further, thanks to Article 301 of the Turkish Penal Code, anyone who argues to the contrary can be imprisoned for the crime of “denigrating the Turkish Nation.”  By not going all the way in our condemnation, we Americans—the people who are supposed to be leading the world in justice and freedom—allow the practice to continue.

It’s a moral disgrace by all involved—an insult to Armenians, to history and to truth itself.  And everybody knows it.

That’s the creepiest part:  It’s not just that so many officials are saying something untrue.  They’re saying something untrue that everybody knows is untrue.

It’s the very essence of totalitarianism:  Create your own reality and exert no effort in making anyone believe it.

In actual dictatorships, this strategy works because the leaders wield absolute control over their citizens.  (To wit:  If you’re being starved, tortured, raped, etc., the fact that your government is also duplicitous is not a particularly high concern.)

On the other hand, such transparent dishonesty never works in democracies like ours, because our system is designed to make it impossible.  So long as we retain the freedom of expression, the separation of powers and a reasonably competent press corps, the truth will (eventually) rise to the surface.

So the president will eventually come around on this issue, and the Republic of Turkey will just have to deal with it.

Until that happens, however, Obama’s ongoing squeamishness will continue to validate the pessimism of many voters that the promise of “change” in Washington is an illusion.  That campaign pledges, however sincere at the time, will always ultimately be overruled by entrenched interests at home and abroad.  That insurgents who vow to “shake things up” are no match for the status quo.

To be sure, there’s no point in being naïve about these things.  If you’re the leader of the free world, you can’t just go insulting other countries willy-nilly and expect nothing bad to happen in return.  You have to accept the world as it is, politics is the art of the possible, blah blah blah.

But does the bar for political pragmatism really have to be set this low?  By acceding to other nations’ fantasies about the facts of history, aren’t we diminishing not just history but ourselves?  Are we not paying a random that any other wrongheaded country could demand as well?

Why would we do this?  Why should the bad guys win?

It’s certainly not inevitable.  Just look at Germany.

A mere seven decades after committing the most horrible crime against humanity in modern times, the Federal Republic of Germany stands not just as a stable, functioning, open society, but as Europe’s premier economic power and—crucially—just about as un-anti-Semitic as it’s possible for such a country to be.

Of course, in a nation so large, pockets of anti-Jewish sentiment still percolate, some of which manifest themselves through violence.  However, the overall prevalence of German anti-Semitism today is no greater than that of most other nations in Western Europe, and is considerably smaller than some (looking at you, France).

More to the point:  Since completely reinventing itself during and after the Cold War, Germany, in its official acts, has never stopped apologizing for its wretched past, even going so far (as I noted earlier) of punishing anyone who “approves of, denies or belittles an act committed under the rule of National Socialism,” along with anyone who “assaults the human dignity of others by insulting, maliciously maligning, or defaming segments of the population.”  This might explain why the country’s Jewish population doubled in the first five years after reunification, and then doubled again over the next decade and a half.

In America, of course, those sorts of laws would be completely unconstitutional, as the First Amendment guarantees the right to insult whoever you want.  However, as both a Jew and a defender of human dignity, I appreciate the sentiment.  Better to outlaw lies than truth.

This is all to say that Turkey will ultimately come to terms with the darkest period in its history, and all the reconciliation that it entails.  We can’t be sure how long it will take for such a proud nation to own up to its past cruelties.  But there is one thing of which we can be sure:  It will have no reason to take that leap until it stops being enabled into complacency by superpowers like us.

The End of Comedy

Should today’s comedians tailor their material for people with no sense of humor?

Obviously the answer is no.  But you’d never know it from the past few weeks, in which far too many humorless rubes have had far too much say—and sway—over what cheeky, intelligent comics are allowed to say.

Increasingly, we are becoming a society in which every public statement—be it serious or in jest—must be understood by the dumbest, most literal-minded person in the room, and in which irony and sophistication are punished and looked upon with scorn.

It’s a form of cultural suicide.  Shame on us for doing so little to stop it.

We could look just about anywhere for examples, but at this moment, we might as well begin with Trevor Noah.

A stand-up comedian by trade, Noah was unknown to most Americans until the fateful moment two weeks ago when he was given the job of a lifetime:  Successor to Jon Stewart as host of The Daily Show on Comedy Central.

Naturally, this announcement led Daily Show viewers to plumb the Internet for clues about who the heck Trevor Noah is.  As it turns out, he is an uncommonly deft and sneakily subversive 31-year-old from South Africa who found great success in his country of birth—in radio, television and onstage—before wafting over to the United States in 2011.

He is also an extremely active presence on Twitter.  Since joining in 2009, he has issued nearly 9,000 tweets in all.  (That’s roughly four per day, in case you didn’t want to do the math.)

Like the rest of us, Noah tweets pretty much every half-clever thought that pops into his head, and because he tells jokes for a living, the entirety of his Twitter output covers an awful lot of ground.

By itself, this fact is not especially interesting—and certainly not “newsworthy”—but then the world made a horrifying discovery from which it has not yet recovered:  Some of those 9,000 tweets were politically incorrect.

The horror.

I confess that I have not personally read all six years’ worth of brain droppings from an entertainer who’s been culturally relevant for 15 days.  However, many people apparently have, because within hours of Noah’s hire, they produced the aforementioned damning tweets, about which two facts stand out:  First, none of them is less than three years old.  And second, you can count them on the fingers of one hand.

What is their content, you ask?  Which 140-character quips are so horrible—so appallingly beyond the pale—that their existence is germane to us several years after the fact, and are possibly grounds for dismissal for the man who quipped them?

They were, in no particular order:  A putdown of Nazi Germany.  A mild critique of Israel.  An observation about the scarcity of white women with curves.  And a musing about the value of alcohol for women with a few too many curves.

And.  That’s.  About.  It.

At this juncture, we could go further into depth, if we were so inclined.  We could follow the lead of Noah’s critics, attempting to connect a handful of disparate tweets to the inner workings of Noah’s soul.

Or we could choose option B:  Grow up, get a life and stop throwing a tantrum every time someone says something that makes us uncomfortable.

I’ll keep it simple:  If a biracial comedian’s cracks about white women are too much for you to handle, then you have no business watching Comedy Central.  If you cannot stomach the notion of an émigré from South Africa having a critical view of Israel—a country that tacitly supported the former’s apartheid government until the bitter end—then you’d better steer clear of any newspaper or magazine that crosses your desk, because it just might give you a heart attack.

Sorry to break the news, but one of the consequences of living in a country with freedom of speech is that people will occasionally speak freely, and you might not agree with all of them.

Or, in this case, even understand what they’re saying.

My fear, you see, is not just that free expression itself is under attack, but that a great deal of this offense-taking is based on misapprehensions.  That smart people cannot say anything in public without worrying how their words might be interpreted by idiots.

Case in point:  Note the stupidity surrounding Bill Maher’s recent throwaway gag about how Zayn Malik, the now-ex-member of One Direction, bears a passing resemblance to Boston Marathon bomber Dzhokhar Tsarnaev.

Juvenile, yes.  But the logic could not have been more obvious:  Person A looks like Person B, end of joke.  It’s funny (or not) because one is evil while the other is an innocuous pop star, and that’s what irony is all about.

No one could possibly have understood the joke in any other way.  And so, of course, everyone did.

OK, not everyone.  But there were enough complaints about Maher “comparing” Malik to Tsarnaev—paired with the fact that Malik is Muslim, which no one outside the One Direction fan club would have known—for this to become a news story in many major publications.  For a solid few days, an HBO talk show host was compelled to explain the comedic concept of implying that one famous person looks a little bit like another famous person.

Has America really become that intellectually infantile?  Is this the level to which our public discourse has plunged?  How long will our best and brightest continue to shoulder this burden before everyone else finally wises up?

Certainly, it’s not a new phenomenon that an entire culture can get dragged down by its lowest-hanging fruit—our so-called “bad apples.”  Just look at how a handful of corrupt, racist cops have single-handedly tarnished the image of their entire profession, even as 90-something percent of their colleagues are doing their jobs exactly as they should.

But it’s even trickier when it comes to the militant enforcement of political correctness, because unlike killing unarmed black people, being offended by a joke as a result of your own ignorance is not against the law.  As my eighth grade history teacher said, “In this country, you’re allowed to be stupid.”

And it’s not just about jokes.  The tendency to lazily misinterpret a sophisticated public statement has consequences for our political leaders, too.  And, indeed, for the very language we speak.

I am reminded, for instance, of candidate Mitt Romney touting his family’s support for civil rights by saying, “I saw my father march with Martin Luther King.”  George Romney was, indeed, a strong ally of the Civil Rights Movement, consistently supporting Dr. King’s efforts and even leading a Michigan march (as the state’s governor) to protest the police brutality in Selma, Alabama in 1965. However, according to newspaper reports, Romney and Dr. King never literally appeared at the same event on the same day.  This led the media to tar Mitt Romney as a liar for implying that they had.

In one sense, the media were right to call Romney out for saying something that was technically untrue.  However, considering the full context of Romney’s statement—namely, the fact that his father was a champion of black civil rights, despite being a white Republican—we can accept the words “march with” as a rhetorical device in service to a broader truth, rather than as a bald-faced fabrication.

Except that we don’t accept such things anymore, because we’re too busy setting mousetraps for our public servants to get caught in.  Thanks to the wonders of the interwebs, we live in an age in which every statement is maniacally fact-checked and a politician can’t get away with anything.

For the most part, this is a good thing, because it means that true deceptions get exposed within minutes of being uttered and our leaders are kept relatively honest.

However, this instinct toward righteous, ruthless truth-seeking can be taken too far, leading us to take down politicians for transcendently silly reasons, and possibly dissuading future leaders from ever entering the arena.

So long as our public figures have reason to worry that everything they say will be taken literally—including words and phrases that are self-evidently figurative—they will have no choice but to dumb down their oratory and rhetoric until all the poetic flourishes are gone—and, with it, any hint of inspiration or linguistic flair.

That’s how our future is looking, so you’d better prepare yourself.  At long last, we are fulfilling the prophesy of Vanity Fair editor Graydon Carter, who remarked one week after 9/11, “It’s the end of the age of irony.”

It took 13 years, but we’ve finally achieved a culture in which no one is allowed to be funny.

That is, unless one of two things happens:  Either the dolts who can’t take a joke suddenly acquire the powers of subtlety, or the rest of us stop giving them the time of day.  I don’t know about you, but I have a pretty good idea about which of those scenarios is more likely to occur in our lifetime.

If history has taught us anything, it’s that stupidity cannot be eradicated.  It can only be marginalized, ridiculed and ultimately ignored.

Terrorism is a Cliché

If there is anything more depressing about the attack on Charlie Hebdo than the attack itself, it is the fact that there is nothing new or interesting to be said about it.  The context and apparent reasons for the assault are old news; as such, everything has already been said many times before.

Indeed, as I attempt to formulate my own response to this latest obscenity against human decency and the freedom of expression, I find myself merely repeating other people’s responses to other such obscenities over the last many years, both before and after September 11, 2001.

Charlie Hebdo—for the few of you who miraculously still do not know—is a French satirical newspaper operating out of Paris.  It ran continuously from 1970-1981, and then again from 1992 to the present day.  (“Hebdo” is French for “weekly,” and “Charlie” is an inside joke involving both Charles de Gaulle and Charlie Brown.)

Like The Onion here in the States, Charlie Hebdo operates on the principle that just about everything is fair game for parody and ridicule, including and especially organized religion.  As a result, the publication has regularly come under fire for its treatment of such revered figures as the Prophet Muhammad, among others.  In November 2011, such ire turned violent when the paper’s headquarters was firebombed by Muslim extremists, in response to an edition featuring a cartoon of Muhammad on its cover.

Further threats of violence against Charlie Hebdo have periodically surfaced in the three years since, and this past Wednesday, three would-be jihadists made good on that threat by storming the paper’s newsroom and murdering 12 people, including its editor-in-chief and several of its famed cartoonists.  On their way out, the assailants were heard shouting, “We have avenged the prophet!”  The killers have since been killed.  They are believed to have been connected to al Qaeda, although many details are yet unclear.

For those of us on the sidelines—we who have taken it upon ourselves merely to make sense of senseless acts like this—there is a great deal to say:  many principles to defend, many facts to establish.  However, in doing so, we are forced to repeat ourselves rather than come up with anything new.  It’s a shame we have to expend such efforts in the first place—we are, after all, applying reason to people who have none—but then again, it seems we have no other choice.  Better to reintroduce ancient clichés than bear witness to barbarism in silence.

We could start, for instance, with the old trope, “Not all Muslims are terrorists”—an assertion that is invariably preceded and/or followed by its rejoinder, “Yes, but virtually all terrorists are Muslim.”  The first statement is obviously true—only a complete idiot would argue otherwise—while the second is obviously false and yet is nonetheless, shall we say, a bit more true than most of us would like to admit.

In other words, the argument here is exactly the same one we had after the September 11 attacks—namely, “Is Islam the problem?”  If Islam is truly “a religion of peace,” then why are there so many officially Muslim nations that traffic in violence and war, using certain Islamic doctrine as justification?

Alternatively, we could expand the question to encompass religion as a whole, since there is no shortage of Christian and Jewish extremists who also take the dictates of their faiths into their own hands.  Could the root cause of ideological mass murder in the 21st century not be Islam but rather religious-based intolerance of every sort?  Have we really made no progress in this debate since the Twin Towers fell?

Whichever side you take (there are more than two), perhaps the more salient point in the present context is the level of risk one assumes in broaching this subject at all.  The way that Bill Maher’s old joke, “Never say Islam isn’t a religion of peace, because if you do, they’ll kill you,” manages to be funnier than it should be.

Because of course our primary subject of concern in the Charlie Hebdo assault is the inalienable right to express one’s views—yes, even when such views make some people uncomfortable, angry or—perish the thought!—offended.

As many of us well know, Charlie Hebdo is not the first Western publication to be physically targeted for printing provocative caricatures of the Prophet Muhammad.  In 2005, a Danish newspaper called Jyllands-Posten similarly rendered Muhammad in cartoon form, in order to make a few points about free speech and religious prohibitions thereof, and within days all hell proceeded to break loose from one end of the continent to the other—an uproar that included riots, attacks on multiple European diplomatic missions and some 200 deaths, all told.

As such, because exactly this sort of thing has happened before—and quite recently, at that—we don’t need to wonder what it all means:  We can just dig up what all the smart people wrote in 2005 and 2006.

As it happens, one of the smartest and sharpest of those reactions came from an old favorite of mine, Christopher Hitchens, who wrote passionately in favor of the right to insult organized religion at all costs.  (“The babyish rumor-fueled tantrums that erupt all the time, especially in the Islamic world, show yet again that faith belongs to the spoiled and selfish childhood of our species.”)  And so we have a perfectly cogent analysis of the Charlie Hebdo situation penned by someone who’s been dead for three years.

As well, in case you need further proof of the dull repetitiveness of the West’s run-ins with theocratic loony toons, I would direct you to a wonderfully illuminating chat in 2010 between Hitchens and Salman Rushdie—a man who, despite radical Islam’s best efforts, is still very much alive.  Their talk considers several key points about the Danish cartoon fiasco, and watching it today, one is taken aback by how perfectly it corresponds to the mess at Charlie Hebdo, as if the two events were completely interchangeable.  In many respects, they are.

For instance, Rushdie proposes dividing the central question about free speech into two parts.  First:  Are news outlets duty-bound to reprint offensive cartoons out of solidarity with a publication that has been attacked?  And second:  Should that first paper have been more circumspect about printing those images in the first place, knowing the fuss that it would cause?

In other words, is the right to be offensive sometimes trumped by the wisdom to hold back?  Is there a distinction between offending in order to make a point and offending for its own sake?  Is the First Amendment not always as important as good taste?

By now, there has been exhaustive back-and-forth online and in print about these very important questions, including the charge that some of the folks at Charlie Hebdo are just plain racist.  That the “I am Charlie” solidarity is a function of the relatively high level of anti-Muslim prejudice around the world today, and that a comparable paper that had published anti-Semitic or anti-Catholic cartoons would not enjoy such international goodwill following a terrorist attack.  As many have said, it’s easy to defend free speech when you happen to agree with the speech in question.

My answer to this:  Who cares?

The right to free expression should be defended regardless of the content, and the fact that we’re less likely to defend speech we don’t like is precisely why we have the First Amendment in the first place.

The question about good taste is an interesting one, but in this instance it’s also, finally, beside the point.  The only reason we’re wondering whether the editors of Charlie Hebdo should have used more discretion is because their cartoons yielded a violent response.  If satirical images of the Prophet Muhammad were not so radioactive—if they didn’t so predictably lead some people to go out and commit mass murder—then taste would be the only thing to discuss, and the First Amendment would hardly enter into it.  We would talk about provocative religious images the way we talk about provocative non-religious images:  With passion and indignation, but without the hysterical claim that they should not exist at all.

No, the real problem here is the lack of sophistication inherent in those who don’t have the stomach for ideas they don’t share, and who would rather such ideas not be uttered and are prepared to threaten and/or attack those who utter them.

And the problem behind the problem, like every other cliché I’ve noted, has been astutely espoused in the past, in this case by comedian Lewis Black.  The central fact about al Qaeda and their ilk, Black surmised on his album The End of the Universe, is that they have no sense of humor.  That they take their faith literally and without a whiff of irony or self-criticism, resulting in untold misery for millions of people.

“Patriotism is important, and religion is vital,” said Black, “But without a sense of humor, religion and patriotism can get crazy […] and we see that in our enemy.”

Black once wrote a memoir titled Nothing’s Sacred, and satire is founded upon that very notion:  No subject is out of bounds, nor should it be.  This means that so long as satirists exist, someone somewhere is going to be offended by what they have to say.  There is no getting around this fact.  Individual writers and publications are free to self-censor for reasons of taste, but it should be their decision alone, and they should never be compelled to restrict their content out of fear of violence.

The problem, you see, is not the people who offend.  The problem is the people who (to quote Hitchens again) are determined to be offended and, paradoxically, will stop at nothing to prevent the rest of us from offending them.

Maybe I could explain this phenomenon better, but it would just be one more cliché.

The Right to Hate

I have no evidence that the Westboro Baptist Church is secretly a pro-gay rights organization masquerading as a gang of religious extremists in order to make anti-gay groups look ridiculous.

However, if such a cheeky cabal were formed, I suspect it wouldn’t look a heck of a lot different.

For the past many years, the Westboro Baptist Church has served two essential purposes in American public life.  First, to be arguably the most universally detested organization in our 50 states united.  And second, to ensure, beyond all doubt, that the First Amendment to the U.S. Constitution is as healthy and muscular now as ever it has been.

To review:  The WBC are the folks who shuttle from place to place wielding signs with such heart-warming messages as “God Hates Fags,” “God Hates America” and “Thank God For Dead Soldiers.”  Most of its members are related, either by blood or marriage, to its founder and patriarch, Fred Phelps, who died on March 19, at age 84.

The group is perhaps most notorious for its practice of picketing the funerals of U.S. soldiers, whom it claims were killed as a consequence of America’s tolerance for homosexuality, among other things.  In 2010, this ritual led to a Supreme Court case, Snyder v. Phelps, in which the Court ruled in favor of the church, arguing that protesting a funeral is a form of free expression protected by the First Amendment.

While the death of Fred Phelps does not necessarily mark the demise of the Westboro Baptist Church itself, it may well hasten its diminished presence in the public eye.  As such, we might entertain the notion of referring to the WBC in the past tense, if only for its cathartic effects.

On this subject, I have but one question:  On balance, has the Phelps family been good for America?

My answer:  Yes, but it’s complicated.

I say the WBC is the most hated organization in America—a fairly uncontroversial sentiment—but we might also say it has come by this distinction rather lazily, as far as generating mass hatred goes.

After all, what could be more of a “slam dunk” in the quest for amassing public scorn than to spit on the graves of fallen soldiers and to craft placards with the sort of radioactive language that leads even those who otherwise agree with you to recoil in disgust?

The WBC can be accused of being any number of things, but subtle is not one of them.

Quite to the contrary, they are cartoon characters—hysterical, childish, simplistic, ideologically absolutist to an extent previously not thought possible, and—surprise, surprise—completely convinced of their moral rightness on all fronts.

Indeed, the more time one spends reading the WBC’s various statements on matters of public import, the more one feels the weight of precious seconds of one’s life being irretrievably wasted away.

In other words, the WBC seems to incite the world’s rage and indignation for their own sake, as if it were all one big piece of performance art.  As such, the church can hardly be taken seriously in the first place.  To coin a phrase:  Its antics are not worth dignifying with a response.

Yet we have done exactly that, be it through satire and counter-protests, or in the case of people like Albert Snyder, through lawsuits alleging the infliction of deep emotional distress.

And we cannot blame some folks for taking WBC at face value, since its views do not exactly come from nowhere.  In point of fact, the church’s basic beliefs about homosexuality are drawn directly from the Old Testament, and its musings that God kills Americans as a punishment for homosexuality is an almost word-for-word plagiarism of Jerry Falwell’s infamous explanation for the attacks of September 11, 2001.

In any case, their flagrant ridiculousness has proved exceedingly useful in reminding ourselves that enforcement of the First Amendment can be a very nasty business, since the right to free expression must be extended even to those whose views no one else on planet Earth wishes to hear.

In this way, the Phelps family’s victory at the Supreme Court was a great relief, because it demonstrated that—at least in this case—our federal institutions still take the Bill of Rights seriously.  That our most sacred liberties apply even to those who probably don’t deserve them.  Yes, even organizations like the Westboro Baptist Church, which expresses nothing but scorn toward the very country in which these liberties are practiced.

For better and for worse, that is what America is all about.