Eye of the Beholder

Can a piece of art ever exist entirely on its own, or is it always tethered to the context of its creation?

For instance, is it possible to listen to the Ring Cycle without remembering that Richard Wagner was an anti-Semitic prick whose music inspired the rise of Hitler?

Can one watch Manhattan—the story of a 42-year-old man’s love affair with a 17-year-old girl—and not be distracted and/or repulsed by the personal life of its writer, director and star, Woody Allen?

As a society, we’ve had a version of this argument many times before, trying to figure out how to separate the art from the artist, while also debating whether such a thing is even desirable in the first place.  (The answer to both:  “It depends.”)

Lately, however, this perennial question has assumed a racial dimension, compelling us to re-litigate it anew—this time with considerably higher stakes.

Here’s what happened.  Over at New York’s Whitney Museum of American Art, the curators of the institution’s 78th biennial—an exhibition of hundreds of contemporary works by dozens of artists—chose to include Open Casket, a semi-abstract painting that depicts the mutilated corpse of Emmett Till, the 14-year-old African-American boy who was tortured and lynched in Mississippi in 1955 for allegedly whistling at a white girl.  (The woman in question later admitted she made the whole thing up, but that’s another story.)

As a painting, Open Casket is arresting, with the oils so thickly layered that Till’s mangled face literally protrudes from the canvas, as if calling out to us from beyond the grave.  As a political statement, it fits comfortably into our uncomfortable era of police brutality and racial unease—a natural, even obvious, choice for any socially conscious art show in 2017.

There was just one little problem:  The creator of Open Casket is white.  Specifically, a Midwestern white woman living in Brooklyn named Dana Schutz.

Upon hearing that a Caucasian had dared to tackle Emmett Till as the subject for a painting, many patrons demanded the Whitney remove Open Casket from its walls, while condemning Schutz for attempting to profit off of black pain—a practice, they argued, that has defined—and defiled—white culture since before the founding of the republic, and should be discouraged at all costs.  The message, in effect, was that white people should stick to their own history and allow black people to deal with theirs.

In response to this brouhaha, the Whitney defended its inclusion of Schutz’s work without directly addressing the race question, while Schutz herself issued a statement that read, in part, “I don’t know what it is like to be black in America.  But I do know what it is like to be a mother.  Emmett was Mamie Till’s only son.  I thought about the possibility of painting it only after listening to interviews with her.  In her sorrow and rage she wanted her son’s death not just to be her pain but America’s pain.”

In other words:  Far from being exploitative or opportunistic, Open Casket is meant as an act of compassion and empathy toward black America from an artist who views Emmett Till’s death as a tragedy for all Americans—not just black ones.

Of course, that is merely Dana Schutz’s own interpretation of her work, and if history teaches us anything, it’s that the meaning of a given cultural artifact is never limited to what its creator might have intended at the time.  The artist Hannah Black, one of Schutz’s critics, is quite right in observing, “[I]f black people are telling her that the painting has caused unnecessary hurt, she […] must accept the truth of this.”

The real question, then, is whether offensiveness—inadvertent or not—is enough to justify removing a piece of art from public view, as Black and others have advocated in this case.

If, like me, you believe the First Amendment is more or less absolute—that all forms of honest expression are inherently useful in a free society—then the question answers itself.  Short of inciting a riot (and possibly not even then), no art museum should be compelled to censor itself so as not to hurt the feelings of its most sensitive patrons, however justified those feelings might be.  Au contraire:  If a museum isn’t offending somebody—thereby sparking a fruitful conversationit probably isn’t worth visiting in the first place.

Unfortunately, in the Age of Trump, the American left has decided the First Amendment is negotiable—that its guarantee of free speech can, and should, be suspended whenever the dignity of a vulnerable group is threatened.  That so-called “hate speech” is so inherently destructive—so wounding, so cruel—that it needn’t be protected by the Constitution at all.  As everyone knows, if there was one thing the Founding Fathers could not abide, it was controversy.

What is most disturbing about this liberal drift toward total political correctness is the creative slippery slope it has unleashed—and the abnegation of all nuance and moral perspective that goes with it—of which the Whitney kerfuffle is but the latest example.

See, it’s one thing if Open Casket had been painted by David Duke—that is, if it had been an openly racist provocation by a callous, genocidal lunatic.  But it wasn’t:  It was painted by a mildly-entitled white lady from Brooklyn who has a genuine concern for black suffering and wants more Americans to know what happened to Emmett Till.

And yet, in today’s liberal bubble factory, even that is considered too unseemly for public consumption and must be stamped out with all deliberate speed.  Here in 2017, the line of acceptable artistic practice has been moved so far downfield that an artist can only explore the meaning of life within his or her own racial, ethnic or socioeconomic group, because apparently it’s impossible and counterproductive to creatively empathize with anyone with a different background from yours.

By this standard, Kathryn Bigelow should not have directed The Hurt Locker, since, as a woman, she could not possibly appreciate the experience of being a male combat soldier in Iraq.  Nor, for that matter, should Ang Lee have tackled Brokeback Mountain, because what on Earth does a straight Taiwanese man like him know about surreptitious homosexual relationships in the remote hills of Wyoming?  Likewise, light-skinned David Simon evidently had no business creating Treme or The Wire, while Bob Dylan should’ve steered clear of Hattie Carroll and Rubin Carter as characters in two of his most politically-charged songs.

Undoubtedly there are some people who agree with all of the above, and would proscribe any non-minority from using minorities as raw material for his or her creative outlet (and vice versa).

However, if one insists on full-bore racial and ethnic purity when it comes to the arts, one must also reckon with its consequences—namely, the utter negation of most of the greatest art ever created by man (and woman).  As I hope those few recent examples illustrate, this whole theory that only the members of a particular group are qualified to tell the story of that group is a lie.  An attractive, romantic and sensible lie, to be sure—but a lie nonetheless.

The truth—for those with the nerve to face it—is that although America’s many “communities” are ultimately defined by the qualities that separate them from each other—certainly, no one would mistake the black experience for the Jewish experience, or the Chinese experience for the Puerto Rican experience—human nature itself remains remarkably consistent across all known cultural subgroups.  As such, even if an outsider to a particular sect cannot know what it is like to be of that group, the power of empathy is (or can be) strong enough to allow one to know—or at least estimate—how such a thing feels.

As a final example, consider Moonlight—the best movie of 2016, according to me and the Academy (in that order).  A coming-of-age saga told in three parts, Moonlight has been universally lauded as one of the great cinematic depictions of black life in America—and no wonder, since its director, Barry Jenkins, grew up in the same neighborhood as the film’s hero, Chiron, and is, himself, black.

Slightly less commented on—but no less noteworthy—is Moonlight’s masterful meditation on what it’s like to be gay—specifically, to be a gay, male teenager in an environment where heterosexuality and masculinity are one and the same, and where being different—i.e., soft-spoken, sensitive and unsure—can turn you into a marked man overnight, and the only way to save yourself is to pretend—for years on end—to be someone else.

Now, my own gay adolescence was nowhere near as traumatic as Chiron’s—it wasn’t traumatic at all, really—yet I found myself overwhelmed by the horrible verisimilitude of every detail of Chiron’s reckoning with his emerging self.  Here was a portrait of nascent homosexuality that felt more authentic than real life—something that cannot possibly be achieved in film unless the men on both sides of the camera have a deep and intimate understanding of the character they’re developing.

Well, guess what:  They didn’t.  For all the insights Moonlight possesses on this subject, neither Barry Jenkins, the director, nor a single one of the leading actors is gay.  While they may well have drawn from their own brushes with adversity to determine precisely who this young man is—while also receiving a major assist from the film’s (gay) screenwriter, Tarell Alvin McCraney—the finished product is essentially a bold leap of faith as to what the gay experience is actually like.

Jenkins and his actors had no reason—no right, according to some—to pull this off as flawlessly as they did, and yet they did.  How?  Could it be that the condition of being black in this country—of feeling perpetually ill at ease, guarded and slightly out of place in one’s cultural milieu—has a clear, if imprecise, parallel to the condition of being gay, such that to have a deep appreciation of one is to give you a pretty darned good idea of the other?  And, by extension, that to be one form of human being is to be empowered to understand—or attempt to understand—the point of view of another?  And that this just might be a good thing after all?

Advertisements

Character Is Destiny

Donald Trump has been president for all of two weeks, yet already he has proved himself the most brazenly Nixonian person to ever sit in the Oval Office—Richard Nixon included.

How much of a paranoid megalomaniac is our new commander-in-chief?  Well, for starters, it took Nixon a full four-and-a-half years to dismiss his own attorney general for failing to carry out the president’s imperial agenda.  Trump?  He took care of that on Day 11.

There’s a classic saying, “History doesn’t repeat itself—but it rhymes.”  Of course, historians love to draw parallels between the past and the present in any case, but the truth is that some connections are so blindingly obvious that we needn’t even bring experts to the table.  We can do the rhyming ourselves, thank you very much.

At this absurdly premature juncture in the life of the new administration, it has become evident—to the shock of no one—that the Trump White House is destined to most resemble Nixon’s in both form and effect, and there may be no surer means of anticipating this West Wing’s machinations—good and bad, but mostly bad—than through a close study of the one that dissolved, oh-so-ignominiously, on August 9, 1974.

In light of recent events, we might as well begin with the Saturday Night Massacre.

In the fall of 1973, President Nixon was drowning in controversy about his role in the Watergate caper, thanks largely to the efforts of Special Prosecutor Archibald Cox.  Suddenly, on October 20, Nixon decided he had had enough and ordered his attorney general, Elliot Richardson, to fire Cox ASAP.  Having promised to respect Cox’s independence, Richardson refused to comply and promptly resigned, as did his deputy shortly thereafter.

Once the dust settled and Cox was finally sacked by Solicitor General Robert Bork (yes, that Robert Bork), it became clear to every man, woman and child in America that the president of the United States was a crook and a scumbag—albeit a cartoonishly sloppy one—and so began the suddenly-inevitable march to impeachment that would end only with Nixon’s resignation in August of the following year.

What’s the lesson in all of this?  For my money, it’s that if the president feels he cannot do his job without depriving America’s chief law enforcement officer of his, something extraordinarily shady is afoot, and it’s only a matter of time before the public—and Congress—demands some manner of accountability.

Cut to the present day, and the constitutional (and humanitarian) crisis that Donald Trump pointlessly unleashed by banning all Syrian refugees from entering the U.S.—along with immigrants from seven Muslim-majority countries—and then firing Acting Attorney General Sally Yates when she proclaimed the order illegal and instructed the Justice Department to ignore it.

For all that differentiates the Saturday Night Massacre from the Muslim ban and its aftermath, both events present a commander-in-chief with an utter, self-defeating contempt for basic rule of law and all institutional checks on his authority.  Just as Nixon believed he could sweep Watergate under the rug by canning its lead investigator, so does Trump think he can essentially wipe out an entire religion’s worth of immigrants from the United States by disappearing any Justice Department official who regards the First Amendment as constitutionally binding.

(Notice how Trump justified the firing of Yates by accusing her of “betrayal”—as if the attorney general’s loyalty to the president supersedes her loyalty to the law.)

Of course, the nice thing about the Constitution is that it exists whether or not the president believes in it (as Neil deGrasse Tyson didn’t quite say).  The trouble—as the nation learned so painfully with Nixon—is that justice can take an awfully long time to catch up to the president’s many dogged attempts to dodge it—especially if he has a gang of willing collaborators in Congress.

In the end, the reason Watergate exploded into a full-blown cataclysm was that Richard Nixon was a fundamentally rotten human being—a callous, cynical, friendless sociopath whose every move was calibrated for political gain and without even a passing consideration for the public good.  For all that he spoke about standing up for the common man, when push came to shove the only person he really gave a damn about—the only person he ever lifted a finger to protect—was Richard Nixon.

Does any of this sound familiar?  You bet your sweet bippy it does.  In the frightfully short time he’s been president, Trump has shown a remarkable knack for mimicking every one of Nixon’s faults—his vindictiveness, he contempt for the press, his insecurity, his dishonesty, his propensity for surrounding himself with racists and anti-Semites—while somehow skirting any redeeming qualities that might make his presidency tolerable, despite all of the above.

Indeed, to the extent that Trump is not the absolute spitting image of America’s all-time champion of corruption, he is demonstrably worse.  After all, Nixon was historically literate, intellectually curious and, from his experience as a congressman and vice president, highly knowledgeable about the nuts and bolts of Washington deal making.  He was a scoundrel, but a reasonably competent one with several major accomplishments to his name.

Can we expect Trump to achieve any sort of greatness in the teeth of his many weaknesses?  If these first two weeks are at all predictive of the next four years, I see no reason to think so.  Whereas Nixon was a gifted strategic thinker with a deep sense of history and geopolitics, Trump has over and over again professed a proud and stubborn ignorance of any matter that does not directly involve himself, and seems to derive all his information about a given subject from the last person he spoke to about it.

The Greeks had it right:  Character is destiny, and there’s just no coming back from a veritable avalanche of fatal flaws.  We can pray all we want that the president will suddenly discover the value of temperance, deliberation and any hint of public virtue, but we’d only be denying a truth that has been staring us in the face from the moment Trump announced himself as a figure of national consequence.  He is who he is, he will never get better, and our only hope is that this new national nightmare won’t last quite as long as the last one did.

Love the Bubble

There’s an old story that when Richard Nixon was re-elected president in 1972 by a score of 49 states to one, the legendary New Yorker film critic Pauline Kael remarked, “How could Nixon possibly have won?  Nobody I know voted for him!”

In truth, Kael said nothing of the sort.  Or rather, she said the exact opposite of the above, but because life is one long game of telephone, over time her words have been misinterpreted to within an inch of their life, so that now she comes off as an oblivious, left-wing stooge.  Oh well:  When the legend becomes fact, print the legend.

All the same, those exact words have been bouncing around my head a lot these days, following the even more inexplicable election of an even more inappropriate candidate to that very same high office.  If the gist of Kael’s (fictional) lament is that Americans are so ideologically tribal that we’ve essentially walled ourselves off from those with whom we disagree, I’ve certainly done my part to make matters worse.

Indeed, months before Donald Trump became America’s president-elect, I couldn’t help but marvel at the fact that, so far as I could tell, not a single person I’ve ever known was prepared to cast a vote for him.  Nor, for that matter, was any writer, elected official or celebrity in my intellectual orbit for whom I hold even a modicum of respect—including many conservatives who would normally support the Republican candidate as reflexively as I would support the Democrat.

Is this because, like Pauline Kael, I live inside an elitist, left-wing bubble and spent the entirety of 2016 subconsciously avoiding any views I would rather not hear?  Probably.

Is it also because Donald Trump was the most unserious and morally repugnant presidential candidate in a century, and therefore liable to turn off virtually any honest person who knows a vulgar charlatan when they see one?  Once again:  All signs point to yes.

Because those two things are equally true—not one more than the other—I’ve had real trouble feeling guilty about contributing to America’s increasing divide between Team Red and Team Blue.  I don’t doubt that if I put in more effort to reach out to folks in the heartland and elsewhere who do not share my values, I would likely emerge a fuller, more empathetic human being.  But there is no amount of ideological ecumenicalism that could negate all the terrible things Trump has said and that innumerable supporters of his have done:  He and they are as contemptible today as they’ve ever been—if not worse—and I have no desire to treat their particular views on race, religion and gender as if they are deserving of my respect.

Remember:  One’s politics are not some ingrained, immovable phenomenon like ethnicity or sexual orientation.  They are a choice.  They reflect how you think—as opposed to who you are—and that makes them fair game for the condemnation of others.

Which brings us—improbably enough—to Meryl Streep.

At Sunday’s Golden Globes, Streep chose to accept the Cecil B. DeMille Lifetime Achievement Award by expressing her revulsion toward the president-elect and all that he represents—specifically, his disdain for multiculturalism and a free press, as well as his pathological inability to ever behave like a mature, compassionate adult.  Predictably, the crowd inside the Beverly Hilton went wild, while right-wingers online condemned Streep as an arrogant liberal nut.  And so it goes.

From a close reading of Streep’s remarks, we find that—apart from an unfair crack about mixed martial arts—she didn’t make a single statement that any decent person could possibly disagree with.  Every factual assertion was objectively correct (e.g., Trump is a bully, Hollywood actors have geographically diverse backgrounds), while every value judgment was so basic and obvious that a kindergartner could understand it (e.g., “disrespect invites disrespect, violence incites violence”).

Substantively, there was absolutely nothing controversial in Streep’s comments.  The uproar, then, was entirely a function of Streep’s status as Hollywood royalty—and, thus, a spokeswoman for the cultural left—which led those on the right to denounce her purely out of partisan vindictiveness, just the way congressional Republicans have opposed much of what President Obama has said because he said them.

That, my friends, is the real danger of living in a bubble:  Your ideological bias can become so overpowering that you decide, in advance, that those in the other bubble could never possibly say something true.  And that is the moment at which all good governance—nay, all good citizenship—ends.

I, for one, am entirely comfortable with the fact that, during the next four years, Donald Trump will occasionally say and do things of which I completely approve.  When that happens, I hope I will have the decency and integrity to say so.  All I ask in return is for everyone else—no matter which bubble they call home—to meet me halfway.

Whodunit?

There’s an old personality test—introduced to me in middle school and lovingly preserved on the interwebs—involving a woman who gets herself killed journeying between her husband and her lover.  The “test,” as it were, centers on the question of who is most to blame for the woman’s untimely death.  Is it the bored husband who neglected to take his wife along on his business trip?  Is it the greedy boatman who refused to ferry her across the river to safety?  Is it the heartless boyfriend who didn’t lift a finger in her defense?  Or is it the woman herself for being unfaithful and blundering into the wrong place at the wrong time?

It’s a ridiculous conceit, but the idea is that how you assign blame for the woman’s murder is determined by what you value most in life.  The options, in this case, include such things as “fun,” “sex,” “money” and, my personal favorite, “magic.”

Anyway, that story’s been on my mind for the last few days as I’ve seen Donald Trump campaign events descend into violence and mayhem whenever a gaggle of anti-Trump agitators has sneaked its way into the arena.

With regards to these unholy scuffles, everyone seems to have a firm opinion about who is most at fault.  Interestingly, however—and I think you know where I’m going with this—no one can quite agree on who, exactly, that is.

Obviously, then, what we need is to update that silly game about the two-timing wife so that it applies to our own time and our own values.  With Trump—a man who stands as America’s signal Rorschach test of 2016—we can learn a great deal about how each of us thinks just by how we interpret what is happening directly in front of our eyes.

From a sampling of reactions, we find that most people trace the cause of this campaign unrest to either a) the protesters, b) Trump supporters or c) Trump himself.  To an extent, one’s opinion of these incidents is merely an affectation of one’s politics:  If you find Donald Trump generally detestable, you generally attribute all detestable acts to the man himself.  Conversely, if you think Trump speaks truth to political correctness, you find fault only with those who are preventing him from speaking.  It’s confirmation bias in action:  You see what you want to see and filter out everything else.

But of course, all of that is but the tip of the bloody, bloody iceberg.  However illuminating it might be to debate which side threw the first punch, it’s not until folks start to blame those who weren’t even in the room that the real fun begins.

We might start with the Donald himself, who has fingered Bernie Sanders as the main culprit for the madness, saying that the party crashers at his gatherings are on direct marching orders from the socialist from Vermont.  It is noteworthy that Trump bases this claim on no evidence whatsoever, while he has simultaneously blamed other outbursts on ISIS—yes, that ISIS—due to a YouTube video that was swiftly exposed as a typical Internet hoax.  As Trump explained on Meet the Press, “All I know is what’s on the Internet,” reminding us that he is apparently the one person in America who believes, with all his heart, that if it’s online, it must be true.

Farce that this undeniably is, such behavior nonetheless offers real insights into Trump’s personality and that of his fellow travelers.  Strongest among these, perhaps, is the value of “truthiness,” a.k.a. believing something to be true simply because your gut tells you so.

In fact, Trump’s entire movement is dependent on truthiness, since at least 80 percent of his campaign’s major claims are demonstrably false and his promise of “restoring America’s greatness” is one big fatuous smoke-and-mirrors routine containing nary a whiff of substance or honest reporting.  If all presidential candidates engage in hyperbole, Trump is unique for engaging in absolutely nothing else.

The real problem, though, is how sinister that hyperbole has been for the last nine months and how deeply it has metastasized within the GOP.  While this week’s outright physical violence might be relatively new, the truth is that Trump and his flock have been blaming other people for America’s problems for his entire presidential run.  Like any seasoned demagogue, Trump has invented most of this blame from whole cloth, while at other times he has even managed to invent the problems themselves.  (Who would ever know, for instance, that net immigration from Mexico is actually negative over the last five years, or that U.S. military spending increased from 2014 to 2015?)

Which leads us, as it must, to the most disturbing personality quirk of all:  The one that blames all of this turmoil on African-Americans and views the entire American experience in terms of white supremacy.

While it would be irresponsible to peg every Trump voter as a white supremacist—or, specifically, a Nazi or a Klansman—the point is that Trump rallies have become a safe space—if not a veritable breeding ground—for white people who think that punching, kicking and spitting on black people is their God-given right as members of a privileged race.  For all Trump’s claims that the protesters are the true instigators of these melees, most video clips suggest otherwise:  Largely, we just keep seeing groups of young, mostly black people nonviolently holding up signs and chanting cheeky slogans while white guards and white attendees proceed to manhandle them with the greatest possible force—egged on, every single time, by the candidate himself.

You see pictures like these—paired with people like Mike Huckabee calling the protesters “thugs,” a word that Republicans only ever use to describe African-Americans—and you realize all that’s missing are the dogs and the fire hoses.

Among the many sick ironies of Donald Trump is his supposed fidelity to the First Amendment, which he claims the dissenters at his rallies are attempting to suppress (as if Trump has ever lacked an outlet for expressing himself on a moment’s notice).  Historical ignoramus that he is, he doesn’t seem to realize that, when it comes to muzzling free speech, few things are more effective than riling up a large gang of angry white people by telling them how to mistreat a small gang of dark-skinned antagonists.  (And then, of course, pleading ignorance when those same white people do exactly what you suggest.)

Even if there were nothing at all race-based in Trump and company’s behavior, we would still be left with this profoundly dangerous idea that all problems can, and should, be solved with physical violence.  To hear Trump talk, you’d think his were the first-ever campaign events to feature any sort of disruptors and that there is no rational response except to treat them like enemy combatants.  (How long before Trump recommends waterboarding?)

The relevant terms here are “escalate” and “de-escalate.”  As any honest police officer knows, whenever you are faced with a potentially explosive situation, it is your moral responsibility to try to de-escalate tensions and not make matters worse.  Indeed, for anyone who wields authority or influence over others—not least in politics—the obligation to lead by example and get your minions under control is absolute and non-negotiable.

Donald Trump has failed that charge over and over again.  In so doing, he has revealed which values he holds dear and which values he does not—if, that is, he can be said to possess any values at all.

It proved quite prescient that Trump opened his campaign while riding an escalator in Trump Tower in Manhattan:  As it turns out, he is an escalator.

Freedom From Fear

When it comes to terrorism, how did we suddenly become such a nation of scaredy cats?

Sure, each of us has our own private set of fears—things that add unwelcome tension to our day and maybe even keep us up at night.  Some of these are perfectly rational, while others seem to have been invented from whole cloth.

I don’t know about you, but I certainly know a few things that frighten me.  Failure.  Poverty.  Writer’s block.  Cancer.  Bugs.

But you know one thing that doesn’t scare me at all?  Being killed in a terrorist attack.

On any given day, I am far more concerned about a beetle wandering into my bed than a suicide bomber wandering onto my subway car.  Why?  Because I’m a reasonably logical human being who realizes that the former is infinitely more likely than the latter, and I’m not about to waste my time fretting about every last terrible thing that could possibly happen to me.

Could I find myself in some kind of active shooter/bomber/hostage situation?  Sure, why not?  Bad guys exist and somebody has to be their victim.  I lived in New York on September 11, 2001, and in Boston on April 15, 2013, so I’m not entirely naïve about the horrors that Islamic (and non-Islamic) extremists can unleash upon unwitting bystanders.

All the same, there is something to which I am equally attuned:  statistics.

You’ve read the actuarial tables.  All things equal, each of us is roughly 35,000 times more likely to die from heart disease than from a terrorist attack.  Heck, we are 350 times likelier to die from gravity (read:  falling off a roof) and four times likelier to be struck by lightning.  According to at least one study, the average American’s lifetime odds of being killed as the result of terrorism are approximately 1 in 20 million.

On one level, these numbers serve as amusing, if abstract, pieces of trivia.  On a deeper level, they reflect what a colossal waste of time it is to actively fear being caught up in an act of mass violence.  The probability of such a thing are so remote, you might as well get worked up over being eaten by Bigfoot.

And yet, from a new poll, a record-high number of Americans claim to be more fearful of terrorism now than at any time since September 11, 2001.  Thanks to the atrocities in Paris and San Bernardino—and the increasing reach of ISIS in general—the super-low risk of being the victim of a similar attack now strikes many of us as entirely feasible, if not outright imminent.

It’s not, and it never will be.  Get it together, people.  Don’t be such drama queens.  Keep calm and…well, you know.

Look:  I watch Woody Allen movies.  I understand that if someone is determined to freak out about an imaginary bogeyman, there’s nothing you can do to stop them.  Then there’s the fact that this particular bogeyman is not completely a figment of our collective imagination.  In Syria and Iraq, it’s a lot worse than that.

But realize that, here in America, by being afraid of a hypothetical attack by a gang of faceless, radical Muslims, you are—by definition—letting the terrorists win.

Not to get too cute or cliché, but the object of terrorism is to generate terror.  For the jihadist, committing random mass murder is the means, not the ends.  Whenever a follower of ISIS or al Qaeda opens fire in a crowded marketplace or plants a bomb on a city bus, the point isn’t merely to kill a bunch of people; rather, it’s to make everyone else nervous about entering a marketplace or boarding a bus, because, hey, they might be next.

George W. Bush was absolutely right to say that the best way to fight back is to continue going about our lives as if nothing has changed.  In the most fundamental sense, nothing has:  America remains an exceptionally open society in which all citizens can come and go as they please.  Our economy and armed forces continue to be the envy of the world.  The First Amendment is in such strong shape that a private business denying service to gay people is now considered a form of free expression.  And—sorry to be so repetitive—the likelihood of being personally affected by terrorism is all but microscopic.

To be sure, the government does not have the same luxury as individuals to adopt such a blasé attitude toward the global struggle against violent extremism (or whatever you want to call it).  Having the means to actually disrupt organized crime originating in the Middle East, our military and intelligence agencies are obligated to take the ISIS threat seriously, thereby giving us private citizens the freedom to leave our houses every morning with the confidence that we will return in one piece.

But here’s the main point:  There’s absolutely no reason why we shouldn’t adopt this optimistic attitude anyway.  There is much our government can do to keep us safe, but there is just as much that it can’t.  Islamic terrorism—like Christian terrorism—cannot be eliminated completely.  More perpetrators will fall through the cracks and more innocent people will be killed.

But so what?  There’s very little we civilians can contribute to this struggle—other than the whole “see something, say something” initiative, which has produced mixed results—so where’s the stock in being terrified?  Death itself is unavoidable, and death by terrorism is on roughly the same plane of probability as death by asteroid—and nearly as futile to prevent in advance.

What we should do, then, is take a cue from Franklin Roosevelt, who in January 1941 outlined the “four freedoms” to which all inhabitants of the Earth should be entitled.  While he merely plagiarized from the First Amendment for two of them—“freedom of speech” and “freedom of worship”—and paraphrased the Constitution’s preamble for the third—“freedom from want”—the fourth was an invention all his own:  “freedom from fear.”

Whatever such a concept meant at the outset of World War II—a reduction in global arms, mostly—today we can accept it as a right we grant to ourselves:  The freedom to go about our lives as if they were actually controlled by us.

Speak No Evil

Thomas McCarthy’s new movie Spotlight has been likened, in both form and quality, to Alan J. Pakula’s All the President’s Men.  The comparison may at first seem like a cliché, but after seeing both movies this past weekend—the latter for the fourth or fifth time—I realize the connection is both unavoidable and entirely germane.

When you get right down to it, Spotlight isn’t a companion to All the President’s Men so much as a remake.  While the two films are by no means identical—they take place in different cities at different times and have completely different plots—their agendas are one and the same, and they succeed for exactly the same reasons.

The agenda, then, is to demonstrate how justice and democracy cannot exist anywhere without freedom of the press, and how investigative journalism itself is a long, difficult, boring process that—counterintuitively and against all common sense—makes for positively riveting cinema.

It’s easy enough to talk about the preeminence of the First Amendment and of speaking truth to power, but Spotlight goes a step further by showing us how near-impossible that task really is—even for the most well-equipped and widely-circulated newspaper in town.

McCarthy’s movie—in case you’ve been kept out of the loop—is about how the Boston Globe in 2001 uncovered evidence of rampant sexual abuse of children within the Roman Catholic Archdiocese of Boston, which the church’s leadership—up to and including Archbishop Bernard Law himself—spent decades covering up.  All told, some 250 Boston-area priests were alleged to have sexually molested young boys and girls, with the “official” victim count at 552.  God knows what the true figure really is.

Like Pakula’s film, which famously recounted the Washington Post’s investigations into Watergate, Spotlight features a small group of unknown reporters undertaking a methodical, comprehensive, long-shot project to reveal that a gargantuan and revered American institution is rotten to the core.  In All the President’s Men, that institution was the federal government during the Nixon administration.  In Spotlight, it’s the Catholic Church.

The bottom line—the implied moral to the story—is that were it not for the Spotlight team’s exhaustive and heroic efforts, the Globe’s revelations about predatory priests and a corrupt hierarchy could very easily have remained a secret forever, denying justice and any kind of closure to an entire generation of molested kids, not to mention all the generations that came before.

To this conclusion, one might naturally ask, “How?”  How could that many children be raped—physically and emotionally—without a single one of them speaking up and being heard?

Spotlight’s answer:  Many of them did speak up, but nobody wanted to listen.

Fourteen years after the fact, we now know beyond doubt that the Catholic Church  in Boston engaged in a conspiracy of silence on the epidemic of sexual abuse, moving problem priests from one parish to another while saying nothing publicly about what those priests were up to.  (It was as a direct consequence of the Boston revelations that similar scandals came to light in virtually every Christian country on Earth.)

What we didn’t know—at least not as fully as we should have—is that the archdiocese was not the only player in this terrible drama.  Far from acting alone, Cardinal Law and his gang received a crucial assist from the people in the pews:  those in the Catholic community who loved and respected the Church and would never, for a moment, have entertained the possibility that a member of this institution could do something wrong—let alone something criminal and obscene.

For millennia, clergymen of all faiths have served (often rightly) as the most trustworthy members of society—men of education, wisdom and unimpeachable moral fiber.  It didn’t hurt that, in the case of Catholicism, these men also had God on their speed dial and could invoke divine punishment or reward to bend their parishioners to their will.

And so whenever there was a whisper about some priest doing this or that to the altar boys under his tutelage, most Catholics—including a few who worked at the Globe—dismissed the allegation before the thought could even settle into their minds.

Like a parent who hears that his son is dealing drugs or a Patriots fan who learns that Tom Brady was doing something fishy with those footballs—or, indeed, an idealistic citizen who views the government as a benevolent force for good—churchgoers could not bring themselves to see what was directly in front of their nose

They didn’t want it to be true, so they convinced themselves it was false.

That, in so many words, is what journalism is for:  To tell you what you’d rather not hear.  Reporters have resources and privileges that ordinary citizens do not, which makes it inevitable that the press will disappoint your rosy views of humanity every now and again.

As such, it also means that news outlets will forever be on the front lines in the battle for truth, justice and accountability.  Reporters and editors have no professional obligation except to find out what the hell’s going on.  That’s why movies about newspapermen tend to be so entertaining:  Sometimes fact really is more compelling than fiction.  In a moral universe, it’s the only thing that matters.

As with all major institutional scandals, the details mean everything.  The triumph of Spotlight is that it allows the survivors of the Church rape epidemic to have their day in court—that is, to explain precisely what being held captive by a priest entailed—along with those who had everything to lose from the publication of this story, from the archbishop to the priests to the family members who chose to look the other way.

In so doing, the film shows how the process of newsgathering is inherently a dreary, depressing, often hostile endeavor in which powerful forces will try everything they can to prevent you from doing your job.  A job, we might add, that depends overwhelmingly on cooperation from the public, which includes those same institutions.  The term “Gordian Knot” leaps oddly to mind.

To break the Watergate caper, Carl Bernstein and Bob Woodward needed to draw connections between key White House figures—“follow the money,” they were sagely told—and they did so by extracting information from witnesses who had every reason to keep their insights to themselves.  Some of this involved misdirection and sleight of hand—asking a witness to “confirm” information you don’t actually know is always a neat trick—some involved dumb luck, and some required nothing except patience, asking the right questions and a near-pathological refusal to take “no” for an answer.

With Spotlight, it’s déjà vu all over again.  The Globe team, working separately and together, accumulates its information through a combustible mixture of instinct, legal wrangling, library basement research and good old-fashioned interviewing.  A late-inning confrontation between Michael Keaton and a representative for child victims is a virtual carbon copy of Dustin Hoffman’s run-in with a Florida lawyer with a cabinet full of crucial documents.  In both instances, the reporter explains that his paper is running the story with or without this person’s cooperation, so he might as well stand on the right side of justice.

Long story short (too late?):  Revelatory, in-depth reporting does not happen by accident.  It’s the result of thousands of man hours of detective work—and all the court orders, dead ends, slammed doors and wrecked lives that go with it—and when is it done carefully and seen through to the end, it can change the world.

The Globe would eventually print more than 600 articles in connection with the Church abuse tragedy, for which the paper was awarded a Pulitzer Prize in 2003.  It’s worth noting that the award—the most prestigious in all of American journalism—was not in the category of “Investigative Reporting,”  “Explanatory Reporting” or “Local Reporting,” although any of those would surely have fit the bill.

Rather, the Pulitzer Prize Board saved the Globe’s output for its most distinguished category of all:  “Public Service.”

Republican Holy War

The problem isn’t that Ben Carson wouldn’t vote for a Muslim president.

The problem is that few other Republicans would, either.

The problem isn’t that Donald Trump dignified the insane anti-Islam rants of some random crank.

The problem is that a massive chunk of all GOP voters share those same toxic views.

It would be bad enough if the men representing one of America’s two major political parties happened to be a bunch of xenophobic cretins.  But it’s worse than that because, as it turns out, a plurality of their fans are, too.

In other words, the GOP primary’s rank bigotry isn’t a bug.  It’s a feature.

Nor is the party’s contempt for certain Americans limited to Muslims.  At various junctures, Republican candidates have demonstrated robust, unchained hostility toward immigrants, women, homosexuals and unbelievers, among others.  And their supporters have followed them every step of the way.

Not all of them, of course.  Perhaps not even a majority.

But if there is any measurable difference between Democrats and Republicans, it is that the latter are significantly more likely to harbor open suspicion and disapproval of minorities—individually and collectively—on the basis of their minority status.

In a recent Gallop poll, we find that while 73 percent of Democratic respondents would vote for a qualified presidential candidate who happened to be Muslim, only 45 percent of Republican respondents would do the same.  Similarly, although 85 percent of Democrats would vote for a gay candidate, only 61 percent of Republicans would as well.  For an atheist candidate, the party split was 64 percent versus 45 percent, respectively.

While those numbers are nothing for either faction to brag about, the gulf between the two is unmistakable, and it leads us to a fairly obvious conclusion:  As it currently stands, the Republican Party is a one-stop shop for paranoia, hatred and prejudice toward anyone who seems even slightly foreign to some preconceived, mythical idea of what makes someone a “real American.”

Yes, many self-identified Republicans are sane, decent folks.  Yes, there are many components of GOP dogma that have nothing to do with shunning minorities and other undesirables.  Yes, conservatism itself is still a perfectly legitimate means of thinking about the world.

And yet I wonder:  Why are there any “moderate Republicans” left?  At this point, isn’t that phrase a contradiction in terms?

Case in point:  If you happen to think that all Muslims are terrorists and all gays are perverts, then it makes perfect sense that you would align with today’s GOP.  Their values are your values.

But if you don’t think those things—if you find the denigration of entire classes of people to be juvenile, unattractive and dangerous—then why would you throw in with a political party that loudly and proudly does?

Notwithstanding whatever else you might believe—say, about taxes or foreign policy—why would you join arms with an organization that—at least in its presidential candidates—has adopted enmity and ignorance as its defining characteristics?  What’s the appeal in belonging to a gang so fundamentally unappealing?  After all, you can always vote for Republicans without being one yourself.

The explanation, I suppose, is roughly the same as why so many Catholics remain committed to their church, in spite of its history of raping innocent children and using every means necessary to cover it up.

That is:  Many people are quite skilled at keeping utterly contradictory ideas in their heads and somehow still getting through the day.  They compartmentalize, embracing virtue while ignoring or overlooking vice.

And in the end, it is religion where the Republican Party exerts its most breathtaking feats of hypocrisy and self-deception.

In fact, Ben Carson’s infamous rumination on Meet on the Press about the dangers in electing a Muslim president contained the most telling statement any candidate has yet made on the subject of mixing religion and politics.

To the question, “Should a president’s faith matter?” Carson responded, “I guess it depends on what that faith is.”  As far as most Republican candidates are concerned, that’s exactly right.

The GOP fashions itself as the champion of religious freedom—defender of the clause in the First Amendment that says, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.”

Don’t believe it for a minute.  The GOP would love Congress to make a law respecting the establishment of religion, and the only religion its leaders are interested in exercising freely is their own.

When that ridiculous Kentucky clerk refused to issue marriage licenses to gay couples because she is personally opposed to same-sex marriage, she informed the media that “God’s law” takes precedence over man’s law, and when certain Republicans defended her willful disregard of the latter, they defined her “struggle” precisely in terms of a religious war.

How often we have heard—from nearly every major and minor candidate—that Christianity is “under attack” and being “criminalized” because those who don’t believe in gay marriage—ostensibly for Biblical reasons—now have to grin and bear the fact that the Supreme Court has ruled against those beliefs.  Mike Huckabee, the self-appointed leader of the cause, said, “No man […] has the right to redefine the laws of nature or of nature’s God.”

I wonder:  What exactly is the difference between that statement and Sharia law?  The latter, of course, is the idea—popular in the Middle East—of running a legal system based on teachings in the Quran and other Islamic holy works, rather than on any precepts devised by man.

In principle, there is no difference at all.  Huckabee and the king of Saudi Arabia apparently agree that the word of God is more important than the rule of law, and that an individual’s own religious convictions can and should overrule any rule that comes into conflict with them.

And yet—amazingly—it is these same cultural conservatives who attack and condemn Sharia law at every opportunity, insisting that some nefarious Islamic cabal is secretly plotting to bring Sharia to the United States and is this close to succeeding and—my God!—what a horrible world it would be if America became an oppressive, Bronze Age theocracy.

Read those last few paragraphs again and tell me this isn’t the most spectacular double standard in recent American politics.  Taking them at their word, GOP leaders evidently think that religion in the public square is both good and bad, that holy books are simultaneously more and less authoritative than the Constitution, and that Christians—who represent 70 percent of the U.S. population—are under threat, while Muslims—who are less than 1 percent—are on the verge of taking over the whole damn country.

The logistical cartwheels in this reasoning are enough to give you whiplash.  The term “Schrödinger’s cat” springs curiously to mind.

In reality, though, the thinking is straightforward and simple, and it’s exactly like Ben Carson said:  Christianity good, Islam bad.  God is great, except when his name is Allah.

Once you convince yourself—as Carson and company have—that Islam is fundamentally incompatible with living in a free society like ours and that no individual Muslim could possibly adopt America’s values as his or her own—a self-evidently absurd idea—then it becomes quite easy to make comically hypocritical statements like the above and somehow think you’re being principled and consistent.

But these guys aren’t.  They believe in religious freedom when the religion is Christianity and when the “freedom” involves preventing gay people from leading fulfilling lives.  I’m sure the irony of the latter will sink in sooner or later, although we probably shouldn’t hold our breaths.

In the meantime, we would all do well to remind ourselves that freedom means nothing if it only applies to certain people and that the United States, for all its religious citizens, does not have an official state religion and does not take sides in religious fights.

This did not happen by accident.  In the fall of 1801, a group of Connecticut Baptists sent an urgent letter to the new president, Thomas Jefferson, pleading for protection against religious tyranny by a rival sect.  Jefferson’s famous response, which guaranteed such protection, intoned that “religion is a matter which lies solely between man and his God” and that the Establishment Clause of the First Amendment amounted to “a wall of separation between Church and State.”

As Christopher Hitchens used to say:  Mr. Jefferson, build up that wall.