The Last Laugh

The trouble with being a free speech absolutist (as I am) is that you often find yourself having to defend awful people who say awful things. Even if you truly believe (as I do) that the First Amendment’s guarantee of free expression applies to all manner thereof—not just the pleasant, uncontroversial sort—there will inevitably be moments that test the limits of our most elemental constitutional right, leading reasonable people to wonder if some restrictions on public speech are in order.

This is not one of those moments.

In the first week of January—as you might recall—the United States very nearly started a war with Iran when President Trump ordered the killing of noted general/terrorist Qasem Soleimani. This in turn led the Islamic Republic to bomb U.S. bases in Iraq, at which point Trump threatened, via Twitter, to strike various locations inside Iran, potentially including ones “important to Iran [and] the Iranian culture.”

Ultimately, the administration ruled out the possibility of targeting culturally significant sites—presumably after being informed that such a move would constitute a war crime—but not before a gentleman named Asheen Phansey, the director of sustainability at Babson College in Wellesley, Massachusetts, mischievously posted on Facebook:

“In retaliation, Ayatollah Khomenei [sic] should tweet a list of 52 sites of beloved American cultural heritage that he would bomb. Um…Mall of America? Kardashian residence?”

It was a cheap, naughty little quipsomething many of us undoubtedly thought and/or said ourselvesbut it evidently rubbed someone at Babson (Phansey’s employer) the wrong way, because four days later, Phansey was fired.

In a statement, the school wrote, “Babson College condemns any type of threatening words and/or actions condoning violence and/or hate. This particular post from a staff member on his personal Facebook page clearly does not represent the values and culture of Babson College,” adding, “[W]e are cooperating with local, state and federal authorities.”

In the weeks since Phansey’s sacking, there has been considerable pushback against Babson by the likes of Salman Rushdie, Joyce Carol Oates and PEN America. Thus far, however, Phansey has not been rehired, nor has the college shown any interest in doing so.

Speaking as someone who lives nearby and has attended on-campus events every now and again, I would advise Babson to offer Phansey his old job back and for Phansey to reject it out of hand. An institution so idiotic as to fire an employee for making a joke is unworthy of Phansey’s talents, whatever they might be.

I wrote at the outset that, as a First Amendment issue, this one is not a close call. Viewed in its full context—or, I would argue, in any context whatsoever—Phansey’s Facebook post was very obviously written in jest—an ironic commentary about a serious, if absurd, world event. In categorizing a knock on the Kardashians and the Mall of America as “threatening words and/or actions,” Babson seems to imply that it can’t distinguish humor from sincerity, begging the question of how it ever achieved accreditation in the first place.

More likely, of course, is that Babson was simply intimidated by the bombardment of complaints it apparently received following Phansey’s original post (which he swiftly deleted) and decided it would be easier and more prudent to cave in to the demands of a mob and fire Phansey on the spot, rather than defend Phansey’s right—and, by extension, the right of any faculty member—to comment on news stories in his spare time.

It was a terrible, stupid decision for Babson to make, and its silence in the intervening days has only brought further dishonor upon an otherwise sterling institute of higher learning. While it is undeniably true that a private employer has the discretion to dismiss employees who have violated official policy—the First Amendment’s provisions are explicitly limited to the public sector—the notion that making a mildly off-color remark on one’s own Facebook page constitutes a fireable offense is a horrifying precedent for a college to set, and is among the most egregious examples yet of the general squeamishness on so many American campuses toward the very concept of free expression.

As a cultural flashpoint, I am reminded of the (much larger) brouhaha surrounding Kathy Griffin in 2017, when an Instagram photo of the comedienne hoisting what appeared to be the severed head of President Trump led to Griffin being treated as a national security threat by the Justice Department and effectively banished from society for the better part of a year.

As with Phansey, no honest person could look at Griffin’s gag and say it was anything other than gallows humor—albeit an exceptionally tasteless manifestation thereof. We might agree, on reflection, that Griffin should’ve thought twice before tweeting an image of a bloodied Trump mask out into the world—to paraphrase Rabbi Mendel, not everything that is thought should be instagrammed—but there is a giant chasm between being a low-rent comedian and being a threat to public safety, and I am very, very worried that our politically correct, nuance-free culture is increasingly unable and/or unwilling to separate one from the other.

In short, we are slowly-but-surely devolving into a society without a sense of humor, a tendency that—if I may allow myself a moment of hysterical overstatement—is a gateway drug to totalitarianism. A nation without wit is a nation without a soul, and a culture that doesn’t allow its citizens to make jokes without fear of losing their livelihoods is one that has no claim to moral superiority and no right to call itself a democracy.  What a shame that our universities, of all places, haven’t quite figured this out yet.

Only Words

Bitch, please.

If a certain civic-minded Massachusetts resident has her way, that two-word phrase could someday cost me as much as a speeding ticket—and without the adrenaline rush that comes from barreling down the Mass Pike at 88 miles per hour.

As reported last week in the Boston Globe, back in May a state representative from Boston, Daniel Hunt, submitted a bill in the Massachusetts legislature, titled, “An act regarding the use of offensive words,” which stipulated that a person who “uses the word ‘bitch’ directed at another person to accost, annoy, degrade or demean the other person shall be considered to be a disorderly person” and fined up to $200. The text went on to say that “a violation […] may be reported by the person to whom the offensive language was directed at or by any witness to such incident.”

Commentators across the political spectrum have been having a lot of fun with this story in the days since it broke, pointing out—among other things—that any bill regulating the use of certain words in the public square would be in direct conflict with the First Amendment’s guarantee of free expression and would presumably be ruled unconstitutional by every court in the land.

What’s more, it turns out that the bill in question was conceived and drafted not by Representative Hunt himself, but rather by an unnamed constituent of his, who submitted the proposal through a “direct petition” process that Massachusetts—unique among the 50 states—allows its residents to use. In short, Hunt moved this bill forward because the Massachusetts Constitution required it, not because he necessarily thinks it’s a good idea or will ever even be put to a vote.

And of course, it’s not a good idea to levy fines on American citizens for uttering four (or five)-letter words, however objectionable they might be, nor does any measurable chunk of the citizenry seem to think otherwise. While “swear jars” may work fine within one’s own home—particularly when small children are nearby—enshrining the concept into law would prove problematic at best and unenforceable at worst. Call me insensitive, but the prospect of adult women running to the nearest cop saying, “That man over there just called me a bitch!” doesn’t seem like the best use of our law enforcement officers’ time.

Having said that—in my capacity as a First Amendment near-absolutist, I might add—I cannot help but admire the underlying feminist sentiment behind the anti-“bitch” bill—in particular, its precision in only punishing those who utter the B-word “to accost, annoy, degrade or demean.” It would not, for instance, target an individual who spontaneously yells “Son of a bitch!” to no one in particular, nor (one imagines) would it proscribe the playing of Elton John’s “The Bitch is Back” or Nathaniel Rateliff’s “S.O.B.” at some public gathering.

In other words, at least this pipe dream of an anti-obscenity law acknowledges the existence of nuance and context in the language we use. It allows—if only implicitly—that so-called “bad words” are not inherently objectionable, but rather are expressions of objectionable ideas under certain, limited circumstances—a distinction George Carlin made on his comedy album Class Clown in 1972 and which the rest of America has been stumbling toward ever since.

I mention this in light of a recent incident at a high school in Madison, Wisconsin, where an African-American security guard was fired for telling a student not to call him the N-word after said student called said security guard the N-word 15 times in rapid succession.

I will repeat that: A teenager spewed the word “nigger” repeatedly at a black security guard, the guard repeated the word in order to explain its inappropriateness, and the guard lost his job.

And we claim to live in an enlightened society.

Happily, the officer in question, Marlon Anderson, has since been rehired following an extremely public backlash. But the “zero tolerance” reasoning behind his firing remains endemic in the American body politic, which insists on removing all complexity from a given situation in favor of lazy, politically correct pablum. As a wise man once said, zero tolerance equals zero thinking.

To be sure, the N-word can be considered a special case—a term that, for obvious historical reasons, need not pass the lips of anyone under any circumstances—and that while Anderson’s initial sacking was plainly an overreaction, it was done in the spirit of enforcing a total and justifiable taboo against a word that is inextricably synonymous with the ugliest and most irredeemable facet of the American character.

And yet the N-word has very definitely retained a certain ironic cache within the African-American community, and as a Privileged White Person, I feel more than a little silly discouraging black people from reclaiming and re-appropriating it in order to blunt its toxic impact, much as the LGBT community has assumed ownership of words like “faggot” and “queer” and—come to think of it—women have with the word “bitch,” among others.

The solution, in short, is to treat each other with due respect as individualized human beings, avoiding hurtful stereotypes whenever possible, while also recognizing that the negative power of offensive words derives not from the literal stringing-together of a series of sounds, but rather from a complicated matrix of ideas and prejudices that will regrettably endure long after the current verbal manifestations of them have been vanquished from the American lexicon.

Ain’t that a bitch?

A Grand Compromise

Last Wednesday, a 19-year-old lunatic opened fire at a Florida high school, killing 17 students and teachers and wounding several others.  This Valentine’s Day massacre was the 30th mass shooting in the United States so far this year, and the most deadly.

As our fellow citizens raced into their predicable opinion bubbles, ruminating on how to properly react to yet another instance of pointless American carnage, one sentiment struck me with particular force:  “If you oppose gun control, you can’t call yourself pro-life.”

On the one hand, an assertion like that speaks for itself.  Guns equal death; therefore, to foster life, eliminate the guns.  Surely the “pro-life” movement, whose entire platform is based on protecting the young and vulnerable, can appreciate this as well as anyone.

And yet, unfortunately, the world is more complicated than that, if only because of the apparently intractable politics that have enabled America to become the most trigger-happy advanced nation on Earth.  Even when overwhelming majorities of the public support certain basic changes to who gets to own deadly weapons in this country—and who doesn’t—the financial tyranny of the NRA over our elected officials guarantees a bloody status quo on guns for many years to come.

Into this breach, I offer a modest proposal:  Repeal the Second Amendment once and for all, and in exchange, allow the Supreme Court to overturn Roe v. Wade.

That’s right:  I’m suggesting a good old-fashioned trade-off whereby two groups claiming the mantle of “pro-life” can put their money where their mouth is, and two major issues can be addressed in one fell swoop.

Obtuse as it may sound, there is a certain symmetry in tethering gun rights to abortion rights.  After all, both are rooted in core constitutional principles—the former in the aforementioned Second Amendment; the latter in the Fourteenth.  Both involve the direct, deliberate taking of human life, sometimes for morally dubious reasons.  Both provoke deep, painful and ultimately irresolvable debates about what it means to be a free American.  Finally, both hinge on the question of federalism and what it would mean, practically-speaking, if we were to radically decentralize certain rights we have heretofore regarded as (to coin a phrase) inalienable.

Of course, we’ll never find out the exact answer to that question, since neither the Second Amendment nor Roe v. Wade will be disappearing any time in the foreseeable future—a fact that leaves me half-relieved and half-depressed (not necessarily in that order).

All the same, having witnessed lawmakers’ shameful abdication of leadership in the teeth of one heinous—and utterly preventable—mass shooting after another, I have reached the dispiriting conclusion that our national epidemic of gun violence will never abate unless and until we decide, as a people, that there shouldn’t be a right to bear arms in the first place.  While such seemingly obvious fixes as an assault weapons ban or robust background checks would undoubtedly save countless lives, neither addresses the fundamental collective psychosis that is Americans’ fetishization of hand-held killing machines, for which the Second Amendment provides both legal and cultural cover.

Were I to become king, I would gut the Second Amendment tomorrow and hurl every firearm into a volcano.  However, since I am not king and we live in a republic, I recognize that, one way or another, effecting truly transformational gun reform will come at a price—and a painful one at that.  In a country with such wildly divergent views of liberty and freedom and right and wrong, no major ideological settlement can be made cleanly or simply:  There must be a fight, and both sides must be prepared to give at least as much as they are hoping to take.

It’s hard to believe today, but this was something that America used to be able to accomplish.  Indeed, look closely enough and you’ll notice a large chunk of modern American life came about through incongruous—if not outright ludicrous—grand compromises, many of them sealed in proverbial smoke-filled rooms or around dinner tables in between bottles of Port.  Think Jefferson, Madison and Hamilton agreeing to establish a national bank in exchange for moving the capital from New York to Virginia.  Or the Compromise of 1850, which gave us the horrid Fugitive Slave Act, but also California.  (The former was eventually repealed.  The latter, not yet.)  Or the fact that the Constitutional Convention itself gifted us a bicameral legislature, with one house favoring small states and the other favoring large ones.

That was then.  Now, of course, we are represented by a Congress that can’t seem to pass laws everyone likes, let alone ones that divide America straight down the middle.  Because our body politic has become so irretrievably tribal—so blindingly partisan, so stubbornly zero-sum—the very notion of compromise has increasingly been conflated with weakness, capitulation and ideological selling-out, rather than for what it actually is:  the only known way to run a goddamned country.

Hence the rank impossibility of a comprehensive immigration deal—something that could be resolved in an hour if Democrats merely agreed to fund a wall along the Mexican border.  Hence the absence of a plan to strengthen Obamacare, which the GOP prefers to cripple out of spite than make work for its own constituents.

Our leaders would rather get nothing than give their opponents anything, and we are all living with the consequences.  It would be a terribly unfair quandary for this great country to find itself in, except for the pesky fact that every one of those representatives was democratically elected by us, the people.  This is what we wanted, folks, and the madness will continue until we choose—say, on November 6—to make it stop.

The Limits of Loyalty

Is loyalty a virtue or a sin?  Does the world need more of it, or less?

Donald Trump, in a controversial speech to the Boy Scouts of America on Monday, endorsed the former in no uncertain terms, rambling to the gathering of thousands of teenage boys, “As the Scout Law says, ‘A scout is trustworthy, loyal’—we could use some more loyalty, I will tell you that.”

The subtext of this remark was clear enough to anyone paying attention to current events.  Throughout the past week, the president has been very publicly steaming about Attorney General Jeff Sessions, whom Trump feels betrayed him by recusing himself from the administration’s Russia imbroglio—and also, apparently, by not investigating Hillary Clinton for God knows what.  In an ongoing series of tweets, Trump has tarred Sessions as “beleaguered” and “VERY weak,” effectively goading him into resigning, lest the abuse continue indefinitely.

The implication—or explication, as the case may be—is that Sessions’s duty as America’s chief law enforcement officer is to protect Donald Trump from the law, not to defend the law against those who violate it, up to and including the commander-in-chief himself.  As Trump made plain in an interview with the New York Times, his hiring of Sessions was predicated on the AG serving the president—not the Constitution.

But then it’s not only Sessions who has found himself the object of Trump’s wrath on the question of absolute allegiance.  Let’s not forget James Comey, the former director of the FBI, who famously met with the president in January, when the latter said, point-blank, “I need loyalty; I expect loyalty.”  Comey’s eventual sacking—like Sessions’s, should it occur—was the result of being insufficiently faithful to the man in the Oval Office.  Of daring to think, and act, for himself.

As someone who has never been leader of the free world—nor, for that matter, held any position of real responsibility—I must confess that I remain skeptical about the value of unconditional submission in one’s day-to-day life and generally regard free agency as the far superior of the two virtues.  Indeed, I would argue (to answer my own question) that “virtue” might be altogether the wrong word to use in this context.

When thinking about loyalty, the question you must ask yourself is:  What, exactly, am I being loyal to?  Is it to a set of principles, or to another human being?  And if you are merely dedicating yourself to a person, what has he or she done to deserve it, and what, if anything, will you be getting in return?

Certainly, the spectacle of Trump demanding total fealty to Trump is the most extreme—and most cartoonish—manifestation of this latter category, since the president has shown minimal interest in reciprocating whatever devotion happens to come his way.  Except with members of his immediate family (so far, anyway), Trump’s modus operandi is to ask for everything and give nothing back.  Part and parcel of being a textbook sociopath, Trump views his fellow humans purely as a means to an end and rarely, if ever, stops to think how he might make their lives easier in the process.  It does not occur to him to treat people with respect for its own sake.  If anything, he views empathy as a sign of weakness.

This behavior may well represent an abuse and perversion of an otherwise useful human trait, but that hardly makes a difference when considering the enormous political power of the man doing the perverting.

Which brings us—by way of analogy—to Adolf Hitler.

In Germany, beginning in 1934, all members of the armed forces were required to swear a solemn oath—not to Germany, mind you, but to the man at the top.  This vow, or Reichswehreid, read, in part, “To the Leader of the German Empire and people, Adolf Hitler, supreme commander of the armed forces, I shall render unconditional obedience and […] at all times be prepared to give my life for this oath.”  As you might’ve guessed, soldiers who refused to comply tended not to live very long.

If that seems like an extreme and sui generis example of a personality cult run amok, let me remind you of the moment in March 2016 when, at a campaign rally in Florida, Donald Trump implored his adoring crowd to raise their right hands and pledge, “I do solemnly swear that I—no matter how I feel, no matter what the conditions, if there’s hurricanes or whatever—will vote […] for Donald J. Trump for president.”

While a stunt like that doesn’t exactly sink to the depths of the Hitler oath—Trump wasn’t about to jail or murder anyone who opted out—it is nonetheless a profoundly creepy thing for a presidential candidate in a democratic republic to say—particularly when you recall that Trump once reportedly kept an anthology of Hitler’s speeches at his bedside table.  This for a man who can otherwise go years without reading a single book.

That Trump evidently views Hitler as some sort of role model—and is haphazardly aping the Führer’s stylistic flourishes on the campaign trail—ought to give us serious pause about where his own fidelity lies—is it to the nation or himself?—and about whether his pronouncement at the Republican National Convention that he—and he alone—is capable of steering America forward was less an expression of supreme confidence than a barely-veiled threat against those who doubt that a serially-bankrupt con artist is the best man to preside over the largest economy in the world.

The problem, you see, is not that Trump is Hitler.  (He’s not.)  The problem is that he wants to be Hitler—and Mussolini and Saddam Hussein and Vladimir Putin and every other national figurehead who has managed to wield near-absolute authority over his citizenry—often with sarcastically high approval ratings and totally unburdened by the institutional checks and balances that America’s founders so brilliantly installed in 1787.

While Trump’s ultimate ambitions might not be as violent or imperial as those of the men I just listed—in the end, he seems to care about little beyond self-enrichment—the central lesson of the first six months of his administration—plus the first 71 years of his life—is that there is nothing he will not try to get away with at least once.  No sacred cow he will not trample.  No rule he will not bend.  No sin he will not commit.  He is a man of bottomless appetites and zero restraint.  Left to his own devices, he would spend his entire presidency arranging meetings—like the one with his cabinet last month—whose participants did nothing but praise him for being the greatest man in the history of the world.  A Kim Jong-un of the West.

Remember:  The sole reason Trump hasn’t already turned the United States into a full-blown banana republic is that he can’t.  Constitutionally-speaking, the only things stopping him from indulging his basest instincts are Congress, the courts and the American public, and we’ve seen how tenuous all three of those institutions can be.  Should the remaining branches of government fulfill their obligations as a check on executive overreach and malfeasance, we’ll be fine.  Should they falter—thereby providing Trump the untrammeled loyalty he demands—we’ll be in for the longest eight years of our lives.

Eye of the Beholder

Can a piece of art ever exist entirely on its own, or is it always tethered to the context of its creation?

For instance, is it possible to listen to the Ring Cycle without remembering that Richard Wagner was an anti-Semitic prick whose music inspired the rise of Hitler?

Can one watch Manhattan—the story of a 42-year-old man’s love affair with a 17-year-old girl—and not be distracted and/or repulsed by the personal life of its writer, director and star, Woody Allen?

As a society, we’ve had a version of this argument many times before, trying to figure out how to separate the art from the artist, while also debating whether such a thing is even desirable in the first place.  (The answer to both:  “It depends.”)

Lately, however, this perennial question has assumed a racial dimension, compelling us to re-litigate it anew—this time with considerably higher stakes.

Here’s what happened.  Over at New York’s Whitney Museum of American Art, the curators of the institution’s 78th biennial—an exhibition of hundreds of contemporary works by dozens of artists—chose to include Open Casket, a semi-abstract painting that depicts the mutilated corpse of Emmett Till, the 14-year-old African-American boy who was tortured and lynched in Mississippi in 1955 for allegedly whistling at a white girl.  (The woman in question later admitted she made the whole thing up, but that’s another story.)

As a painting, Open Casket is arresting, with the oils so thickly layered that Till’s mangled face literally protrudes from the canvas, as if calling out to us from beyond the grave.  As a political statement, it fits comfortably into our uncomfortable era of police brutality and racial unease—a natural, even obvious, choice for any socially conscious art show in 2017.

There was just one little problem:  The creator of Open Casket is white.  Specifically, a Midwestern white woman living in Brooklyn named Dana Schutz.

Upon hearing that a Caucasian had dared to tackle Emmett Till as the subject for a painting, many patrons demanded the Whitney remove Open Casket from its walls, while condemning Schutz for attempting to profit off of black pain—a practice, they argued, that has defined—and defiled—white culture since before the founding of the republic, and should be discouraged at all costs.  The message, in effect, was that white people should stick to their own history and allow black people to deal with theirs.

In response to this brouhaha, the Whitney defended its inclusion of Schutz’s work without directly addressing the race question, while Schutz herself issued a statement that read, in part, “I don’t know what it is like to be black in America.  But I do know what it is like to be a mother.  Emmett was Mamie Till’s only son.  I thought about the possibility of painting it only after listening to interviews with her.  In her sorrow and rage she wanted her son’s death not just to be her pain but America’s pain.”

In other words:  Far from being exploitative or opportunistic, Open Casket is meant as an act of compassion and empathy toward black America from an artist who views Emmett Till’s death as a tragedy for all Americans—not just black ones.

Of course, that is merely Dana Schutz’s own interpretation of her work, and if history teaches us anything, it’s that the meaning of a given cultural artifact is never limited to what its creator might have intended at the time.  The artist Hannah Black, one of Schutz’s critics, is quite right in observing, “[I]f black people are telling her that the painting has caused unnecessary hurt, she […] must accept the truth of this.”

The real question, then, is whether offensiveness—inadvertent or not—is enough to justify removing a piece of art from public view, as Black and others have advocated in this case.

If, like me, you believe the First Amendment is more or less absolute—that all forms of honest expression are inherently useful in a free society—then the question answers itself.  Short of inciting a riot (and possibly not even then), no art museum should be compelled to censor itself so as not to hurt the feelings of its most sensitive patrons, however justified those feelings might be.  Au contraire:  If a museum isn’t offending somebody—thereby sparking a fruitful conversationit probably isn’t worth visiting in the first place.

Unfortunately, in the Age of Trump, the American left has decided the First Amendment is negotiable—that its guarantee of free speech can, and should, be suspended whenever the dignity of a vulnerable group is threatened.  That so-called “hate speech” is so inherently destructive—so wounding, so cruel—that it needn’t be protected by the Constitution at all.  As everyone knows, if there was one thing the Founding Fathers could not abide, it was controversy.

What is most disturbing about this liberal drift toward total political correctness is the creative slippery slope it has unleashed—and the abnegation of all nuance and moral perspective that goes with it—of which the Whitney kerfuffle is but the latest example.

See, it’s one thing if Open Casket had been painted by David Duke—that is, if it had been an openly racist provocation by a callous, genocidal lunatic.  But it wasn’t:  It was painted by a mildly-entitled white lady from Brooklyn who has a genuine concern for black suffering and wants more Americans to know what happened to Emmett Till.

And yet, in today’s liberal bubble factory, even that is considered too unseemly for public consumption and must be stamped out with all deliberate speed.  Here in 2017, the line of acceptable artistic practice has been moved so far downfield that an artist can only explore the meaning of life within his or her own racial, ethnic or socioeconomic group, because apparently it’s impossible and counterproductive to creatively empathize with anyone with a different background from yours.

By this standard, Kathryn Bigelow should not have directed The Hurt Locker, since, as a woman, she could not possibly appreciate the experience of being a male combat soldier in Iraq.  Nor, for that matter, should Ang Lee have tackled Brokeback Mountain, because what on Earth does a straight Taiwanese man like him know about surreptitious homosexual relationships in the remote hills of Wyoming?  Likewise, light-skinned David Simon evidently had no business creating Treme or The Wire, while Bob Dylan should’ve steered clear of Hattie Carroll and Rubin Carter as characters in two of his most politically-charged songs.

Undoubtedly there are some people who agree with all of the above, and would proscribe any non-minority from using minorities as raw material for his or her creative outlet (and vice versa).

However, if one insists on full-bore racial and ethnic purity when it comes to the arts, one must also reckon with its consequences—namely, the utter negation of most of the greatest art ever created by man (and woman).  As I hope those few recent examples illustrate, this whole theory that only the members of a particular group are qualified to tell the story of that group is a lie.  An attractive, romantic and sensible lie, to be sure—but a lie nonetheless.

The truth—for those with the nerve to face it—is that although America’s many “communities” are ultimately defined by the qualities that separate them from each other—certainly, no one would mistake the black experience for the Jewish experience, or the Chinese experience for the Puerto Rican experience—human nature itself remains remarkably consistent across all known cultural subgroups.  As such, even if an outsider to a particular sect cannot know what it is like to be of that group, the power of empathy is (or can be) strong enough to allow one to know—or at least estimate—how such a thing feels.

As a final example, consider Moonlight—the best movie of 2016, according to me and the Academy (in that order).  A coming-of-age saga told in three parts, Moonlight has been universally lauded as one of the great cinematic depictions of black life in America—and no wonder, since its director, Barry Jenkins, grew up in the same neighborhood as the film’s hero, Chiron, and is, himself, black.

Slightly less commented on—but no less noteworthy—is Moonlight’s masterful meditation on what it’s like to be gay—specifically, to be a gay, male teenager in an environment where heterosexuality and masculinity are one and the same, and where being different—i.e., soft-spoken, sensitive and unsure—can turn you into a marked man overnight, and the only way to save yourself is to pretend—for years on end—to be someone else.

Now, my own gay adolescence was nowhere near as traumatic as Chiron’s—it wasn’t traumatic at all, really—yet I found myself overwhelmed by the horrible verisimilitude of every detail of Chiron’s reckoning with his emerging self.  Here was a portrait of nascent homosexuality that felt more authentic than real life—something that cannot possibly be achieved in film unless the men on both sides of the camera have a deep and intimate understanding of the character they’re developing.

Well, guess what:  They didn’t.  For all the insights Moonlight possesses on this subject, neither Barry Jenkins, the director, nor a single one of the leading actors is gay.  While they may well have drawn from their own brushes with adversity to determine precisely who this young man is—while also receiving a major assist from the film’s (gay) screenwriter, Tarell Alvin McCraney—the finished product is essentially a bold leap of faith as to what the gay experience is actually like.

Jenkins and his actors had no reason—no right, according to some—to pull this off as flawlessly as they did, and yet they did.  How?  Could it be that the condition of being black in this country—of feeling perpetually ill at ease, guarded and slightly out of place in one’s cultural milieu—has a clear, if imprecise, parallel to the condition of being gay, such that to have a deep appreciation of one is to give you a pretty darned good idea of the other?  And, by extension, that to be one form of human being is to be empowered to understand—or attempt to understand—the point of view of another?  And that this just might be a good thing after all?

Character Is Destiny

Donald Trump has been president for all of two weeks, yet already he has proved himself the most brazenly Nixonian person to ever sit in the Oval Office—Richard Nixon included.

How much of a paranoid megalomaniac is our new commander-in-chief?  Well, for starters, it took Nixon a full four-and-a-half years to dismiss his own attorney general for failing to carry out the president’s imperial agenda.  Trump?  He took care of that on Day 11.

There’s a classic saying, “History doesn’t repeat itself—but it rhymes.”  Of course, historians love to draw parallels between the past and the present in any case, but the truth is that some connections are so blindingly obvious that we needn’t even bring experts to the table.  We can do the rhyming ourselves, thank you very much.

At this absurdly premature juncture in the life of the new administration, it has become evident—to the shock of no one—that the Trump White House is destined to most resemble Nixon’s in both form and effect, and there may be no surer means of anticipating this West Wing’s machinations—good and bad, but mostly bad—than through a close study of the one that dissolved, oh-so-ignominiously, on August 9, 1974.

In light of recent events, we might as well begin with the Saturday Night Massacre.

In the fall of 1973, President Nixon was drowning in controversy about his role in the Watergate caper, thanks largely to the efforts of Special Prosecutor Archibald Cox.  Suddenly, on October 20, Nixon decided he had had enough and ordered his attorney general, Elliot Richardson, to fire Cox ASAP.  Having promised to respect Cox’s independence, Richardson refused to comply and promptly resigned, as did his deputy shortly thereafter.

Once the dust settled and Cox was finally sacked by Solicitor General Robert Bork (yes, that Robert Bork), it became clear to every man, woman and child in America that the president of the United States was a crook and a scumbag—albeit a cartoonishly sloppy one—and so began the suddenly-inevitable march to impeachment that would end only with Nixon’s resignation in August of the following year.

What’s the lesson in all of this?  For my money, it’s that if the president feels he cannot do his job without depriving America’s chief law enforcement officer of his, something extraordinarily shady is afoot, and it’s only a matter of time before the public—and Congress—demands some manner of accountability.

Cut to the present day, and the constitutional (and humanitarian) crisis that Donald Trump pointlessly unleashed by banning all Syrian refugees from entering the U.S.—along with immigrants from seven Muslim-majority countries—and then firing Acting Attorney General Sally Yates when she proclaimed the order illegal and instructed the Justice Department to ignore it.

For all that differentiates the Saturday Night Massacre from the Muslim ban and its aftermath, both events present a commander-in-chief with an utter, self-defeating contempt for basic rule of law and all institutional checks on his authority.  Just as Nixon believed he could sweep Watergate under the rug by canning its lead investigator, so does Trump think he can essentially wipe out an entire religion’s worth of immigrants from the United States by disappearing any Justice Department official who regards the First Amendment as constitutionally binding.

(Notice how Trump justified the firing of Yates by accusing her of “betrayal”—as if the attorney general’s loyalty to the president supersedes her loyalty to the law.)

Of course, the nice thing about the Constitution is that it exists whether or not the president believes in it (as Neil deGrasse Tyson didn’t quite say).  The trouble—as the nation learned so painfully with Nixon—is that justice can take an awfully long time to catch up to the president’s many dogged attempts to dodge it—especially if he has a gang of willing collaborators in Congress.

In the end, the reason Watergate exploded into a full-blown cataclysm was that Richard Nixon was a fundamentally rotten human being—a callous, cynical, friendless sociopath whose every move was calibrated for political gain and without even a passing consideration for the public good.  For all that he spoke about standing up for the common man, when push came to shove the only person he really gave a damn about—the only person he ever lifted a finger to protect—was Richard Nixon.

Does any of this sound familiar?  You bet your sweet bippy it does.  In the frightfully short time he’s been president, Trump has shown a remarkable knack for mimicking every one of Nixon’s faults—his vindictiveness, he contempt for the press, his insecurity, his dishonesty, his propensity for surrounding himself with racists and anti-Semites—while somehow skirting any redeeming qualities that might make his presidency tolerable, despite all of the above.

Indeed, to the extent that Trump is not the absolute spitting image of America’s all-time champion of corruption, he is demonstrably worse.  After all, Nixon was historically literate, intellectually curious and, from his experience as a congressman and vice president, highly knowledgeable about the nuts and bolts of Washington deal making.  He was a scoundrel, but a reasonably competent one with several major accomplishments to his name.

Can we expect Trump to achieve any sort of greatness in the teeth of his many weaknesses?  If these first two weeks are at all predictive of the next four years, I see no reason to think so.  Whereas Nixon was a gifted strategic thinker with a deep sense of history and geopolitics, Trump has over and over again professed a proud and stubborn ignorance of any matter that does not directly involve himself, and seems to derive all his information about a given subject from the last person he spoke to about it.

The Greeks had it right:  Character is destiny, and there’s just no coming back from a veritable avalanche of fatal flaws.  We can pray all we want that the president will suddenly discover the value of temperance, deliberation and any hint of public virtue, but we’d only be denying a truth that has been staring us in the face from the moment Trump announced himself as a figure of national consequence.  He is who he is, he will never get better, and our only hope is that this new national nightmare won’t last quite as long as the last one did.

Against All Enemies

The election of Donald Trump was arguably the worst disaster to befall the United States since September 11, 2001.  But if you ask what will keep me up at night once Trump assumes power, the answer is:  Whatever disaster comes next.

I say “whatever,” but really, I mean terrorism.  If not a large-scale, years-in-the-making cataclysm like 9/11, then perhaps a series of multi-city, mass-casualty suicide bombings like we’ve seen throughout Europe the last several years:  Barbarous, politically-motivated strikes that, individually, are not destructive enough to bring America to its collective knees but, taken together, have the effect of radicalizing ordinary citizens into seeking extraordinary, extralegal measures to ensure such death and disruption doesn’t become (to use the buzzword of the moment) normalized.

You can see it coming from 100 miles away:  Trump conditions his supporters to view all Muslims with suspicion as potential ISIS recruits.  Then one day, their worst fears are realized when actual radical Islamists commit an actual act of terrorism on American soil.  As a consequence, those citizens who for years have been fed a steady diet of revulsion and contempt toward the entire Islamic faith will feel emboldened to act on those worst instincts.

At the street level, this will inevitably take the form of countless assaults and harassment against any and all perceived “foreigners” by brainless white thugs cloaking themselves in the mantle of “patriotism,” cheered on by fellow white thugs waving the flag of white supremacy.

We know this is what would happen following the next terrorist attack because it’s happening right now in the absence of it:  Every other day, we hear about some Muslim-American or other being targeted by deranged white idiots for the sole crime of reading from the wrong bible and praying to the wrong god.  Never mind that virtually every major act of violence in America since 9/11 has been committed by white Christians; never mind that you’re more likely to be killed by a piece of furniture than a terrorist attack; and never mind that, within the United States, organized Islamic jihad isn’t even remotely a thing.

Nope:  We are now firmly entrenched in a post-fact environment, and there’s no amount of data or common sense that will prevent several million of our dumbest countrymen from viewing several million of their fellow citizens as avowed enemies of our very way of life.

It’s an insane, racist, destructive way to think, and the incoming commander-in-chief has been enabling it every step of the way.

Without much doubt, a Trump administration will be lousy for women, lousy for African-Americans, lousy for gays, lousy for Hispanics and lousy for Jews.  But for my money, it is America’s Muslims who are the most vulnerable group of all, because their “otherness” is so completely (and irrationally) tethered to a gang of murderers 5,000 miles away over whose actions they have absolutely no control.

Like German Jews in the 1930s or the young women of Salem, Mass., in 1692, Muslims have become the designated scapegoats for most, if not all, social unrest in the 21st century, and it is entirely up to us—the non-Muslim majority—to ensure they don’t suffer a similar historical fate.

As with all other heretofore-unthinkable scenarios, we have little cause for complacency on this front.  Never forget:  During the campaign—in response to no specific threat—Trump suggested a blanket prohibition on all Muslims entering the United States “until we know what’s going on,” and also insinuated—albeit in his characteristically slippery, incoherent way—that the government should create some sort of “registry” to keep an eye on Muslims already living in the U.S.  You know, just in case.

The point isn’t whether he really meant it.  As anyone with half a brain ought to know by now, Trump doesn’t really mean anything.

The point—chilling and undeniable—is that, in Trump’s mind, absolutely nothing is out of bounds.  To him, there is no limit to what the president can do for the sake of “national security”:  The ends justify the means, even when the ends themselves are unclear.  Having never read a word of the Constitution, the Bill of Rights, the Geneva Conventions or, for that matter, the Old and New Testaments, he believes himself immune to the institutional checks and basic ethical norms that every other democratically-elected official takes for granted and that serve as the societal glue that holds this crazy world together.

Fundamentally, our next president possesses the mind of a dictator, waking up every morning thinking, “If it can be done, why shouldn’t it be?”

Hence the profound unease we should all feel about how he might behave in an emergency—particularly given our country’s abysmal track record in this department.

Remember:  In response to World War II, Franklin Roosevelt systemically violated the Constitutional rights of 120,000 American citizens in the off-chance they were Japanese sleeper agents—and he is considered the greatest president of the 20th century.  Eight decades earlier, Abraham Lincoln reacted to the Civil War by unilaterally suspending habeas corpus—a highly unconstitutional move that was roundly condemned by the Supreme Court, whose judgment the president then promptly ignored.  And Lincoln was the greatest man in the history of everything.

You don’t think Trump’s advisers have studied up on those cases and are prepared to use them as a pretext for rounding up Muslims en masse in the aftermath of the next big national calamity?  More worrying still:  Are we at all confident that, in a 9/11-like situation, Republicans in Congress will summon the courage to defend America’s core principles and prevent Trump from assuming dictatorial powers from now until the end of time?

They won’t if they live in competitive districts and fear being “primaried” in the next election.  They won’t if they expect to be labeled unpatriotic and “soft on terror” if they dare suggest that not all Muslims pose a national security risk.  And they certainly won’t if there is a groundswell of support from America’s basket of deplorables to turn the world’s greatest democracy into a perpetual police state with the sole objective of making white people feel safe.

It’s a central—and oft-repeated—lesson of world history:  Republics cannot be destroyed except from within.  In 1787, our founders designed a system of government—subject to layer upon layer of checks and balances—that could withstand every imaginable challenge to its viability save one:  The failure of all three branches to uphold it.

On January 20, Donald Trump will raise his right hand and swear an oath to “preserve, protect and defend the Constitution of the United States.”  If his public statements over the last 18 months are any indication, he will probably violate that oath midway through his inaugural address, at which point Congress will need to decide whether it truly values country over party, and whether the principles established in that very Constitution are still worth defending against all enemies, foreign and domestic.

Particularly when one of those enemies is sitting in the Oval Office.