The Last Laugh

The trouble with being a free speech absolutist (as I am) is that you often find yourself having to defend awful people who say awful things. Even if you truly believe (as I do) that the First Amendment’s guarantee of free expression applies to all manner thereof—not just the pleasant, uncontroversial sort—there will inevitably be moments that test the limits of our most elemental constitutional right, leading reasonable people to wonder if some restrictions on public speech are in order.

This is not one of those moments.

In the first week of January—as you might recall—the United States very nearly started a war with Iran when President Trump ordered the killing of noted general/terrorist Qasem Soleimani. This in turn led the Islamic Republic to bomb U.S. bases in Iraq, at which point Trump threatened, via Twitter, to strike various locations inside Iran, potentially including ones “important to Iran [and] the Iranian culture.”

Ultimately, the administration ruled out the possibility of targeting culturally significant sites—presumably after being informed that such a move would constitute a war crime—but not before a gentleman named Asheen Phansey, the director of sustainability at Babson College in Wellesley, Massachusetts, mischievously posted on Facebook:

“In retaliation, Ayatollah Khomenei [sic] should tweet a list of 52 sites of beloved American cultural heritage that he would bomb. Um…Mall of America? Kardashian residence?”

It was a cheap, naughty little quipsomething many of us undoubtedly thought and/or said ourselvesbut it evidently rubbed someone at Babson (Phansey’s employer) the wrong way, because four days later, Phansey was fired.

In a statement, the school wrote, “Babson College condemns any type of threatening words and/or actions condoning violence and/or hate. This particular post from a staff member on his personal Facebook page clearly does not represent the values and culture of Babson College,” adding, “[W]e are cooperating with local, state and federal authorities.”

In the weeks since Phansey’s sacking, there has been considerable pushback against Babson by the likes of Salman Rushdie, Joyce Carol Oates and PEN America. Thus far, however, Phansey has not been rehired, nor has the college shown any interest in doing so.

Speaking as someone who lives nearby and has attended on-campus events every now and again, I would advise Babson to offer Phansey his old job back and for Phansey to reject it out of hand. An institution so idiotic as to fire an employee for making a joke is unworthy of Phansey’s talents, whatever they might be.

I wrote at the outset that, as a First Amendment issue, this one is not a close call. Viewed in its full context—or, I would argue, in any context whatsoever—Phansey’s Facebook post was very obviously written in jest—an ironic commentary about a serious, if absurd, world event. In categorizing a knock on the Kardashians and the Mall of America as “threatening words and/or actions,” Babson seems to imply that it can’t distinguish humor from sincerity, begging the question of how it ever achieved accreditation in the first place.

More likely, of course, is that Babson was simply intimidated by the bombardment of complaints it apparently received following Phansey’s original post (which he swiftly deleted) and decided it would be easier and more prudent to cave in to the demands of a mob and fire Phansey on the spot, rather than defend Phansey’s right—and, by extension, the right of any faculty member—to comment on news stories in his spare time.

It was a terrible, stupid decision for Babson to make, and its silence in the intervening days has only brought further dishonor upon an otherwise sterling institute of higher learning. While it is undeniably true that a private employer has the discretion to dismiss employees who have violated official policy—the First Amendment’s provisions are explicitly limited to the public sector—the notion that making a mildly off-color remark on one’s own Facebook page constitutes a fireable offense is a horrifying precedent for a college to set, and is among the most egregious examples yet of the general squeamishness on so many American campuses toward the very concept of free expression.

As a cultural flashpoint, I am reminded of the (much larger) brouhaha surrounding Kathy Griffin in 2017, when an Instagram photo of the comedienne hoisting what appeared to be the severed head of President Trump led to Griffin being treated as a national security threat by the Justice Department and effectively banished from society for the better part of a year.

As with Phansey, no honest person could look at Griffin’s gag and say it was anything other than gallows humor—albeit an exceptionally tasteless manifestation thereof. We might agree, on reflection, that Griffin should’ve thought twice before tweeting an image of a bloodied Trump mask out into the world—to paraphrase Rabbi Mendel, not everything that is thought should be instagrammed—but there is a giant chasm between being a low-rent comedian and being a threat to public safety, and I am very, very worried that our politically correct, nuance-free culture is increasingly unable and/or unwilling to separate one from the other.

In short, we are slowly-but-surely devolving into a society without a sense of humor, a tendency that—if I may allow myself a moment of hysterical overstatement—is a gateway drug to totalitarianism. A nation without wit is a nation without a soul, and a culture that doesn’t allow its citizens to make jokes without fear of losing their livelihoods is one that has no claim to moral superiority and no right to call itself a democracy.  What a shame that our universities, of all places, haven’t quite figured this out yet.

Gray Lady Splits the Baby

Lest you think I am in any way a well-adjusted individual, last Sunday night—when I could’ve tuned in to the season premiere of “Curb Your Enthusiasm”—I found myself spending an hour with “The Weekly” on FX, in which the New York Times editorial board met with seven of the leading Democratic presidential candidates, one by one, as it decided which one to formally endorse. (Two others were interviewed but not included in the show.) In the end, the Times opted for a choose-your-own-adventure approach to field-winnowing, selecting both Elizabeth Warren and Amy Klobuchar as its preferred nominees, leaving it to readers to figure it out from there.

Given both the import and weirdness of the Times’ verdict, this would seem the ideal moment to reflect on the broader question of how much impact endorsements of office seekers actually have in this third decade of the 21st century: Whether the recommendations of media outlets—newspapers in particular—directly influence people’s votes and, if so, how many.

The premise is sound enough: While ordinary citizens may be too busy or ill-informed to fully understand weighty matters of state and determine which candidates for office are best-equipped to handle them, newspapermen and women devote their lives to exactly that and are presumably experts in their field. Like movie critics, their judgement is theoretically deeper and more informed than yours or mine, and their recommendations—while hardly etched in marble—can serve as a useful exercise in edifying those who wish to be edified.

As to whether this works in practice, the honest answer is that we’ll never know for sure. The act of voting is complicated—the result of a million small considerations congealing into a particular shape at a specific moment in time—and generally not attributable to any one thing. This is especially true for the country’s impressionable swing voters, whose ultimate decision at the ballot box may well be determined by the last TV ad they see or the last tweet they read. To the extent that endorsements play a major—or even ancillary—role in some cases, few voters will explicitly tell a pollster, “I voted for Amy Klobuchar because the New York Times told me to.”

Recalling my own voting history in high-stakes races—which, if you count primaries, include four votes for president, four for governor, five for senator, and two for mayor—I can identify exactly one instance in which a newspaper endorsement actually swayed me from one candidate to the other. It was during the Massachusetts gubernatorial race in 2014, when the Boston Globe—an otherwise left-wing outfit—sided with the Republican, Charlie Baker, over his Democratic opponent, Martha Coakley, on the grounds that Baker, a former healthcare CEO, had proved himself a competent and effective chief executive, while Coakley’s most notable accomplishment was to have lost a U.S. Senate race—in Massachusetts!—to a conservative Republican who wore denim jackets and drove a pick-up truck.

Liberal that I am, it would’ve been the default move to vote for Coakley anyway; her Senate loss notwithstanding, she had served two perfectly respectable terms as the state’s attorney general. However, once the Globe made its case for Baker, I felt as if I had been given permission—and cover—to cross the aisle in favor of the guy who I suspected was, in fact, the stronger choice of the two. Had the Globe gone with Coakley, I doubt I would’ve had the nerve.

Of course, this was all predicated on the aforementioned idea that editorial boards are these faceless, all-knowing philosopher kings, smarter and more dispassionate than us mere mortals, endowed with the wisdom of the ages and concerned solely with the well-being of the republic.

Deep down, we know this isn’t entirely true—indeed, one of the delights of “The Weekly” is to see the Times editorial writers in all their quirky, bumbling glory—and I would be remiss not to mention that only two of the 100 largest American newspapers endorsed Donald Trump in 2016, and look how well that went.  Undoubtedly, the influence of the op-ed section of major publications has been on the wane for quite some time, and the pattern is likely to continue as such.

Nonetheless, for those of us who still read the paper every morning and believe a free press is all that stands between the United States and tyranny, news publications will remain a beacon in the search for truth and justice in the world and a bulwark against the corruptions and obfuscations of public men. If their views on presidential candidates don’t come directly from God and no longer count as the proverbial last word on the matter—if, indeed, they ever did—they should nonetheless be taken seriously and with the deference owed to an institution whose core mission—guaranteed by the First Amendment—is to ensure the survival of liberty and freedom in our society, now more than ever.

In the future, though, it would perhaps be most prudent to endorse only one candidate at a time.

Only Words

Bitch, please.

If a certain civic-minded Massachusetts resident has her way, that two-word phrase could someday cost me as much as a speeding ticket—and without the adrenaline rush that comes from barreling down the Mass Pike at 88 miles per hour.

As reported last week in the Boston Globe, back in May a state representative from Boston, Daniel Hunt, submitted a bill in the Massachusetts legislature, titled, “An act regarding the use of offensive words,” which stipulated that a person who “uses the word ‘bitch’ directed at another person to accost, annoy, degrade or demean the other person shall be considered to be a disorderly person” and fined up to $200. The text went on to say that “a violation […] may be reported by the person to whom the offensive language was directed at or by any witness to such incident.”

Commentators across the political spectrum have been having a lot of fun with this story in the days since it broke, pointing out—among other things—that any bill regulating the use of certain words in the public square would be in direct conflict with the First Amendment’s guarantee of free expression and would presumably be ruled unconstitutional by every court in the land.

What’s more, it turns out that the bill in question was conceived and drafted not by Representative Hunt himself, but rather by an unnamed constituent of his, who submitted the proposal through a “direct petition” process that Massachusetts—unique among the 50 states—allows its residents to use. In short, Hunt moved this bill forward because the Massachusetts Constitution required it, not because he necessarily thinks it’s a good idea or will ever even be put to a vote.

And of course, it’s not a good idea to levy fines on American citizens for uttering four (or five)-letter words, however objectionable they might be, nor does any measurable chunk of the citizenry seem to think otherwise. While “swear jars” may work fine within one’s own home—particularly when small children are nearby—enshrining the concept into law would prove problematic at best and unenforceable at worst. Call me insensitive, but the prospect of adult women running to the nearest cop saying, “That man over there just called me a bitch!” doesn’t seem like the best use of our law enforcement officers’ time.

Having said that—in my capacity as a First Amendment near-absolutist, I might add—I cannot help but admire the underlying feminist sentiment behind the anti-“bitch” bill—in particular, its precision in only punishing those who utter the B-word “to accost, annoy, degrade or demean.” It would not, for instance, target an individual who spontaneously yells “Son of a bitch!” to no one in particular, nor (one imagines) would it proscribe the playing of Elton John’s “The Bitch is Back” or Nathaniel Rateliff’s “S.O.B.” at some public gathering.

In other words, at least this pipe dream of an anti-obscenity law acknowledges the existence of nuance and context in the language we use. It allows—if only implicitly—that so-called “bad words” are not inherently objectionable, but rather are expressions of objectionable ideas under certain, limited circumstances—a distinction George Carlin made on his comedy album Class Clown in 1972 and which the rest of America has been stumbling toward ever since.

I mention this in light of a recent incident at a high school in Madison, Wisconsin, where an African-American security guard was fired for telling a student not to call him the N-word after said student called said security guard the N-word 15 times in rapid succession.

I will repeat that: A teenager spewed the word “nigger” repeatedly at a black security guard, the guard repeated the word in order to explain its inappropriateness, and the guard lost his job.

And we claim to live in an enlightened society.

Happily, the officer in question, Marlon Anderson, has since been rehired following an extremely public backlash. But the “zero tolerance” reasoning behind his firing remains endemic in the American body politic, which insists on removing all complexity from a given situation in favor of lazy, politically correct pablum. As a wise man once said, zero tolerance equals zero thinking.

To be sure, the N-word can be considered a special case—a term that, for obvious historical reasons, need not pass the lips of anyone under any circumstances—and that while Anderson’s initial sacking was plainly an overreaction, it was done in the spirit of enforcing a total and justifiable taboo against a word that is inextricably synonymous with the ugliest and most irredeemable facet of the American character.

And yet the N-word has very definitely retained a certain ironic cache within the African-American community, and as a Privileged White Person, I feel more than a little silly discouraging black people from reclaiming and re-appropriating it in order to blunt its toxic impact, much as the LGBT community has assumed ownership of words like “faggot” and “queer” and—come to think of it—women have with the word “bitch,” among others.

The solution, in short, is to treat each other with due respect as individualized human beings, avoiding hurtful stereotypes whenever possible, while also recognizing that the negative power of offensive words derives not from the literal stringing-together of a series of sounds, but rather from a complicated matrix of ideas and prejudices that will regrettably endure long after the current verbal manifestations of them have been vanquished from the American lexicon.

Ain’t that a bitch?

All That Is Written

“All that is thought should not be said, all that is said should not be written, all that is written should not be published, and all that is published should not be read.”

Those words were coined by a Polish rabbi named Menachem Mendel of Kotz in the middle of the 19th century.  Surely they have never been more urgently needed than in the United States in 2019.

Just the other day, for instance, the venerable Boston Globe published an online op-ed by Luke O’Neil, a freelance columnist, expressing his rather pointed thoughts about the recently-sacked homeland security secretary, Kirstjen Nielsen.  Its throat-clearing opening line:  “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon.”  (Kristol, you’ll recall, was a leading cheerleader for the Iraq War.)

The rest of the column continued in the same vein, castigating Nielsen for her complicity in President Trump’s policy of separating children from their parents at the Mexican border, and advocating for a general shunning of Nielsen from polite society, up to and including doing unsavory things to her food whenever she turns up at a fancy restaurant.

Following a small uproar among its readers, the Globe subsequently re-wrote parts of O’Neil’s piece—cutting out the word “pissing,” among other things—before ultimately removing it from its webpages entirely.  (It never appeared in print in any form.)  All that currently remains of the thing is an editor’s note explaining that the column “did not receive sufficient editorial oversight and did not meet Globe standards,” adding, rather gratuitously, “O’Neil is not on staff.”

Locally, much has been said and written about the Globe’s (lack of) judgment in ever believing an op-ed about poisoning a public official’s dinner—however cheeky—was fit to publish in the first place.  For all of its obvious liberal biases, the Globe opinion page is a fundamentally grown-up, establishmentarian space, suggesting this episode was a bizarre, one-off aberration and nothing more.

The deeper question, however, is what brings an uncommonly thoughtful and clever writer to put such infantile thoughts to paper in the first place.

And I’m not just talking about Luke O’Neil.

Let’s not delude ourselves:  Ever since Secretary Nielsen was hounded from a Mexican restaurant last summer in response to her department’s repugnant immigration policies, every liberal in America has had a moment of silent contemplation about what he or she would do or say to Nielsen given the chance.  That’s to say nothing of her former boss, the president, and innumerable other members of this wretched administration.

Indeed, plumb the deepest, darkest thoughts of your average politically-aware American consumer, and you’re bound to emerge so covered in sludge that you may spend the rest of your life trying to wash it off.

This is why we differentiate thoughts from actions—morally and legally—and why the concept of “thought crime” is so inherently problematic.  Outside of the confessional, no one truly cares what goes on inside your own head so long as it remains there, and most of us have the good sense to understand which thoughts are worth expressing and which are not.

Except when we don’t, and in the age of Trump—with a major assist from social media platforms whose names I needn’t mention—an increasing number of us don’t.

Because it is now possible for any of us to instantaneously broadcast our basest and most uninformed impressions on any subject to the entire world, we have collectively decided—however implicitly—that there needn’t be any filter between one’s mind and one’s keyboard, and that no opinion is more or less valid than any other.  In the Twitterverse, “Let’s expand health insurance coverage” and “Let’s defecate in Kirstjen Nielsen’s salad” carry equal intellectual weight.

As a free speech near-absolutist, I can’t deny the perverse appeal in having no meaningful restrictions to what one can say in the public square.  With political correctness exploding like a cannonball from America’s ideological extremes, it’s heartening to know that reports of the death of the First Amendment have been greatly exaggerated, indeed.

Or it would be—until, say, a newly-elected congresswoman from Minnesota tells a group of supporters, “We’re gonna go in there and we’re gonna impeach the motherfucker,” and suddenly discretion seems very much the better part of valor.

Among the many truisms that life under the Trump regime has clarified is the fact that just because something can be done, it doesn’t mean it should be done.  And the same is true—or ought to be—about how each of us expresses ourselves to the wider world.

I don’t mean to sound like a total prude.  After all, I’m the guy who wrote a column in mid-November 2016 calling the newly-elected president a selfish, narcissistic, vindictive prick, and who tried to cheer my readers up the day after the election by noting that Trump could drop dead on a moment’s notice.

With two-and-a-half years of hindsight, I’m not sure I should’ve written either of those things, not to mention a few other snide clauses and ironic asides here and there ever since.  They weren’t necessary to make my larger points, and like the opening quip in Luke O’Neil’s Globe column, their rank immaturity and meanness only served to cheapen whatever it was I was trying to say.

As someone who claims to be a writer, I try to choose my words carefully and with at least a small degree of charity.  With great powerin this case, the power of wordscomes great responsibility.  And that includes leaving Kirstjen Nielsen’s salmon alone.

Eye of the Beholder

Can a piece of art ever exist entirely on its own, or is it always tethered to the context of its creation?

For instance, is it possible to listen to the Ring Cycle without remembering that Richard Wagner was an anti-Semitic prick whose music inspired the rise of Hitler?

Can one watch Manhattan—the story of a 42-year-old man’s love affair with a 17-year-old girl—and not be distracted and/or repulsed by the personal life of its writer, director and star, Woody Allen?

As a society, we’ve had a version of this argument many times before, trying to figure out how to separate the art from the artist, while also debating whether such a thing is even desirable in the first place.  (The answer to both:  “It depends.”)

Lately, however, this perennial question has assumed a racial dimension, compelling us to re-litigate it anew—this time with considerably higher stakes.

Here’s what happened.  Over at New York’s Whitney Museum of American Art, the curators of the institution’s 78th biennial—an exhibition of hundreds of contemporary works by dozens of artists—chose to include Open Casket, a semi-abstract painting that depicts the mutilated corpse of Emmett Till, the 14-year-old African-American boy who was tortured and lynched in Mississippi in 1955 for allegedly whistling at a white girl.  (The woman in question later admitted she made the whole thing up, but that’s another story.)

As a painting, Open Casket is arresting, with the oils so thickly layered that Till’s mangled face literally protrudes from the canvas, as if calling out to us from beyond the grave.  As a political statement, it fits comfortably into our uncomfortable era of police brutality and racial unease—a natural, even obvious, choice for any socially conscious art show in 2017.

There was just one little problem:  The creator of Open Casket is white.  Specifically, a Midwestern white woman living in Brooklyn named Dana Schutz.

Upon hearing that a Caucasian had dared to tackle Emmett Till as the subject for a painting, many patrons demanded the Whitney remove Open Casket from its walls, while condemning Schutz for attempting to profit off of black pain—a practice, they argued, that has defined—and defiled—white culture since before the founding of the republic, and should be discouraged at all costs.  The message, in effect, was that white people should stick to their own history and allow black people to deal with theirs.

In response to this brouhaha, the Whitney defended its inclusion of Schutz’s work without directly addressing the race question, while Schutz herself issued a statement that read, in part, “I don’t know what it is like to be black in America.  But I do know what it is like to be a mother.  Emmett was Mamie Till’s only son.  I thought about the possibility of painting it only after listening to interviews with her.  In her sorrow and rage she wanted her son’s death not just to be her pain but America’s pain.”

In other words:  Far from being exploitative or opportunistic, Open Casket is meant as an act of compassion and empathy toward black America from an artist who views Emmett Till’s death as a tragedy for all Americans—not just black ones.

Of course, that is merely Dana Schutz’s own interpretation of her work, and if history teaches us anything, it’s that the meaning of a given cultural artifact is never limited to what its creator might have intended at the time.  The artist Hannah Black, one of Schutz’s critics, is quite right in observing, “[I]f black people are telling her that the painting has caused unnecessary hurt, she […] must accept the truth of this.”

The real question, then, is whether offensiveness—inadvertent or not—is enough to justify removing a piece of art from public view, as Black and others have advocated in this case.

If, like me, you believe the First Amendment is more or less absolute—that all forms of honest expression are inherently useful in a free society—then the question answers itself.  Short of inciting a riot (and possibly not even then), no art museum should be compelled to censor itself so as not to hurt the feelings of its most sensitive patrons, however justified those feelings might be.  Au contraire:  If a museum isn’t offending somebody—thereby sparking a fruitful conversationit probably isn’t worth visiting in the first place.

Unfortunately, in the Age of Trump, the American left has decided the First Amendment is negotiable—that its guarantee of free speech can, and should, be suspended whenever the dignity of a vulnerable group is threatened.  That so-called “hate speech” is so inherently destructive—so wounding, so cruel—that it needn’t be protected by the Constitution at all.  As everyone knows, if there was one thing the Founding Fathers could not abide, it was controversy.

What is most disturbing about this liberal drift toward total political correctness is the creative slippery slope it has unleashed—and the abnegation of all nuance and moral perspective that goes with it—of which the Whitney kerfuffle is but the latest example.

See, it’s one thing if Open Casket had been painted by David Duke—that is, if it had been an openly racist provocation by a callous, genocidal lunatic.  But it wasn’t:  It was painted by a mildly-entitled white lady from Brooklyn who has a genuine concern for black suffering and wants more Americans to know what happened to Emmett Till.

And yet, in today’s liberal bubble factory, even that is considered too unseemly for public consumption and must be stamped out with all deliberate speed.  Here in 2017, the line of acceptable artistic practice has been moved so far downfield that an artist can only explore the meaning of life within his or her own racial, ethnic or socioeconomic group, because apparently it’s impossible and counterproductive to creatively empathize with anyone with a different background from yours.

By this standard, Kathryn Bigelow should not have directed The Hurt Locker, since, as a woman, she could not possibly appreciate the experience of being a male combat soldier in Iraq.  Nor, for that matter, should Ang Lee have tackled Brokeback Mountain, because what on Earth does a straight Taiwanese man like him know about surreptitious homosexual relationships in the remote hills of Wyoming?  Likewise, light-skinned David Simon evidently had no business creating Treme or The Wire, while Bob Dylan should’ve steered clear of Hattie Carroll and Rubin Carter as characters in two of his most politically-charged songs.

Undoubtedly there are some people who agree with all of the above, and would proscribe any non-minority from using minorities as raw material for his or her creative outlet (and vice versa).

However, if one insists on full-bore racial and ethnic purity when it comes to the arts, one must also reckon with its consequences—namely, the utter negation of most of the greatest art ever created by man (and woman).  As I hope those few recent examples illustrate, this whole theory that only the members of a particular group are qualified to tell the story of that group is a lie.  An attractive, romantic and sensible lie, to be sure—but a lie nonetheless.

The truth—for those with the nerve to face it—is that although America’s many “communities” are ultimately defined by the qualities that separate them from each other—certainly, no one would mistake the black experience for the Jewish experience, or the Chinese experience for the Puerto Rican experience—human nature itself remains remarkably consistent across all known cultural subgroups.  As such, even if an outsider to a particular sect cannot know what it is like to be of that group, the power of empathy is (or can be) strong enough to allow one to know—or at least estimate—how such a thing feels.

As a final example, consider Moonlight—the best movie of 2016, according to me and the Academy (in that order).  A coming-of-age saga told in three parts, Moonlight has been universally lauded as one of the great cinematic depictions of black life in America—and no wonder, since its director, Barry Jenkins, grew up in the same neighborhood as the film’s hero, Chiron, and is, himself, black.

Slightly less commented on—but no less noteworthy—is Moonlight’s masterful meditation on what it’s like to be gay—specifically, to be a gay, male teenager in an environment where heterosexuality and masculinity are one and the same, and where being different—i.e., soft-spoken, sensitive and unsure—can turn you into a marked man overnight, and the only way to save yourself is to pretend—for years on end—to be someone else.

Now, my own gay adolescence was nowhere near as traumatic as Chiron’s—it wasn’t traumatic at all, really—yet I found myself overwhelmed by the horrible verisimilitude of every detail of Chiron’s reckoning with his emerging self.  Here was a portrait of nascent homosexuality that felt more authentic than real life—something that cannot possibly be achieved in film unless the men on both sides of the camera have a deep and intimate understanding of the character they’re developing.

Well, guess what:  They didn’t.  For all the insights Moonlight possesses on this subject, neither Barry Jenkins, the director, nor a single one of the leading actors is gay.  While they may well have drawn from their own brushes with adversity to determine precisely who this young man is—while also receiving a major assist from the film’s (gay) screenwriter, Tarell Alvin McCraney—the finished product is essentially a bold leap of faith as to what the gay experience is actually like.

Jenkins and his actors had no reason—no right, according to some—to pull this off as flawlessly as they did, and yet they did.  How?  Could it be that the condition of being black in this country—of feeling perpetually ill at ease, guarded and slightly out of place in one’s cultural milieu—has a clear, if imprecise, parallel to the condition of being gay, such that to have a deep appreciation of one is to give you a pretty darned good idea of the other?  And, by extension, that to be one form of human being is to be empowered to understand—or attempt to understand—the point of view of another?  And that this just might be a good thing after all?

Character Is Destiny

Donald Trump has been president for all of two weeks, yet already he has proved himself the most brazenly Nixonian person to ever sit in the Oval Office—Richard Nixon included.

How much of a paranoid megalomaniac is our new commander-in-chief?  Well, for starters, it took Nixon a full four-and-a-half years to dismiss his own attorney general for failing to carry out the president’s imperial agenda.  Trump?  He took care of that on Day 11.

There’s a classic saying, “History doesn’t repeat itself—but it rhymes.”  Of course, historians love to draw parallels between the past and the present in any case, but the truth is that some connections are so blindingly obvious that we needn’t even bring experts to the table.  We can do the rhyming ourselves, thank you very much.

At this absurdly premature juncture in the life of the new administration, it has become evident—to the shock of no one—that the Trump White House is destined to most resemble Nixon’s in both form and effect, and there may be no surer means of anticipating this West Wing’s machinations—good and bad, but mostly bad—than through a close study of the one that dissolved, oh-so-ignominiously, on August 9, 1974.

In light of recent events, we might as well begin with the Saturday Night Massacre.

In the fall of 1973, President Nixon was drowning in controversy about his role in the Watergate caper, thanks largely to the efforts of Special Prosecutor Archibald Cox.  Suddenly, on October 20, Nixon decided he had had enough and ordered his attorney general, Elliot Richardson, to fire Cox ASAP.  Having promised to respect Cox’s independence, Richardson refused to comply and promptly resigned, as did his deputy shortly thereafter.

Once the dust settled and Cox was finally sacked by Solicitor General Robert Bork (yes, that Robert Bork), it became clear to every man, woman and child in America that the president of the United States was a crook and a scumbag—albeit a cartoonishly sloppy one—and so began the suddenly-inevitable march to impeachment that would end only with Nixon’s resignation in August of the following year.

What’s the lesson in all of this?  For my money, it’s that if the president feels he cannot do his job without depriving America’s chief law enforcement officer of his, something extraordinarily shady is afoot, and it’s only a matter of time before the public—and Congress—demands some manner of accountability.

Cut to the present day, and the constitutional (and humanitarian) crisis that Donald Trump pointlessly unleashed by banning all Syrian refugees from entering the U.S.—along with immigrants from seven Muslim-majority countries—and then firing Acting Attorney General Sally Yates when she proclaimed the order illegal and instructed the Justice Department to ignore it.

For all that differentiates the Saturday Night Massacre from the Muslim ban and its aftermath, both events present a commander-in-chief with an utter, self-defeating contempt for basic rule of law and all institutional checks on his authority.  Just as Nixon believed he could sweep Watergate under the rug by canning its lead investigator, so does Trump think he can essentially wipe out an entire religion’s worth of immigrants from the United States by disappearing any Justice Department official who regards the First Amendment as constitutionally binding.

(Notice how Trump justified the firing of Yates by accusing her of “betrayal”—as if the attorney general’s loyalty to the president supersedes her loyalty to the law.)

Of course, the nice thing about the Constitution is that it exists whether or not the president believes in it (as Neil deGrasse Tyson didn’t quite say).  The trouble—as the nation learned so painfully with Nixon—is that justice can take an awfully long time to catch up to the president’s many dogged attempts to dodge it—especially if he has a gang of willing collaborators in Congress.

In the end, the reason Watergate exploded into a full-blown cataclysm was that Richard Nixon was a fundamentally rotten human being—a callous, cynical, friendless sociopath whose every move was calibrated for political gain and without even a passing consideration for the public good.  For all that he spoke about standing up for the common man, when push came to shove the only person he really gave a damn about—the only person he ever lifted a finger to protect—was Richard Nixon.

Does any of this sound familiar?  You bet your sweet bippy it does.  In the frightfully short time he’s been president, Trump has shown a remarkable knack for mimicking every one of Nixon’s faults—his vindictiveness, he contempt for the press, his insecurity, his dishonesty, his propensity for surrounding himself with racists and anti-Semites—while somehow skirting any redeeming qualities that might make his presidency tolerable, despite all of the above.

Indeed, to the extent that Trump is not the absolute spitting image of America’s all-time champion of corruption, he is demonstrably worse.  After all, Nixon was historically literate, intellectually curious and, from his experience as a congressman and vice president, highly knowledgeable about the nuts and bolts of Washington deal making.  He was a scoundrel, but a reasonably competent one with several major accomplishments to his name.

Can we expect Trump to achieve any sort of greatness in the teeth of his many weaknesses?  If these first two weeks are at all predictive of the next four years, I see no reason to think so.  Whereas Nixon was a gifted strategic thinker with a deep sense of history and geopolitics, Trump has over and over again professed a proud and stubborn ignorance of any matter that does not directly involve himself, and seems to derive all his information about a given subject from the last person he spoke to about it.

The Greeks had it right:  Character is destiny, and there’s just no coming back from a veritable avalanche of fatal flaws.  We can pray all we want that the president will suddenly discover the value of temperance, deliberation and any hint of public virtue, but we’d only be denying a truth that has been staring us in the face from the moment Trump announced himself as a figure of national consequence.  He is who he is, he will never get better, and our only hope is that this new national nightmare won’t last quite as long as the last one did.

Love the Bubble

There’s an old story that when Richard Nixon was re-elected president in 1972 by a score of 49 states to one, the legendary New Yorker film critic Pauline Kael remarked, “How could Nixon possibly have won?  Nobody I know voted for him!”

In truth, Kael said nothing of the sort.  Or rather, she said the exact opposite of the above, but because life is one long game of telephone, over time her words have been misinterpreted to within an inch of their life, so that now she comes off as an oblivious, left-wing stooge.  Oh well:  When the legend becomes fact, print the legend.

All the same, those exact words have been bouncing around my head a lot these days, following the even more inexplicable election of an even more inappropriate candidate to that very same high office.  If the gist of Kael’s (fictional) lament is that Americans are so ideologically tribal that we’ve essentially walled ourselves off from those with whom we disagree, I’ve certainly done my part to make matters worse.

Indeed, months before Donald Trump became America’s president-elect, I couldn’t help but marvel at the fact that, so far as I could tell, not a single person I’ve ever known was prepared to cast a vote for him.  Nor, for that matter, was any writer, elected official or celebrity in my intellectual orbit for whom I hold even a modicum of respect—including many conservatives who would normally support the Republican candidate as reflexively as I would support the Democrat.

Is this because, like Pauline Kael, I live inside an elitist, left-wing bubble and spent the entirety of 2016 subconsciously avoiding any views I would rather not hear?  Probably.

Is it also because Donald Trump was the most unserious and morally repugnant presidential candidate in a century, and therefore liable to turn off virtually any honest person who knows a vulgar charlatan when they see one?  Once again:  All signs point to yes.

Because those two things are equally true—not one more than the other—I’ve had real trouble feeling guilty about contributing to America’s increasing divide between Team Red and Team Blue.  I don’t doubt that if I put in more effort to reach out to folks in the heartland and elsewhere who do not share my values, I would likely emerge a fuller, more empathetic human being.  But there is no amount of ideological ecumenicalism that could negate all the terrible things Trump has said and that innumerable supporters of his have done:  He and they are as contemptible today as they’ve ever been—if not worse—and I have no desire to treat their particular views on race, religion and gender as if they are deserving of my respect.

Remember:  One’s politics are not some ingrained, immovable phenomenon like ethnicity or sexual orientation.  They are a choice.  They reflect how you think—as opposed to who you are—and that makes them fair game for the condemnation of others.

Which brings us—improbably enough—to Meryl Streep.

At Sunday’s Golden Globes, Streep chose to accept the Cecil B. DeMille Lifetime Achievement Award by expressing her revulsion toward the president-elect and all that he represents—specifically, his disdain for multiculturalism and a free press, as well as his pathological inability to ever behave like a mature, compassionate adult.  Predictably, the crowd inside the Beverly Hilton went wild, while right-wingers online condemned Streep as an arrogant liberal nut.  And so it goes.

From a close reading of Streep’s remarks, we find that—apart from an unfair crack about mixed martial arts—she didn’t make a single statement that any decent person could possibly disagree with.  Every factual assertion was objectively correct (e.g., Trump is a bully, Hollywood actors have geographically diverse backgrounds), while every value judgment was so basic and obvious that a kindergartner could understand it (e.g., “disrespect invites disrespect, violence incites violence”).

Substantively, there was absolutely nothing controversial in Streep’s comments.  The uproar, then, was entirely a function of Streep’s status as Hollywood royalty—and, thus, a spokeswoman for the cultural left—which led those on the right to denounce her purely out of partisan vindictiveness, just the way congressional Republicans have opposed much of what President Obama has said because he said them.

That, my friends, is the real danger of living in a bubble:  Your ideological bias can become so overpowering that you decide, in advance, that those in the other bubble could never possibly say something true.  And that is the moment at which all good governance—nay, all good citizenship—ends.

I, for one, am entirely comfortable with the fact that, during the next four years, Donald Trump will occasionally say and do things of which I completely approve.  When that happens, I hope I will have the decency and integrity to say so.  All I ask in return is for everyone else—no matter which bubble they call home—to meet me halfway.