Nanny Statecraft

All things being equal, I’d prefer the government not tell me how to live my life.  While I don’t call myself a libertarian—or anything else, for that matter—I agree with libertarianism’s central ethos that, in a free society, one should have the right to do absolutely anything so long as it doesn’t harm anyone else.

The issue, of course, is what exactly we mean by “harm.”  It’s all well and good to advocate for abstract notions like “freedom” and “individual liberty,” but sooner or later it must be faced that no person is an island and that all of our actions affect those around us, whether we realize it or not.  The First Amendment pointedly doesn’t give a damn about other people’s wellbeing when it comes to free speech, but actual behavior is another matter altogether.

In our present crisis—the COVID-19 pandemic and America’s faltering effort to mitigate its worst impacts—the most interesting and relevant debate concerns whether governors should order residents and out-of-state visitors to take various steps—beyond masking and physical distancing—to keep themselves and others safe from the virus and—more important still—whether and how those orders should be carried out.

In my home state of Massachusetts, Governor Charlie Baker just last week issued a directive, effective August 1, requiring those entering Massachusetts from most other states to either produce a negative COVID test result from the previous 72 hours, or to quarantine for 14 days upon crossing into the Bay State.  According to Baker’s edict, those who violate these dictates are subject to fines of up to $500 per day.

All of which begs the question:  How on Earth will any of this be enforced, and who will be doing the enforcing?  Will there be border guards on every street and highway that straddles Rhode Island, Connecticut, New Hampshire, Vermont or New York?  Will state police pull over any car with an out-of-state license plate?  Are hotels responsible for monitoring un-tested, un-quarantined guests?  What about restaurants and public beaches?

For that matter, what if you get tested 72 hours before your trip but—through no fault of your own—the results don’t come back in time?  What if, as a consequence, you decide to scrap your trip entirely and are subject to exorbitant hotel cancellation fees?  Will Massachusetts reimburse you?  Considering how abruptly these new rules have been implemented, it seems only fair that the commonwealth absorb at least some of the cost.

Thus far, the Baker administration has addressed none of these concerns and provided no details about the mechanics of its elaborate visitation policies.  The operating idea, it would seem, is to make the act of entering Massachusetts from a COVID hot spot so onerous and convoluted that potential visitors will conclude that it simply isn’t worth the trouble, thereby keeping the commonwealth’s hospitalization and death rates low. 

At a press conference early in the state’s lockdown, Baker said explicitly, “I don’t believe I can or should order U.S. citizens to be confined to their homes for days on end.”  Ever since, Baker consistently expresses confidence that, overall, the people of Massachusetts have been—and will continue to be—smart and conscientious about following his administration’s advisories about masks, social distancing and the like, thereby rendering the need for strict oversight moot. 

But this attitude only brings us to James Madison’s immortal observation, “If men were angels, no government would be necessary.”  As 200,000 years of trial and error have shown, they are not and, therefore, it is.  Savvy politician that he is, Baker is most certainly aware that, in the dead of summer, a great many travelers will call his bluff, slinking off to Cape Cod and the Islands in spite of the health risks, assuming they won’t run into any trouble with the authorities when they get there.

When—not if—that happens, Baker’s administration will have to decide whether to take its own orders seriously—either by bringing the hammer down on COVID scofflaws without mercy, or by admitting that, like all previous virus-related policies within the commonwealth, the travel restrictions are merely a suggestion, however essential they might be in preventing a new outbreak.

Certainly, the threat of a $500 daily fine is quite the sword of Damocles to hang over the public’s head.  But if word gets around in the coming weeks that no such fines have been levied and the state’s infection rate remains relatively low, there will be precious little to prevent unwitting COVID carriers from flocking here in increasingly large numbers.  This, in turn, may well trigger the next wave of infections we’ve all tried so hard to avert, and by the time the administration responds—presumably through new shutdowns and/or actual sanctions for those not following the rules—it will be too late.

Such is the risk any government takes by being neither clear nor consistent in the enforcement of its own laws.  Amidst COVID-19, in which thousands of lives hang in the balance, our leaders would do well to suppress their libertarian instincts as best they can, recognizing that if ever there were a time to order their constituents to follow instructions and punish them when they don’t, it would be now.

To be sure, individual liberty is a cornerstone of the American republic—as foundational a national value as representative democracy, a free press and due process of law.  However, the right to pursue one’s happiness does not include willfully and recklessly spreading a deadly virus that has already killed more than 150,000 Americans with no end in sight, and our elected officials have every prerogative—subject to checks by the legislature and the courts—to temporarily limit or restrict our movements in service of preventing future deaths.

Keeping our fellow Americans safe and healthy is a patriotic act, and if it takes punitive measures by our leaders to make us behave responsibly in the middle of a plague, so be it.

The Goalpost

Back in 2009, then-blogger and LGBTQ advocate Andrew Sullivan asserted, “The goal of the gay rights movement should be to cease to exist.”  Last month, after the U.S. Supreme Court ruled that LGBTQ people are protected from employment discrimination by Title VII of the Civil Rights Act of 1965, Sullivan essentially declared victory, writing, “Every single goal the gay-rights movement set out to achieve in my lifetime has now been won.  Gays can marry; we can serve our country openly with pride; we are categorically protected from discrimination in employment and accommodations in every state. […] It’s done.  Finished.  Accomplished.”

As the United States today finds itself grappling with multiple social justice struggles—none more prolific than Black Lives Matter—it is well worth asking whether Sullivan’s perspective is correct and should be applied across the board.

The question is simple:  Should civil rights movements be fought with a particular end goal in mind?  Is their fundamental purpose to achieve a specific and clearly-defined set of legal victories, or is it rather to effect a broad, gradual, ongoing change in the minds of the people?

Ideally, of course, the answer is both, and one presumes that success on one front will naturally lead to progress on the other.  Certainly, the cascade of legal protections afforded to LGBTQ folk over the past decade would’ve been inconceivable without the extraordinary spike in public support on issues like same-sex marriage.  Nor, in all likelihood, would the recent visibility of transgender concerns have been possible without the Supreme Court rulings that effectively put gay and lesbian concerns to bed, thereby clearing the field for the next oppressed minority to step up to the plate.

But suppose the two lanes of progress didn’t align so neatly.  Suppose, say, that in 2015 same-sex marriage weren’t all that popular nationally but the Supreme Court legalized it anyway.  In that case, would gay rights agitators still have been justified in considering the issue closed, or would the imperative of winning hearts and minds remain a vital, unfinished order of business?  How far should activists go, how hard should they push, how much should they demand, before concluding that the mission has been accomplished, the war has been won, and now it’s time to pack everything up and get on with the business of living happily ever after?

To be sure, the raisons d’être of some civil rights causes necessarily evolve over time—as it has, for instance, with women’s rights, which 100 years ago was concerned primarily with suffrage but subsequently expanded to include everything from birth control to pay equity to employment discrimination to college sports.  Admittedly, “women’s rights” is an umbrella term applied retroactively to every disparate struggle in history involving women—a fact that only serves to demonstrate just how bottomless the struggle for a sexually equitable society is and will probably always be.

But what about movements that define themselves a bit more narrowly and do not necessarily intend to stick around forever?

In this moment, what about Black Lives Matter?

As America continues to reckon with its 400-year-old problem with institutional racism, what precisely does Black Lives Matter intend to accomplish, and are they things that might actually be gotten in any of our lifetimes?  In other words, is this particular undertaking a means to an end or a way of life?

Case in point:  One of BLM’s central claims is that Black people are disproportionately targeted and mistreated by police departments all across the country, and that the systemic racism that enables this must be addressed, fought and ultimately eliminated.  Supposing we all agree about this, by what metrics should this effort be monitored and assessed? 

Considering that so much of the recent energy for racial justice stems from amateur videos of Black people being needlessly tortured by police—which are presumed, rightly or wrongly, to be representative of a much larger pattern of institutional bias—should we remain unsatisfied until these egregious incidents cease happening altogether, even if the data show they are exceedingly rare overall?  And if so, wouldn’t any new video of such misconduct cast us right back to square one, all but dooming us to a perpetual loop of righteous outrage from now until the end of time?

To what extent are our perceptions of racial progress—or any progress—shaped more by anecdotes than by cold, hard facts?  And what happens when there is a clear conflict between the two?  What should we believe:  Statistics or our own eyes?

Here’s one anecdote for consideration.  In 1831, the legendary Boston abolitionist William Lloyd Garrison founded a newspaper called The Liberator for the express purpose of agitating to end slavery in the United States.  In 1865, when ratification of the Thirteenth Amendment to the Constitution did exactly that, Garrison ceased publication of his newspaper, writing, “The object for which the Liberator was commenced […] having been gloriously consummated, it seems to be especially appropriate to let its existence cover the historic period of the great struggle; leaving what remains to be done to complete the work of emancipation to other instrumentalities (of which I hope to avail myself) under new auspices.”

That, in so many words, is what I would recommend for any honest effort to effect a more perfect union:  To establish a clear objective, to fight like hell to achieve it and, once achieved, to declare victory and move on.  Even as William Lloyd Garrison knew the struggle for racial equality was far from over, he understood there was only so much one man—or one movement—could accomplish without the risk of losing focus or completely jumping the shark.

The goal of Black Lives Matter should be to cease to exist—and to have the wisdom to recognize that fateful moment when it comes. As with toxins like anti-Semitism and misogyny, racism will never be fully eradicated in a free society, no matter how hard we try. But that should not preclude us from reaching a tipping point whereby we can look ourselves in the mirror and conclude that, as far as America is concerned, Black lives have value, after all.

Rowling in the Deep

The finest college commencement address I’ve ever seen in person was delivered by J.K. Rowling at Harvard in 2008.  While I have no personal connection to America’s oldest university, my dad—a proud member of the Class of 1974—managed to snag us tickets to the annual spring exercises, knowing they would be headlined by the most beloved novelist of the last many decades, and she did not disappoint.

What so charmed me about Rowling’s remarks—which can be viewed on YouTube and have also been published in book form—was their near-total lack of references to Harry Potter.  Confident that her singular literary contribution to society could speak for itself, Rowling devoted the balance of her time to her own personal story of failure and struggle, as if to remind Harvard’s privileged graduating seniors that not everything in life is served on a silver platter.  Particularly not if you’re a working-class single mom navigating an industry dominated by wealthy, patriarchal men.

Smash cut to 2020, and Rowling has suddenly become Voldemort in the Anglo-American culture wars, castigated by many of her most loyal admirers—both online and in real life—for sustained and unforgiveable moral transgressions that necessitate her expulsion from polite society unless and until she repents and begs forgiveness from the cultural powers that be.

Rowling’s crime—for those who missed it—is to have asserted on Twitter (where else?) that people born with female reproductive organs are, in fact, biologically female.

The horror.

Had Rowling written such a thing, say, two or three years ago, it likely would have gone unremarked upon, possibly owing to the fact that it’s an objectively true statement borne out by basic common sense and 200,000 years of scientific observation (give or take a few millennia).

What made the comment “problematic” here in 2020 was its implied diss to the transgender community, whose entire existence is grounded in the proposition that gender transcends biology—that one’s sex at birth need not align with the gender with which that person identifies once they are old enough to think for themselves.

Substantively, this issue largely hinges on the distinction between sex and gender, and the fact that many people treat the two terms interchangeably when they shouldn’t—which, ironically, prevents them from realizing that both sides in this debate are right:  Rowling is right about the immutable biological nature of sex, while the transgender community is right that gender is the result of a complex and often painful journey of self-discovery—a fact that Rowling herself has acknowledged multiple times.

In a mature society, any disagreements on this subject—either of fact or opinion—would be hashed out on their merits in the spirit of open debate and the free exchange of ideas.  If, indeed, Rowling were mistaken in her characterization of the nature of womanhood in the 21st century, she would be presented with a reasoned counterargument that she could accept or reject, and the conversation would continue from there.

In the society we actually inhabit—the one played out in the Twitterverse and adjudicated by whoever possesses the greatest mixture of self-righteousness and spare time—Rowling was simply called “transphobic,” and that was the end of that.

Or rather, it wasn’t, since Rowling chose to stand her ground and further elucidate her views on the matter, yielding additional ad hominem non-arguments from her antagonists, including the charge of being a “TERF,” or trans-exclusionary radical feminist, a slur that at least has the virtue of being somewhat specific.

Amidst all this internet squabbling, my question is as follows:  If the LGBTQ+ community has decided that Public Enemy No. 1 in the fight for transgender rights is J.K. Rowling, is it possible the whole movement has ever-so-slightly lost focus and jumped the proverbial shark?  What’s more, if the only rejoinder to Rowling’s commentary on gender issues is to call her names and treat her like a toxic substance that must be disposed of with all deliberate speed, why should anyone take the gender equity cause seriously in the first place?

Call me old-fashioned, but I’ve long been taught that, in a country founded on pluralism and free speech, the best response to a bad argument is a better argument.  If your first and only instinct when faced with criticism of your identity or ideas is to attack the character of your opponent, could it be because you have no counterargument to offer them?  If you’re so sure you’re right and they’re wrong, why not prove it with reason and logic, rather than channeling Donald Trump and resorting to petulant schoolyard taunts?

Given the extraordinary courage and nerve of the rising generation of transgender people having to simultaneously find their way in the world while enduring the very real—and very cruel—hostility from the Trump administration toward their rights and dignity as human beings—not to mention the physical violence to which they are subjected on an alarmingly disproportionate scale—I refuse to believe this same group could be so weak and brittle in the face of snarky tweets by a woman who, in every other respect, is regarded as a leading voice for the world’s underdogs. What a shame it would be for the folks so heroically teaching the nation to be respectful and tolerant to end up being so disrespectful and intolerant themselves.

I trust this disconnect shan’t last much longer. That it will vanish soon enough. You know: Like magic.

Not All Statues

Like any civic-minded American, I spent a portion of my Fourth of July weekend basking in silent, awed reverence toward a man who owned more than 200 human beings—including six of his own children—and who believed to his dying day that black people are biologically inferior to white people.

The reason for my reverence, of course, is that in June 1776, that same man, Thomas Jefferson, sat at a small desk in Philadelphia and wrote the 35 words—beginning with “We hold these truths to be self-evident”—that formed the entire foundation for the country he was helping to create. 

These were the same words that allowed another noted politician—some four score and seven years down the road—to ensure, by proclamation and later by constitutional amendment, that no person would be enslaved on American soil ever again, and which, a century after that, provided righteous fodder for a famous minister to intone that “all men are created equal” means exactly that:  “All” men, regardless of race, color or creed.  That Jefferson, by his actions, clearly meant otherwise at the time ultimately doesn’t matter:  His declaration speaks for itself.

In a typical year, these historical tidbits would serve as mere trivia—basic facts that one hopes are still being taught in every middle school social studies class in this country.

Here in 2020—a year not typical by any conceivable metric—we are in the middle of a debate about whether the likes of Jefferson, George Washington and other Great Men of History should be “cancelled” from our public squares for their crimes against humanity—most of all, the crime of slavery.  As a consequence, all our old assumptions about the nation’s founders—as well as latter-day figures such as Andrew Jackson and Robert E. Lee—have been cast to the winds amid demands that statues and monuments to these dead white men be torn down once and for all and that the teaching of U.S. history be reformed from the inside out to accommodate the non-white, non-male perspectives that are so often left out of the narrative.

The premise of this argument is that, by holding large numbers of slaves and lifting not a finger to end the practice in their lifetimes, Washington, Jefferson, et al., were little more than racist, barbaric perpetuators of white supremacy who have nothing useful to teach us today and deserve none of the deification and respect we have given them for the past 244 years.

While this view of the founding period might satisfy our need for moral rectitude and is a convenient framing device for movements such as Black Lives Matter, it is also ahistorical and wholly ignorant of the debt we owe to the past, whether we like it or not.

Certainly, it would feel empowering to simply write off every historical figure who ever held repugnant views and/or committed repugnant acts—at least according to the mores of the present day.  Indeed, in the age of Black Lives Matter and woke-ism writ large, defending the honor of patriarchal, aristocratic slavers has never been less fashionable or politically correct.

Nonetheless, as an amateur historian, I am compelled to do exactly that.

With regards to Washington and Jefferson in particular, the question isn’t whether they were white supremacists (they were) or whether their intimate involvement in the slave economy was wrong (it was, and they knew it).  The question, rather, is whether the United States would exist in anything close to its current form without their efforts as revolutionaries and, later, as symbols and elder statesmen.

Answer:  It wouldn’t.

As I suggested earlier, the only reason any of us today has the right to express ourselves freely—the reason we live in a democracy where thousands can demonstrate peaceably in the streets—is because of Thomas Jefferson, he being the author not just of “life, liberty and the pursuit of happiness,” but also the Virginia Statute for Religious Freedom, the linguistic and ideological forerunner to the First Amendment to the Constitution. (The latter, of course, was written by yet another slaveowner, James Madison.)

What’s more, the only reason Donald Trump is a president and not a king is because, as president, George Washington refused to exercise more power than Congress and the Constitution granted him, thereby setting a template that has been followed dutifully by every one of his successors.  More elemental, still, was Washington’s earlier role as commander-in-chief of the Continental Army, without whose victory against Great Britain the words of the Declaration of Independence would’ve been functionally meaningless, and an independent United States would’ve remained an impossible dream.

These truths matter and, like history itself, cannot simply be papered over or wished away.

Just as each of us owes a baseline debt to our parents for literally bringing us into the world—no matter how imperfectly they might’ve treated us thereafter—so, too, does the nation itself owe both its existence and durability to the seeds planted by the men toward whom we now bear a self-righteous moral grudge.  One needn’t defend or excuse their spectacular shortcomings—seen by many at the time as glaring, odious and howlingly hypocritical—in order to acknowledge their singular achievement in creating the most successful democratic republic in the history of planet Earth.

Yes, it would’ve been nice if those men had bothered to live up to the very ideals they fought for and espoused, rather than forcing the issue on subsequent generations, thereby extending the long tentacles of white supremacy (and related toxins) into the next century and beyond.  Generally speaking, it would be nice if all—or, indeed, any—of our national heroes were one-tenth as upstanding and ethically consistent as our shrines to them would suggest.

The plain truth is that if we insist on only erecting monuments to those without any flaws—or, at any rate, without complexities, contradictions or personal peccadillos—America would contain no monuments at all.  Show me a person who has never sinned and I’ll show you an infant who has yet to leave the maternity ward.

People are complicated and history is messy.  This is true whether or not we admit it, so we might as well admit it.  That the world’s most dynamic and enduring democracy was founded by slaveowners is an irony we should confront and embrace, rather than deny or reject.  Like F. Scott Fitzgerald—and unlike so many in today’s all-or-nothing social justice movements—I believe the human mind is mature and sophisticated enough to hold two opposing ideas simultaneously without going completely mad.

What truly worries me is how little the emerging generation seems to know about where their freedoms come from in the first place; how little they have been taught about the history and civic structure of their country; and how little they understand about why it took so damn long for “all men are created equal” to be transformed from a mere declaration into something close to a reality.

As Jefferson himself once wrote, “If a nation expects to be ignorant and free […] it expects what never was and never will be”—a statement that is, dare I say, self-evident.

The 94-Year-Old Man

What does it say that when the news broke last week that Carl Reiner had died at age 98, my first and only thought was, “How’s Mel doing?”

It says quite a lot.

Mel, of course, is Mel Brooks, the 94-year-old elder statesman of American comedy, who formed both a professional and personal kinship with Reiner in 1950 on the set of Sid Caesar’s “Your Show of Shows,” and the two men remained close for the next seven decades—a relationship that outlasted both of their marriages (Reiner’s wife, Estelle, died in 2008; Brooks’s wife, Anne Bancroft, died in 2005) and would surely be included in any pantheon of great celebrity friendships.

Onstage, their most enduring partnership was unquestionably “The 2,000-Year-Old Man,” their recurring standup bit with Brooks as the titular über-senior citizen and Reiner as the unflappable reporter who draws him out.  Offstage, they became known—thanks, in part, to a priceless episode of Jerry Seinfeld’s “Comedians in Cars Getting Coffee”—for the dinners they would have at each other homes on a near-daily basis, which they enjoyed on plastic tray tables in the living room while watching “Jeopardy!” or some old movie.  While Seinfeld offered them such delicacies as corned beef and brisket, Reiner was reportedly never happier than with a good old-fashioned hot dog.

In truth, my exposure to Carl Reiner is relatively limited, having never watched, say, “The Dick Van Dyke Show,” nor any of the Steve Martin movies Reiner directed in the 1970s and 1980s.  (His son, Rob, is another story.)

Brooks, by contrast, looms in my mind as very close to a deity.  As a movie director, his work proved as foundational to my budding interest in cinema as any:  I watched (and re-watched) “Young Frankenstein” long before stumbling upon the “Frankenstein” classics from the 1930s, “Blazing Saddles” before discovering the Westerns of the 1940s and 1950s, and “High Anxiety” before delving endlessly into the collected works of Alfred Hitchcock.  That my ignorance of the source material had no adverse impact on my enjoyment of Brooks’s parodies is a testament both to Brooks’s writing and the zany genius of his actors, including such comic luminaries as Gene Wilder, Madeline Kahn, Cloris Leachman and Kenneth Mars.

Culturally, Brooks’s crowning achievement may well be “The Producers”—both the 1968 film and the 2001 Broadway musical—which told the story of a financially desperate theatrical shyster who plots to swindle millions from his investors by way of a maximally appalling production called “Springtime for Hitler,” a scheme that goes haywire when—surprise!—the show becomes a smash hit.  I’ve had few experiences of live performance to match seeing Nathan Lane and Matthew Broderick devouring the scenery at the St. James Theatre in New York during a brief reunion of the show’s original cast in early 2004.  While Brooks himself didn’t appear onstage, he played a leading role in a 90-minute PBS program about the recording of the cast album, which saw him at his high-octane, free-associating best.

Increasingly, Mel Brooks is to me what Ruth Bader Ginsburg is to most liberals:  My own mental equilibrium is dependent upon his being alive and in relatively good health.  In a way, this makes little sense, as Brooks hasn’t directed a new movie since 1995 and his more recent credits—mostly voiceover work—have flown straight under my radar.

On the other hand, it was less than four years ago that I actually saw the great man in person (albeit from a distance) when he came to Boston’s Wang Theatre to screen and reminisce about “Blazing Saddles”—an evening made more poignant by the fact that Gene Wilder had died just a few weeks earlier.

Now that Carl Reiner, too, has shuffled off to the great beyond, I worry—nay, dread—that Brooks will soon follow.  Just as couples who’ve been married since World War II tend to synchronize their deaths to within days, hours or minutes of each other, it would make a certain cosmic sense for these platonic partners-in-comedy to depart Earth in rapid succession.

I hope I’m wrong, of course, as I’m not sure a world without Mel Brooks is one I’d want to live in.  While some figures in popular culture seem to be truly immortal—looking at you, Queen Elizabeth and Betty White—the specter of death hangs over us all, causing us, in our better moments, to appreciate those we have for as long as we have them.

Speaking of appreciation, at this juncture I feel duty-bound to note that the Kennedy Center in Washington, D.C., has been handing out the Mark Twain Prize for American Humor annually since 1998—including to Carl Reiner in 2000—but that not one of its honors has ever gone to Mel Brooks.  While Brooks has certainly amassed his share of recognition through the decades (including by the Kennedy Center in 2009), it does seem a bit odd that the nation’s most prestigious life achievement award for comedy—one that has already been bestowed on the likes of Will Farrell and Tina Fey—has somehow eluded the man who all but invented an entire genre of film humor and is widely beloved because of it.

While it is generally true that taste in comedy is subjective, Gene Siskel was correct in noting, “There is a point when a personal opinion shades off into an error of fact.”  It’s high time this particular error were corrected once and for all.  Lest we forget, the Twain Prize can only be given to a living person, and Mel Brooks ain’t getting any younger.