The 94-Year-Old Man

What does it say that when the news broke last week that Carl Reiner had died at age 98, my first and only thought was, “How’s Mel doing?”

It says quite a lot.

Mel, of course, is Mel Brooks, the 94-year-old elder statesman of American comedy, who formed both a professional and personal kinship with Reiner in 1950 on the set of Sid Caesar’s “Your Show of Shows,” and the two men remained close for the next seven decades—a relationship that outlasted both of their marriages (Reiner’s wife, Estelle, died in 2008; Brooks’s wife, Anne Bancroft, died in 2005) and would surely be included in any pantheon of great celebrity friendships.

Onstage, their most enduring partnership was unquestionably “The 2,000-Year-Old Man,” their recurring standup bit with Brooks as the titular über-senior citizen and Reiner as the unflappable reporter who draws him out.  Offstage, they became known—thanks, in part, to a priceless episode of Jerry Seinfeld’s “Comedians in Cars Getting Coffee”—for the dinners they would have at each other homes on a near-daily basis, which they enjoyed on plastic tray tables in the living room while watching “Jeopardy!” or some old movie.  While Seinfeld offered them such delicacies as corned beef and brisket, Reiner was reportedly never happier than with a good old-fashioned hot dog.

In truth, my exposure to Carl Reiner is relatively limited, having never watched, say, “The Dick Van Dyke Show,” nor any of the Steve Martin movies Reiner directed in the 1970s and 1980s.  (His son, Rob, is another story.)

Brooks, by contrast, looms in my mind as very close to a deity.  As a movie director, his work proved as foundational to my budding interest in cinema as any:  I watched (and re-watched) “Young Frankenstein” long before stumbling upon the “Frankenstein” classics from the 1930s, “Blazing Saddles” before discovering the Westerns of the 1940s and 1950s, and “High Anxiety” before delving endlessly into the collected works of Alfred Hitchcock.  That my ignorance of the source material had no adverse impact on my enjoyment of Brooks’s parodies is a testament both to Brooks’s writing and the zany genius of his actors, including such comic luminaries as Gene Wilder, Madeline Kahn, Cloris Leachman and Kenneth Mars.

Culturally, Brooks’s crowning achievement may well be “The Producers”—both the 1968 film and the 2001 Broadway musical—which told the story of a financially desperate theatrical shyster who plots to swindle millions from his investors by way of a maximally appalling production called “Springtime for Hitler,” a scheme that goes haywire when—surprise!—the show becomes a smash hit.  I’ve had few experiences of live performance to match seeing Nathan Lane and Matthew Broderick devouring the scenery at the St. James Theatre in New York during a brief reunion of the show’s original cast in early 2004.  While Brooks himself didn’t appear onstage, he played a leading role in a 90-minute PBS program about the recording of the cast album, which saw him at his high-octane, free-associating best.

Increasingly, Mel Brooks is to me what Ruth Bader Ginsburg is to most liberals:  My own mental equilibrium is dependent upon his being alive and in relatively good health.  In a way, this makes little sense, as Brooks hasn’t directed a new movie since 1995 and his more recent credits—mostly voiceover work—have flown straight under my radar.

On the other hand, it was less than four years ago that I actually saw the great man in person (albeit from a distance) when he came to Boston’s Wang Theatre to screen and reminisce about “Blazing Saddles”—an evening made more poignant by the fact that Gene Wilder had died just a few weeks earlier.

Now that Carl Reiner, too, has shuffled off to the great beyond, I worry—nay, dread—that Brooks will soon follow.  Just as couples who’ve been married since World War II tend to synchronize their deaths to within days, hours or minutes of each other, it would make a certain cosmic sense for these platonic partners-in-comedy to depart Earth in rapid succession.

I hope I’m wrong, of course, as I’m not sure a world without Mel Brooks is one I’d want to live in.  While some figures in popular culture seem to be truly immortal—looking at you, Queen Elizabeth and Betty White—the specter of death hangs over us all, causing us, in our better moments, to appreciate those we have for as long as we have them.

Speaking of appreciation, at this juncture I feel duty-bound to note that the Kennedy Center in Washington, D.C., has been handing out the Mark Twain Prize for American Humor annually since 1998—including to Carl Reiner in 2000—but that not one of its honors has ever gone to Mel Brooks.  While Brooks has certainly amassed his share of recognition through the decades (including by the Kennedy Center in 2009), it does seem a bit odd that the nation’s most prestigious life achievement award for comedy—one that has already been bestowed on the likes of Will Farrell and Tina Fey—has somehow eluded the man who all but invented an entire genre of film humor and is widely beloved because of it.

While it is generally true that taste in comedy is subjective, Gene Siskel was correct in noting, “There is a point when a personal opinion shades off into an error of fact.”  It’s high time this particular error were corrected once and for all.  Lest we forget, the Twain Prize can only be given to a living person, and Mel Brooks ain’t getting any younger.

The Less We Know

I spent my last post on this page rhapsodizing about my family’s recent weeklong trip to Martha’s Vineyard, which we enjoyed immensely despite the ongoing fears of COVID-19 and the cumbersome restrictions enacted to fight it. 

The question I did not address is whether we should’ve travelled there in the first place.  Whether it was irresponsible for us, in the middle of a pandemic, to embark upon an utterly indulgent and non-essential vacation to a small island populated by a sizeable number of old people and drunk college students on summer break.  Whether we were, in effect, part of the problem.

Our official defense—which has the added value of being true—is that while on the Vineyard, as on the mainland, we didn’t go anywhere in public without face coverings on our persons, which we slipped dutifully onto our faces the moment we got anywhere close to another human being, be it on the street, the ferry, the bike path or the hiking trail.  What’s more, we observed all the local social distancing regulations to the best of our abilities—personally, I’ve spent years turning people-avoidance into a fine art, so it wasn’t a major change in routine—and infused as much cash into the island’s economy as our modest budget allowed.

As far as we’re concerned, we were model tourists in the time of coronavirus, and should there be a spike in positive cases somewhere between Edgartown and Aquinnah over the next few weeks, we’re pretty sure it won’t be our fault.

That said, wasn’t it only a few weeks ago that we New Englanders were chastising all those spring breakers in Missouri for having the temerity to enjoy the beautiful spring weather in a public pool at Lake of the Ozarks, as if nothing out of the ordinary were happening?  Did we not spend the entire months of April and May ridiculing the state of Florida for keeping its beaches open, despite the obvious epidemiological dangers therein?

After all that highfalutin judginess, does it not reek of hypocrisy for us to turn around and retreat to our own personal island paradise, engaging in much the same behavior we had just finished condemning in others?  Even as we were taking commonsense precautions that the folks in Missouri and Florida plainly were not, were we not nonetheless taking an unnecessary risk and setting a poor example for our fellow Bay Staters at a moment when the battle against COVID-19 has not yet been won?

To be fair, the COVID infection rate in Massachusetts has been trending downward for quite a while now—so much so that even our maddeningly cautious governor, Charlie Baker, has given many previously-forbidden leisure activities the green light—with the key metrics down 80-90 percent from their peak in mid-April.  By all outward appearances, the odds of catching the virus in my home state are as low as they’ve ever been, and getting lower as we speak.

But what if the numbers are wrong?  Or rather, what if the true nature of the virus is not reflected in the official tallies?  Knowing, as we do, that the effects of COVID-19 often take days, if not weeks, to present themselves in its victims, are we so sure that shopping on Circuit Ave or brunching at the Black Dog Tavern today will not have dire consequences two or three weeks hence?  And if that burst of social intimacy does, indeed, produce a new wave of sickness and death, what will our excuse be?  “The numbers were looking good”?  Wasn’t that the case in Miami Beach two months ago?  What makes us so special?

As Donald Rumsfeld used to say, there are known knowns, there are known unknowns, and then there are unknown unknowns.  When it comes to COVID-19, we seem to have a fair amount of all three.  For all we’ve learned about how the virus spreads and what it does to its hosts, we have also seen many of our assumptions proved wrong through a combination of incompetence, miscommunication and good old trial and error.  While we’d undoubtedly like to think we finally have this deadly riddle figured out—Outside is better than inside!  Six feet is safe!  Masks definitely work!—we still need to account for the unknown unknowns, i.e., all the horrible ways the virus will surprise us in the future without any warning whatsoever.

All of which is to say that we should be humble about what we think we know about how best to continue fighting the worst epidemiological challenge in America since 1919, and not be any more judgmental toward our supposedly less-enlightened fellow citizens than strictly necessary.  We won’t know for sure whether the actions we’ve taken over the last three months—individually and collectively—have been responsible or reckless until the consequences will have already played out. 

In the meantime, all we can reasonably do is continue heeding the advice of those who are probably more knowledgeable on these matters than we are, and to exercise the same sort of common sense that protects us from burning our hands on the kitchen stove or following our car’s GPS into a lake.

And if that requires putting off a treasured vacation until further notice—well, do as I say, not as I do.

The First Dance

On Wednesday, March 11, our family dined at a local Italian restaurant to celebrate my dad’s birthday.  By the time we got home that evening, most of America had been shut down until further notice because of the looming coronavirus pandemic.  Consequently, other than takeout, we didn’t patronize another restaurant for the next 13-plus weeks.

Then we hopped a ferry to Martha’s Vineyard on June 15, and between then and our departure on June 22, we ate out a total of 15 times, in a combination of cafés, diners, seafood shacks, gastropubs and creameries; on porches, patios, picnic tables and lawn chairs; under a tent, on the sidewalk, among the flowers and along the beach.

It was one of the most glorious eight-day stretches of my life.  As Joni Mitchell didn’t quite say, you don’t know what you’ve got ’till it’s gone for three months and then suddenly comes back with full table service and a side of fries.

I should mention that my family has spent a few days on Martha’s Vineyard—the tony island seven miles off Cape Cod—virtually every summer since before I was born.  (That’s right, my first visit there was in utero.)  As always, this year’s sojourn was planned many months in advance—those hotel and ferry reservations aren’t going to book themselves—without the knowledge that, when the blessed moment arrived, all of planet Earth would be engorged in an ongoing public health emergency involving a deadly and highly-infectious disease, rendering virtually all travel inadvisable and virtually all commerce moribund.

Indeed, as we hunkered down in our COVID bunker in mid-March, abstaining from all non-essential human contact while awaiting further instruction from the authorities, the notion that we might not make it to our island paradise this year was never far from mind—along with the assumption that, should we somehow get there on schedule and in one piece, there would be precious little to do once we arrived.  Lazing in an Adirondack chair reading Elin Hilderbrand for 16 straight hours is all well and good, but sooner or later you want some soft serve and tequila.  You know, just to break up the monotony.

As it turned out, our timing was fortuitous, if not outright providential.  Per the guidance of Governor Charlie Baker, our entire stay coincided with the rollout of Phase 2 of the state’s reopening plan, whereby all restaurants and retail establishments may resume operations immediately, if they so choose.  On the Vineyard—a New England tourist mecca whose entire economy is dependent upon its summer crowds—that decision was an existential no-brainer. 

By the time we turned up, the overwhelming majority of such businesses had their doors wide open to the public, while the few that were still closed made it clear their return was imminent.  As our budget allowed, we took as full advantage of this dynamic as we possibly could, eating and shopping our way through the week to our hearts’ desire, more or less as we always do.

To be sure, the environment in which these activities occurred wasn’t anything close to normal or routine.  In accordance with Massachusetts COVID guidelines, most of our dine-in meals were ordered with paper or online menus, served on paper plates with plastic silverware—courtesy of wait staff clad in masks and gloves—and sometimes delivered in large brown takeout bags, even though we were consuming it on-site.  All retail shops required face coverings and enforced strict capacity limits, which occasionally meant waiting outside for 5-10 minutes just to purchase a cheap t-shirt—or decide not to after a round of fruitless browsing.

That’s to say nothing of the many elements of the pre-COVID universe that remain verboten until further notice.  Unlike so many previous Vineyard jaunts, this year there were no movies to see.  No tours to sign up for.  No concerts or plays to attend.  The island’s recently-refurbished museum was bolted shut, as were the Flying Horses carousel and (most distressing of all) a great many public restrooms.

And yet, for all the inconveniences and constraints imposed by a once-in-a-century pandemic, we enjoyed ourselves about as much as we ever have, and for reasons no more complicated than that we had endured more than three months locked inside our own heads—unable to enjoy all the commercial and cultural pleasures we had taken as our national birthright—and were now given a momentary reprieve. 

In the parlance of Tomas Pueyo’s influential Medium post, “The Hammer and the Dance,” Massachusetts has flattened the COVID curve through extraordinary public policy measures (i.e., the hammer) and now—bit by bit and with the utmost caution—it’s time to dance.

For me, the most revelatory development of last week was discovering—much to my surprise—that the Vineyard’s many modest-sized art galleries were among the institutions that had risen from the dead, proudly displaying and selling their wares to a public hungry to consume them once again.  Stumbling into one on Water Street in Edgartown on our first day, I realized—with unexpected emotional force—just how much I had missed gazing at pretty pictures in an intimate setting, as I had done on a regular basis before the world shut down, but not at all since a visit to Boston’s Isabella Stewart Gardner Museum on February 29.

At the venerable and quirky Field Gallery in West Tisbury, I found myself locking eyes with seemingly random works—a black and white photograph of a dock, an oil-and-glitter print of a giraffe with a rainbow dangling from its mouth—and being unable to look away, my feet superglued to the floor, savoring every last inch of canvas, knowing that—depending on how the coming weeks and months unfold—it could be a very long time before I get this opportunity again.

In another gallery, on Main Street in Vineyard Haven, I had a brief chat with the owner—at the time, the only other person in the room—who bemoaned the maddening lack of clarity she had received from the state about how and when she could re-start her business.  As I surveyed the numinous landscapes and pastel portraits adorning the walls—the ones she had managed to install in the few days since re-opening her doors—she asked me if I was looking for any artist in particular.  I responded, “No, I’m just puttering around.”  Sighing through her mask, she said, “It feels good to putter, doesn’t it?”

It does, indeed.

The End of History

To the best of my recollection, I have seen “Gone With the Wind” in its entirety exactly once in my life.  While I admired its scope, its high drama and of course its immortal performances by the likes of Vivian Leigh, Clark Gable and the Oscar-winning Hattie McDaniel, I have never been particularly compelled to watch it a second time.  If I wanted to devote four hours of my life to any single piece of cinema, I’d just as well re-watch “The Irishman,” followed by 30 minutes of thoughtful contemplation.

Nonetheless, should the urge to revisit the 1939 blockbuster ever set in—for whatever reason—I would hope to be able to access the classic film with the ease that the Streaming Revolution has wrought for virtually every work of visual art ever created since the dawn of time.

As of this week, however, that will not necessarily be the case, following the announcement by HBO Max that it will temporarily purge “Gone With the Wind” from its library, on the grounds that the movie presents an essentially rose-colored view of slavery in the antebellum South, which might implant the wrong ideas about 19th century race relations in the minds of unsuspecting viewers.

The context of this decision is clear enough:  Amidst nationwide protests against the institutional racism that caused the murders of George Floyd, Breonna Taylor, Ahmaud Arbery, et al., no streaming service wants to be associated with material that puts it on the wrong side of history.

The problem is that, in the world of film, “Gone With the Wind” is history—an essential artifact of erstwhile American values alongside “The Birth of a Nation,” “Song of the South” and numerous other now-execrable expressions of what the United States used to produce and represent.  While we now generally view these movies with cringing, eye-rolling horror—if, indeed, we watch them at all—their mere existence serves a critical and indispensable reminder of what our country once stood for, and of how far we have (or have not) progressed ever since.

Hollywood’s adaptation of Margaret Mitchell’s best-selling novel may strike us today as irredeemably racist.  However, adjusted for inflation, it is also the highest-grossing film ever made in the United States.  Rather than simply stowing the movie away in the national attic—never to be seen or heard from again—would it not be more fruitful, more educational and ultimately more enlightening to attempt to reconcile those twin realities and hopefully arrive at some greater truth, however unpleasant it might be?

I say yes.  As with all “problematic” cultural touchstones, I say we should keep everything out in the open for all to see, and to engage in the messy, uncomfortable conversations we claim to value and desire in this supposedly free-thinking society.  Arguing—as many do—that some works of art are simply too toxic to exist is to deny each of us the opportunity to make that decision for ourselves—and, more importantly, to more fully reckon with our collective past, rather than sweeping it under the rug, as if that would solve anything. (In the “Gone With the Wind” case, HBO has said, to its credit, that it will eventually re-install the movie with “a discussion of its historical context.”)

What most animates my resistance to this censorious approach to the past—and, more broadly, to the “cancel culture” movement that believes in purging society of all forms of political incorrectness—is the dangerous question of who gets to decide which art is appropriate for public consumption and which is unsuitable, offensive or otherwise beyond the pale.  In a nation that supposedly holds the freedom of expression sacrosanct—believing that the Ku Klux Klan has the same right to the public square as the Lions Club or Black Lives Matter—the notion of appointing an individual (or group of individuals) to determine what we can all watch and what we can’t ought to strike us as fundamentally repugnant.  Indeed, even if 99 percent of us agree a particular film, book, etc., has worn out its welcome, what right do we have to deny the other 1 percent the opportunity to make themselves look ridiculous?

We have no such right, nor should we.  If the First Amendment means anything, it’s that repulsive speech—such as a movie that effectively glorifies slavery—deserves just as much protection from censorship as speech that is wholesome and uncontroversial.  If we only cared about the latter, we wouldn’t really need a First Amendment in the first place, would we?

To put it more bluntly still:  If the state of the American culture is such that the film that (to repeat) has sold more tickets than any other film in history is vulnerable to being disappeared in the interest of good taste, how does that bode for every other flawed classic (and non-classic) in the canon? 

You may not give a damn about “Gone With the Wind”—frankly, I don’t—but what happens when your own politically incorrect favorites are on the chopping block?  When “Blazing Saddles” is no longer deemed satire?  When the appropriation of “Johnny B. Goode” makes “Back to the Future” anathema to enlightened eyes and ears?  When 50 years from now, after we’ve finally decided that killing animals for food is unconscionable and barbaric, it is no longer kosher to watch every 2020 release that depicts someone nonchalantly eating a cheeseburger?

Values and mores change and evolve over time, and it is morally disingenuous to impose today’s truth on yesterday’s fiction. To erase the past is to doom the future.

As for the very real problem of those viewers who cannot tell the difference between film history and real history:  Well, that’s what education is for.

The Wrong Thing

While there has never really been a bad time to re-watch “Do the Right Thing,” you’d be hard-pressed to find a better one than now.

Spike Lee’s 1989 film—arguably the finest he has ever made, and among the finest ever made by anyone—is the story of one day in the life of a Brooklyn neighborhood that begins innocently enough and ends in a deadly race riot.

Technically, the movie is fiction.  In every other sense, it is as truthful as the word of God itself.

(Warning:  Spoilers ahead!)

The film’s emotional climax—following nearly two hours of buildup—is the murder of Radio Raheem, an endearing and sympathetic supporting character, by an NYPD officer, followed by the torching of the neighborhood pizza joint, Sal’s Famous—where all the trouble began—in the neighborhood-wide conflagration that ensues.

If the proximate cause of the unrest—that most hideously euphemistic of terms—is a garbage can hurled through the pizzeria’s front window by Mookie, the deliveryman played by Lee himself, that particular act can be seen, as Roger Ebert wrote in 2001, as being “propelled by misunderstandings, suspicions, insecurities, stereotyping and simple bad luck.”  “Racism,” Ebert continued, “is so deeply ingrained in our society that the disease itself creates mischief, while most blacks and whites alike are only onlookers.”

Which brings us, in a way, to the Memorial Day slaying of George Floyd by a Minneapolis police officer, and all that has transpired in its wake.

So far as I can tell, there is not a person in America who has publicly argued that Floyd’s death—caused by being suffocated by the knee of (former) officer Derek Chauvin for nearly nine minutes—was anything less than an appalling and indefensible act.  Nonetheless, the fact that it happened—in broad daylight, surrounded by eyewitnesses—was sufficient to inspire outrage from coast to coast in the form of large-scale—and overwhelmingly non-violent—demonstrations in the streets of virtually every big city in America (and a lot of small ones, too).

And of course—to Ebert’s point—the reaction to Floyd’s murder is not about Floyd, per se, as it is about every black person in America who has been killed by the police for no apparent reason—who posed no threat, put up no struggle and, in many cases, had committed no crime in the first place.  It’s about four centuries of racial oppression that led a white policeman to believe he could snuff out a harmless black person with impunity, as innumerable other law enforcement officials had done before him—and, in all likelihood, will continue to do in the future.

In making “Do the Right Thing,” Spike Lee anticipated the future with shocking clarity, even if all he was doing was reflecting the reality of the present.  The movie is old enough that Barack and Michelle Obama saw it on their first date, yet the escalating series of events it depicts—and all the grievances and bitterness bubbling just beneath the surface—are virtually indistinguishable from the countless racial powder kegs that have exploded in recent days, weeks, months and years.  Radio Raheem in 1989 is George Floyd in 2020, to say nothing of the countless victims in between, many of whose names we have been chanting over the last week.

The basic conclusion to draw from this is that, when it comes to racism in America—by police and non-police alike—not much has changed since 1989 and things are not getting any better.  That for all the goodwill expressed by a majority of our fellow countrymen who genuinely long for a society in which one’s fate is not determined by the color of one’s skin, it would appear the gravitational pull of racial bias in this country is so profound that even the best-laid plans of mice and men don’t stand a chance against its insidious and toxic allure among those with the power to decide who lives and who dies.

All of which begs the obvious question of how we might possibly break free from this cycle of violence, mistrust and general ill will.  How we might render the likes of “Do the Right Thing” as antiquity rather than prophesy.

In my capacity as a Privileged White Person whose view on this matter is necessarily limited and arguably totally superfluous, I will merely observe that while changing human nature is difficult, changing one’s leaders is comparatively simple.  Lame as it may sound, elected officials wield enormous influence over how policy is carried out—up to and including policy on which racial and ethnic groups count as fully human and which ones can expect a knee on their throats—and their power extends only as far as their constituents—you and me—allow.

To quote Mister Señor Love Daddy, the radio DJ played by Samuel L. Jackson, in the closing moments of “Do the Right Thing”:  “Register to vote.  The election is coming up.”

Not Deadly Enough

Here in Coronaland, we have been awash in so many depressing COVID-19 statistics that we have nearly become inured to their real-world meaning.  Nonetheless, like the virus itself, some numbers still have the power to take your breath away and stop you dead in your tracks.

That was certainly the case last week, when a study by Columbia University estimated that some 36,000 fewer people would’ve died had the United States—and New York in particular—enacted social distancing measures one week earlier than it did—and an additional 18,000 could’ve been saved had those lockdown procedures begun seven days sooner than that.

The implications of these staggering figures are clear enough:  First, in retrospect, the authorities were catastrophically slow in responding to the initial COVID outbreak.  And second, should the country re-open too fast and too sloppily—as it now threatens to do—there is every reason to assume the next wave of infections will be as bad as—or worse than—the first one.

On the first point, I would advise caution in judging our leaders for their slow-footedness more harshly than is strictly necessary, bearing in mind how little they (and we) knew at the time and how wholly unprecedented the notion of sheltering-in-place was once the trigger was finally pulled.

To be clear, I am not referring here to Donald Trump, whose willful, callous indifference to the entire problem—including the withholding of critical supplies to states that urgently needed them—has been a singular failure of leadership in every imaginable context.

However, when it comes to the state and local leaders making the real on-the-ground, hour-by-hour assessments—particularly New York’s Andrew Cuomo and Bill de Blasio—it is worth reminding ourselves that on March 8—one week before its lockdown began—the city of New York had a total of 142 known infections of COVID-19 and zero known deaths.

Ask yourself:  Without the benefit of hindsight and with human nature being what it is, would it really have been feasible for either Cuomo or de Blasio to have stepped in front of a microphone on March 8—or any date prior—and ordered the residents of the nation’s largest metropolitan area to lock themselves inside their homes, suspending all but their most essential life activities, in order to prevent the spread of a virus that, at that moment in time, had not killed a single person within the five boroughs and showed no obvious signs of becoming a once-in-a-century epidemic?

Yes, even at that relatively early date, infectious disease experts had warned of COVID’s high level of contagion—as had been seen in places like China, Italy, Iran and elsewhere.  Nonetheless, for an American political leader to unilaterally shut down his own state or city—immediately and profoundly upending the life of every man, woman and child living there—on the mere presumption that things could get real bad, real quick, would have been an enormously large pill for any sizeable population center to swallow.  Frankly, there just wasn’t enough carnage to convince us it would’ve been worth it.

Thus was the Catch-22 by which many public officials were constrained:  The only way to avoid extreme casualties from the virus was to take extraordinary measures, yet the only politically palatable means of enacting those measures in the first place was to passively allow some of those casualties to occur, thereby proving how dire the situation actually was.  While obviously not the official plan, that was effectively how the tragedy unfolded.

And now—100,000 U.S. deaths later—we are seeing this very same dynamic playing out in the minds and Twitter feeds of millions of Americans who are fed up with being confined mostly to their apartments with nothing to do, itching to resume life as it used to be.

The argument today—if only implicitly—is whether the nationwide economic disruption of the past two-plus months was, at long last, a good idea.  Whether putting the country in a state of suspended animation was an overreaction and a folly, rather than smart public health policy that saved countless lives.  Whether (to put it bluntly) the loss of 100,000 of our fellow citizens to an insidious virus was essentially unavoidable and thus not worth the trouble of kneecapping our GDP and driving unemployment rates through the roof.

As with the initial lockdown advisories, the debate invites a vicious paradox:  A six-figure death rate might lead the lay person to believe—falsely—that the mitigation efforts were futile or counterproductive, rather than an indication that the mitigation efforts worked.

As horrific as the COVID fatalities have been with social distancing practices in place, the fairly obvious truth is that a less draconian version of them—let alone none at all—would almost certainly have produced an exponentially higher death toll—possibly above 2 million souls, according to an early projection by Imperial College London—and, conversely, that better overall adherence to such practices would have yielded marginally more tolerable results.

In short, as with so many things, we cannot assess the effectiveness of a given action without considering the alternative—the proverbial road not taken—which in this case would’ve been for all of us to carry on our lives semi-normally, allowing the virus to “wash over the country” (in the president’s words) and hope all the scientific models were wrong.

That, in effect, is the decision many of us have collectively made by opting to resume certain social activities—and the industries that provide them—for the sake of enjoying the summer warmth that is just beginning to settle in.  Despite all we have learned over the last several months—how COVID spreads, who is most vulnerable and what it does to the human respiratory system—we are betting that, with enough social distancing and mask-wearing (or not), we can simply ride out whatever’s coming next and hope the consequences aren’t as dire as they were (and still are) the first time around.

It’s a hell of a gamble for a first-world country to take, and we shouldn’t expect it to end well.  To paraphrase Boss Jim Gettys in Citizen Kane:  We’re going to need more than one lesson.  And we’re going to get more than one lesson.

American Idols

“Where have you gone, Joe DiMaggio? A nation turns its lonely eyes to you. Woo woo woo.”

In a strong field, that may well be the finest lyric Paul Simon has ever written—and for reasons that have nothing at all to do with the late former Mr. Marilyn Monroe.

Americans need their heroes—be they in sports, entertainment or maybe even politics—and they feel acutely vulnerable and adrift when those idols seem to vanish from the scene. This is particularly true in times of extraordinary distress and upheaval, such as (to pick a random example) a global public health emergency, when inspiring moral leadership is so urgently required.

For liberals who’ve been trapped in an existential funk since November 2016, one such hero is of course Barack Obama, the last U.S. president to exhibit any sort of compassion for his fellow human beings, who, unlike his wife, has made himself relatively scarce since exiting the White House more than three years ago.

That was until last weekend, when Obama made highly-anticipated dual virtual appearances before college and high school graduating classes of 2020—the latter televised in prime time—during which he intoned, “More than anything, this pandemic has fully, finally torn back the curtain on the idea that so many of the folks in charge know what they’re doing. A lot of them aren’t even pretending to be in charge.” The speeches did not include the word “Trump,” but we’re not stupid.

Whether by accident or design, these commencement addresses came on the heels of “leaked” remarks by the former president in a “private” conference call that saw him loudly and explicitly castigating the current administration both for its abysmal response to the coronavirus outbreak and its corrupt handling of the Michael Flynn case—words so forceful that Mitch McConnell, the Senate majority leader, responded, “I think President Obama should have kept his mouth shut.”

As a matter of political timing, Obama’s sort-of reentry into the cultural bloodstream is quite obviously related to the sort-of beginning of the 2020 presidential campaign, and the presumed crowning of Obama’s former wingman, Joe Biden, as the Democratic Party nominee. And certainly the party’s de facto standard-bearer has every right to publicly advocate for his hoped-for inheritor and the values he represents.

Beyond that, however, we, the people, have every reason to question whether McConnell had a point. That is, whether Obama’s broader commentary on the Trump administration is either wise or becoming of a member of the nation’s most exclusive club—namely, those who once had access to the nuclear codes and enjoy Secret Service protection to this day.

Indeed, the question of how ex-presidents should behave in retirement has been a matter of debate since March 1801, when John Adams opted to flee Washington, D.C., on horseback in the dead of night rather than attend the inauguration of Thomas Jefferson the following morning. In our own time—as with virtually everything else—the issue has broken along partisan lines, with Democrats like Jimmy Carter and Bill Clinton maintaining high profiles and busy schedules deep into their post-presidential years while Republicans like the Georges Bush have made a point of receding serenely into the background, content to have their records speak for themselves and their successors left to run the country in peace.

Old fogey-at-heart that I am, I’ve long had a soft spot for the latter approach to elder statesmanship, admiring of the discipline it must take not to gloat at everything the new guy is doing wrong.

In fact, Obama himself vowed to mostly adhere to the hands-off approach to ex-presidenting, telling reporters in January 2017 that, once Trump took office, he would refrain from open criticism except for “certain moments where I think our core values may be at stake.” In retrospect, considering the object of his prospective ire, perhaps that was Obama’s dry way of saying he had no intention of keeping his mouth shut and should not be expected to do so.

The real problem, in any case, is that Donald Trump is such a singularly appalling individual that remaining silent on his odious reign could reasonably be seen as a dereliction of duty for any self-respecting public figure—particularly one so devoted to appealing to the so-called “better angels of our nature.” In other words, the sheer awfulness of Trumpism—even compared to that of, say, George W. Bush—is sufficient to override the usual protocols of discretion among past presidents. These are not ordinary times, and it would be disingenuous to pretend otherwise.

But here’s the thing: Part of the job of statesmanship is to be disingenuous every now and again for the sake of preserving the national fabric. Whatever one might think about Donald Trump, he is the duly-elected leader of our country for at least another eight months and maintains unshakable popularity among a not-insignificant chunk of our fellow citizens. As a head of state, he is entitled to a baseline deference that reflects the majesty of the office he holds, which transcends the character of whoever happens to hold it at a given moment in time.

When a retiring president passes the baton to his immediate successor, he is conferring legitimacy upon the most important public job in the United States—a hand-off in a constitutional relay race that has continued uninterrupted since George Washington peacefully ceded power to John Adams on March 4, 1797.

By then turning around and glibly musing to the nation’s schoolchildren that the sitting commander-in-chief has no Earthly idea what he’s doing, he risks ever-so-slightly chipping away at that legitimacy, rhetorically lowering the presidency to just one more partisan player in a vulgar federal political food fight, rather than the figurehead of the greatest republic the world has ever seen.

I say this in the full knowledge that Obama’s characterization of the Trump White House as a raging dumpster fire of incompetence is objectively, obviously correct. Nor am I under any illusion that the courtesy I am asking of Obama for Trump was ever extended to Obama himself at any point during his eight-year stint in the Oval Office. In effect, I am demanding a double standard whereby when the Republicans go low, the Democrats go high—a strategy that never seems to bear much fruit in the long run, however noble it may sound.

The plain truth is that there will be no good answer to this question until we have a new commander-in-chief. That the catchphrase of erstwhile conservative Rick Wilson, “Everything Trump touches dies,” extends to the presidency itself. That Trump is the exception to every rule, but once he’s gone, maybe we can return to life as it used to be, almost as if he never existed in the first place. Maybe.

In the meantime, with a pandemic raging and an economy cratering, the nation must turn its lonely eyes to someone, and while Joe DiMaggio is no longer available, I can think of at least one other Joe who is.

Checkpoint Charlie

Charlie Baker, the governor of Massachusetts, has consistently ranked among the two or three most admired statewide leaders since he was first elected in 2014, with job approval ratings in the high-60s to low-70s. Not bad for a Republican in an extremely Democratic state. (Indeed, he has historically polled higher among Democrats than Republicans. But that’s another story.)

Since the novel coronavirus upended life as we thought we knew it, forcing all 50 states to place their economies in a state of suspended animation, Baker’s popularity has only grown. According to a Suffolk University poll released last week, some 84 percent of Massachusetts residents approve of Baker’s stewardship of the COVID-19 plague—a stratospheric figure even in the context of a national emergency that has seen virtually all governors’ popularities spike. (Overall, 71 percent of Americans approve of their own governor’s handling of the pandemic.)

While there are many possible explanations for the extraordinary goodwill toward Baker by his constituents, I’d offer two as the most self-evident: He is smart, and he is boring.

By smart, I don’t just mean that he has a bunch of fancy degrees from a bunch of swanky universities. Rather, I mean that when he is presented with a problem—be it a faulty public transit system or a contagious, deadly virus—he takes it upon himself to base all major decisions on data, experts and the proverbial facts on the ground. As a former healthcare CEO and state budget chief, he knows his way around a spreadsheet as well as anybody and will happily rattle off statistics until your eyes roll all the way into the back of your head.

In his daily COVID press conferences, Baker has only ever measured the state’s success in beating back the virus—and in planning for the future—in terms of raw numbers: tests, cases, hospitalizations, deaths. In the face of recent criticism that the state is moving too slowly in announcing which industries will be allowed to re-open—and how and when—Baker merely reiterates his longstanding view that the mechanics of returning to normal will be determined by the fickle course of the pandemic itself, and thus cannot be gamed out too far in advance. As he has put it on multiple occasions, “We have to respect the virus.”

So far as I can tell, Baker has not wavered from this basic operational and philosophical framework since this nightmare began in mid-March, which is perhaps why his televised daily updates have tended to blend into each other, consisting largely of Baker repeating his previous advisories concerning mask-wearing, social distancing and other best practices for the general public. While he will occasionally introduce critical new information into the mix—such as when he delayed the state’s tentative “re-opening” date from May 4 to May 18, or when he first enumerated the state’s plan for “contact tracing”—he otherwise seems perfectly content to produce as little drama as possible, almost as if he’s allergic to being the center of attention and making more news than is strictly necessary.

That brings us to his other main virtue: boringness. It has been theorized for years—specifically, since November 2016—that Massachusetts voters’ appreciation for Baker—a moderate, mild-mannered technocrat—is primarily a function of their smoldering antipathy toward Donald Trump, and their relief that not all Republicans are as craven, corrupt and creepy as the current commander-in-chief. That Baker’s apparent disinterest in toeing the national party line—not to mention his stated personal distain for Trump himself—is reason enough to have him in the corner office on Beacon Hill.

To a large extent, this theory is correct. However maddening Baker’s incrementalism and deliberativeness might be—particularly with a fast-moving pathogen that is killing our friends and neighbors by the thousands—Bay State residents can at least rest assured he will never be drawn into a Twitter battle with a fellow governor, say, or that he will shape policy based on what he saw last night on Fox News. Or pick fights with journalists he believes are treating him unfairly. Or take all the credit when things go well and none of the blame when things go haywire.

In short, unlike other politicians we could mention, Charlie Baker has never caused the average citizen to wake up in a cold sweat asking themselves, “What in God’s name is he going to do today?”

His is a steady hand in a shaky world, blessedly bereft of the deadly ineptitude of Donald Trump, the self-regarding bluster of Andrew Cuomo, the heavy-handedness of Gretchen Whitmer, or the suicidal recklessness of Ron DeSantis or Brian Kemp.

By no means does this make him perfect—or even the right man in the right moment. If nothing else, the COVID-19 crisis has shown Charlie Baker to be exactly who we thought he was all along: An uber-rational, cool-headed nerd more concerned with the well-being of his constituents than his own prized place in the history books, knowing, as he must, that one naturally leads to the other—that prioritizing public health over short-term economic growth is both a noble and savvy means of teeing up a run for an unprecedented third consecutive term in office, which he has not yet ruled out.

For now, he is a dependable voice of sanity and reassurance in a society in dangerously short supply of both, and that’s good enough for me.

This Thing On My Face

I don’t generally quote from my own Twitter feed—I keep my account private for a reason—but I can’t help digging up a gem from March 3 of this year, when I asserted, “Given the choice, I’d much prefer having coronavirus for a month than wearing a face mask for a year.”

While the exact context of that tweet is lost to history, I was obviously reacting to the growing epidemiological menace of COVID-19, which—as the date of the tweet indicates—was roughly one week away from effectively shutting down the United States until further notice.

Now that we are some six-to-eight weeks into this national self-quarantine (depending how you count) and can take a somewhat panoramic view of the early trajectory of this extraordinary societal experiment, it is worth pausing to notice how fast things have changed, and—more interestingly—how fast we, the people, have changed with them.

Specifically, let’s talk about masks.

While my aforementioned tweet-tantrum about preferring the virus itself to strapping a prophylactic around my face for an extended period can now be dismissed—with some justification—as the whiny, simplistic rantings of a selfish, short-sighted nincompoop, I fully stand by the sentiment as an accurate and rational reflection of my mindset—and the mindset of nearly all of my countrymen—at that particular moment in time.

As it happens, it was on the very morning of March 3 that I casually sauntered into the public library downtown—which was open and fully-functioning—to cast my vote in the Massachusetts presidential primary. It was that evening—“Super Tuesday,” as we called it—that Joe Biden took the stage in a very densely-packed auditorium in California to declare victory—a speech briefly interrupted by a small gang of protesters whom Symone Sanders, a senior Biden aide, charged at like a heat-seeking missile and yanked forcibly offstage.

That was the universe in which we all operated in the first week of March: One with lots and lots of people freely moving about to their heart’s desire with nary a care in the world for their health or personal space. While we were all quite aware of the deadly pathogen that had ravaged the likes of China and Italy and had officially migrated into the United States, on the morning of Super Tuesday there was a grand total of 63 confirmed cases in a nation of 328 million, and terms like “social distancing” and “flattening the curve” had not remotely entered the national lexicon.

As such, the notion of large numbers of seemingly healthy Americans walking around in public with face coverings—voluntarily or by government decree—struck most of us as just a hair short of crazy for a good long while—a feeling aided, in no small part, by our country’s own leading health experts, who advised that such accoutrements are unnecessary and possibly counterproductive. Lest we forget the now-infamous February 29 tweet by Surgeon General Jerome Adams, which began, “Seriously people- STOP BUYING MASKS! They are NOT effective in preventing general public from catching #Coronavirus.”  (As I said, best to keep your Twitter feed to yourself.)

Smash cut to tomorrow, May 6, when in my home state of Massachusetts—per an order by Governor Charlie Baker—all residents will be required to wear some kind of face covering whenever they are in a public place and unable to keep a safe distance from others. With temperatures in New England already inching into the 70s, that’ll be just about everywhere soon enough.

Life comes at you fast, doesn’t it? What was unthinkable yesterday may well become inevitable tomorrow, and it turns out that near-universal use of makeshift face masks is a signal example of this reality here in Coronaland in May 2020.

I don’t know about you, but I’ve been measuring the nature of this pandemic largely from my visits to the supermarket—i.e., the only commercial establishment I’ve patronized regularly in the last month-and-a-half—and I still can’t shake the fact that during a Stop & Shop run in early April, I observed maybe 20 percent of my fellow customers masked up as they carted from one aisle to the next, while on a subsequent trip less than a week later, the figure was probably closer to 80 percent. In a mere matter of days, the act of wearing a mask in shared spaces had swung from being an odd, conspicuous affectation to simple common sense and a public health necessity.

One day, the weirdos were the ones who concealed the lower half of their faces. The next day, the weirdos were the ones who didn’t.

Before you ask: No, I myself did not strap on a mask on that earlier jaunt through Stop & Shop—even though I had a perfectly good one in my pocket, ready for action—and yes, by the latter trip, I changed my tune entirely and all-too-willingly complied.

And why was that? Easy: Because, in both cases, I didn’t want to be the weirdo. Because I didn’t want to be judged and glared at by my neighbors for diverging from the social mores of the moment. Because I just wanted to get through the checkout line and back to my car without causing some kind of confrontation. Because for all my so-called independence and First Amendment absolutism, the truth is that my only real ambition in life is to not get into an argument with a stranger more than one or two times per decade.

It was in that spirit that I decided this past Sunday—the most deliciously summerlike day of the year so far—to pre-empt Governor Baker’s order by a few days and put on my mask every time I go for a bike ride. While I live in a suburban area where keeping a six-foot distance from anything is relatively easy to do, I realized there is no particular downside to modeling responsible behavior for others, and it turns out you get a lot more smiles from pedestrians and fellow bikers with a piece of fabric on your face, precisely because of the message it sends.

That message, roughly speaking, can be boiled down to, “Your life is more valuable than my comfort, and it’s worth the occasional itchiness to ensure I don’t accidentally murder my fellow human beings with an invisible bug that might spew forth from my big mouth.”

Even in a country as thoughtless and selfish as ours, that seems like a solid credo with which to ride out this wave of disruption and uncertainty until we arrive wherever it is that we’re headed.

Sloppy Joe

If a sexual assault allegation against Joe Biden falls in the New York Times and no one reads it, will it stop Biden from being elected president in November?

On April 12, while America was understandably preoccupied with other matters, the Times printed the account of a woman named Tara Reade, who claims that in 1993, while working as a staffer in Biden’s Senate office, the future vice president—and now-presumptive Democratic presidential nominee—“pinned [Reade] to a wall in a Senate building, reached under her clothing and penetrated her with his fingers.”

Reade first publicly accused Biden of untoward behavior last year, when more than a half—dozen women recounted a panoply of inappropriate touching, hugging and kissing Biden had engaged in over the course of his career—some of it right out in the open—in anticipation of Biden’s entry into the 2020 Democratic primary. Reade’s own accusation at the time entailed unwelcome physical contact such as neck-stroking and hair-grabbing, but not sexual assault. When asked why she waited until now to lay her most serious charge, Reade said she was afraid following “a wave of criticism and death threats” in response to her initial disclosures.

The Times reporting found that Reade mentioned the alleged assault to several people shortly after it occurred, but also that neither the Senate nor Biden’s office has any record of a formal complaint Reade claims to have filed at the time. Biden himself, through a spokesperson, has denied the incident ever took place.

And so here we are, forced to regard Joe Biden as we have previously regarded the likes of Brett Kavanaugh, Woody Allen, Donald Trump and every other public man whose alleged past sins (i.e., crimes) have been brought to light at a moment when the truth about what happened in the past has a singular power over what happens in the future.

As with Christine Blasey Ford during the Kavanaugh hearings, one of three things must be true. One, Reade is a liar. Two, she has a severely distorted memory. Or three, Joe Biden is a sex offender. And as with so many other chapters of the #MeToo story, with no definitive proof on either side, it’s up to each of us individually to decide which party to believe—him or her—and to act accordingly.

What makes the Biden case different—and arguably the most high-stakes iteration of the #MeToo era to date—is that how Americans judge Reade’s claim may well determine the outcome of the 2020 election—and, by extension, every action by the federal government through at least January 20, 2025. At this point, it would be political malpractice for the Democratic Party to blithely assume otherwise.

The potential trajectory of this electoral powder keg is not difficult to game out: Reade sticks to her story. Trump and/or his backers believe her loudly and unconditionally, seizing on the allegation as a 10,000-ton albatross to sling around Biden’s neck 24 hours a day. A not-insignificant number of left-leaning independents—and maybe even a few Democrats—decide they cannot in good conscience vote for someone credibly accused of sexual assault, and ultimately leave their ballots blank, bequeathing a second term to one Donald J. Trump.

Don’t tell me this can’t happen. Don’t tell me a presidential election cannot be swung by the 27-year-old recollections of a heretofore anonymous former Senate aide. Don’t tell me there isn’t a sizeable chunk of the electorate who might otherwise vote for Biden—despite his known flaws—but will think twice when presented with as explosive an accusation as Reade has now presented. Don’t tell me that, when faced with the ultimate hypothetical—If you knew, for a fact, that Biden had once committed sexual assault, would you vote for him anyway?—even the most loyal Democrats would not give themselves at least a moment or two of pause.

And whatever you do, don’t tell me that because Donald Trump has been accused—indeed, has admitted to—behavior that is demonstrably worse than anything ever said about Biden, there is no moral compromise to be made in choosing the latter over the former.

Sorry, folks. It turns out that, in 2020, life is not going to be that simple.

Barring a sudden confession from Reade that she made the whole thing up, every Biden supporter in America—most of whom, one presumes, have been cheering on the #MeToo movement for the last two-and-a-half years—will be forced to reckon with the fact that on November 3, they will be voting for a man who has been credibly accused of sexual assault, and that the only true rationalization for this decision—the alternative would be worse—is a rationalization all the same.

Liberals have spent the past four years excoriating conservatives for supporting a president whose very existence is an affront to nearly all of their so-called principles—honor, dignity, family values—but whose promises of tax cuts and a right-wing judiciary made the tradeoff both justified and unavoidable in their own minds.

Is that not the moral bargain that today’s liberals will now need to make about Joe Biden? Will the never-Trump crowd not be spending the next six months talking themselves into the idea that one sexual assault is a fair price to pay for universal healthcare and debt-free college education? And given the essentially binary nature of U.S. presidential elections, will they not, in some horrid sense, be correct?

St. Mark asked, For what shall it profit a man, if he shall gain the whole world, and lose his own soul?  I guess we’re about to find out.