This Thing On My Face

I don’t generally quote from my own Twitter feed—I keep my account private for a reason—but I can’t help digging up a gem from March 3 of this year, when I asserted, “Given the choice, I’d much prefer having coronavirus for a month than wearing a face mask for a year.”

While the exact context of that tweet is lost to history, I was obviously reacting to the growing epidemiological menace of COVID-19, which—as the date of the tweet indicates—was roughly one week away from effectively shutting down the United States until further notice.

Now that we are some six-to-eight weeks into this national self-quarantine (depending how you count) and can take a somewhat panoramic view of the early trajectory of this extraordinary societal experiment, it is worth pausing to notice how fast things have changed, and—more interestingly—how fast we, the people, have changed with them.

Specifically, let’s talk about masks.

While my aforementioned tweet-tantrum about preferring the virus itself to strapping a prophylactic around my face for an extended period can now be dismissed—with some justification—as the whiny, simplistic rantings of a selfish, short-sighted nincompoop, I fully stand by the sentiment as an accurate and rational reflection of my mindset—and the mindset of nearly all of my countrymen—at that particular moment in time.

As it happens, it was on the very morning of March 3 that I casually sauntered into the public library downtown—which was open and fully-functioning—to cast my vote in the Massachusetts presidential primary. It was that evening—“Super Tuesday,” as we called it—that Joe Biden took the stage in a very densely-packed auditorium in California to declare victory—a speech briefly interrupted by a small gang of protesters whom Symone Sanders, a senior Biden aide, charged at like a heat-seeking missile and yanked forcibly offstage.

That was the universe in which we all operated in the first week of March: One with lots and lots of people freely moving about to their heart’s desire with nary a care in the world for their health or personal space. While we were all quite aware of the deadly pathogen that had ravaged the likes of China and Italy and had officially migrated into the United States, on the morning of Super Tuesday there was a grand total of 63 confirmed cases in a nation of 328 million, and terms like “social distancing” and “flattening the curve” had not remotely entered the national lexicon.

As such, the notion of large numbers of seemingly healthy Americans walking around in public with face coverings—voluntarily or by government decree—struck most of us as just a hair short of crazy for a good long while—a feeling aided, in no small part, by our country’s own leading health experts, who advised that such accoutrements are unnecessary and possibly counterproductive. Lest we forget the now-infamous February 29 tweet by Surgeon General Jerome Adams, which began, “Seriously people- STOP BUYING MASKS! They are NOT effective in preventing general public from catching #Coronavirus.”  (As I said, best to keep your Twitter feed to yourself.)

Smash cut to tomorrow, May 6, when in my home state of Massachusetts—per an order by Governor Charlie Baker—all residents will be required to wear some kind of face covering whenever they are in a public place and unable to keep a safe distance from others. With temperatures in New England already inching into the 70s, that’ll be just about everywhere soon enough.

Life comes at you fast, doesn’t it? What was unthinkable yesterday may well become inevitable tomorrow, and it turns out that near-universal use of makeshift face masks is a signal example of this reality here in Coronaland in May 2020.

I don’t know about you, but I’ve been measuring the nature of this pandemic largely from my visits to the supermarket—i.e., the only commercial establishment I’ve patronized regularly in the last month-and-a-half—and I still can’t shake the fact that during a Stop & Shop run in early April, I observed maybe 20 percent of my fellow customers masked up as they carted from one aisle to the next, while on a subsequent trip less than a week later, the figure was probably closer to 80 percent. In a mere matter of days, the act of wearing a mask in shared spaces had swung from being an odd, conspicuous affectation to simple common sense and a public health necessity.

One day, the weirdos were the ones who concealed the lower half of their faces. The next day, the weirdos were the ones who didn’t.

Before you ask: No, I myself did not strap on a mask on that earlier jaunt through Stop & Shop—even though I had a perfectly good one in my pocket, ready for action—and yes, by the latter trip, I changed my tune entirely and all-too-willingly complied.

And why was that? Easy: Because, in both cases, I didn’t want to be the weirdo. Because I didn’t want to be judged and glared at by my neighbors for diverging from the social mores of the moment. Because I just wanted to get through the checkout line and back to my car without causing some kind of confrontation. Because for all my so-called independence and First Amendment absolutism, the truth is that my only real ambition in life is to not get into an argument with a stranger more than one or two times per decade.

It was in that spirit that I decided this past Sunday—the most deliciously summerlike day of the year so far—to pre-empt Governor Baker’s order by a few days and put on my mask every time I go for a bike ride. While I live in a suburban area where keeping a six-foot distance from anything is relatively easy to do, I realized there is no particular downside to modeling responsible behavior for others, and it turns out you get a lot more smiles from pedestrians and fellow bikers with a piece of fabric on your face, precisely because of the message it sends.

That message, roughly speaking, can be boiled down to, “Your life is more valuable than my comfort, and it’s worth the occasional itchiness to ensure I don’t accidentally murder my fellow human beings with an invisible bug that might spew forth from my big mouth.”

Even in a country as thoughtless and selfish as ours, that seems like a solid credo with which to ride out this wave of disruption and uncertainty until we arrive wherever it is that we’re headed.

The Last Laugh

The trouble with being a free speech absolutist (as I am) is that you often find yourself having to defend awful people who say awful things. Even if you truly believe (as I do) that the First Amendment’s guarantee of free expression applies to all manner thereof—not just the pleasant, uncontroversial sort—there will inevitably be moments that test the limits of our most elemental constitutional right, leading reasonable people to wonder if some restrictions on public speech are in order.

This is not one of those moments.

In the first week of January—as you might recall—the United States very nearly started a war with Iran when President Trump ordered the killing of noted general/terrorist Qasem Soleimani. This in turn led the Islamic Republic to bomb U.S. bases in Iraq, at which point Trump threatened, via Twitter, to strike various locations inside Iran, potentially including ones “important to Iran [and] the Iranian culture.”

Ultimately, the administration ruled out the possibility of targeting culturally significant sites—presumably after being informed that such a move would constitute a war crime—but not before a gentleman named Asheen Phansey, the director of sustainability at Babson College in Wellesley, Massachusetts, mischievously posted on Facebook:

“In retaliation, Ayatollah Khomenei [sic] should tweet a list of 52 sites of beloved American cultural heritage that he would bomb. Um…Mall of America? Kardashian residence?”

It was a cheap, naughty little quipsomething many of us undoubtedly thought and/or said ourselvesbut it evidently rubbed someone at Babson (Phansey’s employer) the wrong way, because four days later, Phansey was fired.

In a statement, the school wrote, “Babson College condemns any type of threatening words and/or actions condoning violence and/or hate. This particular post from a staff member on his personal Facebook page clearly does not represent the values and culture of Babson College,” adding, “[W]e are cooperating with local, state and federal authorities.”

In the weeks since Phansey’s sacking, there has been considerable pushback against Babson by the likes of Salman Rushdie, Joyce Carol Oates and PEN America. Thus far, however, Phansey has not been rehired, nor has the college shown any interest in doing so.

Speaking as someone who lives nearby and has attended on-campus events every now and again, I would advise Babson to offer Phansey his old job back and for Phansey to reject it out of hand. An institution so idiotic as to fire an employee for making a joke is unworthy of Phansey’s talents, whatever they might be.

I wrote at the outset that, as a First Amendment issue, this one is not a close call. Viewed in its full context—or, I would argue, in any context whatsoever—Phansey’s Facebook post was very obviously written in jest—an ironic commentary about a serious, if absurd, world event. In categorizing a knock on the Kardashians and the Mall of America as “threatening words and/or actions,” Babson seems to imply that it can’t distinguish humor from sincerity, begging the question of how it ever achieved accreditation in the first place.

More likely, of course, is that Babson was simply intimidated by the bombardment of complaints it apparently received following Phansey’s original post (which he swiftly deleted) and decided it would be easier and more prudent to cave in to the demands of a mob and fire Phansey on the spot, rather than defend Phansey’s right—and, by extension, the right of any faculty member—to comment on news stories in his spare time.

It was a terrible, stupid decision for Babson to make, and its silence in the intervening days has only brought further dishonor upon an otherwise sterling institute of higher learning. While it is undeniably true that a private employer has the discretion to dismiss employees who have violated official policy—the First Amendment’s provisions are explicitly limited to the public sector—the notion that making a mildly off-color remark on one’s own Facebook page constitutes a fireable offense is a horrifying precedent for a college to set, and is among the most egregious examples yet of the general squeamishness on so many American campuses toward the very concept of free expression.

As a cultural flashpoint, I am reminded of the (much larger) brouhaha surrounding Kathy Griffin in 2017, when an Instagram photo of the comedienne hoisting what appeared to be the severed head of President Trump led to Griffin being treated as a national security threat by the Justice Department and effectively banished from society for the better part of a year.

As with Phansey, no honest person could look at Griffin’s gag and say it was anything other than gallows humor—albeit an exceptionally tasteless manifestation thereof. We might agree, on reflection, that Griffin should’ve thought twice before tweeting an image of a bloodied Trump mask out into the world—to paraphrase Rabbi Mendel, not everything that is thought should be instagrammed—but there is a giant chasm between being a low-rent comedian and being a threat to public safety, and I am very, very worried that our politically correct, nuance-free culture is increasingly unable and/or unwilling to separate one from the other.

In short, we are slowly-but-surely devolving into a society without a sense of humor, a tendency that—if I may allow myself a moment of hysterical overstatement—is a gateway drug to totalitarianism. A nation without wit is a nation without a soul, and a culture that doesn’t allow its citizens to make jokes without fear of losing their livelihoods is one that has no claim to moral superiority and no right to call itself a democracy.  What a shame that our universities, of all places, haven’t quite figured this out yet.

Twitter and Cheese

Perhaps you’ve heard the joke. A married construction worker on break opens his lunch box and says, “I swear to God, if it’s ham and cheese one more time, I’m gonna jump off this building.” Sure enough, the next day it’s ham and cheese yet again and, true to his word, the man leaps to his death. At the funeral, his wife remarks, “I don’t understand. He packed his own lunch.”

I recount this silly, if morbid, little yarn in light of Americans’ gradually-escalating freak-out about our various modes of social media—in particular, the blue menace that is Twitter, whose darker, louder, more hateful corners seem to be sending all of us to the proverbial ledge.

Over the weekend, the New York Times published its latest exhaustive (read: exhausting) deep dive into some aspect of the Trump administration—in this case, the president’s use of Twitter as a blunt instrument against his political enemies, real and imagined, both within and without the executive branch.

While I would love to engage with the piece’s most compelling and insightful conclusions, I must confess that I haven’t actually read the damn thing and have no immediate plans to do so. Though I have no doubt the Times investigative team has produced valuable and enlightening analysis of how the 45th president’s Twitter habits have shaped the course of recent history—which they most assuredly have—the truth is I just can’t bring myself to care about what Donald Trump types into his phone while he’s sitting on the can.

Frankly, I don’t care what most people tweet most of the time, and I dare say the feeling is mutual. At present, I “follow” a grand total of 22 individuals and organizations on the platform, which is to say I don’t follow roughly 330 million others, including Donald Trump, yet—oddly enough—I have never once felt I’m missing out on anything of any consequence.

In the decade since I first joined, I have only ever used Twitter as a means of keeping up with the nooks and crannies of American culture that truly interest me, and blissfully ignoring everything else. What’s more, my account is private, meaning no one can read my own 280-character brain droppings without a formal request. My current group of loyal followers could fit comfortably inside a telephone booth, and that suits me fine. So far as I’m concerned, I’m the consumer and Twitter is the product—not the other way around.

Which returns us to the construction worker and the ham sandwich. When it comes to Twitter and its many analogues across the googlesphere, we all pack our own lunch. Online, our “friends” and “followers” are entirely of our own choosing and can be done away with through a single tap of the finger. If we don’t want ham and cheese appearing in our newsfeed 24 hours a day, we have the power to pick a different sandwich, or none at all. Apart from the advertising that pays our meal ticket, the menu is entirely within our power to curate.

It begs the question: Why, exactly, are so many of us threatening to jump off the ledge? Why is our entirely voluntary participation in this virtual town square causing us so much unnecessary agita? If each of us has near-total control over which social media personalities to invite onto our pages—and which ones to block—why all the bellyaching about how these platforms have become toxic dumpster fires of intemperate partisan hysteria?

Are we simply a nation of masochists? Do we secretly enjoy rolling around in the rhetorical muck, self-righteously claiming we don’t? Are we so addicted to the dopamine hit that outrageous online behavior provides that we are destined to be sucked into the vortex of hate no matter how hard we resist? Are we a species that just enjoys complaining any chance we get?

Here’s a thought: Stop complaining and start acting.  If the ugliness of social media genuinely repulses you, try removing the sources of that ugliness from your field of vision and see if conditions don’t improve.  Don’t blame others for that which you yourself brought forth.  Take responsibility for your own life.

And don’t forget to vote on November 3, 2020.

To Love a Country

In a 2007 Republican presidential primary debate, Mitt Romney was asked, “What do you dislike most about America?”

To the shock of nobody, Romney dodged the question completely, responding, “Gosh, I love America,” adding, “What makes America the greatest nation in the world is the heart of the American people—hard-working, innovative, risk-taking, God-loving, family-oriented American people.”

It was a lovely thought, perfectly in keeping with the public persona of the ex-governor, now-senator we have come to know and, um, not completely hate.

Really, with a dozen years of hindsight, the most remarkable thing about that moment was that the question was even asked—that someone angling to be America’s commander-in-chief was challenged in a public forum to critique the very country he hoped to lead.

Indeed, when Romney took another whack at the presidency in 2012, he released a memoir of sorts, No Apology, whose title more or less summed up the attitude of his campaign.  As far as he was concerned, America is an idyllic land of milk and honey that has only ever been a force for good in the world, for which it should feel nothing but unadulterated, chest-thumping pride. 

As you’ll recall, President Obama’s greatest sin in office, according to Romney and others, was to have had the temerity to apologize for America’s various historical blunders—particularly on matters of race and foreign policy—thereby implying the nation is somehow less than perfect.  The nerve!

While Romney himself has since slunk off into complete obscurity—i.e., the Senate—his view of the United States as a moral dynamo on the world stage whose superiority must never be questioned has only hardened as Republican Party orthodoxy in the years since.

Or so we were informed last week by the current president, Donald Trump, who in a Twitter broadside against four congresswomen that managed to blend howling racism with wholesale incoherence, argued that anyone who is skeptical about how the United States is run—including those who have been elected to run it—has no business residing within the country’s borders and ought to “go back” to the far-flung lands “from which they came.”

“IF YOU ARE NOT HAPPY HERE,” the president tweeted, “YOU CAN LEAVE!”

Beyond the irony that three-fourths of the congresswomen in question were, in fact, born in the United States, it has been duly noted that few people in public life have been more openly scornful of U.S. foreign and domestic policy over the years than Trump himself.  Indeed, for all the money and privilege—untaxed and unchecked, respectively—that has spilled into his lap practically since birth, the president never seems to run out of grievances about the place that has handed him everything on a silver platter, up to and including its highest public office.

And yet.

Setting aside the singular, noxious bigotry that informs much of our Dear Leader’s enmity toward a republic founded on the principles of liberty, pluralism and equal justice under the law, Trump is absolutely correct in expressing his misgivings about his homeland without fear of persecution or prejudice.  He is right to assert—as he so memorably did in a 2017 interview on Fox News—that America is not “so innocent” in its behavior toward its geopolitical adversaries and, by implication, shouldn’t be held up as the moral paragon that the Mitt Romneys of the world would have you believe it is.

In other words, if you want an ironclad rebuke to the tweets of Donald Trump, look no further than the actions of Donald Trump.

That said, the president’s personal hypocrisy on this matter needn’t obscure the deeper truth, which is that the greatness of America resides precisely in the right of every one of its citizens to criticize it, because criticism, in the right hands, is among the sincerest expressions of patriotism and love.

Surely, Frederick Douglass had a few choice words for his mother country throughout his life—words that, we can safely say, have redounded to America’s benefit in the long run.  Ditto for the likes of Martin Luther King and Susan B. Anthony and Rachel Carson and Ralph Nader and innumerable other restless rabble-rousers who found a glaring blemish in the national complexion and took it upon themselves to fix it.

Criticizing your country is the first step to perfecting it.  It’s how you keep your country honest, challenging it to live up to its loftiest ideals.

Why settle for anything less?

The Prettiest Sight

In The Philadelphia Story, a lowly reporter played by James Stewart scornfully intones, “The prettiest sight in this fine, pretty world is the privileged class enjoying its privileges.”  For one week on Martha’s Vineyard earlier this summer, that’s exactly what I was doing.  And oh, what a pretty sight it was.

Certainly, the Vineyard—regularly ranked among the priciest vacation spots in America—screams “privilege” in any season, from its private beaches and golf courses to its posh restaurants and hotels to its A-list clientele.

In my case, however, the fact that I was among New England’s most well-heeled (albeit in a budget-friendly rental unit with no room service) was ancillary to the real privilege I enjoyed for eight days and seven nights on this triangular island seven miles off Cape Cod:  The privilege to not care what was happening in the universe beyond the shore.  The privilege to disconnect from current events and suffer no consequences whatsoever.

See, in my normal, landlubber life, I’m plugged into the global newsfeed about as deeply as any good American should be, monitoring Twitter and the New York Times with freakish regularity to ensure I am always in the loop about whatever unholy nonsense the president has gotten himself into today (among other things).

But while on vacation, I made a deliberate effort to disengage from the minute-by-minute deluge of copy that otherwise scrolls across my transom, and just try to relax for a change.  By and large, I succeeded.

To be clear, this did not entail a total 24/7 news blackout.  Rather, it meant checking Facebook two or three times per day instead of the usual thirty.  It meant scanning Boston Globe headlines without necessarily reading the articles underneath them.  It meant not watching a single segment of cable or network television.

Most significantly, it meant absolute abstention from Twitter, and all the nauseating, petty political catfighting contained therein.

It meant, in effect, that I still had a vague, general sense of what was happening across the seven continents, but without the fuss of getting bogged down in the details.

What I took away from this experiment—this voluntary, temporary withdrawal from the media-industrial complex—was how precious little I was missing.  How trivial such seemingly earth-shaking stories really are when viewed in proper perspective.  How oddly pleasant it was not to be waist-deep in the muck of political tomfoolery at every hour of every day.  And how much I dreaded returning to my usual routine in the real world—which, of course, I did with all deliberate speed.

It begged the question:  What’s so great about the real world, anyway?  Why do I burden myself with the minutiae of global happenings when I could just as well spend my free time going for long walks and plowing through the collected works of Agatha Christie?

Keeping on top of the news may make me conscientious and informed, but does it really make me happy?  Would I be any worse off, as a person, were I to harness the laid-back habits I picked up on the Vineyard and maintain them until the end of my natural life?

In all likelihood I would not be, and that, in so many words, is the true meaning of privilege in 2019 America.  It’s not a question of wealth or fame (of which I have none).  Rather, it’s about the ability to tune out.  To be mentally on vacation for as long as one’s heart desires.  To ignore such unpleasantries as war, famine, global warming and the Trump administration and be affected by them not one whit.

Deep down, of course, this is just white privilege by another name, since to be white in America is to know that, however bad things may get, there will always be a spot for you on the lifeboat.  And to be a white man, all the better.

Naturally, as a bleeding heart liberal (or social justice warrior, or whatever we’re supposed to call ourselves now), I can hear the angel on my shoulder gently reminding me that the role of the Woke White Person in Trump’s America is to support and agitate on behalf of the downtrodden—immigrants, Muslims, and pretty much anyone else who isn’t Caucasian and/or male and doesn’t have the luxury to take a mental health break from reality—which requires paying close attention to what is being inflicted upon one’s fellow countrymen—and aspiring countrymen—on our watch, in our name.

On refection, it seems like a fair price to pay for someone whose life is sufficiently charmed as to be able to spend a week of every June on a place like Martha’s Vineyard, watching the sun rise over Edgartown Harbor and guzzling beer and clam chowder without a care in the world.

After all, there is some happiness to be found in simply being involved—however meekly—in the national discourse, particularly when Election Day rolls around, as it is wont to do every now and again.  That’s to say nothing for the lowly blogger, who will sooner or later need to write about something other than lobster rolls and how to avoid being eaten by a shark.

All That Is Written

“All that is thought should not be said, all that is said should not be written, all that is written should not be published, and all that is published should not be read.”

Those words were coined by a Polish rabbi named Menachem Mendel of Kotz in the middle of the 19th century.  Surely they have never been more urgently needed than in the United States in 2019.

Just the other day, for instance, the venerable Boston Globe published an online op-ed by Luke O’Neil, a freelance columnist, expressing his rather pointed thoughts about the recently-sacked homeland security secretary, Kirstjen Nielsen.  Its throat-clearing opening line:  “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon.”  (Kristol, you’ll recall, was a leading cheerleader for the Iraq War.)

The rest of the column continued in the same vein, castigating Nielsen for her complicity in President Trump’s policy of separating children from their parents at the Mexican border, and advocating for a general shunning of Nielsen from polite society, up to and including doing unsavory things to her food whenever she turns up at a fancy restaurant.

Following a small uproar among its readers, the Globe subsequently re-wrote parts of O’Neil’s piece—cutting out the word “pissing,” among other things—before ultimately removing it from its webpages entirely.  (It never appeared in print in any form.)  All that currently remains of the thing is an editor’s note explaining that the column “did not receive sufficient editorial oversight and did not meet Globe standards,” adding, rather gratuitously, “O’Neil is not on staff.”

Locally, much has been said and written about the Globe’s (lack of) judgment in ever believing an op-ed about poisoning a public official’s dinner—however cheeky—was fit to publish in the first place.  For all of its obvious liberal biases, the Globe opinion page is a fundamentally grown-up, establishmentarian space, suggesting this episode was a bizarre, one-off aberration and nothing more.

The deeper question, however, is what brings an uncommonly thoughtful and clever writer to put such infantile thoughts to paper in the first place.

And I’m not just talking about Luke O’Neil.

Let’s not delude ourselves:  Ever since Secretary Nielsen was hounded from a Mexican restaurant last summer in response to her department’s repugnant immigration policies, every liberal in America has had a moment of silent contemplation about what he or she would do or say to Nielsen given the chance.  That’s to say nothing of her former boss, the president, and innumerable other members of this wretched administration.

Indeed, plumb the deepest, darkest thoughts of your average politically-aware American consumer, and you’re bound to emerge so covered in sludge that you may spend the rest of your life trying to wash it off.

This is why we differentiate thoughts from actions—morally and legally—and why the concept of “thought crime” is so inherently problematic.  Outside of the confessional, no one truly cares what goes on inside your own head so long as it remains there, and most of us have the good sense to understand which thoughts are worth expressing and which are not.

Except when we don’t, and in the age of Trump—with a major assist from social media platforms whose names I needn’t mention—an increasing number of us don’t.

Because it is now possible for any of us to instantaneously broadcast our basest and most uninformed impressions on any subject to the entire world, we have collectively decided—however implicitly—that there needn’t be any filter between one’s mind and one’s keyboard, and that no opinion is more or less valid than any other.  In the Twitterverse, “Let’s expand health insurance coverage” and “Let’s defecate in Kirstjen Nielsen’s salad” carry equal intellectual weight.

As a free speech near-absolutist, I can’t deny the perverse appeal in having no meaningful restrictions to what one can say in the public square.  With political correctness exploding like a cannonball from America’s ideological extremes, it’s heartening to know that reports of the death of the First Amendment have been greatly exaggerated, indeed.

Or it would be—until, say, a newly-elected congresswoman from Minnesota tells a group of supporters, “We’re gonna go in there and we’re gonna impeach the motherfucker,” and suddenly discretion seems very much the better part of valor.

Among the many truisms that life under the Trump regime has clarified is the fact that just because something can be done, it doesn’t mean it should be done.  And the same is true—or ought to be—about how each of us expresses ourselves to the wider world.

I don’t mean to sound like a total prude.  After all, I’m the guy who wrote a column in mid-November 2016 calling the newly-elected president a selfish, narcissistic, vindictive prick, and who tried to cheer my readers up the day after the election by noting that Trump could drop dead on a moment’s notice.

With two-and-a-half years of hindsight, I’m not sure I should’ve written either of those things, not to mention a few other snide clauses and ironic asides here and there ever since.  They weren’t necessary to make my larger points, and like the opening quip in Luke O’Neil’s Globe column, their rank immaturity and meanness only served to cheapen whatever it was I was trying to say.

As someone who claims to be a writer, I try to choose my words carefully and with at least a small degree of charity.  With great powerin this case, the power of wordscomes great responsibility.  And that includes leaving Kirstjen Nielsen’s salmon alone.

Unplugged

I recently returned from a week-long trip to paradise—Martha’s Vineyard, to be exact—and while I was there, I did something that, for me, was both unthinkable and unprecedented.

I kept away from social media and the news.

That’s right.  From the moment our ferry cast off from shore, I ceased all contact with my Twitter feed and didn’t reconnect until after returning to the mainland.  For good measure, I also generally avoided Facebook, the New York Times and cable news, opting to remain as ignorant as possible about what was going on in the parts of the universe not directly in front of my nose.  For perhaps the first time in my adult life, I just didn’t want to know.

Now, maybe tuning the world out is the sort of thing most normal people do to relax at their favorite summer getaways.  But as a prototypical millennial news junkie, I can scarcely imagine being walled off from current events for more than a few hours at a time, vacation or no vacation.  Since acquiring my first Droid in the summer of 2010, I’m not sure I’ve gone a single day without checking my social media apps at least once.  You know:  Just to make sure I’m not missing anything.

Having lived under the tyranny of Zuckerberg and Bezos for so long, I’ve realized with ever-growing acuity that I am every bit as addicted to the little computer in my pocket—and the bottomless information it contains—as the good-for-nothing Generation Z teenagers I’m supposed to feel superior to.  More and more, I recall Jean Twenge’s terrifying recent Atlantic story, “Have Smartphones Destroyed a Generation?” and I wonder whether any of us—of any age group—are going to emerge from this era better citizens and human beings than when we entered it.

So it was that, on the occasion of my annual sojourn to my favorite summer retreat—an island I’ve visited annually since before I was born—I decided I needed to find out whether I’m capable of cutting myself off from the GoogleTube cold turkey.  Whether—if only for a week—I can bring myself to live as I did for the first 23 years of my life:  Without constant, hysterical, up-to-the-second news flashes from every corner of the globe and, with them, the instantaneous expert (and non-expert) analysis of What It All Means and Where We Go From Here.

Mostly, of course, I just wanted a week without Donald Trump.

Did I succeed?

Kind of.

Yes, I still read the Boston Sunday Globe (mostly for the arts pages).  Yes, I still listened to my favorite NPR podcast while riding my bike.  Yes, I still posted pictures on Facebook before going to bed.  And yes, I still allowed my cable-obsessed bunkmate to watch a few minutes of Morning Joe before we headed out to breakfast each day.

All of that aside, I nonetheless fulfilled my core objective of not actively following world events closely—if at all—and believing, to my core, that nothing in life was of greater concern than which ice cream flavor to order at Mad Martha’s and whether to wear jeans or shorts while hiking at Menemsha Hills.  (The answers, respectively, were butter crunch and jeans.)

So I didn’t get the blow-by-blow of President Trump’s meeting in Singapore with Kim Jong-un.  I didn’t hear the early reports of children being snatched from their parents at the Mexican border.  And I didn’t see that raccoon scaling the UBS Tower in St. Paul, Minnesota.

What’s more, I noticed that as the week progressed, I grew increasingly less bothered by how out-of-the-loop I was in my little self-imposed cone of radio silence, and it got me wondering whether I couldn’t keep up this stunt indefinitely.  Whether, in effect, I could become a beta version of Erik Hagerman—the Ohio man, recently profiled in the New York Times, who severed all ties with society on November 9, 2016, and hasn’t looked back since.  Dubbing him “the most ignorant man in America,” the story left little doubt that Hagerman, in his calculated obliviousness, is probably a happier and more well-rounded individual than three-quarters of his fellow countrymen.

Of course, Hagerman is also extremely white—not to mention extremely male and extremely upper middle class—and there is no avoiding the uncomfortable fact that choosing to ignore the daily machinations of the Trump administration is a direct function of white privilege (as countless Times readers pointedly noted at the time).  To be white is to be insulated from Trump’s cruelest and most outrageous policies; thus, there is little-to-no risk in not keeping a close eye on them every now and again.

“The prettiest sight in this fine, pretty world is the privileged class enjoying its privileges,” said Jimmy Stewart, with great scorn, in The Philadelphia Story in 1940.  As a member of the privileged class—in my whiteness and maleness, if not my disposable income—I recognize the profound moral failing of even thinking of mentally tuning out an American society in which virtually every racial, ethnic and cultural minority finds itself under threat.  Silence is complicity, and I very much doubt I could live in happy ignorance knowing, deep down, that a great deal of preventable suffering is occurring just beyond my immediate line of sight.

But it sure was nice while it lasted.