The Ultimate Aphrodisiac

American liberals have caught a lot of flak this season—some of it deserved—for the rigid purity tests they’ve imposed on the men and women auditioning to be the next president of the United States.

As irritating as this moral posturing tends to be, please indulge me one small litmus test of my own:  In November 2020, I will not vote for any candidate who has been credibly accused of rape.

Admittedly, this doesn’t seem like a terribly lot to ask of the would-be most powerful person on Earth—the man or woman who is supposed to be a role model for America’s children and grownups alike.

However, recent history would suggest otherwise.

If polls are to be believed, there is a certain chunk of the American electorate—somewhere north of 40 percent, at minimum—that does not consider accusations of sexual assault to be a deal-breaker for a future (or sitting) commander-in-chief.  This was first demonstrated two decades ago by the continued sky-high approval ratings for Bill Clinton following the rape allegation leveled by Juanita Broaddrick in 1999, and later confirmed by the election of the current chief executive, Donald Trump, whose penchant for grabbing women’s nether regions uninvited was exposed by the candidate himself (via “Access Hollywood”) in October 2016 and by more than a dozen women at regular intervals ever since.

It’s worth noting—in case it wasn’t obvious—that this implicit condoning of felonious, predatory sexual behavior by America’s head of state is not a one-party problem.  Liberals and conservatives have both been complicit, and both are guilty of gross hypocrisy on the matter.  For most Americans, it would seem, the morality of sexual violence by politicians is largely a function of time:  When the opposing party is in power, rape is bad.  When one’s own party is in power, rape is negotiable.

At the moment, of course, it’s Republicans who have disgraced themselves on the question of whether sexual assault is a good idea, thanks—most recently—to the disturbing revelations by E. Jean Carroll in New York Magazine.

In case you missed it, Carroll has claimed that Trump forced himself on her in a Bergdorf Goodman dressing room in the mid-1990s, which she tried—unsuccessfully—to resist.  While Carroll herself insists the encounter did not amount to rape and does not want to be viewed as a helpless victim, it is extremely difficult to read the details of her account and reach any other conclusion.

This bombshell initially landed on June 21 and, following a weekend of radio silence, was picked up by a handful of news organizations, which gave it enough oxygen to force the president to deny the incident ever occurred, adding—as only he can—“[Carroll] is not my type.”

In the weeks since, the whole nasty business has all but evaporated from the public consciousness, replaced by newer, flashier headlines on other subjects.  As with so much else, the prospect that the president once committed a violent sexual assault ended up being a three-day story, at most.  Ultimately, the public shrugged and moved on to other things.

It begs the question:  Why?

Are our attention spans so short that serious allegations of rape simply don’t register like they used to?  Are we so fatigued and fatalistic about this president’s long history of indiscretions that we have given up differentiating one from another?  Nearly two years into #MeToo, do we not believe E. Jean Carroll is telling the truth, or that her memory is faulty?

Or is it possible that we actually like the idea of a president who is effectively above the law?  Who can do whatever he wants and get off scot-free?  Who is exempt from all the usual rules of ethics and common decency?  Who can rape somebody on Fifth Avenue and not lose any votes?

We don’t admit this out loud, of course.  We use euphemisms like “He’s politically incorrect,” or “He tells it like it is,” or my personal favorite, “He’s not a politician.”

Whichever option is closest to the truth, the underlying rationalization is that any level of unscrupulousness and corruption by the Dear Leader is tolerable so long as he ultimately gives his constituents what they want. 

Trump, for his part, has long been described as a purely transactional figure—someone for whom the ends always justify the means and the notion of right and wrong is a foreign concept.  Less remarked upon—but no less important—is that the general public is transactional as well, and is prepared to forgive any number of shortcomings in service of a greater good.

Hence Trump’s consistently stratospheric approval ratings among Republicans.  After all, if you voted for him on the grounds that he would cut your taxes, appoint conservative judges and make refugees’ lives a living hell, why wouldn’t you be happy with the way this presidency has panned out thus far?

The left can crow all it wants about what a sordid ethical compromise Trump’s base has made, but Democrats’ moral superiority is only as good as the next president of their own party.  Liberals were perfectly happy to excuse every one of Bill Clinton’s sexual peccadillos while he was in power and carrying out their agenda (such as it was).  While they have had a radical change of heart in recent years, I cannot help but wonder if they would feel differently if The Man From Hope were still in the Oval Office today.

Henry Kissinger famously said, “Power is the ultimate aphrodisiac,” and it turns out that applies not only to those exercising power, but also to the many millions of beneficiaries of it.  It’s a pretty ugly sight when roughly half the nation consciously accepts a credibly accused rapist as the instrument of their political ends, but then one reason we have elections is to correct course, as America stands to do on November 3, 2020.  While there’s more to the presidency than not being a sexual criminal, it’s a perfectly decent place to start.

Perhaps electing a woman would do the trick.

Advertisements

The Prettiest Sight

In The Philadelphia Story, a lowly reporter played by James Stewart scornfully intones, “The prettiest sight in this fine, pretty world is the privileged class enjoying its privileges.”  For one week on Martha’s Vineyard earlier this summer, that’s exactly what I was doing.  And oh, what a pretty sight it was.

Certainly, the Vineyard—regularly ranked among the priciest vacation spots in America—screams “privilege” in any season, from its private beaches and golf courses to its posh restaurants and hotels to its A-list clientele.

In my case, however, the fact that I was among New England’s most well-heeled (albeit in a budget-friendly rental unit with no room service) was ancillary to the real privilege I enjoyed for eight days and seven nights on this triangular island seven miles off Cape Cod:  The privilege to not care what was happening in the universe beyond the shore.  The privilege to disconnect from current events and suffer no consequences whatsoever.

See, in my normal, landlubber life, I’m plugged into the global newsfeed about as deeply as any good American should be, monitoring Twitter and the New York Times with freakish regularity to ensure I am always in the loop about whatever unholy nonsense the president has gotten himself into today (among other things).

But while on vacation, I made a deliberate effort to disengage from the minute-by-minute deluge of copy that otherwise scrolls across my transom, and just try to relax for a change.  By and large, I succeeded.

To be clear, this did not entail a total 24/7 news blackout.  Rather, it meant checking Facebook two or three times per day instead of the usual thirty.  It meant scanning Boston Globe headlines without necessarily reading the articles underneath them.  It meant not watching a single segment of cable or network television.

Most significantly, it meant absolute abstention from Twitter, and all the nauseating, petty political catfighting contained therein.

It meant, in effect, that I still had a vague, general sense of what was happening across the seven continents, but without the fuss of getting bogged down in the details.

What I took away from this experiment—this voluntary, temporary withdrawal from the media-industrial complex—was how precious little I was missing.  How trivial such seemingly earth-shaking stories really are when viewed in proper perspective.  How oddly pleasant it was not to be waist-deep in the muck of political tomfoolery at every hour of every day.  And how much I dreaded returning to my usual routine in the real world—which, of course, I did with all deliberate speed.

It begged the question:  What’s so great about the real world, anyway?  Why do I burden myself with the minutiae of global happenings when I could just as well spend my free time going for long walks and plowing through the collected works of Agatha Christie?

Keeping on top of the news may make me conscientious and informed, but does it really make me happy?  Would I be any worse off, as a person, were I to harness the laid-back habits I picked up on the Vineyard and maintain them until the end of my natural life?

In all likelihood I would not be, and that, in so many words, is the true meaning of privilege in 2019 America.  It’s not a question of wealth or fame (of which I have none).  Rather, it’s about the ability to tune out.  To be mentally on vacation for as long as one’s heart desires.  To ignore such unpleasantries as war, famine, global warming and the Trump administration and be affected by them not one whit.

Deep down, of course, this is just white privilege by another name, since to be white in America is to know that, however bad things may get, there will always be a spot for you on the lifeboat.  And to be a white man, all the better.

Naturally, as a bleeding heart liberal (or social justice warrior, or whatever we’re supposed to call ourselves now), I can hear the angel on my shoulder gently reminding me that the role of the Woke White Person in Trump’s America is to support and agitate on behalf of the downtrodden—immigrants, Muslims, and pretty much anyone else who isn’t Caucasian and/or male and doesn’t have the luxury to take a mental health break from reality—which requires paying close attention to what is being inflicted upon one’s fellow countrymen—and aspiring countrymen—on our watch, in our name.

On refection, it seems like a fair price to pay for someone whose life is sufficiently charmed as to be able to spend a week of every June on a place like Martha’s Vineyard, watching the sun rise over Edgartown Harbor and guzzling beer and clam chowder without a care in the world.

After all, there is some happiness to be found in simply being involved—however meekly—in the national discourse, particularly when Election Day rolls around, as it is wont to do every now and again.  That’s to say nothing for the lowly blogger, who will sooner or later need to write about something other than lobster rolls and how to avoid being eaten by a shark.

A Queer Notion

On this final day of Pride Month 2019, allow me to note, for the record, that although I am technically a member of the LGBTQIA community (increasingly the most unwieldy acronym in the English language), you’ll never see me marching in any pride parade.

Why not?  In short:  Because I’m not much into parades and I’m not much into pride.

As I’ve possibly written before, I do not think one’s sexual orientation or gender identity should be a point of personal pride.  Rather, I tend to agree with George Carlin, who posited in his final HBO special in 2008, “Pride should be reserved for something you achieve or attain on your own, not something that happens by accident of birth.”

If we are to accept—as we should—that homosexuality and gender dysphoria are naturally-occurring phenomena that are totally beyond our control, what exactly is there to be proud of in acknowledging their existence?  Morally-speaking, being attracted to the same sex is no different from having green eyes or brown hair, so why should one be celebrated while the others are taken for granted without comment?  What, pray tell, are we celebrating?

The question is worth asking during any Pride Month, but it has acquired extra resonance this year in my home state of Massachusetts in light of the so-called “Straight Pride Parade” scheduled to take place in Boston later this summer.

Conceived and organized by a rogues’ gallery of right-wingers calling themselves Super Happy Fun America, this prospective pro-hetero march is an unabashedly snarky, unserious and meanspirited enterprise, intended primarily to protest and ridicule the means by which the queer community has seized cultural power in recent years, as one barrier to LGBT equality after another has fallen by the wayside.  (The odious—and highly non-straight—Milo Yiannopoulos will reportedly be the parade’s grand marshal.)

The gist of SHFA’s argument—which should hardly be dismissed out of hand—is that the LGBT contingent and its allies have become far too militant in enforcing the new rules on what can and cannot be said in public about the nature of various sexual identities, and far too unforgiving toward those who stray—either by accident or on purpose—from the official party orthodoxy on the matter.

Case in point:  When the idea of a “straight pride parade” was decried by the entire cast of The View, the group released an ever-so-tongue-in-cheek statement, calling the ABC program’s condemnation “an act of literal violence that has endangered the lives of heterosexuals everywhere,” adding, “Heterosexuals have languished in the shadows for decades, but we’re not taking it lying down.  Until an ‘S’ is added, LGBTQ pride will continue to be a system of oppression designed to systematically erase straight people from existence.”

The joke, in other words, is that the LGBT rights movement has been so wildly successful as of late—and has, indeed, so fully entered into mainstream culture as to be borderline uninteresting—that it has apparently left many heterosexuals feeling left out and marginalized.  As with men and women in the age of #MeToo, the victims have supposedly become the victimizers, and vice versa.  And so long as straight people see themselves as a disfavored minority—albeit one that comprises well over 90 percent of the population—why not release some of that pent-up anxiety with a good old-fashioned parade?

Yes, it’s manifestly ridiculous—but why is it any more ridiculous than a parade celebrating its opposite? 

Either we’re all equal or we’re not.  Having spent decades successfully convincing most of America that it’s wrong to judge people on the basis of sexual orientation or gender identity, don’t America’s queer folk have a special responsibility to allow heterosexuality to be given its proper due?  Since when did sexual identity become a zero-sum game?

In a Newsweek cover story in 2012 that half-jokingly referred to Barack Obama as “the first gay president,” Andrew Sullivan wrote, “The point of the gay rights movement […] is not about helping people be gay.  It is about creating the space for people to be themselves.”  This, in a way, was a re-stating of Sullivan’s 2010 proclamation, “The goal of the gay rights movement should be to cease to exist.”

So far as I’m concerned, that is the attitude the LGBT community should strike about itself in 2019:  We’re here.  We’re queer.  Let’s move on.

 

Racism at the Museum

This is why I don’t like being part of a club:  Because whenever one member of the club does something stupid, it somehow makes every other member look (and feel) like an idiot.

As reported last week in the Boston Globe, earlier this month a group of seventh graders from a predominantly African-American middle school in Boston went on a class trip to the Museum of Fine Arts, during which they were subjected to an unexpected torrent of racism from staff and fellow patrons alike.

According to witnesses, shortly after their arrival, the class was told by a museum docent, “No food, no drink and no watermelon.”  Later on, students reported being trailed throughout the galleries—to the point of extreme discomfort—by various security guards who seemed to have no interest in the white kids nearby.  Additionally, according to the Globe, one student was chided by an adult museumgoer “about paying attention in the MFA so she could avoid a career as a stripper,” while another visitor blurted out, “there’s [expletive] black kids in the way.”

And we wonder why Boston is still regarded as a less-than-hospitable place for black people to live and raise their families.

Reading the details of this bizarre field trip from hell, I found myself embarrassed and appalled at least three times over:  First, as a human being; second, as a longtime resident of greater Boston; and third, as a frequent visitor to—and member of—the Museum of Fine Arts itself.

See, it’s one thing when some casual act of Northern racism occurs on a subway platform or in the bleachers at Fenway Park—places that are loud, dense, messy and more-or-less open to the general public.

By contrast, an art museum—in this case, arguably the finest in all of New England and the silver bullet in Boston’s cultural arsenal—is supposed to be the sort of refined, enlightened and (it must be said) exclusive repository of human excellence where that sort of submental crap doesn’t happen.  I don’t know about you, but I can spend hours wandering through art galleries without uttering a word to anyone, let alone complaining about “[expletive] black kids” and giving unsolicited career advice to random 13-year-old girls I might encounter along the way.

As an MFA member—someone who, for a mere $75 per year, is free to explore the museum’s innumerable holdings and special exhibitions to my heart’s desire—I take my privileges seriously enough to respect the institution and all the people in it, and I expect everyone else to do the same.

And so, when a pair of fellow patrons made a wretched spectacle of themselves in the very halls where the city’s most priceless treasures are displayed, I could not help but take it personally.  Much as the rotten behavior of one ballplayer reflects poorly on the entire team, so, too, did it feel as though the bald racism of two museum guests tainted the character of all the others.

I love the MFA dearly, and I don’t want to tell an out-of-towner about some terrific new exhibit there and be asked, “Oh, you mean the place that treats black people like garbage?”  Nor, I might add, do I want to be in the position of answering, “Yes, that’s the one.”

Following a formal inquiry, the MFA—to its credit—was able to identity the morons who made those ridiculous comments and has banned them from the premises.  (Both were, in fact, members.)  As for the “watermelon” comment from an unnamed staff member, officials could neither confirm nor refute that such a thing was said:  The person in question claimed to have told the class, “No food, no drink and no water bottles”—the official museum policy for all visitors—raising the possibility that he or she either severely misspoke or (less likely) was misheard.

Regarding the overbearing security guards, a museum spokesperson insisted—rather unconvincingly—that they were following normal protocol at all times, while nonetheless acknowledging, “[I]t is understandable that […] the students felt followed [and] it is unacceptable that they felt racially profiled, targeted, and harassed.”

As short-term damage control goes, the MFA’s response to this mess has been reasonably adequate, insomuch as it has taken the students’ complaints seriously, has apologized multiple times and in multiple ways, and has pledged to reassess and tweak its policies to ensure this sort of horror show doesn’t happen again.

In the long term—as the Globe and others have loudly opined over the last week—the museum needs to figure out how to foster a clientele that is sufficiently diverse—racially, socioeconomically—that a group of black seventh-graders will feel just as welcome and at-home there as I do.

This could certainly be achieved through an acceleration of the MFA’s pre-existing effort to spotlight more artists of color in its galleries, events and various other special programs—and to make those events free of charge, as many of them already are.

Or better yet:  Why not just make the whole damn place free of charge?  Rather than perpetuating the aura of exclusivity (read: exclusion) that its $25 admission fee engenders, why shouldn’t this most indispensable of civic institutions truly become a gathering place for all the people, à la the Boston Public Library or Faneuil Hall?  If New York’s Metropolitan Museum can offer free—or rather, voluntary—admission for all New York State residents without going bankrupt, why not the MFA?

It won’t solve all of Boston’s racism problems at once.  But it would at least be an acknowledgement that those problems still exist, and that access to art is no less essential to a flourishing and equitable society than access to education or healthcare.  Someday the world will understand that, and thereby become an immeasurably better place to live.

Making the Case

“You think a lot about people you encounter, and there are a number of them in our community who voted for Barack Obama and Donald Trump and Mike Pence and me.  And one thing you realize […] is that it means that voters are maybe not as neatly ideological as a lot of the commentary assumes.”

So said Pete Buttigieg—the mayor of South Bend, Ind., and one of the two-dozen Democrats running for president in 2020—making arguably the most succinct possible case for electing a so-called “moderate” as the party’s standard-bearer against Donald Trump in the election next November.

Needless to say (but why not?), the question of what kind of Democrat ought to represent America’s loyal opposition in 2019 and beyond is the singular point of contention that primary voters will—and should—be debating over the next year and change.  Broadly-speaking, the eventual nominee could come from three possible spots on the ideological spectrum—the center, the left, or the far left—and a great deal depends on whether the Democrats’ perception of the country’s overall political bent matches the reality thereof.

Before we go any further, allow me to disclose loudly and clearly that, barring highly-unforeseen circumstances, I will be voting for the Democratic nominee on November 3, 2020, whoever he or she happens to be.  With Trump as the incumbent, I would happily and unreservedly support any of the possible alternatives without a shadow of a second thought.  Elections are about choices, and lesser-of-two-evils is the name of the game.

One presumes, of course, that a certain percentage of the electorate—somewhere between 40 and 45 percent, say—is on precisely the same wavelength as I am, and can be counted upon to reflexively line up behind the Democratic nominee, come hell or high water—a near-perfect reflection, ironically enough, of the #MAGA rubes who will stick with the president even if/when he murders somebody on Fifth Avenue in broad daylight.

In truth, when you add up every voter who, for all intents and purposes, has already made up his or her mind—i.e., will definitely vote for Trump or will definitely vote for his main challenger—you would be lucky to have more than 10 percent of the electorate leftover.

And yet, as ever, that 10 percent (or whatever) is precisely where the whole damn thing will be decided.  Indeed, while it’s true that every presidential election in our lifetimes has come down to the comparatively miniscule slice of the public known as “swing voters,” the singularly polarizing nature of the Trump era has shrunk America’s protean middle to little more than a sliver, thereby increasing the power and influence of every member therein, for better and for worse.

All of which is to affirm Pete Buttigieg’s implicit argument about how to win the 2020 election:  By making yourself appealing to the widest cross-section of the public as possible.  That begins with assuming that every genuinely undecided voter is persuadable, and acting accordingly.

Practically, this would certainly include venturing into enemy territory—Fox News—to make the case for why you’d be a leader for all Americans, not just those who watch MSNBC.  (Buttigieg and Bernie Sanders have smartly done this already, while Elizabeth Warren has foolishly, and loudly, refused.)  As well, it would require not smearing half the electorate as a bunch of freeloaders (á la Mitt Romney) or a “basket of deplorables” (á la Hillary Clinton).

In truth, it would entail little more than taking the American people seriously and treating them, more or less, like adults.

When Buttigieg reminds us about a certain, non-trivial chunk of our fellow citizens who voted for Obama in 2012 only to switch to Trump in 2016—and who, presumably, could swing back in the future—we are forced to reckon with the possibility that these folks’ political loyalties are a function of something other than racial resentment or any sort of coherent philosophy about the role of government in a free society.

Maybe, unlike us, they don’t spend 12 hours a day watching the news break on basic cable and Twitter, absorbing every last detail about life inside the beltway.  Maybe they lead busy, apolitical lives and haven’t given much thought lately to Robert Mueller or Roe v. Wade.

Maybe their tastes in presidents are more instinctual and elemental than weighing one set of policy proposals against another.  Maybe they voted for Obama because he promised them better healthcare, and for Trump because he promised them…better healthcare.

At the risk of reductionism and oversimplicity, maybe the secret to winning an election is vowing to give people what they want and not calling them idiots more often than is strictly necessary.

Would this necessitate misrepresenting, watering down or otherwise compromising your core moral and political values?  Only if you believe those values aren’t worth defending to a possibly skeptical audience.  And if that’s the case, why in holy hell should anyone vote for you in the first place?

Notorious THC

I didn’t inhale on 4/20 this year.

However, I did ingest.

Specifically, I sucked on two pieces of watermelon-flavored hard candies infused with THC—the active ingredient in cannabis—until they completely dissolved on my tongue and entered into my bloodstream.

To be clear, I didn’t pop two pieces into my mouth in rapid succession.  I’ve read Maureen Dowd’s infamous 2014 column about her ill-fated run-in with a hopped-up chocolate bar in Colorado (“I barely made it from the desk to the bed, where I lay curled up in a hallucinatory state for the next eight hours”) and I know better than to over-indulge an edible with so little experience under my belt.

No, I did exactly what the packaging told me to do:  “Start with one piece.  Wait two hours.  Be mindful.  Enjoy.”

In retrospect, I should’ve been a little less mindful.

Precisely 120 minutes after my first dose, I felt no physical or psychological effects whatsoever.  At that point, I rather restively administered dose number two, from which proceeded another hour-plus of nothing, followed, at long last, by a slight tingle of…….something.  Not a high, per se, let alone a full-blown case of the giggles and/or the munchies.  Nope, just a passing wave of vague euphoria that ended almost as quickly as it began—five, ten minutes, tops.

And that, regrettably, was that.  An evening’s worth of buildup to a virtually non-existent payoff.  So much for the warning on the back of the box:  “The effects of this product may last for many hours.”

What made this 4/20 test run all the curiouser was how very different it was from the night before, Good Friday, when I introduced myself to the world of edibles for the very first time.  In that case, I consumed a single lozenge around 8 o’clock.  At 9:15, while sprawled on the couch watching TV, I found myself breaking into a spontaneous full-body sweat, my heart thumping 50 percent harder than it was a moment before, my mind unable to concentrate on anything beyond removing my socks so my feet wouldn’t suffocate.

While I wouldn’t describe this scene as Maureen Dowd-level paralysis—“hallucinatory” is too grand a word to apply here—I nonetheless kept more-or-less completely still as the weird and less-than-wonderful sensation swept over me, resigned to sit quietly until the perspiration subsided and my heart tucked itself back into my chest, where it belongs. 

When both of those things occurred—again, it didn’t take terribly long, although it sure felt like it—I had no particular urge to double my money with another hit of THC just then.  As a newbie, better to quit while I’m ahead, declare the experiment a success (of sorts) and spend the balance of my Friday night with a relaxing—and altogether predictable—bottle of merlot.

It’s a truism of the pot world that marijuana affects everyone differently.  As has now been proved to me, it is equally true that its effects on a given individual can vary from one day to the next.

Of course, none of the above would be of the slightest interest to anybody, except for one extraordinary fact:  It was all perfectly legal.

Through a voter referendum, the commonwealth of Massachusetts legalized the sale and consumption of marijuana for recreational purposes on November 8, 2016.  And last Thanksgiving—a mere 742 days after the fact—the state’s first two cannabis retail establishments officially opened for business.

Today, there are 15 pot shops (and counting) sprinkled across Massachusetts—including, as of last month, the first recreational dispensary in Greater Boston, New England Treatment Access (NETA) in Brookline, which is where I found myself last Friday morning.  When I arrived at 9:50, there were at least 30 people lined up outside the former Brookline Bank where NETA is housed, waiting to get in.  When the place opened 10 minutes later, at least as many cheery townsfolk were lined up behind me.  Apparently I wasn’t the only one who knew that April 20 was mere hours away.

Customers were escorted inside the ornate marble building five at a time—after getting their IDs checked and scanned, a Brookline police officer stationed casually nearby—and were promptly handed an eight-page menu of the shop’s litany of products, as they waited for the next available register.  (As with the bank that used to occupy the same space, all valuables were concealed safely behind the counter.) 

While tempted by the Belgian dark chocolate bar—Maureen Dowd’s experience notwithstanding—I finally opted for the 16-piece “D-Line Gems,” which the sales associate fetched and rung up for an even $30—$25 for the candy itself, plus a 20 percent sales tax that, per the law, is added to all cannabis-related purchases.  (Actually, it’s three different taxes in one—“local marijuana tax,” “sales tax (cannabis)” and “marijuana excise tax”—but who’s counting?)

Oddly, I wasn’t the slightest bit interested in purchasing an actual cannabis plant, nor the various accessories that go with it.  At my advanced age (31), I suppose I just don’t have the patience for the rituals that old-fashioned pot smoking entails.  As a working man who regularly interacts with the general public, I could certainly do without the smell.

In truth, I could probably do without marijuana altogether, whether smoked, sucked, swallowed or swilled.  Before last week, I hadn’t touched the stuff in nearly nine years, and only a handful of times before that.  Sometimes it’s been a blast; other times, a bust.  I expect I’ll be paying NETA another visit sooner or later, although I doubt it will become much of a habit.

In a country that still occasionally calls itself the land of the free, I’m just happy, at long last, to have the choice.

All That Is Written

“All that is thought should not be said, all that is said should not be written, all that is written should not be published, and all that is published should not be read.”

Those words were coined by a Polish rabbi named Menachem Mendel of Kotz in the middle of the 19th century.  Surely they have never been more urgently needed than in the United States in 2019.

Just the other day, for instance, the venerable Boston Globe published an online op-ed by Luke O’Neil, a freelance columnist, expressing his rather pointed thoughts about the recently-sacked homeland security secretary, Kirstjen Nielsen.  Its throat-clearing opening line:  “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon.”  (Kristol, you’ll recall, was a leading cheerleader for the Iraq War.)

The rest of the column continued in the same vein, castigating Nielsen for her complicity in President Trump’s policy of separating children from their parents at the Mexican border, and advocating for a general shunning of Nielsen from polite society, up to and including doing unsavory things to her food whenever she turns up at a fancy restaurant.

Following a small uproar among its readers, the Globe subsequently re-wrote parts of O’Neil’s piece—cutting out the word “pissing,” among other things—before ultimately removing it from its webpages entirely.  (It never appeared in print in any form.)  All that currently remains of the thing is an editor’s note explaining that the column “did not receive sufficient editorial oversight and did not meet Globe standards,” adding, rather gratuitously, “O’Neil is not on staff.”

Locally, much has been said and written about the Globe’s (lack of) judgment in ever believing an op-ed about poisoning a public official’s dinner—however cheeky—was fit to publish in the first place.  For all of its obvious liberal biases, the Globe opinion page is a fundamentally grown-up, establishmentarian space, suggesting this episode was a bizarre, one-off aberration and nothing more.

The deeper question, however, is what brings an uncommonly thoughtful and clever writer to put such infantile thoughts to paper in the first place.

And I’m not just talking about Luke O’Neil.

Let’s not delude ourselves:  Ever since Secretary Nielsen was hounded from a Mexican restaurant last summer in response to her department’s repugnant immigration policies, every liberal in America has had a moment of silent contemplation about what he or she would do or say to Nielsen given the chance.  That’s to say nothing of her former boss, the president, and innumerable other members of this wretched administration.

Indeed, plumb the deepest, darkest thoughts of your average politically-aware American consumer, and you’re bound to emerge so covered in sludge that you may spend the rest of your life trying to wash it off.

This is why we differentiate thoughts from actions—morally and legally—and why the concept of “thought crime” is so inherently problematic.  Outside of the confessional, no one truly cares what goes on inside your own head so long as it remains there, and most of us have the good sense to understand which thoughts are worth expressing and which are not.

Except when we don’t, and in the age of Trump—with a major assist from social media platforms whose names I needn’t mention—an increasing number of us don’t.

Because it is now possible for any of us to instantaneously broadcast our basest and most uninformed impressions on any subject to the entire world, we have collectively decided—however implicitly—that there needn’t be any filter between one’s mind and one’s keyboard, and that no opinion is more or less valid than any other.  In the Twitterverse, “Let’s expand health insurance coverage” and “Let’s defecate in Kirstjen Nielsen’s salad” carry equal intellectual weight.

As a free speech near-absolutist, I can’t deny the perverse appeal in having no meaningful restrictions to what one can say in the public square.  With political correctness exploding like a cannonball from America’s ideological extremes, it’s heartening to know that reports of the death of the First Amendment have been greatly exaggerated, indeed.

Or it would be—until, say, a newly-elected congresswoman from Minnesota tells a group of supporters, “We’re gonna go in there and we’re gonna impeach the motherfucker,” and suddenly discretion seems very much the better part of valor.

Among the many truisms that life under the Trump regime has clarified is the fact that just because something can be done, it doesn’t mean it should be done.  And the same is true—or ought to be—about how each of us expresses ourselves to the wider world.

I don’t mean to sound like a total prude.  After all, I’m the guy who wrote a column in mid-November 2016 calling the newly-elected president a selfish, narcissistic, vindictive prick, and who tried to cheer my readers up the day after the election by noting that Trump could drop dead on a moment’s notice.

With two-and-a-half years of hindsight, I’m not sure I should’ve written either of those things, not to mention a few other snide clauses and ironic asides here and there ever since.  They weren’t necessary to make my larger points, and like the opening quip in Luke O’Neil’s Globe column, their rank immaturity and meanness only served to cheapen whatever it was I was trying to say.

As someone who claims to be a writer, I try to choose my words carefully and with at least a small degree of charity.  With great powerin this case, the power of wordscomes great responsibility.  And that includes leaving Kirstjen Nielsen’s salmon alone.