Racism at the Museum

This is why I don’t like being part of a club:  Because whenever one member of the club does something stupid, it somehow makes every other member look (and feel) like an idiot.

As reported last week in the Boston Globe, earlier this month a group of seventh graders from a predominantly African-American middle school in Boston went on a class trip to the Museum of Fine Arts, during which they were subjected to an unexpected torrent of racism from staff and fellow patrons alike.

According to witnesses, shortly after their arrival, the class was told by a museum docent, “No food, no drink and no watermelon.”  Later on, students reported being trailed throughout the galleries—to the point of extreme discomfort—by various security guards who seemed to have no interest in the white kids nearby.  Additionally, according to the Globe, one student was chided by an adult museumgoer “about paying attention in the MFA so she could avoid a career as a stripper,” while another visitor blurted out, “there’s [expletive] black kids in the way.”

And we wonder why Boston is still regarded as a less-than-hospitable place for black people to live and raise their families.

Reading the details of this bizarre field trip from hell, I found myself embarrassed and appalled at least three times over:  First, as a human being; second, as a longtime resident of greater Boston; and third, as a frequent visitor to—and member of—the Museum of Fine Arts itself.

See, it’s one thing when some casual act of Northern racism occurs on a subway platform or in the bleachers at Fenway Park—places that are loud, dense, messy and more-or-less open to the general public.

By contrast, an art museum—in this case, arguably the finest in all of New England and the silver bullet in Boston’s cultural arsenal—is supposed to be the sort of refined, enlightened and (it must be said) exclusive repository of human excellence where that sort of submental crap doesn’t happen.  I don’t know about you, but I can spend hours wandering through art galleries without uttering a word to anyone, let alone complaining about “[expletive] black kids” and giving unsolicited career advice to random 13-year-old girls I might encounter along the way.

As an MFA member—someone who, for a mere $75 per year, is free to explore the museum’s innumerable holdings and special exhibitions to my heart’s desire—I take my privileges seriously enough to respect the institution and all the people in it, and I expect everyone else to do the same.

And so, when a pair of fellow patrons made a wretched spectacle of themselves in the very halls where the city’s most priceless treasures are displayed, I could not help but take it personally.  Much as the rotten behavior of one ballplayer reflects poorly on the entire team, so, too, did it feel as though the bald racism of two museum guests tainted the character of all the others.

I love the MFA dearly, and I don’t want to tell an out-of-towner about some terrific new exhibit there and be asked, “Oh, you mean the place that treats black people like garbage?”  Nor, I might add, do I want to be in the position of answering, “Yes, that’s the one.”

Following a formal inquiry, the MFA—to its credit—was able to identity the morons who made those ridiculous comments and has banned them from the premises.  (Both were, in fact, members.)  As for the “watermelon” comment from an unnamed staff member, officials could neither confirm nor refute that such a thing was said:  The person in question claimed to have told the class, “No food, no drink and no water bottles”—the official museum policy for all visitors—raising the possibility that he or she either severely misspoke or (less likely) was misheard.

Regarding the overbearing security guards, a museum spokesperson insisted—rather unconvincingly—that they were following normal protocol at all times, while nonetheless acknowledging, “[I]t is understandable that […] the students felt followed [and] it is unacceptable that they felt racially profiled, targeted, and harassed.”

As short-term damage control goes, the MFA’s response to this mess has been reasonably adequate, insomuch as it has taken the students’ complaints seriously, has apologized multiple times and in multiple ways, and has pledged to reassess and tweak its policies to ensure this sort of horror show doesn’t happen again.

In the long term—as the Globe and others have loudly opined over the last week—the museum needs to figure out how to foster a clientele that is sufficiently diverse—racially, socioeconomically—that a group of black seventh-graders will feel just as welcome and at-home there as I do.

This could certainly be achieved through an acceleration of the MFA’s pre-existing effort to spotlight more artists of color in its galleries, events and various other special programs—and to make those events free of charge, as many of them already are.

Or better yet:  Why not just make the whole damn place free of charge?  Rather than perpetuating the aura of exclusivity (read: exclusion) that its $25 admission fee engenders, why shouldn’t this most indispensable of civic institutions truly become a gathering place for all the people, à la the Boston Public Library or Faneuil Hall?  If New York’s Metropolitan Museum can offer free—or rather, voluntary—admission for all New York State residents without going bankrupt, why not the MFA?

It won’t solve all of Boston’s racism problems at once.  But it would at least be an acknowledgement that those problems still exist, and that access to art is no less essential to a flourishing and equitable society than access to education or healthcare.  Someday the world will understand that, and thereby become an immeasurably better place to live.

Advertisements

Making the Case

“You think a lot about people you encounter, and there are a number of them in our community who voted for Barack Obama and Donald Trump and Mike Pence and me.  And one thing you realize […] is that it means that voters are maybe not as neatly ideological as a lot of the commentary assumes.”

So said Pete Buttigieg—the mayor of South Bend, Ind., and one of the two-dozen Democrats running for president in 2020—making arguably the most succinct possible case for electing a so-called “moderate” as the party’s standard-bearer against Donald Trump in the election next November.

Needless to say (but why not?), the question of what kind of Democrat ought to represent America’s loyal opposition in 2019 and beyond is the singular point of contention that primary voters will—and should—be debating over the next year and change.  Broadly-speaking, the eventual nominee could come from three possible spots on the ideological spectrum—the center, the left, or the far left—and a great deal depends on whether the Democrats’ perception of the country’s overall political bent matches the reality thereof.

Before we go any further, allow me to disclose loudly and clearly that, barring highly-unforeseen circumstances, I will be voting for the Democratic nominee on November 3, 2020, whoever he or she happens to be.  With Trump as the incumbent, I would happily and unreservedly support any of the possible alternatives without a shadow of a second thought.  Elections are about choices, and lesser-of-two-evils is the name of the game.

One presumes, of course, that a certain percentage of the electorate—somewhere between 40 and 45 percent, say—is on precisely the same wavelength as I am, and can be counted upon to reflexively line up behind the Democratic nominee, come hell or high water—a near-perfect reflection, ironically enough, of the #MAGA rubes who will stick with the president even if/when he murders somebody on Fifth Avenue in broad daylight.

In truth, when you add up every voter who, for all intents and purposes, has already made up his or her mind—i.e., will definitely vote for Trump or will definitely vote for his main challenger—you would be lucky to have more than 10 percent of the electorate leftover.

And yet, as ever, that 10 percent (or whatever) is precisely where the whole damn thing will be decided.  Indeed, while it’s true that every presidential election in our lifetimes has come down to the comparatively miniscule slice of the public known as “swing voters,” the singularly polarizing nature of the Trump era has shrunk America’s protean middle to little more than a sliver, thereby increasing the power and influence of every member therein, for better and for worse.

All of which is to affirm Pete Buttigieg’s implicit argument about how to win the 2020 election:  By making yourself appealing to the widest cross-section of the public as possible.  That begins with assuming that every genuinely undecided voter is persuadable, and acting accordingly.

Practically, this would certainly include venturing into enemy territory—Fox News—to make the case for why you’d be a leader for all Americans, not just those who watch MSNBC.  (Buttigieg and Bernie Sanders have smartly done this already, while Elizabeth Warren has foolishly, and loudly, refused.)  As well, it would require not smearing half the electorate as a bunch of freeloaders (á la Mitt Romney) or a “basket of deplorables” (á la Hillary Clinton).

In truth, it would entail little more than taking the American people seriously and treating them, more or less, like adults.

When Buttigieg reminds us about a certain, non-trivial chunk of our fellow citizens who voted for Obama in 2012 only to switch to Trump in 2016—and who, presumably, could swing back in the future—we are forced to reckon with the possibility that these folks’ political loyalties are a function of something other than racial resentment or any sort of coherent philosophy about the role of government in a free society.

Maybe, unlike us, they don’t spend 12 hours a day watching the news break on basic cable and Twitter, absorbing every last detail about life inside the beltway.  Maybe they lead busy, apolitical lives and haven’t given much thought lately to Robert Mueller or Roe v. Wade.

Maybe their tastes in presidents are more instinctual and elemental than weighing one set of policy proposals against another.  Maybe they voted for Obama because he promised them better healthcare, and for Trump because he promised them…better healthcare.

At the risk of reductionism and oversimplicity, maybe the secret to winning an election is vowing to give people what they want and not calling them idiots more often than is strictly necessary.

Would this necessitate misrepresenting, watering down or otherwise compromising your core moral and political values?  Only if you believe those values aren’t worth defending to a possibly skeptical audience.  And if that’s the case, why in holy hell should anyone vote for you in the first place?

Notorious THC

I didn’t inhale on 4/20 this year.

However, I did ingest.

Specifically, I sucked on two pieces of watermelon-flavored hard candies infused with THC—the active ingredient in cannabis—until they completely dissolved on my tongue and entered into my bloodstream.

To be clear, I didn’t pop two pieces into my mouth in rapid succession.  I’ve read Maureen Dowd’s infamous 2014 column about her ill-fated run-in with a hopped-up chocolate bar in Colorado (“I barely made it from the desk to the bed, where I lay curled up in a hallucinatory state for the next eight hours”) and I know better than to over-indulge an edible with so little experience under my belt.

No, I did exactly what the packaging told me to do:  “Start with one piece.  Wait two hours.  Be mindful.  Enjoy.”

In retrospect, I should’ve been a little less mindful.

Precisely 120 minutes after my first dose, I felt no physical or psychological effects whatsoever.  At that point, I rather restively administered dose number two, from which proceeded another hour-plus of nothing, followed, at long last, by a slight tingle of…….something.  Not a high, per se, let alone a full-blown case of the giggles and/or the munchies.  Nope, just a passing wave of vague euphoria that ended almost as quickly as it began—five, ten minutes, tops.

And that, regrettably, was that.  An evening’s worth of buildup to a virtually non-existent payoff.  So much for the warning on the back of the box:  “The effects of this product may last for many hours.”

What made this 4/20 test run all the curiouser was how very different it was from the night before, Good Friday, when I introduced myself to the world of edibles for the very first time.  In that case, I consumed a single lozenge around 8 o’clock.  At 9:15, while sprawled on the couch watching TV, I found myself breaking into a spontaneous full-body sweat, my heart thumping 50 percent harder than it was a moment before, my mind unable to concentrate on anything beyond removing my socks so my feet wouldn’t suffocate.

While I wouldn’t describe this scene as Maureen Dowd-level paralysis—“hallucinatory” is too grand a word to apply here—I nonetheless kept more-or-less completely still as the weird and less-than-wonderful sensation swept over me, resigned to sit quietly until the perspiration subsided and my heart tucked itself back into my chest, where it belongs. 

When both of those things occurred—again, it didn’t take terribly long, although it sure felt like it—I had no particular urge to double my money with another hit of THC just then.  As a newbie, better to quit while I’m ahead, declare the experiment a success (of sorts) and spend the balance of my Friday night with a relaxing—and altogether predictable—bottle of merlot.

It’s a truism of the pot world that marijuana affects everyone differently.  As has now been proved to me, it is equally true that its effects on a given individual can vary from one day to the next.

Of course, none of the above would be of the slightest interest to anybody, except for one extraordinary fact:  It was all perfectly legal.

Through a voter referendum, the commonwealth of Massachusetts legalized the sale and consumption of marijuana for recreational purposes on November 8, 2016.  And last Thanksgiving—a mere 742 days after the fact—the state’s first two cannabis retail establishments officially opened for business.

Today, there are 15 pot shops (and counting) sprinkled across Massachusetts—including, as of last month, the first recreational dispensary in Greater Boston, New England Treatment Access (NETA) in Brookline, which is where I found myself last Friday morning.  When I arrived at 9:50, there were at least 30 people lined up outside the former Brookline Bank where NETA is housed, waiting to get in.  When the place opened 10 minutes later, at least as many cheery townsfolk were lined up behind me.  Apparently I wasn’t the only one who knew that April 20 was mere hours away.

Customers were escorted inside the ornate marble building five at a time—after getting their IDs checked and scanned, a Brookline police officer stationed casually nearby—and were promptly handed an eight-page menu of the shop’s litany of products, as they waited for the next available register.  (As with the bank that used to occupy the same space, all valuables were concealed safely behind the counter.) 

While tempted by the Belgian dark chocolate bar—Maureen Dowd’s experience notwithstanding—I finally opted for the 16-piece “D-Line Gems,” which the sales associate fetched and rung up for an even $30—$25 for the candy itself, plus a 20 percent sales tax that, per the law, is added to all cannabis-related purchases.  (Actually, it’s three different taxes in one—“local marijuana tax,” “sales tax (cannabis)” and “marijuana excise tax”—but who’s counting?)

Oddly, I wasn’t the slightest bit interested in purchasing an actual cannabis plant, nor the various accessories that go with it.  At my advanced age (31), I suppose I just don’t have the patience for the rituals that old-fashioned pot smoking entails.  As a working man who regularly interacts with the general public, I could certainly do without the smell.

In truth, I could probably do without marijuana altogether, whether smoked, sucked, swallowed or swilled.  Before last week, I hadn’t touched the stuff in nearly nine years, and only a handful of times before that.  Sometimes it’s been a blast; other times, a bust.  I expect I’ll be paying NETA another visit sooner or later, although I doubt it will become much of a habit.

In a country that still occasionally calls itself the land of the free, I’m just happy, at long last, to have the choice.

All That Is Written

“All that is thought should not be said, all that is said should not be written, all that is written should not be published, and all that is published should not be read.”

Those words were coined by a Polish rabbi named Menachem Mendel of Kotz in the middle of the 19th century.  Surely they have never been more urgently needed than in the United States in 2019.

Just the other day, for instance, the venerable Boston Globe published an online op-ed by Luke O’Neil, a freelance columnist, expressing his rather pointed thoughts about the recently-sacked homeland security secretary, Kirstjen Nielsen.  Its throat-clearing opening line:  “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon.”  (Kristol, you’ll recall, was a leading cheerleader for the Iraq War.)

The rest of the column continued in the same vein, castigating Nielsen for her complicity in President Trump’s policy of separating children from their parents at the Mexican border, and advocating for a general shunning of Nielsen from polite society, up to and including doing unsavory things to her food whenever she turns up at a fancy restaurant.

Following a small uproar among its readers, the Globe subsequently re-wrote parts of O’Neil’s piece—cutting out the word “pissing,” among other things—before ultimately removing it from its webpages entirely.  (It never appeared in print in any form.)  All that currently remains of the thing is an editor’s note explaining that the column “did not receive sufficient editorial oversight and did not meet Globe standards,” adding, rather gratuitously, “O’Neil is not on staff.”

Locally, much has been said and written about the Globe’s (lack of) judgment in ever believing an op-ed about poisoning a public official’s dinner—however cheeky—was fit to publish in the first place.  For all of its obvious liberal biases, the Globe opinion page is a fundamentally grown-up, establishmentarian space, suggesting this episode was a bizarre, one-off aberration and nothing more.

The deeper question, however, is what brings an uncommonly thoughtful and clever writer to put such infantile thoughts to paper in the first place.

And I’m not just talking about Luke O’Neil.

Let’s not delude ourselves:  Ever since Secretary Nielsen was hounded from a Mexican restaurant last summer in response to her department’s repugnant immigration policies, every liberal in America has had a moment of silent contemplation about what he or she would do or say to Nielsen given the chance.  That’s to say nothing of her former boss, the president, and innumerable other members of this wretched administration.

Indeed, plumb the deepest, darkest thoughts of your average politically-aware American consumer, and you’re bound to emerge so covered in sludge that you may spend the rest of your life trying to wash it off.

This is why we differentiate thoughts from actions—morally and legally—and why the concept of “thought crime” is so inherently problematic.  Outside of the confessional, no one truly cares what goes on inside your own head so long as it remains there, and most of us have the good sense to understand which thoughts are worth expressing and which are not.

Except when we don’t, and in the age of Trump—with a major assist from social media platforms whose names I needn’t mention—an increasing number of us don’t.

Because it is now possible for any of us to instantaneously broadcast our basest and most uninformed impressions on any subject to the entire world, we have collectively decided—however implicitly—that there needn’t be any filter between one’s mind and one’s keyboard, and that no opinion is more or less valid than any other.  In the Twitterverse, “Let’s expand health insurance coverage” and “Let’s defecate in Kirstjen Nielsen’s salad” carry equal intellectual weight.

As a free speech near-absolutist, I can’t deny the perverse appeal in having no meaningful restrictions to what one can say in the public square.  With political correctness exploding like a cannonball from America’s ideological extremes, it’s heartening to know that reports of the death of the First Amendment have been greatly exaggerated, indeed.

Or it would be—until, say, a newly-elected congresswoman from Minnesota tells a group of supporters, “We’re gonna go in there and we’re gonna impeach the motherfucker,” and suddenly discretion seems very much the better part of valor.

Among the many truisms that life under the Trump regime has clarified is the fact that just because something can be done, it doesn’t mean it should be done.  And the same is true—or ought to be—about how each of us expresses ourselves to the wider world.

I don’t mean to sound like a total prude.  After all, I’m the guy who wrote a column in mid-November 2016 calling the newly-elected president a selfish, narcissistic, vindictive prick, and who tried to cheer my readers up the day after the election by noting that Trump could drop dead on a moment’s notice.

With two-and-a-half years of hindsight, I’m not sure I should’ve written either of those things, not to mention a few other snide clauses and ironic asides here and there ever since.  They weren’t necessary to make my larger points, and like the opening quip in Luke O’Neil’s Globe column, their rank immaturity and meanness only served to cheapen whatever it was I was trying to say.

As someone who claims to be a writer, I try to choose my words carefully and with at least a small degree of charity.  With great powerin this case, the power of wordscomes great responsibility.  And that includes leaving Kirstjen Nielsen’s salmon alone.

Biden His Time

Here’s a political question for us all:  Was the death of Beau Biden in May 2015 the most consequential event of the 2016 election?

Prior to being diagnosed with the brain cancer that would ultimately kill him, Beau Biden was a rising talent in the Democratic Party, serving as Delaware’s attorney general and generally assumed to be destined for higher office of one sort or another.

He was also the son of Joe Biden, then the sitting vice president and presumptive leading contender for the Oval Office in 2016.  By all accounts, the elder Biden was fully intent on a third run for president—following failed attempts in 1988 and 2008—and it was entirely due to the timing of his son’s illness and death that he decided to take a pass and effectively cede the Democratic nomination to Hillary Clinton.  And we know how well that went.

It’s the great political “What if?” of our time:  Would the 2016 election have ended differently had Joe Biden been in the mix?

With regards to the Democratic primaries, God only knows.  Maybe Hillary would’ve cleaned Biden’s clock—as both she and Barack Obama did in 2008.  Maybe he would’ve self-imploded through some embarrassing self-own, as he did in 1988 when it was found that he had plagiarized several of his campaign speeches.  Maybe he and Hillary would’ve fought to a protracted, bitter stalemate, allowing a third, outsider candidate (*cough* Bernie *cough*) to sneak past both of them.

But if Biden had somehow bested all his Democratic counterparts and emerged as the party’s nominee, could he have defeated Trump on November 8?

Answer:  Obviously yes.

Of course Biden could’ve defeated Trump in 2016.  Of course he could’ve flipped 80,000 votes in Michigan, Pennsylvania and Wisconsin—i.e., the three states that wound up swinging the whole damn election.  Of course he could’ve appealed to a not-insignificant chunk of white, semi-deplorable working-class folk who otherwise find Democrats acutely irritating and Hillary positively intolerable.

Yes, in an alternate universe, Joe Biden could’ve been sworn in as the 45th president on January 20, 2017.

I say “could’ve,” not “would’ve,” since any counterfactual involves an infinite number of variables we can’t even begin to imagine.  What’s more, given the historically low occurrence of one political party winning three presidential elections in a row, it’s hardly inconceivable that Trump could’ve defeated any number of Democratic opponents in that strange moment of populist rage—not least the one most closely associated with the outgoing administration.

That said, hindsight strongly suggests Biden would’ve navigated the 2016 campaign more adroitly than Clinton did—if only from a lack of questionable e-mails or a sexual predator spouse—and may well have made the biggest mistake of his life in choosing not to take the plunge when he had the chance.

The relevant follow-up, then, is whether Biden’s apparently imminent entry into the 2020 primaries—for real this time!—will follow through on the untested promise of 2016 and serve as the de facto Obama restoration half the country has craved for the last two-plus years.  Or, instead, whether Biden’s moment really has come and gone, and the best he could do would be to sail off into retirement as a beloved (albeit slightly pervy) elder statesman.

In other words:  Having become as respected and endearing as almost any public figure in America today, why would Biden risk becoming a loser and a laughingstock yet again for the sake of one last roll in the hay?

The short answer is that Biden just really, really wants to be president.  Always has, apparently always will.  How badly, you ask?  Well, badly enough to address multiple recent allegations of unwanted physical contact by insisting that he regrets none of it and isn’t sorry about a damn thing.

And what about it?  On the subject of #MeToo-era sensitivity about men behaving predatorily, let’s not kid ourselves:  In a society where “Grab ‘em by the pussy” yielded support of 53 percent of white women, who’s to say “I enjoy smelling women’s hair but I’m also pro-choice” isn’t a winning route to 270 electoral votes?

The only certainty about the 2020 election is that no one has any idea how it will shake out—particularly those who claim they do.  Biden could defeat Trump in the sense that anyone could defeat Trump, although the converse is equally true.  Is he the most “electable” of all the Democrats in the field?  With 301 days until the first primary votes are cast, how much are you willing to wager that the word “electable” holds any meaning whatsoever?

I’ll leave you with this possibly-interesting piece of trivia:  The last non-incumbent former vice president to be elected commander-in-chief in his own right was Richard Nixon in 1968.  Care to guess how many times it happened before that?

Answer:  Zero.

For Pete’s Sake

The first time I ever heard of Pete Buttigieg, the mayor of South Bend, Ind., was in a Frank Bruni column in the New York Times in June 2016, titled, “The First Gay President?”

Two weeks later, Bruni cited Mayor Buttigieg (pronounced “BOOT-edge-edge”) in another column, “14 Young Democrats to Watch”—a list that included such then-unknown figures as Stacey Abrams and Andrew Gillum—while Buttigieg himself grew increasingly visible on the national stage, interviewed by Charlie Rose (ahem) in July 2017 and by the cast of Wait Wait…Don’t Tell Me! in February 2018.

Buttigieg, 37, announced his candidacy for president on January 23, to extremely limited fanfare.  Now, however, he seems to be enjoying his 15 minutes in the limelight, thanks, in roughly equal measure, to generally glowing press coverage and surprisingly high poll numbers in early primary states.

While it is comically premature for anyone with any integrity to predict how the Democratic Party nominating contest will shake out (insert your own cable pundit joke here), Mayor Buttigieg—an Afghanistan War veteran and former Rhodes Scholar who speaks seven foreign languages, including Norwegian—is most certainly deserving of a long, hard look.

Indeed, in his initial column introducing Buttigieg to the world (or at least the world of New York Times readers), Bruni mused that, on paper, you could scarcely produce a more perfect future president if you built one, Frankenstein-like, in a laboratory.  Similarly, in a meet-the-candidate segment on a recent episode of The Daily Show, Trevor Noah struggled to find even the trace of a skeleton in Buttigieg’s professional closet and came up empty.

By all appearances, Mayor Pete (as he is known in South Bend) is the real deal—someone one underestimates at one’s peril.

For that reason, Buttigieg offers us perhaps the single greatest opportunity we’ll ever have to ask:  Is America ready for an openly gay president?

The answer, I suspect, is the same as it was regarding a black candidate in 2008:  “No it’s not, except in this one case.”

I don’t mean to imply that Buttigieg will be crowned the Democratic nominee in the summer of 2020, let alone be elected on November 3.  In a field of a billion contenders, a thirtysomething mayor of the fourth-largest city in Indiana will be a longshot in any context.

However, if America is to have a homosexual commander-in-chief in my lifetime, it will almost surely be someone like Mayor Pete:  A man so smart, so accomplished and so…normal…that his sexual preference becomes both trivial and irrelevant to all but the most obsessive voters.

At the risk of putting too fine a point on it:  Other than being married to a guy named Chasten, there is absolutely nothing about Buttigieg that would lead the average citizen to assume he is gay—nor to think anything of it upon finding out.  In appearance, speech and overall countenance, Buttigieg comes across like any other plucky, overachieving public servant:  wonky, earnest, full of ideas and creative energy, and wholly unencumbered by any notion of personal or demographic limitations.

Buttigieg’s whole approach to the gay question—increasingly common among prominent LGBT officials, post-Obergefell—is to never even mention it, except as a casual aside or in response to a direct question from an unimaginative reporter.

Indeed, Buttigieg did not formally “come out” to the good people of South Bend until deep into his first term as mayor, in June 2015 (in a newspaper column very much worth reading).  And yet, when he ran for re-election that fall, he won with more than 80 percent of the vote.

This is the future of queerness in public life, and a major reason the gay rights movement has achieved so much in the past decade-and-a-half:  By drawing only as much attention to itself as is strictly necessary.  By assimilating to, rather than separating from, the society at large.  By embracing such bedrock American institutions as marriage and family, rather than running away from them.  By treating homo-skeptics with patience and respect rather than scorn and condescension, trusting that, in good time, they will come around.

By being the moderate, mild-mannered, monogamous mayor that he is—and an extraordinarily educated and well-spoken one to boot—Pete Buttigieg is essentially daring the public to give a damn about his personal life in any way, shape or form.

At this point in his political rise, it would appear that no one does.  Perhaps that will change should he miraculously capture his party’s presidential nomination next year, when the spotlight will become infinitely brighter and the public’s curiosity infinitely curiouser.

Then again, perhaps not.  Maybe the country really has gotten past its worst hang-ups about LGBT folk in the public square and are prepared to judge all candidates for higher office strictly on their ideas, experience and the content of their character.

Someday we’ll find out for sure.  Until then, we can dream.

Mueller Lite

Last Sunday at around 4 o’clock, millions of liberals across America were beside themselves—inconsolable!—upon learning that the president of the United States isn’t an agent of a foreign power.  Having invested nearly two years of their lives and all of their emotional bandwidth into the assumption that Donald Trump and his gang conspired with the Russian government to rig the 2016 presidential election—and that the Mueller investigation would eventually prove it beyond doubt—it was positively devastating to be informed by Robert Mueller himself—albeit through his boss, Attorney General William Barr—that this just isn’t so.

As a lifelong fan of Alfred Hitchcock, I couldn’t help thinking of Rear Window.  Specifically, the scene in which James Stewart and Grace Kelly—having spent days doggedly surmising that the salesman across the courtyard has murdered his wife and chopped her body into bite-size pieces—are provided with seemingly airtight evidence from an investigator that the neighbor has done no such thing.  That, in fact, Stewart and Kelly have let their imaginations get the better of them, and that it’s all a silly, if brutal, misunderstanding.

Cut to Stewart’s and Kelly’s crestfallen visages, each overcome with disappointment and just the slightest bit pissed off about the whole bloody affair.

It’s a priceless moment, written and acted to perfection, and encapsulated, a few beats later, by the future princess of Monaco herself:

“If anybody walked in here, I don’t think they’d believe what they see.  You and me with long faces, plunged into despair, because we find out that a man didn’t kill his wife.  We’re two of the most frightening ghouls I’ve ever known.”

The joke, of course, is that Stewart and Kelly had wrapped themselves so tightly in their paranoid theories about what sinister things the neighbors have been up to—and had so convinced themselves that their worst suspicions must be true—they came to view any penetrating of their conspiratorial bubble as a personal insult and humiliation.  Their amateur sleuthing had morphed into a religious cult, and any outside information that challenged it amounted to blasphemy.

Hence the black comedy buried in Kelly’s quip:  In their idle, wild-eyed fervor, she and Stewart had come to believe that their neighbor being a murderer was preferable to their being proved foolish and irresponsible.  In that moment, being right was more important than the salesman’s wife being alive and in one piece.

Such is the dilemma now facing the American left, which must choose between two possible realities:  One in which new, unwelcome information takes precedence over comforting, unfounded speculation, or one in which the president is a traitor to his country and the MSNBC primetime lineup is a fount of divine truth.

Prior to last Sunday, liberals like me had been perfectly content to live in the latter universe, much as conservatives spent the balance of 2009-2016 in a Fox News echo chamber of rage wherein President Obama was a secret Muslim, Hillary Clinton was a secret murderer and Benghazi was the biggest scandal in the history of the human race.

But what about now?  With the news—however preliminary—that our darkest imaginings about Trump are, well, imaginary, are we not duty-bound to accept this most inconvenient of truths and move on to 2020?

I’ll say this much:  Throughout the 2016 election, I rarely went more than 24 hours without checking in on the Huffington Post, the addictive left-wing blogging platform that framed every utterance from Trump’s mouth as a Category 5 emergency and gave Hillary Clinton a 99 percent chance of victory in the days leading up to the big vote.

I haven’t been back to the Huffington Post once since November 9, 2016, and it’s for the exact reason you’d expect:  At long last, and with a great deal of reflection, I decided I no longer enjoyed the taste of Kool-Aid.

Don’t get me wrong:  Today I am still very much a liberal and still very much consider Donald Trump a cancer on the face of America, for reasons Robert Mueller had no need to investigate.

What I am not—or so I would like to think—is a mindless, obstinate rube who clings to demonstrable falsities simply because I want them to be true.  While I still watch MSNBC on a regular basis, I generally limit my consumption to one hour of programming per day, and always with the understanding that comfort food is not the same as nutrition and restless chatter is not the same as insight.

I suggest my fellow anti-Trumpers do the same, and put Collusiongate in the rear window, where it belongs.