Consent of the Governed, Part 2

This past Monday, the president nominated Judge Brett Kavanaugh to replace Anthony Kennedy on the U.S. Supreme Court.  The balance of power being what it is, unless Kavanaugh is found with a dead girl or a live boy (in the immortal words of Edwin Edwards), he will be confirmed by the Senate later this year and the nation’s highest court will be as ideologically conservative as it has ever been in our lifetimes.

From the moment Justice Kennedy announced his retirement last month, liberals have been running around the airwaves with their hair on fire, screaming that this development constitutes the end of the world as we know it.  That the replacement of Kennedy’s so-called moderation with the true blue right-wingery of his successor will usher in a generation of irreversibly destructive decisions on every issue the left holds sacred, from abortion rights to gun control to civil liberties to campaign finance reform.

While Democrats’ concerns about Kavanaugh are undoubtedly well-founded—after all, he comes pre-packaged and pre-approved by the conservative judge factory known as the Federalist Society—they are also misleading and incomplete, insomuch as they overlook a much larger and more profound fact:

Ruth Bader Ginsburg is 85 years old.

Lament Kennedy’s departure if you wish, but the truth is that he was a fundamentally right-wing jurist whose flirtations with progressive causes, however crucial, were few and far between.  While he is rightly credited with preserving abortion rights in 1992 and effectuating same-sex marriage in 2015, he is equally responsible for the majority opinions in Bush v. Gore and Citizens United v. FEC—the two worst Supreme Court decisions since Plessy v. Ferguson, according to most liberals.  During the most recent term, he voted with the court’s conservative wing in every high-profile case that was decided by a 5-4 vote.  Every.  Single.  One.

Long story short:  Replacing Kennedy with a rock-ribbed conservative will not be the end of the world as we know it.  But replacing Ruth Bader Ginsburg with a rock-ribbed conservative?  That will be the end of the world as we know it.

Perhaps it is bad form to observe that most human beings do not live forever, but if the Democratic Party is truly freaked out about losing every major Supreme Court case for a generation or more, it must come to grips with the fact that its most beloved and indispensable justice—the Notorious RBG—is an octogenarian and two-time cancer patient who, for health reasons, might need to leave the bench before the next Democratic president takes office.  Ginsburg may intend to serve well beyond the current administration, but then again, so did Antonin Scalia on February 12, 2016.

If Nancy Pelosi and Chuck Schumer plan to make themselves useful in the coming months, they ought to emphasize, in no certain terms, that a Republican-majority Senate in 2018-2019 guarantees the appointment of Judge Kavanaugh—already a foregone conclusion, so far as I can tell—and that the re-election of Donald Trump in 2020 makes it exceedingly likely the court will contain only three—or perhaps only two—liberals by the end of Trump’s second term.  (Ginsburg’s like-minded colleague Stephen Breyer turns 80 next month.)

Elections have consequences, and one of them is a Supreme Court shaped in the image of the sitting commander-in-chief—an arrangement that has been in place continuously since 1787.

The left can whine all it wants about Russian shenanigans and Mitch McConnell’s dirty tricks vis-à-vis Merrick Garland, but the fact remains that people voted for president in November 2016 in the full knowledge that a) the winning candidate would be selecting the successor to the late Antonin Scalia, and that b) there would almost surely be additional openings on the court before his or her presidential tenure was up.  Candidate Trump made this point repeatedly on the campaign trail.  In retrospect, Hillary Clinton did not make it nearly enough—a mistake her party’s candidate in 2020 would be well-advised to avoid.

Lame as it may sound, Neil Gorsuch is on the Supreme Court today because Donald Trump received the most electoral votes in 2016 and there weren’t enough Democrats in the Senate to stop him.  Brett Kavanaugh will be on the Supreme Court this fall for precisely the same reason.

If you find this situation intolerable, you have two choices:  You can vote for Democratic senators on November 6, 2018, and for a Democratic presidential candidate on November 3, 2020.  Or you can assume John Roberts will magically evolve into a liberal overnight and that Ruth Bader Ginsberg will live to 120.

Personally, I’d recommend Option No. 1, however inconvenient it might be.  You’d be surprised what a democracy can accomplish when its citizens behave democratically.

Advertisements

Consent of the Governed

If you’re wondering about the state of civics education in America today, look no further than a recent episode of Jeopardy!  In the first round of questions and answers, the $400 clue in a category about government read, “This document ends, ‘We mutually pledge to each other our lives, our fortunes and our sacred honor.’”

Not a single contestant rang in.  On America’s flagship TV game show, none of the three players could recognize the climactic clause of the most famous document in the history of the United States, the Declaration of Independence.

While I understand that Jeopardy! is considerably more difficult in front of a live studio audience than from the comfort of one’s couch, I’d like to think there are certain sentences that are embedded in the soul of every man, woman and child in America, and that “our lives, our fortunes and our sacred honor” is chief among them.

However, as one survey after another has shown, this is increasingly not the case.  With each passing generation, we, the people, have become progressively less knowledgeable about the history of this country and our duties as citizens thereof.

Beyond our ignorance of the basic facts of America’s founding—like how, for example, we actually declared our independence from Britain on July 2, not July 4—we have demonstrated an alarming mixture of confusion about and indifference to our obligations as participants in a democratic republic, not the least of which is the act of informed voting.

Case in point:  Last week, the Democratic Party establishment was thrown for a loop by the surprising primary victory in New York’s 14th House district by political neophyte (and self-proclaimed democratic socialist) Alexandria Ocasio-Cortez.  For all the talk about how the win by Ocasio-Cortez portends a definite leftward shift by her party’s base this fall—a base that is suddenly shot full of hope and adrenaline for the first time in two years—it was equally the case that a mere 13 percent of the district’s eligible voters bothered to cast a ballot in the first place.

In other words, the media spent a full week rethinking the narrative trajectory of the 2018 midterms based on a single race in which seven-eighths of the district did not  even participate.  Is this really our idea of representative democracy in action?

Regrettably, yes.

This is to take nothing away from Ocasio-Cortez, a spirited and savvy campaigner who inspired her future constituents in a way her opponent, Joe Crowley, did not.  In truth, such an abysmally low turnout rate is utterly typical for a congressional primary held in the middle of the summer—indeed, it would barely be aberrational for an election held in September.

As a rule, Americans do not vote more than once every four years, and tens of millions never vote at all.  While there are numerous (and often complex) reasons for this—deliberate, systematic suppression being the most insidious—the simple fact is that the majority of these non-participants just plain don’t care who represents them in the public square—be it the legislature, town hall, state house or White House—and cannot be bothered to do the research necessary to know which candidate to choose when the designated day arrives.

Hence the fact that virtually no one (including me) seemed to have heard of Ocasio-Cortez until the day after her win—much like how, according to one survey, only 37 percent of us can name our own congressperson without looking it up.  Or how, according to another survey, a mere one in four can identify all three branches of government, while 31 percent cannot name a single one.

I could go on.  Oh, how I could go on.

In a letter to a friend in 1816, Thomas Jefferson famously wrote, “If a nation expects to be ignorant and free, in a state of civilization, it expects what never was and never will be.”  Less famous—but perhaps more important—was the subsequent clause:

“The functionaries of every government have propensities to command at will the liberty and property of their constituents.  There is no safe deposit for these but with the people themselves; nor can they be safe with them without information.  Where the press is free, and every man able to read, all is safe.”

On this Fourth of July—the 192-year anniversary of Jefferson’s death—might I humbly suggest that, if we truly wish to pull our country back from the abyss, we direct our righteous indignation not at our leaders, but at ourselves.  That we reflect that there isn’t a single official on Capitol Hill or in the White House who wasn’t democratically elected—or appointed by someone who was—and that if we want a fresh set of representatives in 2019—and, with them, a fresh set of policies and ideas—we have it in our power (as we always have) to sweep them into office and to throw the bums out.

Election Day is November 6.  I’ll be there.  Will you?

Let Them Eat Tacos

I have no idea why the secretary of Homeland Security would dine out at a Mexican restaurant on the very day she defended the use of internment camps at the Mexican border.  I don’t know why the White House press secretary would show her face anywhere while acting as a mouthpiece for the most dishonest chief executive to ever sit in the Oval Office.

(If you missed it:  Last Tuesday, protesters yelled “shame!” at Homeland Security Secretary Kirstjen Nielsen inside MXDC Cocina Mexicana in Washington, D.C.  Three days later, Press Secretary Sarah Huckabee Sanders was asked to leave the 26-seat Red Hen in Lexington, Va., by the restaurant’s owner after several employees were made uncomfortable by Sanders’s presence.)

I’m not the least bit surprised that both of those public officials would be confronted by angry constituents while attempting to enjoy a relaxing night on the town.  Given the tenor of public discourse in 21st century America, the miracle is that this sort of thing doesn’t happen more often—or more violently.

I understand instinctively why those concerned citizens feel the need to vent their outrage at these crooks and liars face-to-face when given the opportunity.

In the future, however, I wish they would resist the urge to do so.

Before we go any further, I should probably mention that I am about the least confrontational person on the East Coast.  I’m not sure I’ve ever started an argument with anyone in my adult life, and whenever someone attempts to start an argument with me, I make every effort to tactfully withdraw from the conversation and/or the room.  For all the self-righteous vitriol I’ve unfurled on this site over the years, the notion of telling an odious prominent figure, in person, what I really think of them fills me with bottomless anxiety and dread.

Admittedly, as a privileged, native-born white male, it is very easy for me to hang back on the sidelines and allow human events (however alarming) to run their course.  For someone like me, the actions of President Trump and his collaborators may be irritating—even horrifying—but they do not pose an existential threat to my way of life and probably never will.

I realize, in short, that spending one’s day avoiding conflict and social discomfort is a luxury that many of my fellow Americans cannot afford, and that sometimes verbally lashing out at those who oppress you can feel like a moral imperative—and possibly the only recourse that is available to you as an otherwise powerless individual.  If members of the Trump administration are deliberately and pointlessly making millions of Americans’ (and non-Americans’) lives difficult, the argument goes, why shouldn’t they get a taste of their own toxic medicine whenever they enter space occupied by the victims of their noxious acts?

The reason they shouldn’t—the reason all public servants should be left unmolested when they’re not on the clock—is because Michelle Obama said, “When they go low, we go high,” and every liberal in America cheered.

By its own rhetoric, if the Democratic Party stands for anything in the age of Trump, it’s moral superiority.  Whether stated directly or implicitly, the message from Democratic leaders and supporters in recent years is that, all things being equal, Democrats are the party of sanity, empathy and love for one’s fellow human beings, while Republicans are (to coin a phrase) deplorable.

Without question, Donald Trump’s own rotten character was the primary basis of voting for Hillary Clinton in 2016—“Love Trumps Hate” was arguably Clinton’s most successful and resonant slogan—and most liberals still regard Trump’s penchant for childish name-calling and general thuggery as an intolerable moral stain that must be repudiated at the polls in 2018 and 2020—namely, by voting for as many Democratic candidates as possible.

The question is:  If the left truly believes in the Judeo-Christian ethos of treating others as you would have others treat you—and that Trump and company constitute a monstrous perversion of this policy—do they not have a responsibility to exhibit such mature, noble behavior themselves?  To lead by example?  To understand that darkness cannot drive out darkness—only light can do that?  To be the change they want to see in the world?

I say yes, and this includes allowing Nielsen and Sanders to eat their dinner in peace, whether or not they deserve it.  Because in the end, this isn’t about them.  It’s about us.  And it’s not a good look for the so-called party of inclusion to start telling certain people they’re not welcome and they don’t belong.

Unplugged

I recently returned from a week-long trip to paradise—Martha’s Vineyard, to be exact—and while I was there, I did something that, for me, was both unthinkable and unprecedented.

I kept away from social media and the news.

That’s right.  From the moment our ferry cast off from shore, I ceased all contact with my Twitter feed and didn’t reconnect until after returning to the mainland.  For good measure, I also generally avoided Facebook, the New York Times and cable news, opting to remain as ignorant as possible about what was going on in the parts of the universe not directly in front of my nose.  For perhaps the first time in my adult life, I just didn’t want to know.

Now, maybe tuning the world out is the sort of thing most normal people do to relax at their favorite summer getaways.  But as a prototypical millennial news junkie, I can scarcely imagine being walled off from current events for more than a few hours at a time, vacation or no vacation.  Since acquiring my first Droid in the summer of 2010, I’m not sure I’ve gone a single day without checking my social media apps at least once.  You know:  Just to make sure I’m not missing anything.

Having lived under the tyranny of Zuckerberg and Bezos for so long, I’ve realized with ever-growing acuity that I am every bit as addicted to the little computer in my pocket—and the bottomless information it contains—as the good-for-nothing Generation Z teenagers I’m supposed to feel superior to.  More and more, I recall Jean Twenge’s terrifying recent Atlantic story, “Have Smartphones Destroyed a Generation?” and I wonder whether any of us—of any age group—are going to emerge from this era better citizens and human beings than when we entered it.

So it was that, on the occasion of my annual sojourn to my favorite summer retreat—an island I’ve visited annually since before I was born—I decided I needed to find out whether I’m capable of cutting myself off from the GoogleTube cold turkey.  Whether—if only for a week—I can bring myself to live as I did for the first 23 years of my life:  Without constant, hysterical, up-to-the-second news flashes from every corner of the globe and, with them, the instantaneous expert (and non-expert) analysis of What It All Means and Where We Go From Here.

Mostly, of course, I just wanted a week without Donald Trump.

Did I succeed?

Kind of.

Yes, I still read the Boston Sunday Globe (mostly for the arts pages).  Yes, I still listened to my favorite NPR podcast while riding my bike.  Yes, I still posted pictures on Facebook before going to bed.  And yes, I still allowed my cable-obsessed bunkmate to watch a few minutes of Morning Joe before we headed out to breakfast each day.

All of that aside, I nonetheless fulfilled my core objective of not actively following world events closely—if at all—and believing, to my core, that nothing in life was of greater concern than which ice cream flavor to order at Mad Martha’s and whether to wear jeans or shorts while hiking at Menemsha Hills.  (The answers, respectively, were butter crunch and jeans.)

So I didn’t get the blow-by-blow of President Trump’s meeting in Singapore with Kim Jong-un.  I didn’t hear the early reports of children being snatched from their parents at the Mexican border.  And I didn’t see that raccoon scaling the UBS Tower in St. Paul, Minnesota.

What’s more, I noticed that as the week progressed, I grew increasingly less bothered by how out-of-the-loop I was in my little self-imposed cone of radio silence, and it got me wondering whether I couldn’t keep up this stunt indefinitely.  Whether, in effect, I could become a beta version of Erik Hagerman—the Ohio man, recently profiled in the New York Times, who severed all ties with society on November 9, 2016, and hasn’t looked back since.  Dubbing him “the most ignorant man in America,” the story left little doubt that Hagerman, in his calculated obliviousness, is probably a happier and more well-rounded individual than three-quarters of his fellow countrymen.

Of course, Hagerman is also extremely white—not to mention extremely male and extremely upper middle class—and there is no avoiding the uncomfortable fact that choosing to ignore the daily machinations of the Trump administration is a direct function of white privilege (as countless Times readers pointedly noted at the time).  To be white is to be insulated from Trump’s cruelest and most outrageous policies; thus, there is little-to-no risk in not keeping a close eye on them every now and again.

“The prettiest sight in this fine, pretty world is the privileged class enjoying its privileges,” said Jimmy Stewart, with great scorn, in The Philadelphia Story in 1940.  As a member of the privileged class—in my whiteness and maleness, if not my disposable income—I recognize the profound moral failing of even thinking of mentally tuning out an American society in which virtually every racial, ethnic and cultural minority finds itself under threat.  Silence is complicity, and I very much doubt I could live in happy ignorance knowing, deep down, that a great deal of preventable suffering is occurring just beyond my immediate line of sight.

But it sure was nice while it lasted.

See You Next Tuesday

Almost everything I know about bad language I learned from George Carlin—the stand-up comedian who, in 1972, explained in a now-legendary monologue that there are seven words that cannot be said on television.

In case you forgot, those words were “shit,” “piss,” “fuck,” “cunt,” “cocksucker,” “motherfucker” and “tits.”

While Carlin revised and expanded his initial list of “dirty words” over the years—when I saw him perform live in 2007, I purchased a wall poster that listed 2,443 of them—the essence of the original bit has lost none of its power or relevance in the 46 years since it debuted.  We might quibble about which (and how many) words belong on such a list in 2018, but we agree—if only implicitly—that there are, in fact, certain words that cannot be said on television or in any other public space.

And last week, we had a spirited and rather unexpected argument about whether the list still includes the word “cunt.”

This argument, you’ll recall, sprung from an episode of Samantha Bee’s TBS program, Full Frontal, in which Bee called Ivanka Trump a “feckless cunt” for doing nothing to stop her father, the president, from separating immigrant mothers from their children after crossing the Mexican border into the United States.  Whatever the merits of Bee’s gripe—valid and cutting as it was—the mere presence of the so-called “C word” drained the entire rant of its substance in the eyes and ears of the Twitterati and briefly threatened to derail Bee’s entire career.

As a regular Full Frontal viewer—and a fan of Bee’s since her days at The Daily Show—I was neither surprised nor offended that she would (and did) feel compelled to call Ivanka Trump the most disparaging name a woman can possibly be called.  Beyond the small fact that the word itself was bleeped during the initial broadcast—this is basic cable, after all—I have long taken the view that women can employ sexist invective to their hearts’ desire in a way that men don’t (and shouldn’t) get away with, and a certified comic like Samantha Bee is hardly an exception to the rule.

Just as black people can say “nigger” and gay people can say “faggot”—while white and straight people, respectively, cannot—women, as an historically oppressed group, are entitled (and, as far as I’m concerned, encouraged) to assume ownership of the language that has been used by men to keep them down since time immemorial.  Apart from ironically blunting the negative impact such words often have on their targets, co-opting rhetorical slurs enables you to disarm and disorient your would-be oppressor by showing him how meaningless such words ultimately are.  It’s a double standard, but a necessary one—a means of ever-so-slightly rectifying a history of patriarchal misogyny that cannot possibly be repaired in full.

So if Samantha Bee wants to sling a see-you-next-Tuesday at Ivanka Trump, that’s her privilege.  And if Ivanka wants to respond in kind—or not at all—more power to her.  Why on Earth should it concern anyone other than the two of them?

It shouldn’t.  Yet, it does.

At the moment—and for the last many years—the American culture finds itself locked in a hysterical feedback loop of offense-taking, in which an objectionable comment by a member of one ideological clan publicly wounds the moral sensibilities of the other—until roughly 36 hours later, when the roles reverse following some fresh outrage by a representative of Team Number Two.  Samantha Bee today, Roseanne Barr tomorrow.

Broadly speaking, if we are to escape this corrosive, cynical war of political attrition—as we should, if only for our collective mental health—there are two types of societies America can choose to become:  One in which no one is allowed to say anything that might offend someone else (today’s de facto status quo), or one in which virtually anything may be said because everyone has decided not to be bothered by it.

From all I’ve written so far, you can probably guess which scenario I’d prefer.  Speaking as someone who hasn’t lost his temper over anything for at least 20 years, there is something to be said for not walking around all day with your hair on fire because a minor celebrity uttered a naughty word in public.

This isn’t to suggest that every controversial statement of recent vintage is of equal moral weight—or, for that matter, that we should abandon our sense of right and wrong in the interest of being a little more agreeable with each other on social media and in real life.  Some things really are worth being outraged by, and it’s up to each of us individually to decide where to draw the line.

All the same, my advice—informed in no small part by the wit and wisdom of George Carlin, one of the happiest and most well-rounded Americans who ever lived—is to allocate your indignation sparingly and judiciously, and—when at all possible—to err on the side of just letting things go.

The Imperial Calorie

Is it better to know, or not to know?  Are there certain pieces of information of which you’re happy to remain ignorant?  At what point does “knowledge is power” get subsumed by “ignorance is bliss”?  And what happens when all of these considerations involve the number of calories in your food?

Thanks to a new federal regulation that kicked in earlier this month, those sorts of questions have become slightly less theoretical than they were before.  In compliance with the Affordable Care Act—and following years of resistance by special interest groups—all food establishments in the U.S. with at least 20 outlets are now required to post calorie counts of all their products in all their stores.

While many chains have been doing this voluntarily for years, the practice became law on May 7, which means you can no longer order a muffin at Dunkin’ Donuts without learning that it contains nearly twice as many calories as a bagel, nor can you finish a meal at Olive Garden without willfully consuming more caloric energy than the average American burns in an entire day—with or without breadsticks.

Of course, maybe this new law means nothing to you.  Maybe you are a knowledgeable, health-conscious consumer who knows exactly what you’re putting into your body at all times.  Maybe you’ve long been aware of how deadly chain restaurant food tends to be for your waistline and cholesterol levels, and you tread carefully whenever you indulge—as you do when eating at home, at work or at Thanksgiving dinner.

However, this would hardly make you a prototypical American, 160 million of whom are either overweight or obese—a jaw-dropping figure that suggests a majority of our fellow countrymen either don’t understand how their digestive systems work or don’t care, and who pose an existential threat to our national healthcare system in any case.

As a matter of public health, then, requiring eating establishments to disclose nutrition information is a no-brainer and a win-win, and has largely been accepted as such in recent years.  By listing calorie counts on the menu, a restaurant provides valuable, potentially life-saving information to those who might need it, while still honoring every citizen’s God-given right to eat whatever they damn well please.

The problem here—as I suggested at the top—is that you cannot un-see what is written directly in front of you, and there’s a certain group of Americans who really, desperately wish they could.  If some people want to know how many calories they’re consuming while others are indifferent, there is also a third category:  Those (sometimes including me) whose culinary pleasure is dependent on not knowing, chemically-speaking, exactly what it is they’re eating, and once facts and figures enter into it, the whole experience turns sour.

I don’t know about you, but when I was younger and first scanning the nutrition labels on every foodstuff in the kitchen, the whole point of dining out was to eat as much as humanly possible, because you had no earthly idea how many calories were involved and could therefore assume there were none at all.  As any corrupt politician will tell you, plausible deniability is a powerful thing.

Admittedly, one cannot responsibly live in such utter obliviousness forever—aforementioned 160 million Americans notwithstanding—and as I’ve grown older, I’ve become considerably more informed and mindful about the science of nutrition and human metabolism, which has enabled me to balance the books in my eating and exercise routines, as well as to perform ballpark calorie calculations in my head in almost any setting—a superpower that is both highly useful and profoundly irritating.

On the one hand, becoming educated about food has unlocked the secret to losing (or at least not gaining) weight and feeling generally in control of my destiny.  By turning meals into a math problem—or, more accurately, a budget—I am considerably less likely to stuff my face for the hell of it and then feel like crap for the rest of the day.

On the other hand, by being super-vigilant about what I deposit into my pie hole—say, by scarfing down three slices of pizza for lunch instead of six—I risk turning eating into a purely clinical and joyless act—something every diet fad in history has expressly tried to avoid, because why on Earth would you remove the pleasure from the most inherently pleasurable activity of your day?

It has taken me several years—and one rather dramatic period of weight loss—to reconcile those twin urges without driving myself completely crazy.  (As Oscar Wilde put it, “Everything in moderation, including moderation.”)  While I don’t regret this strange journey to enlightenment (such as it was), I often wonder whether I’d be happier if I’d remained fat and ignorant instead of thin and neurotic—and whether America as a whole is feeling similarly now that it’s become virtually impossible to eat anything without the terrible knowledge of how much it’s costing us (in all senses of the word).  Whether our ability to live longer and healthier is necessarily making us live better.

There’s a saying amongst dieters, “Nothing tastes as good as skinny feels.”  How wonderful life would be if such a thing were actually true.

The Man Who Wouldn’t Be King

It says a lot about America that John McCain was never elected president.  It says even more that, in retrospect, we sort of wish he had been.

Indeed, all the way back in 2001, during an interview with Charlie Rose (ahem), Bill Maher cited McCain—recently defeated in the GOP primaries by George W. Bush—as among his favorite Republican politicians.  “He’s everyone’s favorite,” said Rose, to which Maher dismissively retorted, “Then why doesn’t he win?”

It’s a damn good question, and a useful lens through which to view our entire political system.  As McCain clings ever-more-precariously to life—having spent the last 10 months ravaged by glioblastoma, an aggressive form of brain cancer—we might reflect on the strange way that our most accomplished and admired public officials tend not to rise all the way to the Oval Office—and why a great many more never bother to run in the first place.

On paper, McCain would seem exactly the sort of person the Founding Fathers had in mind as a national leader:  A scrappy rebel from a distinguished family who proves his mettle on the battlefield, then parlays that fame into a steady career in public service.  (He was first elected to Congress in 1982 and has never held another job.)

While hardly a first-class intellect—he famously graduated near the bottom of his class at Annapolis—McCain’s grit and endurance through five-and-a-half years of torture and deprivation in a Vietnamese prison forever burnished his reputation as among the most indefatigable men in American life—someone who would speak truth to bullshit and hold no loyalties except to his own conscience.  Having cheated death multiple times, here was a man with precious little to fear and even less to lose.

Against this noble backdrop, it would be the understatement of the year to say that, as a two-time presidential candidate, John McCain was a complicated and contradictory figure—perhaps even a tragic one.  In 2000, he established his political persona as a crusty, “straight-talking” “maverick,” only to be felled in South Carolina by a racist Bush-sanctioned robocall operation that McCain was too gentlemanly to condemn.  (The robocalls implied, among other things, that McCain’s adopted daughter from Bangladesh was an out-of-wedlock “love child.”)

Eight years later, having learned a thing or three about brass-knuckles campaigning, McCain scraped and clawed his way to the Republican nomination—besting no fewer than 11 competitors—only to throw it all away with the single most irresponsible decision of his life:  His selection of Alaska Governor Sarah Palin as his running mate.

With nearly a decade of hindsight, the science is in that choosing Palin—a world-class ignoramus and America’s gateway drug to Donald Trump—constituted the selling of McCain’s soul for the sake of political expediency.  Rather than running with his good friend (and non-Republican) Joe Lieberman and losing honorably, he opted to follow his advisers’ reckless gamble and win dishonorably.  That he managed to lose anyway—the final, unalterable proof that the universe has a sense of humor—was the perfect denouement to this most Sisyphean of presidential odysseys.  He was damned if he did and damned if he didn’t.

The truth is that McCain wouldn’t have won the 2008 election no matter what he did, and this had very little to do with him.  After eight years of George W. Bush—a member of McCain’s party, with approval ratings below 30 percent in his final months—the thrust of history was simply too strong for anyone but a Democrat to prevail that November.  (Since 1948, only once has the same party won three presidential elections in a row.)

If McCain was ever going to become president, it would’ve been in 2000.  Pre-9/11, pre-Iraq War and post-Bill Clinton, a colorful, self-righteous veteran could’ve wiped the floor with a stiff, boring policy wonk like Al Gore.

Why didn’t he get that chance?  The official explanation (as mentioned) is the reprehensible smear campaign Team Bush unloaded in the South Carolina primary.  However, the more complete answer is that Republican primary voters throughout the country simply didn’t view McCain as one of their own.  Compared to Bush—a born-again Christian with an unambiguously conservative record—McCain was a quasi-liberal apostate who called Jerry Falwell an “agent of intolerance” and seemed to hold a large chunk of the GOP base in bemused contempt.

McCain’s problem, in other words, was the primary system itself, in which only the most extreme and partisan among us actually participate, thereby disadvantaging candidates who—whether through their ideas or their character—might appeal to a wider, more ideologically diverse audience later on.  Recent casualties of this trend include the likes of John Kasich and John Huntsman on the right to John Edwards and (arguably) Bernie Sanders on the left.

On the other hand, sometimes primary voters will do precisely the opposite by selecting nominees whom they perceive to be the most “electable”—a strategy that, in recent decades, has produced an almost perfect record of failure, from John Kerry to Mitt Romney to Hillary Clinton.

By being his best self in 2000 and his worst self in 2008, McCain managed to fall into both traps and end up nowhere.  Indeed, he may well have been a victim of bad timing more than anything else—as was, say, Chris Christie by not running in 2012 or Hillary Clinton by not running in 2004.

Then again, all of history is based on contingencies, and it is the job of the shrewd politician to calibrate his strengths to the tenor of the moment without sacrificing his core identity.  However appealing he may be in a vacuum, he must be the right man at the right time—the one thing Barack Obama and Donald Trump had in common.

As Brian Wilson would say, maybe John McCain just wasn’t made for these times.  Maybe he wasn’t elected president because America didn’t want him to be president.  Maybe his purpose in life was to be exactly what he was:  A fiery renegade senator who drove everybody a little crazy and loved every minute of it.  Maybe he wouldn’t have been any good as commander-in-chief anyhow—too impulsive, too hawkish—and maybe we’re better off not knowing for sure.

Will someone of McCain’s ilk ever rise to the nation’s highest office in the future?  Wouldn’t it be nice if they did?