The L Word

Over the weekend, Showtime aired a new documentary about Barney Frank, the now-retired congressman who represented Massachusetts’ 4th district for 32 years.  Among Frank’s many quips included in the film (he was known as much for his wit as for his legislative clout), the one that most stuck with me was his response to the question, “What is a progressive?”

“A progressive,” Frank said, “is a liberal who’s afraid to admit it.”

Sounds about right to me.

It is remarkable—and always worth pointing out—how completely the word “liberal” has been scrubbed from the Democratic Party in Washington.  Ask any major Democratic figure to tick off the adjectives that best describe him or her, and you’ll find that “liberal” is nowhere to be found.

Indeed, it’s as if party leaders called a secret meeting some years back and proclaimed “liberal” the eighth entry on George Carlin’s list of words you can’t say on television, then went on to intone that “progressive” is the preferred euphemism that shall be employed from this point forward.

Everyone’s been marching in lockstep ever since.

At the first Democratic debate, Hillary Clinton was asked if she is a progressive or a moderate.  No one even thought to include “liberal” in the list of choices.

Bernie Sanders’ website boasts of “a progressive economic agenda,” but not a word about doing anything for liberalism.  And this from a guy who’s trying to resurrect “socialism” from the linguistic dustbin of history.

President Obama?  Same deal.  Democrats in Congress?  Ditto.  Far as they’re concerned, they’re a progressive party with a progressive agenda.  No bout a-doubt it:  “Liberal” has been stricken from the record and “progressive” is the new black.

This leaves us with two questions I have never quite shaken.  First:  Why?  And second:  Does it even matter?

Implicitly, I think Barney Frank’s pithy definition manages to answer both.  In short:  Democrats are scaredy-cats.

Taking my second question first:  Is there a functional difference between these two words, or are they interchangeable?

I’ve always naïvely assumed the latter, figuring that “progressive” is simply a nicer-sounding version of an otherwise identical concept.  Barney Frank evidently agrees.

But then I underwent a bit of Internet sleuthing and realized the issue isn’t as cut and dry as all that.  For instance, from a Huffington Post article by David Sirota, we have the following:

“It seems to me that traditional ‘liberals’ in our current parlance are those who focus on using taxpayer money to help better society.  ‘Progressives’ are those who focus on using government power to make large institutions play by a set of rules.”

If that sounds like splitting hairs, don’t worry, it gets worse.

Indeed, scour the web long enough—say, three or four minutes—and you’ll find “serious” arguments claiming that while Bernie Sanders may be a progressive and a socialist, he is most definitely not a liberal.  Or that Hillary Clinton, for all her posturing—along with three or four decades of advocacy for left-wing causes—is not the progressive-in-chief she presumes to become.

All this talk of who’s a real Democrat and who’s not is eerily reminiscent of the purity tests to which conservative voters have subjected Republicans over the past decade or so—particularly in the era of the Tea Party, in which there is no such thing as being too conservative.

The crucial difference, though, is that Republicans actually take the bait, tripping all over each other to be the most maniacally right-wing person on the stage.  That’s how you get Mitt Romney calling himself a “severely conservative” candidate in 2012, or Marco Rubio hardening his positions on things like immigration and abortion over the last few months.

On the whole, Democrats are the reverse, constantly assuring us that they are less liberal than they appear.  Hence Barack Obama pretending to be against same-sex marriage in 2008, or John Kerry embracing the Iraq War four years earlier.

The conventional wisdom is that candidates run toward the extremes for the primaries and toward the center for the general election.  Not really.  Republicans, maybe.  But Democrats pretty much cling to the center and stay there.  Recall how Bill Clinton in 1992 built his entire candidacy on being a “New Democrat,” suggesting there was something wrong with the old ones.

In fact, there was something wrong with them:  They kept losing elections.  Candidates like Michael Dukakis and Walter Mondale ran as across-the-board liberals and got walloped.  While those particular losses can be attributed to personality as much as policy (Dukakis’ in particular), it became clear that the party would need to change in one way or another, and Clinton’s solution amounted to,  “Let’s be more like Republicans.”

To be fair, the gambit worked, insofar as it produced electoral victories.  The Democratic candidate has won the popular vote in five of the last six presidential elections, and always on the implicit promise of being agreeably moderate on most things, whether through cutting taxes, maintaining a large military presence overseas or being as glacial as possible on civil liberties and civil rights.

If the benefit of this strategy has been having a liberal in the Oval Office, the cost has been liberalism itself.  The Democratic Party’s ideological center of gravity is smack in the middle of the American political spectrum, leaving left-wing voters with no one to defend their views on such matters as social justice or economic inequality.  When such true-blue figures appear—say, in the form of Elizabeth Warren or Bernie Sanders—left-wing voters treat them like the unicorns they are.

There are a billion national Republicans whom the GOP base can claim as its own.  Why does the Democratic base have almost none at all?

Easy:  Because the Democratic Party either doesn’t trust its own ideas, or it doesn’t trust its ability to sell those ideas to the public.

Considering public opinion, you’d think they would try every now and again.  After all, such core liberal programs like Medicare and Social Security remain intensely popular from coast to coast, while subjects that Democrats didn’t even touch a decade ago—gay marriage, legal marijuana, prison reform—are becoming more accepted by the day.

The party worries that it’s too liberal for America.  Has it ever wondered if it isn’t the other way around?

Goodbye to Some of That

Here was a thoughtful Facebook post this week from George Takei:

“We have made progress.  Few even notice that the top contenders for the Oval Office are a woman senator, a black neurosurgeon, a Jewish socialist and a total douchebag.”

True enough.  And we could go even further than that:  It seems that equally few people have noticed that one of the five Democratic nominees is Catholic, as are not one, not two, but six of the Republicans.  This in a country that has elected only one Catholic president and nominated only two others.  So much for the fear of an American leader taking his cues directly from the Holy See.

Then there’s the fact that one GOP candidate is the son of Cuban immigrants, another is the son of Indian immigrants and a third wasn’t even born in the United States.

In previous elections, any of those pieces of trivia would be major headline news, and we would spend months ruminating, for instance, on how having foreign-born parents might affect a potential president’s foreign policy—as some people are still doing with our current president and his African father.

But in this election?  Not so much.

Not really at all, in fact.  I mentioned that one of the Democratic nominees is Catholic.  Quick:  Which one is it?  Did the thought even occur to you until now?  If so, has this person’s Catholicism had any impact on your interest (or lack thereof) in electing him leader of the free world?

For that matter, has Bernie Sanders’ Judaism had any measurable influence—in either direction—on his overall popularity?  Is there a statistically significant chunk of Republican voters who support (or oppose) Ben Carson because he is black, or does his unlikely success in the polls exist entirely outside the context of race?

Wouldn’t it be nice to think so?

To be sure, none of these people has won a damn thing at this point in the 2016 presidential contest.  Anybody can run for president, so it shouldn’t come as a shock that a minority or two would slip in every now and again.

The difference this time—as George Takei’s quip suggests—is that many of these demographic oddballs are being taken seriously and could actually win the nomination, if not the presidency, and hardly anybody seems to care about how “historic” these would-be presidencies would be.

A Cuban commander-in-chief?  Yawn.  An African-American in the Oval?  Been there, done that.  A woman as the most powerful human being on Earth?  Yeah, sure, why not?

Perhaps I’m being needlessly optimistic.  With 15 weeks to go before the first primaries, it’s still too early to take an accurate measure of what America really thinks about its options on the ballot next fall.

However, as gauged by media coverage—a not entirely useless barometer of cultural trends—the attention on this year’s contenders is focused less on identity and more on…well, not issues, per se, but definitely on the things coming out of these folks’ mouths.

With Ben Carson, for instance, the interest has been entirely on his muted speaking style, his career as a neurosurgeon and—for liberals, anyway—his galling opinions about Muslims and gun control.

And why not?  When a serious candidate walks around saying things like, “Obamacare is […] the worst thing that has happened in this nation since slavery,” there really isn’t time to notice the color of his skin or any other details not immediately relevant to what his policies might be.

(Admittedly, a non-black candidate probably wouldn’t have uttered that particular sentence, although Carson didn’t let his non-Jewishness prevent him from saying the Holocaust might have been avoided if the Jews had been packing heat.)

In other words, Carson’s outlandish worldview has transcended his race and everything else about him.  To the degree that his public comments can be considered “substance,” our public discourse on his candidacy has, in fact, been almost entirely substantive.

Likewise with Bernie Sanders.  The Vermont senator has become universally recognized for being a democratic socialist and not—so far as anyone can tell—for being Jewish.  His theories about making the United States a bit more like Scandinavia—galvanizing to liberals, infuriating to conservatives—has totally eclipsed any concerns (or thrills) that his faith might otherwise have caused.

In 2000, the nomination of Joe Lieberman for vice president created a minor national tizzy—particularly within the Jewish community.  This time, one could be forgiven for not knowing what Sanders’ religion is.  He’s certainly never brought it up himself.

Maybe this will change.  Should Sanders replicate Barack Obama’s 2008 miracle and defeat Hillary Clinton, it might only be a matter of time before we start wondering (for instance) whether having a Jewish commander-in-chief would be counterproductive in our ongoing negotiations with the Arab world, or whether a Sanders presidency would engender a new era of conspiracy-mongering amongst America’s anti-Semitic community, in the way that Obama’s presidency has seen a flowering of our country’s residual racism.

And should Clinton maintain her lead and fulfill her destiny, it probably won’t just be T.I. musing about how women can’t be trusted with power because their hormones might get the better of them.  Some 92 percent of Americans say they would vote for a qualified female presidential candidate, but the road to that destination might be a whole lot messier than we think.

But I’m willing to be surprised.  Maybe the United States really has moved beyond identity-based prejudices in choosing national leaders.  Maybe electing our first black president—twice!—had the effect of getting all of our demographic hang-ups out of our system, and now we are prepared to elect anybody with the smarts and the fortitude to take on the most difficult job on planet Earth.

Soon enough, we’ll know for sure.  In the meantime, we can give ourselves a soft pat on the back for ignoring or looking past the genetic characteristics of our presidential contestants and focusing, instead, on things that really matter.  Like those damn e-mails.

Boston Goes Dutch

For all the time I spend in art museums, I’ve never really settled on favorites.  Having no formal education in fine arts, my tastes tend to careen from one genre to another.  I’ll be swooning over Impressionism at one moment, then gawking at bronze sculpture in the next.  Then I’ll turn a corner and realize—no!—it’s Dalí and the Surrealists I love the most.  Or perhaps Sargent with his ravishing portraits, or Bierstadt and his majestic Western landscapes.

In short:  I may not know art, but I also don’t know what I like.

Yet whenever I’m wandering through some wide-ranging collection or other, I always manage, sooner or later, to find my way to the Netherlands.

For whatever reason, Dutch paintings have an aura and a gravitas completely unlike those from anywhere else.  You spot one from across a gallery and realize that your day is not complete until you’ve given it a good, hard look, because there’s bound to be something there that you’ve never seen before and probably won’t again.  (Spend a few hours with Hieronymus Bosch’s Garden of Earthly Delights and you’ll start to get the idea.)

While Dutch painting has been interesting in every epoch, it was during the 17th century that it achieved transcendence.  Like 18th century Philadelphia or the 1927 Yankees, the Dutch Republic during the 1600s saw a confluence of human ingenuity and accomplishment—concentrated in one time, in one place—that, in its way, is without parallel in recorded history.

And now, at the finest art museum in New England, you have as great an opportunity as ever to experience what that explosion of creativity entailed.

The new show is called “Class Distinctions:  Dutch Art in the Age of Rembrandt and Vermeer.”  It is on view at the Museum of Fine Arts, Boston, from now until January 18.  If you have even the slightest interest in the Dutch Golden Age or in Dutch culture in general (there is absolutely no reason you shouldn’t), you’d best make your way to Huntington Avenue while the getting’s good.  I have it on good authority that the exhibit is truly one of a kind.

As it turns out, what made this so-called Golden Age so special was the vitality of its subject matter.  The 17th century marked the moment—albeit one that lasted 80 years—when, through a grinding war, the Seventeen Provinces of the Netherlands secured independence from Spain and became a thriving, independent republic.  In so doing, the people of this reborn nation (when they weren’t busy founding and settling New York City) were compelled to start their lives fresh—a bit like our own founding fathers a century and a half later.

The resulting society was at once a cultural and economic powerhouse while also sharply and cruelly divided among the rich, the poor and everyone in between.  Hence the show’s title, “Class Distinctions.”  With 75 paintings—culled from institutions all over the world—the exhibition presents the entirety of Dutch society through the prism of class:  Each person in each piece is defined, in one way or another, by what they do and where it positions them on the social stratum.  From nobles and businessmen down to butchers, bakers and linen makers, everyone had their place and, for better or worse, was likely stuck there forever.

Sound familiar?

At this particular moment—when America is, itself, becoming an increasingly stratified country, and when our great cultural institutions are struggling to retain relevance by appealing to ever-younger audiences—the MFA’s Dutch show is a dazzling stroke of inspiration:  A collection of masterworks arranged in a manner that could not be timelier or more compelling.

The MFA has certainly wrestled with the issue of populism before.  In the spring of 2014, it put on a modestly-sized exhibition, “Boston Loves Impressionism,” that was possibly the first major museum show to be crowd-sourced:  In the months prior to opening, the MFA allowed its visitors to vote for their favorite Impressionist works in the museum’s collection, and the 30 most popular pieces won a spot on the wall.  Neat, huh?

In truth, “Boston Loves Impressionism” was mostly just a way to kill time while the museum renovated its permanent Impressionist galleries—and, of course, to show off how magnificent its holdings are—but it nonetheless marked a minor milestone in the relationship between a large art institution and the public.  It posed the question:  What responsibility does the former have to the latter?  Should a museum care what its visitors think, or is its job purely to educate—to enhance the tastes of the rabbles rather than indulging them without comment?

If the MFA’s Impressionism experiment was an act of pandering—however successful and aesthetically gratifying—“Class Distinctions” represents the pinnacle of what a great art museum can and should do with its mountains of buried treasure.  That is, to present them in a way that implicitly says to the public, “This is what great art looks like, this is why it’s important and this is how it might be useful in your daily life.”

With Dutch painting—equally in settings intimate and majestic—it’s all about the little details.

Details like the dead rooster lying at a customer’s feet in Isaak Koedijk’s The Barber-Surgeon (it was the customer’s means of payment).  Or the slightly bemused look on one of the cows in Gerard ter Borch’s A Maid Milking a Cow in a Barn (butter and cheese were among the most valuable commodities of the time).  Or the general prevalence, in nearly every depiction of the ruling class, of cheerful black servants and/or playful midsize dogs—both signifiers of great wealth, much as sports cars and personal trainers are today.

Overall, the paintings here reveal much about the era’s gender roles and the comparative benefits of one profession over another (e.g. working-class prostitutes were much better-off than women in more “honorable” vocations).  Indeed, the sheer volume of information contained in this show’s four rooms is staggering, particularly if you—like me—knew practically none of it beforehand.

Then again, you could choose not to give a damn about What It All Means and simply enjoy these pretty pictures for their own sake.  “Class Distinctions” is a triumph because of its strong thematic momentum, but also because the paintings themselves—virtually without exception—are as rich and inventive as you could reasonably hope to see all at once.  And all without having to invent a time machine or buy a ticket to Amsterdam.

Unbearable

As the nation smarts from its most recent mass shooting—this one at a community college in Oregon—I wonder:  this time, can we skip the niceties and just repeal the Second Amendment?

In America’s perpetual cycle of gun violence, the predicable response by the left—but never by the right—is to advocate for stronger gun control laws, in the hope that these sorts of senseless large-scale massacres could be made a little less common in our culture.

While there is no doubt that background checks, assault weapons bans and the like would reduce the total amount of lethal firearms in circulation—and, presumably, lower the odds that some lunatic will toddle into a public place and wreak unholy havoc—it has become increasingly clear that such “common sense solutions” are not enough.

To truly, comprehensively address our country’s problem of crazy people committing mass violence with guns, we need to change the culture itself.  And there is no more obvious place to start than to amend one-tenth of our beloved Bill of Rights.

We often ask why our country is so much more violent than other Western democracies, as measured by how many times we kill each other (and ourselves) with guns every year.

I think it’s rather simple:  It’s because we have conflated weaponry with freedom.

Americans value personal autonomy as a sacred birthright, and somewhere along the line we decided that having your own private arsenal is a necessary component of the compact between individuals and the state.  As far as we’re concerned, personal freedom and security are meaningless unless they include the ability to kill another human being with the greatest of ease.

And you can’t blame us for thinking that, because it’s written right there in the Constitution:  “A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed.”  Scholars have argued long and hard about the significance of the first half of that sentence, but the second half is inescapable.  So long as the Second Amendment remains on the books, we will continue to accept that purchasing and carrying a gun is—under most circumstances—a natural part of American life, along with all the carnage that comes with it.

Which means that if we want to change the latter, we will need to change the former.

So let’s do it.  Let’s strike the Second Amendment from the record and forget it was ever there in the first place.

This may seem like an extreme, radical measure.  In practice, that wouldn’t necessarily be the case.

After all, to abolish the Constitutional right to “keep and bear arms” would not make gun ownership illegal.  As with any activity not specifically mentioned in the Constitution, it would merely delegate the regulation of firearm sales to individual cities and states—a scenario that more or less already exists.  The difference, of course, is that those municipalities would be empowered to enact severe gun restrictions—if not outright prohibitions—they could not currently get away with.  Or, conversely, to opt for no limits whatsoever.

The result would be exactly what Republicans have always claimed they want:  A laboratory of democracy operating purely on the state and local level.  Guns would be prevalent in areas of the country that want them and scarce in areas that don’t, and the results would speak for themselves.

The real impact of getting rid of the Second Amendment—the reason it might be worth the trouble—is that it would force us to treat deadly weapons with the gravity they deserve, and to stop acting as if they were as American as football and pumpkin pie.  It would scale firearms down from a revered prop in the American tool belt to what they actually are:  A million tragedies waiting to happen.  They would be exotic commodities, instead of the red-white-and-blue freedom sticks that many of us take them for now.

This shift in attitude would be no small thing—not when you reflect on how much of a crutch the Second Amendment has always been for those who oppose any and all gun control legislation, and how much more challenging the pro-gun argument would be if the amendment didn’t exist.

Indeed, every time we experience some gun-related mass murder and our leaders muse about what might be done to rectify it, the National Rifle Association and its fellow travelers insist that the problem of sociopaths accessing weapons is ultimately unsolvable, then they proceed to hold up the text of the Second Amendment the way a zombie hunter holds up garlic.

The NRA apparently feels that the right to bear arms is absolute and—as Ben Carson recently argued—more important than ensuring that innocent children are not pumped full of holes, even when such horrors could very easily be prevented.

Well:  so long as the Constitution says what it says, what’s to stop them from thinking that?

Now, I’m not a complete idiot.  I realize that the prospect of repealing a Constitutional amendment—particularly this one—has roughly the same likelihood as electing Taylor Swift to be speaker of the House.  Not counting the Bill of Rights, America’s founding document has been altered only 17 times in its 228 years of existence.  What is more, only once has Congress replaced one amendment with another, and that was for the singular purpose of reintroducing alcohol into polite society (a noble cause if ever there was one).

Yes, of course attempting to knock down the Second Amendment would be an exercise in futility—political suicide, to say the least—and I don’t expect such an effort to take shape in my own lifetime, or anyone else’s.

In 2014, no less than former Supreme Court Justice John Paul Stevens offered the following compromise:  Rather than abolishing the Second Amendment outright, he wrote, let’s expand it by five words so that it reads, “A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms when serving in the militia shall not be infringed.”

That would certainly do wonders in clarifying what the founding fathers likely meant in composing the original clause, and it may well have the effect of tempering our country’s cavalier attitude toward crooked metal objects that kill people.

Or not.

Maybe the image of Americans as armed and dangerous is just too strong for us to ever overcome.  Maybe the United States really is exceptional in this regard, for better and for worse.  Maybe our increasingly obstinate Congress has made concrete gun control laws impossible for the next many years, and maybe these insane school shootings will continue to occur at regular intervals for the rest of our natural lives.  That’s to say nothing of all the afterschool shootings, suicides and freak accidents that can only be caused by lethal weapons and would be far less prevalent if those weapons were far more difficult to obtain.

In other words, maybe we’re doomed.  Maybe a few dead children every now and again is the price we pay for freedom, and we’ll just need to get used to it.

And so we are left to ask ourselves whether a Constitutional right to bear arms is worth all of that.  Call me un-American, but I’m not entirely sure that it is.

Bern Notice

Could Bernie Sanders be elected president?  Could he even be nominated by the Democratic Party?  Would the latter be political suicide with respect to the former?

For the many despairing liberals who think they already have these questions figured out, indulge me a brief history lesson that just might give you pause and—dare I say—hope.

At precisely this point in the 2008 election cycle—namely, early October 2007—Hillary Clinton led Barack Obama by anywhere between 16 and 21 points in opinion polls of Democratic primary voters.  What is more, out of the 133 national surveys conducted between January and October of that year, Clinton bested Obama on 131 occasions.

This trend continued throughout the fall, with Clinton more than 20 points up in the final days of the year.  In fact, the first time Obama prevailed in any poll in the latter stage of the campaign was in the first week of February 2008—a full month into the actual primary process.

From that moment on—that is, between early February and the first week of June—everything flipped.  Of the 52 polls taken during that time, Obama defeated Clinton by a score of 47-5.

Long story short (too late?), the man who’s been leader of the free world for the last six-and-a-half years did not even become leader of the Democratic Party until after many, many months of being firmly in second place.

While the thrust of history tends to wash away memories of what might have been, it is worth recalling how overwhelmingly likely it seemed that Hillary Clinton would be her party’s nominee for president in 2008.  How very few people—for a very long time—took seriously the notion that a first-term senator with mixed blood and an African name could defeat the partner-in-crime of the most popular Democratic president of modern times.

For all the early passion of Obama’s smitten supporters—paired with the understandable exhaustion with all things Clinton—most, if not all, political prognosticators pegged an Obama victory as little more than a pipe dream until very, very late in the process.  Clinton, it seemed, was unbeatable.

What changed?  People starting voting, that’s what.

Once Obama won the Iowa caucuses—thanks, in large part, to his campaign’s superior organization and mastery of social media—the fantasy of his election suddenly became plausible, and all those who preferred him but feared Clinton was the only viable option were liberated to vote with their hearts, with the assurance that in doing so, they were also voting with their heads.

Which is all to suggest that this premise of “electability” is totally, utterly worthless.  In truth, you cannot anticipate how the country will vote until it actually does so.  History is inconceivable until it becomes inevitable, and someone who is unelectable today just might become president tomorrow.

It happened with Obama.  Why couldn’t it happen with Sanders?

To be sure, the analogy between the two men is not exact.  Their candidacies exist in different times and contexts—different political worlds, really—and the candidates themselves are hardly mirror images of each other.

All the same, I would maintain that their similarities are the key to understanding why and how Sanders is a more serious candidate than most people think, and that in several key areas, Sanders is actually better off than Obama was at this point in his presidential quest.

I mentioned Obama was polling at least 16 points behind Clinton in the fall of 2007.  As it happens, Sanders right now is in roughly the same position.  However, I would hasten to add that it was only two months ago that Clinton led Sanders by 30 points or more—sometimes a lot more—which suggests a Sanders tailwind that even Obama never enjoyed.

Then there are the polls that actually matter:  those in Iowa and New Hampshire, the first two primary states that, for better or worse, tend to determine whose name ends up on the ballot.  (The last person to secure the nomination without winning either state was Bill Clinton in 1992.)  In Iowa, Clinton currently leads Sanders by 6 points—the same margin she held over Obama in the fall of 2007—while in New Hampshire, Sanders is ahead by double digits.  (There, Clinton led Obama by more than 20 points.)

“Feel the Bern” is not just a cheeky branding device; it is an actual, tangible phenomenon every bit as real as the worshipful throngs at the Obama rallies who thundered on about being “fired up and ready to go.”  (Whatever the hell that meant.)

Just this past weekend, a Sanders gathering in Boston drew some 20,000 people—the largest crowd ever assembled in Massachusetts for a political primary event.  Sanders has seen similarly huge showings all over the country for months now, and his campaign recently announced fundraising totals nearly equal to Clinton’s and, by some metrics, well ahead of ’08 Obama at this juncture in the race.

What is more, the reasons for this jubilation—this visceral, manic enthusiasm that Clinton can only fantasize about—are not terribly different from those that led Obama to be anointed the second coming of Christ.

In the twilight of the Bush administration—when it was clear to liberals that their government didn’t give a damn about lower-class concerns and that Congress was increasingly derelict in its most basic duties as a governing body—Obama promised to change the system:  To squash the influence of plutocrats and other special interest groups, to bring a measure of fairness and justice to American finance and to redirect critical funding from futile overseas adventures back to the home front.  And, of course, to moderate the “tone” on Capitol Hill so that Democrats and Republicans might occasionally treat each other with respect.

In other words, Obama premised his candidacy on creating a country in which ordinary people were given a stake in their own destiny, and it was a message so intrinsically appealing and so well-delivered that people responded with a fervor that hadn’t existed in decades.

That, in so many words, is what is now happening with Sanders.  While much more strident than Obama in his indictment of America’s ruling class, he, too, is billing himself as a fighter for the working man.  From his stump speech, his ultimate ambition is to create a society in which wealth and income do not determine how much (or how little) influence an individual exerts over his government, nor how much benefit he or she derives from it.

To America’s non-billionaires, this is a fairly irresistible platform, and given Sanders’ long and consistent history as a legislator, there isn’t a doubt in the world that he means it.  Whether his prescriptions are feasible is a separate question.  The more important point—as the sheer size of his campaign rallies attest—is that a significant chunk of the public apparently agrees with his diagnosis of the problem itself.

Bearing all of these considerations in mind—along with a few that we haven’t mentioned—the question isn’t, “How could Sanders possibly be elected?”  Rather, it is, “Why on Earth shouldn’t he be?”

If appealing to actual concerns of actual people means anything, what makes Sanders any less electable than anyone else?  To those queasy left-wingers who worry about Sanders having trouble in the general election, I wonder:  Why are you so afraid to vote for exactly what you want?  If you are convinced that his ideas about America are superior to everyone else’s, what exactly is stopping you from exerting every effort to put him in the Oval Office?

Is it simply that you don’t think enough of your fellow citizens are smart enough to see things the way you do?  Are you worried that by nominating a true blue liberal, the Democratic Party will lose any chance of carrying Ohio and Florida?  Are the stakes just too darned high to risk a long shot when a much safer option is available?

That’s certainly how the party felt in 2004 when it passed up firebrands like Howard Dean and John Edwards in favor of the more “electable” John Kerry.  GOP voters behaved likewise in 2012 when they picked the “electable” Mitt Romney over much more conservative alternatives.  Conversely, the Dems were said to be crazy to opt for an unknown quantity like Obama in 2008, and Ronald Reagan in 1980 was widely derided as a joke by most liberals right up until he won 44 states against Jimmy Carter’s six.

In other words, you don’t know who’s electable until the final results are in.  If you want someone who shares your worldview to be president, try voting for that person.  You might be surprised how many of your fellow citizens do the same.