Nothing to See Here

The problem isn’t that Hillary’s burrito bowl doesn’t matter.  The problem is that it does.

Oh, it certainly shouldn’t be anybody’s concern that Hillary Clinton popped into a Chipotle somewhere along her magical mystery tour through the Midwest last week.  Contrary to popular belief, presidential candidates do occasionally eat lunch.  It’s not an inherently newsworthy event.

In fact, you’d need to be more or less clinically insane to be so invested in a potential president—19 months before the election—that you wonder where (and what) they ate this week.  Or ever.

Many words leap to mind.  One of them is “stalker.”

But, of course, that’s what our country’s venerable press corps spent its time doing in the opening days of the Hillary Clinton campaign.

From the moment the former secretary of state announced her candidacy in a YouTube video on April 12, a gaggle of reporters raced to her roving campaign headquarters—a Chevy Explorer named Scooby—and they’ve been holding her road trip under a microscope ever since.

When word came that Clinton had patronized a Chipotle without anyone noticing, the media couldn’t let it go.  Over the next several days, no stone of Burritogate was left unturned:  What Clinton ordered, whether she left anything in the tip jar, why she was there incognito and didn’t mingle with other customers.

This is probably the moment for us to wryly observe that if the media had been as maniacally vigilant about the Iraq War as they are about a former senator’s dining habits, the last dozen-odd years of Middle East calamities might have been avoided.  But that’s a cliché for another day.

The fact, in any case, is that the press is treating this early phase of the 2016 election pretty much as you’d expect:  By clinging onto every last microscopic detail of the two parties’ respective contenders and wringing as much meaning out of them as they can.

At bottom, this is the result of two simultaneous—and seemingly unavoidable—conditions.  First, the reporters in question apparently only exist in order to cover these sorts of things.  Because, you know, it’s not like there’s anything else happening in America that might provide a better use of their time.

And second, since the first primary ballots of the 2016 race won’t be cast for another nine months, they really have no choice but to cover literally anything the candidates do.  Thus far, Clinton is the only active campaigner on the Democratic side, so there you have it.

The logic of it, however depressing, seems airtight.

It’s not.  There is a choice involved here, both for journalists and for us:  The choice to look away.  To ignore everything to do with the 2016 election until—oh, I don’t know—the year 2016.  To wait patiently until something interesting happens, rather than trying to create interest out of nothing.

We could allow ourselves a scenario—if we so chose—in which presidential aspirants would go on their whistle stop tours of Iowa and New Hampshire for years on end, but without reporters and cameras breathing down their necks 24 hours a day.  Grant the good residents of these early primary states the attention of the candidates, but not of the entire country.  Really, what do the rest of us care?

There are those—particularly on the interwebs—who will insist to the last that early nuggets from the campaign trail can serve as insights into a candidate’s character and managerial style, and are therefore worth covering and commenting upon.

As much as I would love to dismiss this theory outright as a load of hooey—political pop psychology run amok—I am in the unfortunate position of agreeing with it.  At least some of the time.

For instance, it became clear in the early days of John McCain’s first run that his scrappy, welcoming attitude toward the press would make him an uncommonly congenial and entertaining nominee (a fact that, admittedly, didn’t quite hold the second time).  Conversely, I think Rand Paul’s already lengthy history of arrogance and condescension toward reporters asking him simple questions should rightly give pause to anyone who thinks it’s a good idea to make this guy America’s chief diplomat to the world.

I’m not convinced, however, that it requires two full years of coverage for the truth about these people to come out.  Indeed, I am as certain as I can be that a person who completely tunes out all “news” about the 2016 election from now until, say, next February will be no less informed of its essentials than someone watching The Rachel Maddow Show every night between now and then.

I should add that, so long as reporters continue hounding candidates day and night, I have no particular problem with viewers at home following it as pure, disposable entertainment.

Just don’t pretend that it’s anything else.

Personally, I think it’s kind of hilarious that Hillary Clinton has named her campaign van Scooby.  It’s goofy, whimsical and endearing—and possibly a latent grab for the stoner vote, considering whom it’s named after.

But I did not need to know that.  It’s not important, and it reveals nothing relevant about Clinton that I won’t learn through debates, speeches and actual primaries.

More to the point, I did not need a professional journalist to tell me the van’s name, knowing what that journalist might have learned and written about instead.

The key in covering a round-the-clock event that goes on forever is knowing how to distinguish the things that matter from the things that don’t.  When reporters treat everything equally—as if where a candidate eats lunch is just as important as what he or she thinks about climate change—they license voters to do the same thing, leading to a campaign that is dangerously trivial.

The trouble, you see, is that talking about a trip to Chipotle is a lot more fun than talking about, say, ISIS.  Given the choice, there isn’t one of us who wouldn’t secretly (if not openly) prefer the former, even though we know the latter is infinitely more consequential and pertinent to being president of the United States.

Which means that we can’t be given the choice.

We can’t have our laziest instincts accommodated by being told that following the most inane details of a presidential campaign makes us informed citizens.  It doesn’t.  It just makes us voyeurs and turns our candidates into exhibitionists.  To elevate irrelevant pablum to a level of respectability is to enable both us and them into being our worst possible selves.

As we well know, the cultural erosion this practice creates does not end with the campaign.  Think about how many precious TV hours and newspaper columns have been expended on the exploits of the first family, or on the president’s March Madness bracket.  Or the fact that the White House Correspondents’ Dinner is still a thing.

The human need for trivia is plainly innate and inescapable—hence the proliferation of reality TV, People and the National Football League.

However, government and politics are supposed to exist outside the world of superficiality, not least because the future of the republic depends upon them.

Is it too much to ask that we take this one aspect of American life seriously?

If our press corps didn’t spend days on Hillary Clinton’s burrito runs and the like, would we really be unable the handle the stuff that really matters?

Don’t answer that.

Advertisements

A Frank Appraisal

I’d nearly forgotten how much I adore Barney Frank.

The Massachusetts lawmaker retired from Congress in January 2013 after 16 terms representing the state’s fourth House district.  He had kept relatively quiet in the two years since, but has suddenly been popping up in TV and radio interviews in conjunction with the release of his new memoir, Frank.

His reemergence into public life should function as a reminder of how unique, entertaining and indispensable he still is.

To many, Barney Frank may well be known simply as the co-author of the Dodd-Frank Wall Street Reform and Consumer Protection Act, which attempted to right the American economy amidst the Great Recession by dramatically shaking up the inner workings of the country’s regulatory agencies.

While Frank’s role as chair of the House Financial Services Committee will undoubtedly be a major component of his legacy as a public servant (for better or worse), his special place in my heart—and in the hearts of countless other government nerds—was secured through a lifetime of advocacy for causes and principles that precious few other congressmen have ever bothered to take seriously.

And—it must be said—for his being such a cranky, insufferable firewall against those who have stood in his way.

As a Massachusetts Democrat, Congressman Frank was, in some ways, completely predictable.  On matters of policy, he took an unambiguously liberal view on nearly every issue, from economics to foreign policy to climate change to abortion.

But it wasn’t just that he held clear political stances and stuck with them (rare as that is nowadays).  It’s that he defended his worldview with guns blazing, arguing for his side until his throat grew hoarse—often to the point of rudeness—never giving an inch and never entertaining any doubt that, in the end, he was right.

Specifically, Frank made himself a champion of two would-be lost causes:  Government and liberalism.  That is to say, on the former, he advocated not merely for his own particular government-led solutions to various national ills, but also for the notion that government should be in the business of helping people whenever it possibly can.  On the latter, he not only gave voice to left-wing ideas, but to liberalism itself as a noble means of seeing the world and running the country.

In short, he was (and still is) a big government Democrat and damned proud of it.

For any left-wing politician, this should go without saying.  But it doesn’t.

Unlike most Republicans in Washington, who fall all over each other to claim themselves as the most “conservative” person in the room, today’s Democrats do a fairly rotten job of sticking up for their own brand.  As Frank himself has disapprovingly observed, most Democrats attempt to have it both ways by championing government programs but then echoing the GOP mantra that government should be as small as humanly possible.

They do this out of fear—namely, fear that voters are too conservative to ever be sold liberalism as a governing philosophy.  They have effectively ceded the moral high ground that, in the Roosevelt and Johnson eras, liberalism so firmly held.

Instead, they have adopted non-ideological centrism as their M.O.—a tactical approach that, to be sure, helped to elect Bill Clinton and Barack Obama to four combined presidential terms, but which has also left the party vulnerable to the charge that it doesn’t believe in anything except winning elections.

Barney Frank had no truck with this lame political maneuvering, and instead took the gamble that he could convince people that his left-wing views were the right ones, not least by showing that he believed in them himself.

Indeed, when speaking on issues about which he was passionate, he was seemingly a man without fear.  Even when he knew his position was unpopular—and he certainly had a knack for skirting popularity—he went right ahead to make his opinion clear.  Morally speaking, he didn’t care if he was the only one stumping for this or that cause.  He was determined to say what he truly thought and shape America into what he dearly wanted it to be.

The results were mixed.  In his 32 years in Congress, Frank notched some glorious victories and some devastating defeats.  The real challenge—for him and for any intellectually honest public figure—was to emerge from a lifetime of political and ideological battles with his dignity intact.  On balance, he succeeded.

At this moment, it’s worth appreciating just how difficult it is for a lawmaker to remain true to his convictions while also logging some genuine legislative accomplishments along the way.  For most congressmen, it’s one or the other:  Either you hold firm to your principles and get nothing done—newly-minted presidential candidate Ted Cruz is a sterling example—or you bend and compromise, effecting laws that are not quite what you had in mind but are, under the circumstances, good enough.

In fact, Frank spent a great deal of his tenure bowing to certain political realities, acknowledging that politics is always a mixture of idealism and pragmatism and that intractable opposition cannot simply be wished away.  When push came to shove, he would opt to cut a deal with Republicans to get half of what he wanted, rather than obstinately sticking to his guns and ending up with nothing.

He tried hard not to make the perfect the enemy of the good, and it resulted in an awful lot of good.

The key, through all of it, was that Frank almost always came clean to his constituents as to why he acted as he did.  This would often require an explanation similar to the one I just gave—that politics is the art of the possible—and if the voters didn’t accept that, it was just too bad.

Frank has always prided himself on intellectual honesty, and on the basis of his collected public statements over the years, there may be nothing he despises so much as disingenuousness and hypocrisy—character traits that he still takes enormous joy in calling out.

To wit:  Before his surname became synonymous with financial reform, there was such a thing as the “Frank Rule,” which stated that a congressman who was secretly gay could be “outed” by others if said congressman publicly opposed gay rights and/or supported anti-gay legislation.  As Frank put it, “The right to privacy does not include the right to hypocrisy.”

In a fair way, the Frank Rule is where all the elements of Barney Frank’s awesomeness converge.  It demonstrates his searing disdain for double standards—the practice, in this case, of a lawmaker privately engaging in behavior that he publicly condemns.  It underlines Frank’s penchant for loudly and consistently condemning such conduct when it occurs.

In addition, it alludes to Frank’s outsized concern for ordinary people—especially members of minority groups—who are left vulnerable by unprincipled politicians who consider themselves to be above the law.

And, of course, it concerns the most important cause of Frank’s life and career:  Legal equality for gays.

Frank was America’s second openly gay congressman.  When he came out in 1987, the most pressing civil rights issue was amending the Immigration Act of 1965, which had classified homosexuals as “sexual deviants” who could be denied entry to the United States.  Same-sex marriage was scarcely an idea, let alone a reality.

While Frank has not been personally responsible for every civil rights victory in the quarter-century since, his fingerprints are everywhere, and his public oratory in defense of legal equality for gay people is among the most arresting and passionate as that of any public figure.  In an interview shortly after retiring, he cited the repeal of Don’t Ask, Don’t Tell, on which he played a part, as possibly the finest moment of his career.

And his concern for fellow gays is really just one component of his work to secure civil rights for all oppressed groups, itself motivated by his most zealously-held, and seemingly contradictory, belief:  That people should be left the hell alone by the government.

For all his true blue liberalism, Frank is a social libertarian of the first degree, defending the right of individuals to engage in any activity they want, provided that it doesn’t directly harm anyone else.  For him, this includes the right not just to marriage but also to gambling, to drug use, to prostitution and, fittingly, to free speech.  When the Westboro Baptist Church came under fire for its anti-gay demonstrations at the funerals of soldiers, Frank was one of only three congressmen to side with the church, arguing that even rank homophobia is not a sufficient cause to stifle free expression.

This is precisely the sort of nerve and political boldness of which Congress has been deprived since Frank departed its storied halls, and of which it could not possibly have enough.

We need more public servants like Barney Frank to defend the lost causes that will always need a champion.  For the time being, we can be thankful that, even in retirement, we still have Barney Frank himself to fill the role.

Oscar Soapbox

Would it be considered a lost cause to complain about the mixing of politics and the Oscars?  Is it just too late in the game for us to do anything about it?

Probably.  But every losing issue needs somebody to argue it for the last time, and on this occasion, that person might as well be me.

From this year’s Academy Awards, broadcast a week ago Sunday, arguably the most admired moment came from Patricia Arquette, the winner of Best Supporting Actress, who devoted the final chunk of her acceptance speech to call for equal pay for women.  “We have fought for everybody else’s equal rights,” said Arquette.  “It’s our time to have wage equality once and for all and equal rights for women in the United States of America.”  The remarks yielded howls of approval inside the Dolby Theatre and wide support on the interwebs in the hours and days thereafter.

Indeed, I can’t say I have any quarrel with the substance of Arquette’s remarks.  While I think the specific issue of wage equity is slightly more complicated than it appears—not every case is a matter of out-and-out discrimination by an employer—it’s just about impossible to dispute the principle of equal pay for equal work.

Here’s my question:  What does this have anything to do with the Oscars?

In theory, the Academy Awards are nothing more than the recognition of the film industry’s best work in a given year, as determined by members of the industry itself.  Acceptance speeches by the winners are meant to be exactly that:  A show of gratitude for having been singled out by one’s peers.  And—as has become the practice—an opportunity to thank everyone who helped get them there in the first place (which, as we know, tends to be everyone the honoree has ever met).

As such, Oscar speeches, at their best, are exercises in humility—ironic as that sounds, considering that the speakers are effectively being crowned kings and queens of the universe, or at least of the American culture.

To that end, my own favorite moment from last Sunday was Eddie Redmayne winning Best Actor for his performance as Stephen Hawking in The Theory of Everything.  Although I thought Michael Keaton slightly more deserving of the honor for his work in Birdman, I sort of hoped Redmayne would win, anyway, because I figured (from his previous wins this year) that he would react exactly as he did:  By jumping up and down like a giddy schoolgirl, completely overwhelmed.

There’s a certain feigned modesty that many British actors have turned into a shtick, but with Redmayne—33 years old, with no major starring roles until now—you sense that the gratitude is real.  That he works hard and takes his job seriously, but never in a billion years expected to wind up on the Oscar stage, and knows precisely how lucky he is.  That in a Hollywood overstuffed with jerks and prima donnas, Redmayne is one of the good ones.

That’s what the Oscars are all about:  Giving a moment in the spotlight to stars whose very existence elevates show business to something pure, noble and joyous.

And joy, it must be said, was oddly hard to come by during the balance of the Oscar telecast.  We had Best Song winners Common and John Legend lamenting the continuing racial injustices in the American legal system (and elsewhere).  We had Dana Perry, producer of the documentary short Crisis Hotline: Veterans Press 1, invoking her son’s suicide in a plea for more public discussion of the subject.  We had Imitation Game screenwriter Graham Moore citing his own brush with suicide and begging today’s tortured young people not to give up hope.

Sheesh, what an unholy string of letdowns.

Surely, these are all deathly important issues that deserve a thorough public airing, as they all surely have in recent times—albeit some more visibly than others.

But is the Dolby Theatre on Oscar night really the proper setting for them?

Can’t the Oscars just be the whimsical, frivolous, bloated Hollywood orgy we all think we’re tuning in to on the last Sunday of every February—curled up, as we are, on the couch with a tub of microwave popcorn and a cosmo?

We deal with the discomforting horrors of real life at all other moments of the year.  Why can’t the Oscars, of all things, be a temporary respite?  Arguably the single central function of movies, after all, is escapism.  Shouldn’t the event that celebrates movies follow suit?

Movie stars can, and do, stake out public opinions on any issue that interests them.  But must they do so at the very moment when most of us would just as well not be reminded of the fraught and complicated real world to which we must return in the morning?

I know this is a line of reasoning with holes large enough to drive a tank through.  I know movies are not only about escape.  I know the Oscars represent the largest audience that any artist will ever have.  I know that the Academy is, itself, a highly political organization and that Oscar voting is subject to the same cynical political maneuvering as any presidential election.  I know that the gripes about sexism and racism are as germane to the film industry as to any other.

And I know that, barring a totalitarian freak-out by future Oscar producers, winners are going to continue to say whatever the hell they want when they get up on that stage, even if it means talking over that infernal orchestra and harshing the buzz of everyone at home.

There is no escape from facing the hard facts of life—not even at silly award shows, which you’d think would be immune to them.  Apparently they’re not.

So instead, we are left with the second-best option:  Awarding trophies only to artists intelligent enough to climb on their political soapboxes in an articulate and entertaining fashion, as (it must be said) nearly all of them did last week.

Or we could just give everything to Eddie Redmayne.

Political Sniping

Warning:  The following contains spoilers about the movie American Sniper.  Proceed with caution.

Reading the various reactions to Clint Eastwood’s new movie American Sniper, two facts immediately leap out.

First:  No one can seem to agree on the movie’s point of view vis-à-vis the Iraq War.  Some say it’s neutral or apolitical, while others consider it a full-throated endorsement of the theory that American involvement in Mesopotamia was (and is) a good idea.

And second:  In nearly every analysis of what American Sniper is about—and whether it’s any good—the conclusion perfectly matches the politics of the person making the analysis.  Generally speaking, those who opposed the Iraq War also dislike the film, while those who considered the war necessary and just think the movie is great.  Those whose politics are ambivalent, private or nonexistent fall somewhere in between.

Joined with the debate about the movie’s version of Iraq is the depiction of its protagonist, Chris Kyle, the real-life Navy SEAL who killed more Iraqis than any other sniper and, as a consequence, spent the rest of his life struggling with posttraumatic stress disorder.  That is, until he was killed by a fellow veteran who was, himself, stricken with PTSD.

Does American Sniper portray Kyle as a pure All-American Hero and—far more interestingly—should it have?  Here, too, individual answers seem to track with whatever people happened to already think about these subjects.

What is clear, in any case, is that Kyle is given highly sympathetic treatment by Eastwood and is meant to be shown, in the end, as a Good Guy.  What is more, the movie is ultimately meant to be about Kyle and Kyle alone—his experience, his struggles—and is not necessarily interested in the world that is going on around him.

Is this a valid approach to filmmaking?  Can a movie like this be truly apolitical, as so many critics say it is?

In a fiery column in Rolling Stone, journalist Matt Taibbi says no.  Referring to “the Rumsfelds and Cheneys and other officials up the chain” as “the real villains in this movie,” Taibbi argues, “Sometimes there’s no such thing as ‘just a human story.’  Sometimes a story is meaningless or worse without real context, and this is one of them.”

Taking this theory a step further, I think it’s worth considering whether any movie can lay claim to being completely removed from politics of one sort or another.  Or, indeed, whether there is any point in trying.  My inkling is that it can’t and there isn’t, and it’s just as well that this is so.

As it happens, this is not the first time that an ostensibly “personal” Clint Eastwood project has been attacked for having a secret political agenda.

In 2004, Eastwood made a movie called Million Dollar Baby, about a woman who believes her sole purpose in life is to be a professional boxer.  (Warning:  Massive spoilers ahead!)  When she is sucker-punched by an opponent and left paralyzed below the neck, she decides she has no further reason to live, and pleads with her trainer and only friend (played by Eastwood) to unplug her life support and allow her to die.  After a period of struggle and a talk with a local priest, he does just that.

Because both the boxer (played by Hilary Swank) and the trainer are shown as sympathetic characters, the movie was considered by some to be an “endorsement” of assisted suicide, leading to a brief but intense national debate about both the movie and the issue itself.  How irresponsible, many said, for a serious film to portray assisted suicide in a sort-of positive light.

Against this wave of condemnation, the critic Roger Ebert offered the following rebuke“Million Dollar Baby raises fundamental moral issues.  At a moment of crisis, the characters arrive at a decision.  I do not agree with their decision.  But here is the crucial point:  I do believe that these characters would do what they do in this film.  It is entirely consistent with who they are and everything we have come to know about them.”

In other words, movies are about individuals, not causes.  Directors are free to have their characters behave however is natural to them, and it is up to us, the audience, to make moral judgments.  In all cases, however, we should understand such behavior as being specific to those characters—Chris Kyle included—and not infer them to be representative of any larger philosophy of life.

The problem, though, is that we just can’t help ourselves.  As Taibbi points out vis-à-vis American Sniper, movies do not exist in a vacuum.  It’s silly and naïve to think otherwise.

The truth is that everything is political, whether we realize it or not.  Politics is life.  The word itself, in its original Greek form, means “relating to citizens,” meaning every one of us is on the hook.  So long as you’re alive and occasionally leave the house to interact with the rest of humanity (tiresome as it can often be), then you are engaging in the art of politics.

As such, once a movie presumes to be about anything at all, it becomes a political document, and we are free—encouraged, even—to wade through any possible larger meanings it might hold, whatever the director’s intent.

In the current Oscar race, for instance, the big pre-Sniper controversy concerned whether Ava DuVernay’s Selma is unfair in its depiction of Lyndon Johnson circa 1965.  Because it’s about the Civil Rights Movement and its present-day parallels, Selma is political to its core.  (Its theme song, “Glory,” even includes a reference to Ferguson, Mo.)

However, take a deeper look and you’ll find politics intruding upon every last entry on the Oscar shortlist.

Richard Linklater’s Boyhood is ostensibly about nothing more than the experience of growing up in 21st century America.  But it’s also about single mothers, deadbeat fathers, alcoholic stepfathers, inspiring teachers, and the virtues of hard work.  Do you mean to tell me those are not political issues?  Open a newspaper:  When have they ever not been on the front page?

Alejandro González Iñárritu’s Birdman is a dark comedy about a washed-up movie star attempting to resurrect himself by putting on a Broadway show.  As such, it’s also about the nature of celebrity and fame, the integrity of art and (again) what it means to be a good father.  All political matters.

Damien Chazelle’s Whiplash is about an aspiring drummer and the terrifying music teacher who pushes him to within an inch of his life.  Which means it’s really about the costs of ambition and the lengths that some people will go to achieve greatness and immortality.  In America’s hyper-competitive culture, coupled with our meritocratic national work ethic, what could be more political than that?

And so forth.

So when people say that American Sniper is not a political movie, they are wrong twice.  First, in the view that the movie has no opinion about America’s adventures in Iraq (to wit:  could it really be a coincidence that the one soldier who questions the war’s purpose ends up getting shot in the face?).  And second, in the implication that a Navy sniper’s psychological profile has no political dimension.  As if killing Arabs for a living were a purely personal matter.

Indeed, if American Sniper were truly non-political, we would not be arguing about it at all.  We wouldn’t need to.  And what a boring, worthless movie it would be.

No, the film’s inherent relevance to our national conversation about foreign affairs is what makes it so valuable and compelling.

This doesn’t mean it isn’t a deeply personal story as well.  Of course it is.  That’s what the cliché “the personal is political” is all about.  Chris Kyle’s suffering is real, but it has a context that we must acknowledge in order for it to make any sense.  We can only heed Eastwood’s central message—America must take better care of its veterans—by understanding what makes their return to civilian life so difficult.  There’s no way to do that without returning, sooner or later, to the policies that sent people like Kyle to Iraq in the first place.

Eastwood has been rightly criticized for simplifying this narrative into an old-fashioned “good guys vs. bad guys” story (every Iraqi in the film serves no purpose except to be killed), but that doesn’t mean the rest of us should follow his lead.  Judging from the contentious response it has garnered thus far, we haven’t.  However unwittingly, American Sniper has re-ignited one of the most important debates in contemporary American life—namely, have the past dozen years of U.S. foreign policy been one giant dead end?

To that extent, the movie has served a useful purpose.  Through the profile of one person, however lionized, it has provoked people to think more seriously about a subject of global importance they would just as well ignore.

Not bad for a movie that isn’t interested in politics.

Sex, Lies and Politics

In the life of an observer of politics and government, nothing is quite so gratifying as having long-held suspicions about a particular figure—repeatedly denied by that figure—be confirmed as 100 percent true.  In a world of shady wheeling and dealing, in which one is compelled to question all things at all times, it is reassuring to discover that, at least this once, you are not crazy.

The Obama administration had already been suffering considerable whiplash from this phenomenon from the events of the past week—in particular, the disclosure that the NSA had tapped the German chancellor’s cell phone, and that President Obama’s longstanding promise, “If you like your health care plan, you can keep it,” was a flat-out lie.

But then on Friday came the icing on the revelatory cake:  Excerpts obtained by the New York Times from a forthcoming book, Double Down, which recounts the 2012 presidential race from behind the not-quite-iron curtain.

Written by Mark Halperin and John Heilemann, the dynamic duo behind the 2008 campaign chronicle Game Change, the new book reportedly contains all manner of juicy tidbits to intrigue and/or outrage anyone with even a passing interest in presidential politics.

Chief among these news flashes—that is, the one receiving the most copy—is the Obama team’s flirtation with replacing Joe Biden with Hillary Clinton on the 2012 Democratic ticket, of which there were indeed whisperings by reporters (and denials by the administration) at the time.

Far more compelling to this blogger, however, is the all-but-passing reference much deeper into the Times story to Obama’s annoyance with Biden for spontaneously declaring his support for gay marriage on NBC in the spring of 2012, thereby “pre-empting the president’s own poll-tested plans to announce what the book indicates was a position he had held as early as 2004.”

Say what?

You mean to suggest the president who unequivocally asserted that, to him, marriage is only between a man and a woman during his Senate run in 2004 and first presidential run in 2008 was, in fact, a secret supporter of same-sex unions all along?

Knock me over with a feather.

The truth is, there is nothing at all surprising about this apparent acknowledgment that the president’s so-called “evolution” on the question of marriage rights during his first term was complete nonsense.  He was in favor of it the whole time.

How do I know this?  Because I can read.

Never mind 2004.  It was in February 1996 when Obama, then an Illinois State Senate candidate, wrote in a questionnaire, “I favor legalizing same-sex marriages, and would fight efforts to prohibit such marriages.”

Doesn’t get much clearer than that, does it?

What this always meant, when combined with his later opposition to the same, is that either he was lying in 1996, he was lying in 2004 and thereafter, or he was the only person in America who became less accepting of gay marriage during that period.

Now that we know, thanks to Halperin’s and Heilemann’s reportage, that scenario No. 2 is the correct one, we can return to the well-trod set of questions on this subject with a newfound sense of urgency and, dare I say, anger.

To wit:  If he really did support the idea of gay marriage in 2004, why in holy heck did he drag his feet for a full eight years before finally saying so?

I know, I know:  Political pragmatism demanded it.  Advocating for gay marriage in 2004 was nowhere near as trendy as it is today—far less than half the country was in favor—and Obama, along with most other Democrats, was not prepared to gamble his entire career on this single issue.

Wouldn’t it be nice to live in a world where we didn’t need to keep telling ourselves this?  Where so-called “audacious” figures like Obama didn’t feel the need to suppress views that might be unpopular, and instead ran the risk of trying to make them popular through good old-fashioned politicking?

But of course we do have such principled gadflies in our system.  They have names like Ron Paul, Dennis Kucinich and Ralph Nader.  They run for president all the time, and those who share their views are given every possible opportunity to elevate them to higher office.

What we really need, then, is a collection of folks with the intellectual courage of a Kucinich combined with the persuasive bravado of a Clinton.  People with the nerve to lead on a supposedly radioactive issue without worrying that others won’t follow.

Perhaps they will.  You never know until you try.

Third Party Plight

Barack Obama and Mitt Romney had their second debate this past Tuesday, but they are not the only people running for president this year.

Gary Johnson, former New Mexico governor, is running on the Libertarian Party line.  Jill Stein, physician and former Massachusetts gubernatorial candidate, is this year’s nominee of the Green Party.  And Virgil Goode, former Virginia congressman, carries the banner for the Constitution Party.

But those are just the candidates who have managed to stencil their names on a majority of statewide ballots.  There’s also the Objectivist Party, founded on the teachings of Ayn Rand.  There’s the Justice Party, the Prohibition Party, the Modern Whig Party, at least five different outfits with “Socialism” in the title, and also the Peace and Freedom Party, represented by none other than Roseanne Barr.

We could go on, but things might start to get silly.

In spite of the paragraph I just wrote—and in spite of recent history—so-called third parties have played a real and sometimes significant role in shaping American politics.  To voters under 30, this impact begins and ends with Ralph Nader and his alleged “spoiling” of the 2000 election for Al Gore in Florida—a tenuous claim, at best.  This is a shame, because it clouds a much more colorful history of various rogue candidates and their disruptions of our otherwise two-party system.

In 1992, for instance, independent candidate H. Ross Perot caused enough of a stir not only to ultimately garner nearly 20 million votes nationwide, but managed actually to involve himself in all three presidential debates—even being declared the “winner” of them by the public and media alike.

Four score prior, Theodore Roosevelt invented a new party, the Progressives (known by history as the Bull Moose Party) to challenge Republican incumbent William Howard Taft and Democrat Woodrow Wilson.  Although Roosevelt lost, he did so by splitting the Republican Party—“spoiling” it for Taft, as it were—ceding Wilson a plurality of the vote that he might otherwise not have received.  We can hardly picture world history between 1912 and 1920 without a President Wilson, and it was a third party that made it happen.

Today, for those of us who do not identify with either the Democrats or Republicans, third parties—individually and collectively—represent one tragic, massive tease.

Contemporary third parties exist, after all, on the very assumption that the two Goliaths we have do not encompass the views and concerns of all citizens of these United States.  Gary Johnson speaks about ending the drug war, as Obama and Romney do not.  Jill Stein advocates cutting the defense budget by 50 percent, as Obama and Romney do not.  Ron Paul—perennially pushed, but ultimately resistant, to secede from the GOP—would abolish the Education Department and the CIA, as Obama and Romney most definitely would not.

Conceivably—since the country is divided roughly three ways—an organized, independent third party could pull a TR or better, and perhaps even win a plurality of the vote, rather than simply diluting it amongst the powers that be.

To wit:  A 2010 Gallop poll found 31 percent of Americans identify as Democrats, 29 percent as Republicans, and 38 percent as independent.  That is an awfully large pool of proverbial men and women without a country.

The short answer to “Why don’t third party candidates win?” is easy enough:  We, the 38 percent, are no more in agreement about any particular issue than anyone else—except, I suppose, for the issue of not identifying as Democrats or Republicans.

People have justified figures such as Nader as vehicles for a “protest vote,” and this alludes to the tragedy of the whole business:  Independent voters who detest their two real choices are left with no practical alternative—just a symbolic opting out of the whole system.

What is more, there are institutional mechanisms currently in place that are designed to prevent a serious third party from taking hold in our system, and if you don’t work within the system, you exert no influence whatever.

Except when you do.

I do not expect a third party candidate to be elected president in my lifetime.  Those who run with that expectation are either delusional or pulling your leg.

But it is equally delusional to say third parties are a waste of our time.  At their best, they serve as lobbyists for the people, agitating for causes that never get aired by America’s two partisan wings, but are every bit as important, if not more so, than those that do.  These troublesome gadflies deserve all the support we can muster for them.  To reassert an old cliché:  Do not let the perfect become the enemy of the good.

A Case For Romney

The political media sphere can be such an echo chamber of cliché and conventional wisdom that it strikes as a special treat whenever a piece of analysis escapes from it that actually makes one pause and think.

One such truffle from the 2008 presidential race holds particular interest for us today.  Matt Taibbi, the renegade scribbler for Rolling Stone, speaking with Keith Olbermann about John McCain’s many policy oscillations, offered the following perspective:

“The worst thing about George Bush was that he had convictions.  It was the things he actually believed in that got us into the most trouble.  John McCain is a guy […] who will change his mind at the drop of a hat.  He’s a cynic, as opposed to a true believer.  In these times, I’ll take the cynic.”

It is ironic, in retrospect, that McCain’s most formidable primary opponent was Mitt Romney, who today is rewriting the book on not letting conviction get in the way of winning the damn election.

At this late date, it is simply a fact that Romney is prepared to finesse, alter or outright negate his public views about virtually every issue in the electoral bloodstream, if doing so might increase his chance of being elected president of these United States.

So we are led, inevitably, to the $64,000 question:  What happens when he actually becomes president?  Will he finally stick to a set of “core beliefs”—if so, which ones?—or, rather, will his term be ideologically neutral, guided purely by practicalities?

Now that we are tasked to take the prospect of a Romney victory seriously, we are equally compelled to entertain that his nature as a no-looking-back flip-flopper is a good thing.  It just might be.

Taibbi’s point about President Bush was largely about Iraq:  If Bush had not been so ideologically hell-bent on “staying the course,” the reasoning goes, then he would have more clearly seen how badly the war was going and made smarter, more practical decisions to rectify it.  Bush’s certainty of the inherent goodness of the United States’ intervention in Iraq blinded him to the bloody, bloody consequences.

Mitt Romney, for his lack of foreign policy experience, is a much smarter and more pragmatic man than Bush.  It is very difficult to picture Romney plowing ahead with a particular strategy if all the evidence shows it to be a failure.  Romney’s reputation in the business world suggests nothing so much as an utter lack of tolerance for inefficiency, particularly if it makes Romney, the boss, look bad.

On foreign affairs, then, we might welcome a leader whose views will likely be conditional to the facts on the ground—who has nothing in particular to prove, other than his own competence.

Not that Romney’s stated views on the subject could be described as timid.  Speaking earlier this month at the Virginia Military Institute, he asserted boldly, “[I]t is the responsibility of our president to use America’s great power to shape history—not to lead from behind.”  This would suggest a foreign policy much closer to the eventual Bush doctrine that promised “ending tyranny in our world,” rather than Bush’s initial promise in 2000 to preside over “a humble nation.”

Nonetheless, Romney speaks of American power in a more inward fashion.  Where Bush’s concern was ostensibly with oppressed citizens of foreign nations yearning to be free, Romney’s focus is more self-serving:  America should assert its military might for its own sake, and not necessarily to uphold some larger ideal.  Those are not the words of a man prepared to be bogged down in any particular foreign hellhole for a decade or more.  Where is the profit in that?

Naturally, this is all speculative.  One of the many lessons from George W. Bush was that a man’s world outlook can change rather dramatically between being a candidate and being leader of the free world.  Further, we have not broached how a lack of ideological conviction might translate on the domestic front, which is no small concern.  Nor have we factored in the residual force of the Tea Party to create mischief against Republicans and Democrats alike.

What we know we have, in any case, is the latest in a long line of Oval Office suitors who believe a good business sense is just what America needs.  In Romney’s case, this would seem to require a degree of non-ideological thinking, which can be a very useful quality in a leader.  A ruthless eye for the bottom line knows no partisan loyalty, and if ruthless efficiency is indeed Romney’s true nature—if he can be said to have a true nature—then he would do the electorate and himself a great deal of good simply to admit it.