Continuity with Change

Out there in the über-liberal, anti-Hillary, Bernie Bro corner of the interwebs, the following challenge has been posed:

“Convince me to vote for Hillary Clinton without mentioning Donald Trump.”

As with so much else about the #NeverHillary crowd, it is unclear whether the above is a genuine, good-faith inquiry or just a snarky dig at Clinton’s supporters’ supposed moral bankruptcy.

It’s a rather bizarre question, in any case.  If it’s meant as pure rhetoric—a way of pointing out how the leading justification for Clinton’s presidency is that it would prevent a Trump presidency—then we can take the point while also acknowledging its childish assumption that competing candidates could ever be judged independently of each other—as if choosing one option didn’t also mean rejecting the other.

However, if the question is meant seriously, then it’s just a stupid question.

Can liberals identify reasons to elect Clinton that don’t involve her not being Donald Trump, you ask?  Are there really other liberals who think the answer is “no”?

There are dozens of ways to support Hillary’s candidacy without regard to her Republican opponent.  Many of them are identical to those that led millions of future Bernie Bros to support Barack Obama in 2008 and 2012—and, naturally, many of the same traits also applied to Bernie Sanders during this year’s primaries.  There are also reasons to endorse her that are sui generis, applicable to her and her alone.

Broadly speaking, Hillary is an enthusiastic subscriber to virtually the entire Democratic Party platform—thus, anyone in ideological agreement with Democratic principles is, by definition, in general alignment with Clinton on what we sometimes refer to as “the issues.”

For instance, she would clearly support and defend—and, if we’re lucky, expand and streamline—the Affordable Care Act, aka Obamacare, overruling every last congressional attempt to kill it once and for all.

She would affirm the recently-established right of any two consenting adults to get married, have children and live happily ever after, while also ensuring those same people cannot be fired or otherwise discriminated against for unconstitutional reasons.

She would continue President Obama’s fight against global warming and his attempts to make the country more energy independent.

She would pledge solidarity with Muslims and other religious minorities against persecution by violent Christian extremists.

She would shape a Supreme Court that would vote in favor of a multitude of issues that liberals care passionately about—voting rights, women’s rights, transgender rights, you name it.

She would try to do something about gun control and—if the stars are aligned just right—maybe even succeed.

In addition to being the first female chief executive, she would appoint a record number of women to her cabinet, not to mention a boatload of ethnic and racial minorities spread throughout the executive branch, thereby inspiring countless young people to consider public service for the first time in their lives.

She would hold meetings and actually listen to what the other people have to say.

She would forge relationships with every last member of Congress, knowing that someday she might need their support for something important.

Long story short, she would essentially be a slightly more mature—but slightly less exciting—version of Barack Obama.  In effect, she would represent Obama’s third term in office, for better and for worse.  That’s the argument for electing her president.  Take it or leave it.

Now, it’s true enough that Clinton herself has never explicitly said, “Vote for me, Obama’s third term!”  However, it doesn’t require a great deal of reading between the lines to grasp the subtext of all of her major policy positions, which can be summed up as, “If you’ve enjoyed life under Obama, you’ll enjoy it under me.”

I realize this is an inherently uninspiring message—a tacit admission that things probably aren’t going to change very much over the next four-to-eight years—but it’s also admirably fresh and realistic—a means of subtly lowering our expectations to a level at which we might actually want to re-elect her four years hence.

Every president in history has needed to confront the gap between what he thinks he can accomplish and what he can actually accomplish, and Hillary Clinton stands apart from most previous candidates in her deep understanding of this fact.  Among the many differences between her and Donald Trump—a man whom, you’ll note, I haven’t mentioned in quite some time—is that Trump apparently thinks a president can do literally anything he wants, while Clinton knows full well that the job is extraordinarily limiting and depends on a great deal of teamwork to get anything meaningful accomplished.

In 1961, John F. Kennedy intoned to the American people, “Let us begin.”  When Lyndon Johnson succeeded Kennedy in November 1963—albeit under unusual circumstances—he said, “Let us continue.”  That’s the dynamic between Obama and Clinton:  They are so compatible in their basic worldview and value systems that we can expect an exceptionally smooth transition from one to the other (this time without an assassination in between).

I don’t know about you, but I have quite enjoyed the Obama administration.  It has followed through on a plethora of progressive actions that were utterly lacking under George W. Bush, and I can say unequivocally that my own personal corner of America is infinitely better off now than it was eight years ago.  If Obama were eligible to run for a third term, I would vote for him a third time.

But he can’t, so I’ll settle with Hillary, instead.

Many Republicans will be familiar with this sense of depleted enthusiasm, since they elected George H.W. Bush in 1988 by pretending he was Ronald Reagan, an incumbent who was term-limited after eight years of making many conservatives’ dreams come true.  In the end, Bush proved a capable but ultimately lackluster follow-up act, keeping some promises while breaking others, and is today admired as much by liberals as by conservatives.

History could easily be in the process of repeating itself on the other side of the ideological spectrum, and that is roughly what we should expect.  Hillary Clinton has drifted to the left on numerous issues as of late, but the intractability of Congress and Clinton’s own cautiousness will surely limit the reach of her administration’s most ambitious goals, resulting in exactly what her most clear-eyed advocates have promised:  Modest, gradual progress through compromise—a variation of Selina Meyer’s campaign slogan in Veep, “Continuity with Change.”

Sounds pretty good to me.

Advertisements

Mr. Know-It-All

Not to date myself, but I am old enough to remember when President Obama’s arrogance was annoying.  When I worried that his occasional lapses into glibness and condescension would diminish the high office he holds and prove counterproductive to his administrative goals.

Then I listened to his improbable interview with Marc Maron—the stand-up comic who hosts a weekly podcast from his garage—and was reminded how, on second thought, the president’s cheerful elitism is among his most endearing personal qualities.  I’ve never for a moment regretted that he was elected in both 2008 and 2012, and one reason is that he can be so gosh darned snarky.

When the Maron interview aired, the media were so blinded by Obama’s employment of the word “nigger” that they neglected to mention anything else that was discussed—not least the segment on race relations that, if actually listened to, would explain why the use of the N-word was entirely appropriate in this case.  (But that’s another story.)

In fact, what stood out in the podcast for me were the bits about good old politics, and the fact that, rhetorically-speaking, Obama has officially given up treating his Republican adversaries as sane, rational people with whom he could ever forge a common bond.

Nope, in the twilight of his presidency, with no further elections except the one to choose his successor, he has finally accepted that the GOP leadership in Congress is obstinate, dumb and worthless, and he simply doesn’t have the faith or patience to expect that they’ll ever grow up.

There was the moment, for instance, when Maron shifted the subject to climate change and Obama ruefully recalled how James Inhofe, the Republican chairman of the Environment and Public Works Committee, recently “proved” that global warming is a hoax by bringing a snowball into the Senate chamber.  Or Obama’s more general assertion, “I believe in reason and I believe in facts,” dryly insinuating that his Republican counterparts do not.

Having followed the news over the past six-odd years, I find both statements incontrovertible, and I think Obama has every right to announce this point loud and clear.  That the GOP has functioned as an ideological stone wall since January 2009 is the plain, simple truth, and it’s the president’s duty to speak the truth from time to time.

Then again, this is all coming from an unabashed partisan of his.  I can imagine that, to those who do not share Obama’s worldview, his self-satisfied demeanor is utterly insufferable.  (Not that any imagination is necessary.)  I am reminded of that week when we found out the administration’s overriding foreign policy philosophy is, “Don’t do stupid shit.”  Clever, yes, but also extremely limited insomuch as the definition of “stupid” is not exactly settled science.

The trouble, you see, is that however appealing it is to implore your adversaries to just listen to reason, framing the argument as a clash between logic and illogic only serves to make you look like a jerk.  It alienates your sparring partners instead of engaging them, and the leader of the free world cannot afford to alienate anybody if he expects to get anything done.  History buffs love to reminisce about the good old days when Franklin Roosevelt or Lyndon Johnson could effect major legislation through sheer force of will, but both of those men enjoyed huge Democratic majorities in both houses of Congress—an advantage Obama lacks.

In other words, the present president does not have the luxury to get cocky, or to blow his own trumpet too loudly.  He has to play nice and exercise tact and restraint.  He has to treat his antagonists as smarter than they actually are.  He has to humor them with the prospect that he takes their silly ideas seriously, or at least respects how they think.

However, at this late date, it is clear beyond doubt that he doesn’t and he won’t.  It’s just not in his DNA to suppress his irritation with a Republican Party that values ideological purity over pragmatism, compromise or even basic arithmetic.

Considering all Obama has accomplished in the teeth of that resistance, paired with the basic validity of his public critiques of life on Capitol Hill, I humbly ask:  Are there instances in which arrogance and condescension are not only acceptable, but necessary?

As a rule, arrogance is among the lowest of all human qualities, particularly among public figures.  It is a form of pride—the gravest of the Deadly Sins—inevitably leading to overreach and alienation.  Recent political history suggests as much:  Cockiness did few favors for Mike Bloomberg—a mayor who often portrayed his opponents as not just wrong, but insane—and the one-two punch of Donald Trump and Chris Christie in the GOP primary scuffle speaks for itself (albeit the former more so than the latter).

It just might be that, as so often happens, Obama is the exception to the rule.

Recall that moment in January’s State of the Union when he noted, “I have no more campaigns to run.”  When this yielded a smattering of sarcastic applause, he couldn’t help but add, “I know because I won both of them.”

I can’t imagine any other president getting away with that—let alone trying to—but Obama, through the sheer force of his audacity, somehow made it work.

And what is the magical X factor that allows him to pull this off time and again?  Is it simply the sharpness of his wit?  The twinkle in his eye?  Has his status as America’s First Black President led us to subconsciously give him a pass on certain points of etiquette that—let’s face it— aren’t all that important in the first place?

Or maybe it’s just that, when you pay close attention to precisely what he is arrogant about, you realize that it’s not arrogance at all.  As a certain New Jersey governor would put it, he is simply telling it like it is.

Whose Revolution Is It, Anyway?

When will we know for sure that “Obamacare” is a success?

When Republicans stop calling it “Obamacare.”

I’ve been carting around that joke for a while now.  I’m pretty sure I didn’t come up with it, although I certainly wish I had.

It’s the perfect little joke, because it’s founded on a basic truth about human nature, and about the nature of partisan politics in particular.  No one needs to explain the joke, because everyone understands the dynamics of taking credit and assigning blame.  I refer you to the old proverb, “Success has many fathers, but failure is an orphan.”

Indeed, if we are to devise a general rule of thumb from both the joke and the proverb, it is that the success of a given policy or social movement is directly related to the number of people claiming credit for it.

That is why I am so delighted by the petty squabbling that has broken out in the last week over the legacy of gay marriage.

Here’s what happened.  This past Tuesday saw the release of new book called Forcing the Spring: Inside the Fight for Marriage Equality.  Written by Jo Becker, a Pulitzer Prize-winning New York Times reporter, the tome purports to be the inside story of the gay marriage movement from 2008 to the present—in particular, the effort to overturn Proposition 8 in California and, in so doing, attempt to bring gay marriage to all 50 states.

The book is told through first-hand accounts of several key players, including now-president of the Human Rights Campaign Chad Griffin, screenwriter Dustin Lance Black, and the famed legal team of Ted Olson and David Boies.

Fair enough, except that according to a veritable avalanche of critics, Forcing the Spring presents these characters and their legal adventures not as simply the most recent (and most fruitful) phase of the struggle for gay marriage rights in America, but as the whole damn story.

(I have not read the book, apart from a few excerpts.)

According to Becker, Griffin et al. were bold revolutionaries—Griffin himself is compared to Rosa Parks on the very first page—who rebelled against a do-nothing gay “establishment” that had effectively driven the cause into a ditch.

As the book would have it (according to these naysayers), nothing that occurred in the struggle for same-sex marriage really, truly mattered until the moment in 2008 when Griffin and like-minded allies made wholesale changes in strategy—legally and rhetorically—that would lead directly to the domino of successes the country has experienced ever since.

Long story short (too late?), the charge against Forcing the Spring is that Becker allows her sources to claim nearly all the credit for the fact that gay marriage is now legal in 17 states and is endorsed by a clear majority of the American public, at the expense of countless others who deserve equal, if not greater, credit for carrying the fight as far as they did.

Among these unacknowledged factors are a legal showdown in Hawaii in the 1990s that set the template for all that would follow; people like Andrew Sullivan and Evan Wolfson, who articulated the now-mainstream arguments for gay marriage decades before they were taken seriously; and the mere fact that, before public support for gay marriage rose from 40 percent to 54 percent between 2008 and 2013, it rose from 27 percent to 46 percent between 1996 and 2007.  (Yes, apparently support dropped six points between 2007 and 2008.)

As the book’s dissenters make plain, to say the anti-Prop 8 crowd is singularly responsible for effecting same-sex marriage, as the book implies, is analogous to crediting the Civil Rights Act of 1964 entirely to Lyndon Johnson, while failing to even mention figures like Martin Luther King, A. Philip Randolph or, indeed, Rosa Parks.

For the most part, this contest over the history of the gay rights movement can be categorized as a family quarrel.  All sides wish to achieve the same ends; they disagree, if at all, only about the means.

What is most encouraging is that this argument is happening at all, because it means that the history of bringing same-sex marriage to America is one that its participants can be proud of.  Those on the struggle’s front lines are falling all over each other to claim responsibility because their efforts have proved successful after a long period of failure, and they feel they deserve their due.  I am positively thrilled that we, as a society, have come this far.

Of course, we still await the moment when we’ll know for sure, and beyond all doubt, that gay marriage is here to stay:  That is, when members of the GOP begin to claim that it was their idea all along.

To Solemnly Forswear

For all the havoc and misery the Kennedy assassination wrought upon the United States 50 years ago last week, it nonetheless yielded one slight, incidental benefit:  The most agreeable swearing-in ceremony in the history of the American presidency.

As you probably know (thanks to a famous photograph), the ceremonial presidential succession on November 22, 1963, occurred in a very crowded cabin aboard Air Force One as it flew the slain president’s corpse from Dallas to Washington, D.C.  Sarah Hughes, a Texas-based federal judge, administered the oath of office to Vice President Lyndon Johnson, who solemnly repeated it back to her and thus formally became the nation’s 36th chief executive.

That was it.  No pomp, no fancy ceremony, no parade, no inaugural balls.  Just a simple affirmation that, yes, the peaceful, orderly transfer of power enshrined in Article II of the U.S. Constitution is still in force, even in the most horrific, disorderly times.

I wonder:  Why can’t the simplicity, dignity and austerity of the Johnson swearing-in be the rule, not the exception?  How I wish that it were.

From an item in the Boston Globe over the holiday weekend, it was reported that Marty Walsh, the incoming mayor of the City of Beans, is seeking private contributions of up to $50,000 to fund his January 6 inauguration and its related activities.

According to the article, the event “could be the city’s priciest mayoral bash ever” and will reportedly include an “inaugural gala” and a “private appreciation” for its most generous donors, with the number of tickets per capita determined by the precise generosity of said donations.  Festivities will also include “events for children and for seniors, and a day of volunteer service.”

Because the full cost of this Walsh-a-palooza will be borne by private entities, be they corporations or individuals, the incoming administration has been made to answer all the usual questions about what these contributors might be getting for their money.

One can hardly be faulted for asking—this is politics, after all—and the situation is made dodgier still by the fact that, as the Globe notes, “Nonprofit inaugural committees are not governed by campaign finance rules and, thus, are not required to disclose donors or hew to limits, and are not required to file paperwork with state campaign finance officials.”

In other words, there is nothing unusual or legally suspect about any of this.  It’s business, and politics, as usual. Austerity be damned—we’re gonna celebrate and it’s gonna be big!

Is this the moral tradeoff for not financing an inauguration bash with public money?  On this Thanksgiving weekend, should we just be grateful our taxpayer dollars are off-limits and not trouble our pretty little heads about what might be going on in the proverbial smoke-filled rooms?

If you answered in the negative to either or both, then you must wonder, as I do, why we bother with these lavish inaugural exercises at all, both at the local level and in Washington, D.C.

Officially speaking, they are completely unnecessary:  Any given transfer of power takes place at the constitutionally-designated time regardless of what anyone does to mark the occasion.  To the extent that the swearing-in itself carries any legal (rather than ceremonial) significance—a subject of constitutional debate at the federal level—no other aspect of the start of one’s term does, and could be abandoned without any legal ramifications.

Marty Walsh got elected Boston’s mayor, in part, on the basis of his reputation as an honest and decent man.  But one need not be an inherently corrupt person to be corrupted by the political process.  Any large-scale fundraising operation is fraught with the possibility of ethical transgressions.  Why bother initiating such an operation if it serves no real public purpose?

Unfortunately, we know exactly why:  Because when it comes to amassing large sums of cash, the public interest is the first thing to go.

A public official may well profess to care more about the little man than the corporate behemoth, and he may well be telling the truth.  But it doesn’t change the fact that our system, at present, is designed for the opposite to be the case:  A politician has to follow the money whether he wants to or not, because if he doesn’t, he stands to lose the only power with which he could possibly help the little man in the first place.

Granting that such a state of affairs is now irreversible—an arguable point—could we at least make the effort to keep money out of inaugurations, thereby returning them to their more modest roots?  Or is that just too much to forswear?

The Meaning of ‘War’

Here is a trivia question for you:  When was the last time the United States officially declared itself to be at war?

Answer:  December 8, 1941, the day after Japan attacked Pearl Harbor, inducing the United States to formally enter World War II.

That was it.  So far as the official record is concerned, every American military engagement since 1945—Korea, Vietnam, Iraq, Afghanistan, Iraq again—has been strictly off-book.

This is not to say that those conflicts (and plenty more besides) did not really happen, or that the United States has officially been in a state of peace for some 68 years.

Rather, it calls into question what terms like “war” and “peace” mean in the first place.  In point of fact, such definitions have never been clear since the founding of the American republic.

This is no small matter, both in theory and in practice.  As the current scuttlebutt surrounding Syria has reminded us, a great deal hinges on how the United States involves itself in foreign entanglements and, in particular, on who has the final say on whether to do so.

The U.S. Constitution states, in Article I, Section 8, that “Congress shall have power to […] declare War,” but offers no opinion as to precisely what war is or, indeed, what form such a declaration should take.  The president, as commander-in-chief, has the authority to conduct hostilities once they have commenced, but has no explicit license to commence them himself.

As a consequence of such constitutional vagary on this subject, it has been left to subsequent generations to fill in the blanks.

Two weeks ago, when President Barack Obama announced his desire to launch a series of strikes against the Syrian government, he said, “I will seek authorization for the use of force from the American people’s representatives in Congress,” but also that “I believe I have the authority to carry out this military action without specific congressional authorization.”

I could not have been the only one who felt slightly ill at ease by the contradiction between those two clauses.

The president said that he was taking his Syria case to Congress because he believes that in so doing, “the country will be stronger […] and our actions will be even more effective,” but would it not be more in the spirit of checks and balances if he were actually required to do so?

The fact is that, while the Constitution delegates the power to declare war to the Congress, the ambiguity with which the very notion of war is understood has allowed the executive branch extremely wide latitude on the actual employment of the U.S. Armed Forces.

In practice, chief executive after chief executive has managed to sneak America into a large-scale armed conflict simply by not calling it a war.  While the president himself may well intend a limited military action not to escalate into a full-blown commitment, history has demonstrated a clear pattern of the former giving way to the latter.

In 1964, for instance, Congress passed the Gulf of Tonkin Resolution—based on false information, it turned out—which authorized President Lyndon Johnson “to take all necessary measures to repel any armed attack against the forces of the United States” and “to take all necessary steps, including the use of armed force, to assist any member or protocol state of the Southeast Asia Collective Defense Treaty requesting assistance in defense of its freedom.”

The words “declaration” and “war” did not appear in the resolution, yet the document unmistakably gave the president permission to do whatever the heck he wanted vis-à-vis the conflict in Southeast Asia, which he and his successor, Richard Nixon, unmistakably did.  Their combined policies in and around Vietnam, licensed by the Tonkin Resolution, led to the deaths of some 58,000 American soldiers and several hundred thousand Vietnamese civilians.

If that isn’t war, what is?

And so I humbly ask:  With the prospect of a fresh new American-sponsored military thingamabob in the Middle East, should we not clarify America’s war-making laws once and for all?

Can the president send American troops into harm’s way of his own accord, or not?  If he can, does it not infringe upon Congress’s prerogative to declare war?  Can the president do whatever he wants so long as he does not call it war?  Or is any deployment of the U.S. Armed Forces axiomatically an act of war, period?

Or:  In today’s world, where the United States and others can inflict great damage without proverbial “boots on the ground” and in which violent conflicts are not nearly as linear as they used to be, have these sorts of questions become obsolete?  And if that is the case, of what use is our 226-year-old Constitution in the first place?