We Will Survive

Given the choice, which of the following would be the more unnerving prospect:  That Donald Trump becomes president and effectively destroys the entire world order, or that Trump becomes president and does a perfectly decent job?

Over the past year, we Trump skeptics have spent so much time imagining the catastrophic consequences of a theoretical Trump presidency that it has barely crossed our minds that he just might be up to the task—or, more precisely, that our fears of what his leadership style would mean for the future have been unduly exaggerated.  That upon becoming the most powerful person on Earth, Trump might finally come to his senses and behave in a more cautious, dignified manner on the world stage.

Admittedly, the reason none of us has entertained this notion is that everything Trump has ever said and done has indicated the exact opposite.  Whether through his infantile personality, his lack of basic knowledge about policy or his propensity for flying into a tizzy whenever anyone calls him out—especially if that person is a woman—Trump has made it impossible for any reasonable observer to give him the benefit of the doubt:  The preponderance of the evidence suggests a disaster in the making.

But what if we’re wrong?  What if Trump surprises us by proving himself a competent, solid leader who manages America’s foreign and domestic affairs with grace, fortitude and good humor?  What if he lays aside his rougher edges and characteristic bile and somehow wills himself into an able statesman?

Or—if that scenario seems too outlandish—suppose he abandons some of his baser instincts in the Oval Office and muddles through four years of minor accomplishments and periodic setbacks, amounting to a presidency that, while hardly great, is finally regarded as a respectable effort and a mere blip in the ongoing saga of republican governance?

Indeed, the prospect of a boring, so-so performance from this man seems to be the one eventuality that both Trump’s fans and haters neither want nor expect—perhaps because it simply doesn’t compute that such an explosive character could possibly be middling.  In the hysterical environment in which we live, today’s electorate is convinced that a President Trump would be either a towering success or a catastrophic failure.  (We should add that, given the differing values of these two camps, it’s possible that, four years hence, both will claim to have been correct.)

And yet, if history teaches us anything, it’s that the U.S. presidency is a fundamentally stable and moderating institution—strong enough to endure even the likes of one Donald Trump.

Taking a cursory view of all U.S. presidents to date, we find that a small handful were truly great, an equally small handful were truly terrible, while the remaining several dozen landed in the giant chasm in between.

What we find in all cases, however, is that not a single one of those 43 men has caused the American republic to collapse or the entire planet to explode—i.e. the two things that half the country more or less assumes will happen under a President Trump.

Whether the presiding administration engaged in open bribery (e.g. Grant and Harding), imperial overreach (Johnson and Bush), nuclear hot potato (Truman and Kennedy) or domestic genocide (Andrew effing Jackson), the country itself managed to endure—both while and after such dangerous men stood at the helm.  To date, no chief executive (try as they might) has succeeded in fully negating the principles of the Constitution.

(For our purposes, we’ll allow that the Civil War—the closest America ever came to disintegrating—was the culmination of a 73-year-old argument as to what those principles actually were, and was not the fault of a single leader.)

The short explanation for our system’s remarkable buoyancy is that the Founding Fathers hit the jackpot by dividing the federal government into three equal branches, with a bicameral legislature and a Supreme Court acting as checks on executive power.  This way, whenever the president does go too far, the remaining branches are empowered to rein him in and/or throw him out until Constitutional equilibrium is restored.  While this arrangement has never operated flawlessly and the power of the presidency has grown with each passing administration, it has worked just well enough to keep things chugging along.

Now, it’s possible that the United States has merely experienced 229 consecutive years of dumb luck and that Trump is now the right guy at the right time to give the Constitution that one final nudge over the cliff.  He certainly professes to care not a whit about the separation of powers, and we have every obligation to take him at his word.

Or rather, we don’t, because when has Trump’s word ever meant anything?

Don’t forget the one thing about Trump that we know for sure:  Whatever he says today has no bearing on what he might say tomorrow.  On matters related to policy and governing, he plainly doesn’t have a clue what he’s talking about and, when asked a direct question, he reflexively spits out the first thought that pops into his head, no matter how incompatible it might be with all his previous statements on the issue—including, in some cases, what he said just a sentence or two earlier.

Nope.  It’s like we’ve been saying for months now:  Trump is the world’s most transparent con man whose only instinct is to say and do whatever he thinks will induce others to bend to his will.  Like every avaricious, status-obsessed windbag before him, he cares nothing for the public good except for how it might enrich him personally.

But here’s the thing:  Trump is not the first presidential candidate driven almost exclusively by narcissism and greed, nor would he be the first commander-in-chief bereft of a basic sense of right and wrong.

These are hardly attractive qualities in a leader of the free world, but they are not—in and of themselves—a hindrance to a competent and fruitful presidency, and even failed presidents can do genuinely good things.  Consider, for instance, that although Richard Nixon gave the world Watergate and four decades of cynicism about public officials, he still found time to open China and establish the EPA.  Or that while George W. Bush was unwittingly fostering a terrorist breeding ground in the Middle East, he was simultaneously funneling billions of dollars to diseased-ravaged countries in Africa, reportedly saving over one million lives and counting.

Long story short (too late?):  Just as Trump himself should quit being so inanely confident about his ability to foster a magical new American Eden, so should we dial back our own assumptions that, if given the chance, he would fail in a million different ways—or worse, that he would “succeed” in the most frightening possible sense.

It’s not that Trump has shown any real propensity for intellectual growth (he hasn’t), or that his whole candidacy has been an elaborate performance masking a much more serious and learned man (if so, he hides it well).

Rather, it’s that the presidency—that most peculiar of institutions—has a way of scrambling the expectations of every person who enters into it and every citizen who observes the machinations therein.  Like no other job on Earth, it has a way of turning great men into cowards and mediocrities into legends.

The truth is that we can’t know what kind of president someone will be until it’s too late to stop them.  With Trump—arguably the most erratic person to have sought the job in any of our lifetimes—this uncomfortable fact becomes all the more self-evident.  If we agree that he is inherently unpredictable, we must allow for the possibility that, once in office, he will do things that we have thus far failed to predict, and that we just might be pleasantly surprised by the results.

Advertisements

Could Be Worse

Is this the worst presidential election in history?  Only if you know nothing about history.

Would Donald Trump be the worst president of all time?  Maybe, but I certainly wouldn’t bet the house on it.

Over the last month or so—as the precise character of the 2016 election has taken form—there has been an endless parade of rhetoric in favor of one or both of the aforementioned claims.  An assumed general election match-up between Hillary Clinton and Donald Trump, we have been told, represents the most wretched choice the American public has faced in many moons—if not ever—and should the presumptive Republican nominee emerge victorious on November 8, it would likely portend the fall of Western civilization as we know it.

The latter, of course, is a matter of pure speculation until the unspeakable actually occurs.  As for the former:  Don’t be ridiculous.  The championship match of the 2016 campaign—uninspiring as it might be—hardly represents a nadir in the history of U.S. presidential elections.  That we have convinced ourselves otherwise is less a product of our increasingly lackluster candidates than of our unjustifiably heightened expectations thereof.

As with so much else, we in the present like to think we live in exceptional times when it comes to presidential politics—in this case, exceptionally bad—and that our generation of voters deserves a heap of pity that all previous generations managed to avoid.

What ignorant hogwash this is.

All the way back in 1905, in writing about politics in the 1870s, historian Henry Adams dryly observed, “The progress of evolution from President Washington to President Grant, was alone evidence enough to upset Darwin.”  Ninety-nine years later, surveying the 2004 contest between George W. Bush and John Kerry, comedian Lewis Black similarly quipped, “If this is evolution in terms of leadership, I think in 12 years we’re gonna be voting for plants.”

Whether or not Black’s assessment has proved correct (those dozen years are now officially up), it is clear enough that the perception of having a lackluster roster of potential presidents is the oldest, stalest gripe in the book.  Contrary to popular belief, Americans have always been disappointed by the people who’ve run to lead them, viewing every election as a search for—well, if not the lesser evil, then at least the person with a fighting chance of not making life any more miserable than it already is.

We elect a commander-in-chief for all sorts of reasons and under many different circumstances, but it’s always with the subtext—spoken or unspoken—that the available options are hardly the best specimens America has to offer.

We often compare our present-day leaders (unfavorably) to those of the founding generation, viewing the latter as the high water mark for intellectual and political brilliance in the history of this or any country.  Indeed, that’s exactly what they were, and the sad truth is that the Founding Fathers were simply the exception to the rule:  George Washington, for his part, didn’t even want to be president except to avert a possible civil war, while his next five successors got the job on the strength of having directly contributed to the American Revolution—a prerequisite that, by definition, could not be reproduced in any other era.

Following the populist upheaval fostered by one Andrew Jackson—a man of extraordinary physical courage who, if he ran today, would be roundly dismissed as a raging psychopath—Americans elected a series of chief executives of such immense mediocrity—often against equally humdrum opponents—that few citizens today could even recite their names.  Much the same was the case in between the end of the Civil War and the rise of Teddy Roosevelt in 1901.  Indeed, one could argue that, except for Abraham Lincoln, the final two-thirds of the 19th century were one giant muddle of executive leadership to which no one would want to return.

And yet return we have, over and over again.  For all our insistence that only the best and the brightest seek and execute the most powerful post on planet Earth, we have spent most of American history dealing with the uncomfortable truth that, on the whole—and especially now—the most brilliant people in the country are not interested in running for elected office:  They’re busy curing diseases, inventing self-driving cars, building skyscrapers and writing Broadway musicals.

Even before presidential campaigns became the insane carnival acts they are today, the incentive for dedicating one’s talents to the public good has been in decline since practically the dawn of the republic, which means that the slack will inevitably be picked up by aspirants who are intellectually and morally inferior.

To be sure, this doesn’t mean that the occasional prodigy hasn’t occasionally slipped through.  We have, after all, just enjoyed two terms with a president who—imperfections notwithstanding—is an exceptionally deep thinker, capable of speaking and writing elegantly, clearly and at length, and (to paraphrase F. Scott Fitzgerald) adept at holding opposing ideas in his head without losing the ability to function.

If today’s voters—particularly those under 30—feel short-changed by this year’s options, it’s largely a consequence of having been spoiled by arguably the hippest man who has ever held this job.  Indeed, for every voting American born after 1986, this will be the first presidential election in which Barack Obama’s name will not be on the ballot.  (Were it not for the 22nd Amendment, one suspects 2016 would’ve gone very differently, indeed.)

And yet, for those very voters, it would be difficult—on paper, at least—to craft a more agreeable and logical successor than someone who spent eight years in the White House, eight years in the Senate and four years in the State Department and who, by the way, is openly campaigning for Obama’s third term and possesses a mastery of policy nuance light years ahead of Obama’s at this point in his 2008 campaign.

This is not to say that Hillary Clinton is perfect or that someone who is qualified on paper is also qualified in real life.  Having been a nationally-known figure for nearly a quarter-century, Clinton carries clear shortcomings and asterisks, and her ascendancy would be a calculated risk on all of our parts.

In other words, Hillary is flawed.  But you know which other presidential candidates have been flawed?  All of them.

In truth, every person who has ever sought the Oval Office has been ill-qualified to one degree or another.  That’s the nature of the gig.  The fantastical notion of an Ideal Candidate—someone with the right skills at the right time—has rarely been borne out by history, and we have little reason to expect such a thing in the near future.

Contra Trump, Clinton is as solid a candidate as you would expect our system to produce:  Wily, intelligent, hardworking, compassionate, compromising, gritty and ruthless.  If you expect more than that in a modern-day commander-in-chief, you just may expect too much.

The Battle of Princeton

This is what happens when you name buildings after human beings.

In this week’s edition of White People Discover Their Heroes Were Racist Thugs, students at Princeton University have demanded that the school disassociate itself with Woodrow Wilson, a man who served as Princeton’s president for eight years before going into politics.

Specifically, today’s protesters want Wilson’s name and likeness removed from all campus buildings, including the Woodrow Wilson School of Public Policy and International Affairs.

The basis of this demand is fairly straightforward:  Making an objective appraisal of the record, it becomes clear that Woodrow Wilson was, in fact, an unreconstructed white supremacist.  A man who openly viewed black people as inferior to white people and who, upon becoming president, ran an executive branch that turned this view into official policy, most damningly through the re-segregation of various government offices and facilities.  (When Boston newspaperman Monroe Trotter brought a delegation to the White House to protest, Wilson informed them, “Segregation is not humiliating, but a benefit, and ought to be so regarded by you gentlemen.”)

Factoids like these were not unknown to history until last week.  Wilson’s bald racism has been well-documented for eons, available to anyone who cared to look it up.

The problem, then, has been twofold.  First, up until now, few people have cared to educate themselves on the more shameful aspects of our 28th president’s life, both personally and politically.  Second, and more disturbingly, many other folks have known the ugly truth about Wilson all this time but haven’t summoned the strength to be appalled by it.  They simply accept his racist tendencies as a function of the era to which he belonged, then promptly shrug and move on.

The real scandal is how we Americans have allowed ourselves to get away with this for so very long.  How our historical assessments of Wilson’s presidency have focused (appropriately) on his handling of World War I abroad and progressive politics at home, but rarely, if ever, on his handling of race relations a full half-century after Reconstruction began.  As one expert after another has said, Wilson didn’t merely sustain the principle of white supremacy:  he actively made it worse.

Not that I speak from a position of moral superiority.  While I’ve never particularly been a fan of Wilson’s—in college, I wrote a paper detailing his sinister power grabs during the war—my previous understanding of his racism had been limited to his open-arms embrace of D.W. Griffith’s The Birth of a Nation, the film that lionized the Ku Klux Klan and employed white actors in blackface.  Disturbing as that was, it never led me to dig deeper.  It didn’t occur to me that the president of the United States, five decades after the Civil War, would harbor such bigoted views toward black people and not even bother to hide them.

I’m not quite that naïve today, nor—thankfully—is an increasing chunk of the country as a whole.

Thanks most recently to the Black Lives Matter movement and its supporters, Americans are no longer permitted to sweep racial prejudice under the rug without one heck of a fight.  While BLM’s focus is on racism in the present, racism in the past has inevitably factored into the argument.  Further, while racial prejudice can often take subtle or even unintentional forms, Wilson’s was neither.  On the race issue, he was simply a scoundrel.  The only question is why it took us so long to acknowledge it.

However, this does not automatically mean we should strike him from the record entirely.  Or, in this case, scrub his name from the college he so ably led.

With the entirety of American history planted in the back of our minds, let us consider—if we may—the Slippery Slope.

Demoting Woodrow Wilson in the popular imagination and at Princeton would undoubtedly make us feel a little better about ourselves, because it would send the message that bigoted men should not be honored or immortalized after they die, regardless of whatever good they might have otherwise done.

Indeed, we have already broadcast this message elsewhere with regards to such controversial figures as Andrew Jackson, John C. Calhoun and Robert E. Lee—men who made their success by subjugating entire classes of Americans.  This year’s extended kerfuffle over the Confederate battle flag seemed to kill several birds with one stone.

But where, exactly, should we draw the line?

If we are to shun Wilson on the grounds that he discriminated against black people, what are we to make of George Washington?  I don’t know about you, but I think personally owning 123 black people is a pretty cut-and-dry example of valuing one race over another.  Although Washington privately spoke of his desire for emancipation and allowed for his own slaves’ freedom upon his death, he didn’t do a thing to advance the cause of racial equality while he was alive and the most powerful man in America.

So what’s the difference between him and Wilson?  If the latter doesn’t deserve to have even a school named for him, why should the former continue to be the namesake of our nation’s capital, one U.S. state and a main thoroughfare in every city and town in this country?

Is it because, although both men were white supremacists, Wilson was more of a jerk about it?  Is it because Washington’s accomplishments as a general and president are just too important to overlook, while Wilson’s leadership in World War I is apparently negligible?  Do we consider an 18th century slaveholder to be somehow more forgivable than a 20th century segregationist?  What’s the standard?

It’s safe to assume that we won’t be expunging the existence of George Washington any time soon, and this might help to clarify why the whole concept of moral cleansing can be so problematic.

The truth is that almost every American leader between 1776 and 1865 was complicit in the perpetuation of a racist society, either through direct action or through silence when action might have done some good.  (Abolitionists were the exception, and many of them paid a huge price for their courage.)  If we are to retroactively adopt this zero-tolerance policy toward race-based discrimination, why shouldn’t we be consistent and apply it to everyone who was responsible?

We know why not:  Because people are complicated and inspiring and compromised and good and evil, usually all at the same time.  Everyone has a little more of some traits than others, and we evaluate an individual’s overall character using some mysterious algorithm that depends a great deal on context—i.e. time and place—and our own biases.

So we tell ourselves that George Washington’s plantation, while unfortunate, is not a deal-breaker for his reputation because, hey, he treated his slaves decently enough and, by the way, he did single-handedly save the country from oblivion on more than one occasion.  It’s not that we ignore that he owned slaves; we just don’t condemn it as strongly as we would if he weren’t the father of his country.  Don’t make the perfect the enemy of the good.

Meanwhile, the Woodrow Wilsons of the world—essential as they are—do not carry the same aura of Godliness and, thus, make easier targets for our derision.

But maybe I’m wrong.  Perhaps the posthumous inquisition of Washington is next.  Maybe our righteous zeal to reconcile our country’s shameful history is so strong that it will eventually reach all the way to the top, and we will soon achieve a society in which buildings and institutions are only named for people (or things) who never did anything wrong.

While I would personally have no problem with this result—what’s wrong with naming a school after a tree or a city or the president’s mom?—it would have one unintended consequence:  It would provide us with one fewer mechanism for arguing about the meaning of America.

Let’s be honest:  If a group of Princeton students hadn’t caused such a row about the Wilson School of Public Policy, how many millions of people would never have realized just what a wretched little puke Woodrow Wilson actually was?  However inadvertent on Princeton’s part, the controversy gave way to education, argument and clarity.  That’s precisely what great universities are for.

As for the question immediately at hand—should Princeton capitulate to the students’ demands?—I defer to the wisdom of Ta-Nehisi Coates who, himself ambivalent about the whole thing, nonetheless imagines how, to a black student, “seeing one’s University celebrate the name of someone who plundered your ancestors—in a country that has yet to acknowledge that plunder—might be slightly disturbing.”

Slightly.

Still Whistling ‘Dixie’

As the United States approaches the 150th anniversary of the end of the Civil War, it has become increasingly common for relics from the old Confederacy to recede from public view.

While there are undoubtedly certain corners of America in which warm feelings toward the slave-owning Deep South still burn, as a general rule, a given locale or organization today has precious little to lose—and often much to gain—from abandoning whatever residual Confederate loyalties it might yet possess.  Particularly when it is under public pressure to do so.

But what happens when the entity in question is so deeply and inextricably tethered to a component of the Confederacy itself that to renounce such ties would be to hollow out its own soul?

It looks like we’re about to find out.

Down in the sleepy Virginia town of Lexington, there lies a small liberal arts college called Washington and Lee.  Founded in 1749, the school assumed George Washington’s name in 1796, following a hefty donation from the man himself.  When the Civil War ended in 1865, the school recruited Robert E. Lee, the former general of the Confederate Army, to be its president.  Lee accepted, and held the post until his death in 1870.

So mighty was Lee’s impact in transforming Washington College into a serious and respected institution of higher learning, the place was swiftly rechristened Washington and Lee University in his honor.

To this day, W&L defines itself by the general’s personal code of conduct from his days as chief administrator.  “We have but one rule here,” said Lee, “and it is that every student must be a gentleman.”  (The school has been co-ed since 1985; today, the women outnumber the men.)

From this credo, W&L maintains an honor system that most American students would find both odd and terrifying, and the result is a university that ranks in the top tier of just about every “best colleges” list and, according to at least one survey, boasts the strongest alumni network in all the United States.

(Full disclosure:  My younger brother is one such alumnus, and, in point of fact, has become as much of a gentleman as anyone I know.)

Against the clear benefits of a university adhering to the values of this particular man, there is at least one equally obvious drawback:  the fact that this same Robert E. Lee spent four years fighting in the defense of slavery in the United States.

Whatever his personal views might have been about America’s peculiar institution—they were complicated, to say the least—Lee functioned as the seceded states’ rebel-in-chief during the climactic months of the war, thereby endorsing the proposition that the holding of human beings as property was a principle worth fighting, dying and killing for.

If a university is prepared to assume the totality of a man’s strengths as part of its core identity, must it not also be prepared to answer for that man’s most unattractive faults—not least when they involve the trafficking and torture of people he would otherwise wish to be educated?  Can this wrinkle in Lee’s makeup really be so easily glossed over?

Such an intellectual compromise is, in so many words, the primary intent of an intriguing new list of demands, submitted last week to the board of trustees, from a group of seven W&L law students calling themselves “The Committee.”

To be precise, these stipulations are for the school to remove the Confederate battle flags that adorn the inside of Lee Chapel, where the late general is buried; to prohibit pro-Confederacy groups from demonstrating on school grounds; to suspend classes on Martin Luther King Day; and, perhaps most dramatically, to “issue an official apology for the University’s participation in chattel slavery and a denunciation of Robert E. Lee’s participation in slavery.”

Doth the Committee protest too much?  Does W&L have a moral obligation to the whole story of Robert E. Lee, and not just the bits that serve its interests?

It is critical to note that, in its official policies and practices, the school today cannot credibly be accused of harboring neo-Confederate or anti-black biases.  (In its letter, the Committee refers to “racial discrimination found on our campus,” but does not cite specific examples.)

The town of Lexington, which has historical ties to Stonewall Jackson as well as Lee, naturally contains many citizens who hold such repugnant views, and who sometimes express them through marches or other forms of public demonstration.  However, this is not, as it were, Washington and Lee’s problem.

It is precisely because W&L makes no formal overtures toward the pre-war South’s view of civilization that it could seemingly afford to differentiate its latter-day founding father’s virtues from his vices.  The university’s president, Kenneth P. Ruscio, suggested as much in a magazine article in 2012, writing, “Blindly, superficially and reflexively rushing to [Lee’s] defense is no less an affront to history than blindly, superficially and reflexively attacking him.”

So why not put real muscle behind this plea for historical nuance by acceding to the Committee’s fourth and final demand (if not the first three)?  What does W&L stand to lose by looking reality in the eye and acknowledging a few unpleasant facts?

Wouldn’t that be the gentlemanly thing to do?

Self-Evident Truths

When Harriet Beecher Stowe published Uncle Tom’s Cabin in 1852, the French scribbler Gustave Flaubert critiqued the abolitionist novel by asking, “Is it necessary to utter one’s ideas about slavery?  Show it, that’s enough.”

In other words, the “peculiar institution” that America finally shook off in 1865 is so self-evidently wicked and profane in the eyes of any decent person that to actually say as much is an insult to one’s audience.  To wit:  Presented with a scene in which a man with a whip is standing alongside a man with lash marks across his back, do you really need to be told which one is the bad guy?

It’s the “show, don’t tell” principle at work:  If you have an idea to convey and the idea is true—factually and morally—then heavy-handed commentary is not required.  Provide the dots and trust that your audience is clever enough to connect them.

With Steve McQueen’s extraordinary new film 12 Years a Slave, we have a document about America’s original sin of which Flaubert would likely approve.

The movie, based on an 1853 memoir of the same name by Solomon Northup, does not preach to us about why we should disapprove of the practice of one group of human beings assuming ownership over another.  Nor does it drone on about how such a system ultimately destroys the whole society in which it occurs—slaves and slave masters alike.

It doesn’t need to.  Its depictions of what happened to Northup and others between 1841 and 1853 on a series of Louisiana plantations tell us all we need to know.

There is the sequence, for example, in which Northup is very nearly lynched for the crime of performing a given task with far more skill and efficiency than his master is comfortable with.  Although the lynch mob is called off at the last moment, Northrop is left there to hang, his toes just barely reaching the ground, until at long last a compassionate hand arrives to remove the noose from Northrop’s neck.

Or the moment when Northrop’s master, Edwin Epps, nonchalantly rapes his most prized slave, Patsey, then immediately flies into a rage and beats her for—what?  Allowing herself to be raped?

Or the bookend scenes in which Patsey is spotted by her headmistress making dolls from corn stalks, for which she is later punished in the middle of a dinner party with a whiskey glass to the head.

There is nothing subtle about any of this.  Indeed, 12 Years a Slave fits neatly into the class of narratives that would be dismissed as over-the-top and preposterous were they not based on real events.

But 12 Years a Slave is real.  It’s based on truth, and it is truth.

Therein lies the tension:  If what is depicted by McQueen is so obviously abhorrent, how do we reconcile that it was nonetheless the official policy of the United States until a grinding civil war finally brought it to an end?  How do we explain how such a self-evident truth was so violently and knowingly undercut for so long?

The short answer is that we don’t explain it at all, opting instead to comfort ourselves with lies.

Lest we forget, to this day there is a significant chunk of the American public that denies that slavery was the central question of the Civil War.  Particularly in many corners of the South, the conventional view is that it was all a matter of state sovereignty.  That each member of the Confederacy was merely defending its right to regulate itself, rather than be regulated by the federal government in Washington, D.C.

One can easily sympathize with such a sentiment on its face—what business is it of Congress to micromanage the economy of South Carolina?—but to view a film like 12 Years a Slave is to be reminded that, in 1860, the principle of “states’ rights” meant the right of white inhabitants of certain states to own, trade, control and torture black inhabitants of those same states.  Or, in the case of Northup, black inhabitants of free states who get themselves kidnapped and their freedom revoked.

To scoot all of this under the “states’ rights” umbrella, as if to suggest that federal encroachment on state matters is the moral equivalent of keeping men and women in bondage—well, you will excuse those of us who find the argument not terribly persuasive in the grand scheme of life on Earth.

Some truths are more self-evident than others.

Accidentally on Purpose

Racism.  It never goes out of style, does it?

The pop culture controversy of the week, if you are lucky enough to have missed it thus far, concerns a new ditty called “Accidental Racist,” by Brad Paisley and LL Cool J.

A song about the Deep South by a West Virginian country star and a rapper.  As a Northern Yankee marinated in 1970s rock ‘n’ roll, I of course could barely contain myself to comment on this particular cultural kerfuffle.

The gist of the track in question, which can easily be gleaned from its title, is that racial insensitivity can be inferred without necessarily being implied.  That a man wearing a t-shirt emblazoned with the Confederate flag, as in the song’s opening verse, may be signaling nothing more than his affinity for Lynyrd Skynyrd.  That many race-based arguments spring from simple misunderstandings rather than any deep-seated hatred or distrust.

Of course, this sentiment has not exactly been welcomed by anyone as cutting edge.  Critics of the song have dismissed its lyrics as everything from hopelessly naïve to outright appalling—in one verse, it seems to imply that wearing a tacky gold necklace is morally equivalent to slavery—but in the interest of wringing lemonade from lemons, we might nonetheless do well to examine the questions that “Accidental Racist” oh-so-stumblingly broaches.

For one thing:  Is it even possible for a person to be racist by accident, or does this constitute a contradiction in terms?

So long as we are plunging into yet another national conversation on this subject, and so long as “racist” is a term, like “fascist” or “socialist,” that tends to mean whatever its speaker wants it to mean, we should probably agree on a single working definition.

From Merriam-Webster, we find “racism” as simply the “belief that race is the primary determinant of human traits and capacities and that racial differences produce an inherent superiority of a particular race.”

So, can such a phenomenon occur unwittingly?

In a manner of speaking, yes it can.

From time to time, for example, one reads about a house or an elementary school building being spray-painted with swastikas—a so-called “hate crime” if ever there was one—and then it turns out that the perpetrators know nothing about the Nazi Party or the Holocaust it unleashed.  What they know is that this four-pointed symbol, so emblazoned in the public square, will create a great uproar, feeding the self-esteem of such hooligans, who care about nothing more than inflating their own egos.

They may well not have an anti-Semitic bone in their bodies.  They are playing the part without even knowing it.

The Confederate flag issue is a wee bit tougher to justify.

The state of affairs is as follows:  Indentured servitude in the United States was conceived, perpetuated, rationalized and defended on the explicit presumption that black people are inherently inferior to white people.  Even Abraham Lincoln, while opposing slavery itself, surmised as much in his famed debates with Stephen Douglas in 1858.

Further, whatever “states’ rights” case one might make regarding the casus belli for the American Civil War, the primary “right” the seceding states were defending was that of continuing to hold human beings as property.

The earliest version of the Confederate battle flag was flown on March 4, 1861—the precise day when Lincoln took office as president, having explicitly vowed to prevent such secession from enduring and branching out.

In short:  The Confederate flag is not merely a “symbol” that is “representative” of an officially racist society.  Rather, it was commissioned as a direct consequence of, and for the express purpose of defending against, the threat to abolish its citizens’ officially racist practices.

To wear a t-shirt bearing such a flag today and not be labeled a racist?  Well, it would require a pronounced lack of familiarity with the information contained in the previous four paragraphs—forgivable in the case of an innocent schoolboy, perhaps, but less so regarding, say, Brad Paisley.

None of this is to say that those who fly the rebel flag from their Jeeps or wear Lynyrd Skynyrd t-shirts (or Lynyrd Skynyrd themselves) are interested in re-introducing chattel slavery and Jim Crow or truly think white people are superior to black people—except, of course, for those who do—but they are not guiltless for their image in the minds of non-Southerners as profoundly insensitive to and, in some cases, flagrantly ignorant of the history of their beloved country.

All racism is not created equal, but that hardly excuses its perpetuation, in any form, in the modern world.

Liberty, Agony and Political Reality

The first of January signaled two momentous events in the history of freedom in America.

As roundly acknowledged, the day marked 150 years since President Abraham Lincoln issued the Emancipation Proclamation, freeing the 3.1 million men and women held as slaves in the ten states then considered to be “in rebellion” and not under Union control.

Additionally, this past January 1 was the first day in which same-sex couples in the state of Maryland could legally marry, as stipulated by that state’s Civil Marriage Protection Act, affirmed by Maryland voters on November 6.  Following similar legislation, same-sex marriage began in Maine on December 29 and in Washington on December 6.

Abraham Lincoln, the man said to have been written about more than any person except Jesus and Shakespeare, has long been used as a prism through which to view and debate the great American questions both past and present.

In light of the temporal coincidence I have just noted—the partial emancipation of two historically oppressed groups in the United States—allow me to suggest using Lincoln and his political maneuvering on slavery as a means of understanding where America stands on same-sex marriage in the age of Barack Obama.

Our present president formally endorsed legalization of gay marriage last May, following a period in which his views on the subject were said to be “evolving.”  Naturally, this presidential seal of approval was an exceedingly welcomed development for, and by, the gay community (and much of the straight community as well), doubly so because of how longstanding and passionately-fought-for it had been.

As history will likely forget, within gay circles Obama was widely viewed as dragging his feet on same-sex marriage for the majority of his first term, his public “evolving” act seen as exactly that:  A too-clever-by-half evasion of his genuine views.

What made Obama’s resistance to publicly support marriage so painful was not that the reasons behind it were so mysterious, but rather that they were so obvious:  From the opinion polls, it was yet unclear that the country was “ready” for same-sex marriage, and Obama was unwilling to press the issue, expending precious political capital in the process, until he was certain the people would follow.

What we learn from Lincoln is that the Emancipation Proclamation was all about timing.

Lincoln was nothing if not a political animal—a fact that Doris Kearns Goodwin rightly argues “is only to his credit”—and he was not prepared to move on emancipation until he could use it to his political advantage.  In 1862, any and all executive action on slavery had to be seen in the context of winning the war, and the proclamation was nothing if not a savvy war tactic, invigorating the North and demoralizing the South.

That we must regard our most beloved of national heroes as a political strategist first and a moral exemplar second is unavoidable, and also a little sad.

Because Lincoln was fundamentally on the side of emancipation, and was unquestionably the deftest politician of his day—not to mention the most powerful—every abolitionist in the country was effectively tethered to Lincoln’s own timetable as to when emancipation would actually happen.  It was his way or the highway.

For an abolitionist—to say nothing of the enslaved themselves—it was undoubtedly heartening to know the president was on your side, but also positively maddening when appreciating the bind he was in.  Yes, freedom for all would need to be waited out longer than most were comfortable with, but what other choice did the country have?

Returning to Obama, then, we reflect that his reticence on marriage, and the timing of his eventual endorsement thereof, may well have been the best thing that could have happened to the gay rights movement from the executive branch.

The legacy of Bill Clinton on gay matters, after all, is that the 42nd president pushed the issue too fast, too early—allowing gays in the armed forces was the primary objective of the time—which turned the whole business into something of a calamity, arguably setting the gay movement back several years as a result.

That Obama, in being far more cautious, would become the best friend the gay community has ever had in the Oval Office, must simply be accepted as an irony of history and of politics.  It is a state of affairs that is neither fair nor morally ideal, but that doesn’t prevent it from being true.