A Spot of Revolution

On June 23, the people of Great Britain voted narrowly to remove themselves from the European Union.  Although the decision occurred 11 days shy of July 4, many of those in favor of this so-called “Brexit” have interpreted it as Britain’s own declaration of independence.  Prominent English politicians Nigel Farage and Boris Johnson have framed it as such, as has noted historian Sarah Palin, who went so far as to draw a straight line from Britons’ rebellion against the EU to the American rebellion of 1776 that we will be duly celebrating six days hence.

While there may indeed be superficial similarities between these two politically seismic events—in this of all weeks, such comparisons are hard to resist—the truth is that we have far more to learn from what makes them different than from what ties them together.

Chief among these differences—or at least the most ironic—is that this new British separation from Europe came about through democratic means and reflects the unambiguous will of the people.  On both counts, the American Revolution most assuredly did not.

It’s an easy thing to forget, but the process by which a free and independent United States of America emerged from the tentacles of an overreaching, overtaxing British Empire was about as far from pure democracy as such an act could be and was, by all accounts, both undesired and unpopular among the inhabitants of the 13 colonies at the time.

Although formal opinion polls did not exist at the end of the 18th century (too much work for the horses), no less than John Adams estimated that the American public in 1776 was probably divided evenly into three groups:  Patriots, loyalists and fence-sitters.  That’s to say that—annoying as taxation without representation undoubtedly was—only about one-third of ordinary colonists agreed that declaring independence was a good idea.

In other words, the momentous decision by a band of renegades to secede from the world’s mightiest empire—an audacious, treasonous and altogether cataclysmic move—was made in defiance of the wishes of a supermajority of the public at large—a fact made all the more glaring by the Declaration’s pretence of creating a democratic, self-governing society that derived its authority from “the consent of the governed.”

The delegates to the Continental Congress, for their part, were selected by the legislatures of their respective colonies—a vaguely republican system for the time—which then enabled said delegates to do whatever the hell they wanted once they got to Philadelphia.

And that—with very few exceptions—is exactly what they did.  By and large, those who voted for independence in July of 1776 did so from a mixture of personal conviction, horse trading with fellow delegates and a general sense of which way the wind was blowing.  In any case, the so-called “will of the people” never really entered into the equation since, for all intents and purposes, the Continental Congress was the people.  (We need hardly add that the Congress was 100 percent male and 100 percent white.)

It’s not that the Founding Fathers were indifferent to public opinion—as the war heated up, securing popular support became essential to sustaining the Continental Army—but they certainly didn’t consider it legally binding.  In the opening decade of the American republic, the word “democratic” was an epithet that conjured images of mobs and anarchists who reacted to leaders they didn’t like by burning them in effigy.  The aforementioned John Adams went to his grave believing his finest moment as president was to have averted war with France in 1800 despite overwhelming popular support for just such a war.  For Adams, defying the will of the people was the ultimate badge of honor, and hindsight has surely borne him out.

Of course, to marinate in the facts of America’s founding is to reach some extremely ambivalent conclusions about democracy, realizing, as we must, that those men in wigs and puffy shirts got along just fine without it.  In a way, the Founding Fathers ruined it for everyone by being so exceptional:  In the hands of anyone else, the plainly elitist nature of the Continental Congress—and later the Constitutional Convention—would’ve flatly negated the very principles it claimed to stand for and strangled our infant nation before it ever had a chance to breathe.

However, because the founders were so faithful to the cause of liberty and freedom—and not merely to their own self-interests—they somehow managed to negotiate the contradictions their experience in nation-building required and allow future generations to live up the standards that they themselves did not.

In the Western world today, democracy through popular vote is taken more or less for granted, while major decisions made behind closed doors are looked upon with high skepticism, if not outright contempt.  Yet we cannot ignore the reality, in the U.S. and U.K. alike, that almost every political decision—major and minor—is enacted not by “the people” but rather by representatives of the people who, in the end, behave however they damn well please, assuming—often correctly—that they will never be held to account when and if things go wrong.

In some quarters, this week’s “Brexit” vote has been hailed as a heroic popular revolt against such elitism, while in others it has been seen as a cautionary tale against allowing direct democracy to carry the day.  (Not that these interpretations are necessarily mutually exclusive.)

The million-dollar question, in any case, is whether popular rule is the solution to all conflicts or whether, instead, there are some questions that are simply too important to be decided by the whim of the majority.  In a typically cutting op-ed in Rolling Stone, Matt Taibbi argues for the former, writing, “If you believe there’s such a thing as ‘too much democracy,’ you probably don’t believe in democracy at all.”  Taibbi was responding, in part, to a Boston Globe op-ed by Harvard professor Kenneth Rogoff, who observed, “Since ancient times, philosophers have tried to devise systems to try to balance the strengths of majority rule against the need to ensure that informed parties get a larger say in critical decisions.”

The natural follow-up, then, is who exactly are these “informed parties” and what qualifies them as such?  For that matter, how do we establish which decisions are “critical” and which are less so?

We might agree that some citizens are smarter and wiser than others and that direct democracy is too unwieldy to be exercised on a daily basis, but how do we reconcile these assumptions with the democratic ideal that no citizen’s voice is valued higher than any other?  The short answer—based on some 240 years of experience on this side of the Atlantic—is that we don’t reconcile at all.  We simply learn to live with the contradiction.

For now, we can occupy ourselves with the double irony that, on the question of declaring independence of one form or another, America employed elitism in the service of promoting democracy, while Britain employed democracy in reasserting its identity as a nation that is still technically ruled by a monarch.  Karl Marx famously said history repeats itself “the first time as tragedy, the second time as farce.”  Depending how the next few months go, “Brexit” may unleash both at the same time.

Advertisements

It Still Doesn’t Fit

I was eight years old when the O.J. Simpson verdict was announced in the fall of 1995.  If I was cognizant enough to see that the trial was a big effing deal, I neither knew nor cared much about why and emerged with the impression merely that some famous ex-football player had been acquitted of a terrible crime that everyone in America knew he had committed.

Today—after more than two decades of human events and two excellent TV miniseries on this very subject—I sense that I have finally—finally!—caught up with the rest of America in understanding what the O.J. “not guilty” verdict truly meant:  Namely, that after 400 years of white people getting away with murder by taking the law into their own hands, it was long past due for black people to do the same—if only to prove, just this once, that they could.

If you grew up—as I did—in an affluent white suburb where racial tension was more urban legend than reality, you might have been forgiven in 1995 for not getting why race—or rather, racism—was the central drama underpinning the double murder trial of one of the most beloved celebrities in America.  Even now—with an additional 21 years’ of state-sanctioned white supremacy in action—it’s still an open question whether racism was even remotely relevant to the Simpson case and/or whether the “race card” should ever have been played.

Yet when you consider the O.J. fiasco in a broader cultural context—as both of these new TV programs have forced us to do—you begin to grasp the logic of both the defense team’s arguments and the jury’s final decision:  Each was a rebuttal to a criminal justice system designed not to give black (and other non-white) defendants a fair shot.  At some point, the case ceased being about Simpson’s guilt and became a referendum on whether any black man accused of any crime could be treated fairly in a white society patrolled by white policemen, white lawyers, white judges and white juries.

Indeed, in arguably the most electric moment in the new ESPN documentary O.J.: Made in America, we are told point-blank by one of the 12 original jurors—speaking for herself, if not the others—that the “not guilty” verdict was payback for the treatment of Rodney King in 1991.  Since white members of the Los Angeles Police Department had behaved disgracefully and dishonestly in that instance, who’s to say they hadn’t behaved similarly in this one?  Further, since the officers who kicked and clobbered King had gotten off scot-free—thanks, presumably, to an inherently racist system—didn’t the O.J. jury—a panel that was 75 percent black—have a moral imperative to ensure such a thing didn’t happen again?

As we can see, there are really two separate questions at play.  First, should the apparent systemic racism within the LAPD be taken as evidence, in and of itself, that O.J. Simpson might have been framed for murder?  And second, do the accumulated past examples of prejudicial behavior against black defendants justify acquitting one particular black defendant against whom, it would appear, no such prejudice existed?

This is no small distinction.  In practice, there’s a world of difference between treating the LAPD with appropriate skepticism versus proactively punishing it for sins it committed in the past but hasn’t necessarily committed in the present.  You might call it the difference between justice and vengeance, in which case the question becomes:  Can vengeance ever be a form of justice and (while we’re at it) are there scenarios—such as, say, the Simpson case—in which vengeance, in any form, is the correct response to a problem (e.g. institutional white supremacy) that demands a solution one way or another?

Put simply:  If the O.J. jury found Simpson not guilty strictly to avenge every previous defendant who’d gotten screwed by the LAPD—and not, mind you, because they truly thought Simpson was not guilty—could we say that justice had been served?

First things first.  With more than two decades of hindsight at our disposal, it remains as clear as ever that O.J. Simpson definitely killed his wife, Nicole, and her friend, Ron Goldman.  The preponderance of direct evidence—namely, the trail of blood containing DNA of all three people—paired with circumstantial evidence regarding Simpson’s long history of physical abuse and his want of an alibi on the night of the murders, is enough for a contemporary jury to find him guilty at least 99 times out of 100.  Indeed, the strength of the DNA evidence alone may well be sufficient to avert a trial altogether and lead, instead, to some kind of plea deal whereby Simpson would either claim momentary insanity or—considering his privileged position in society—an acute case of “affluenza.”

The trouble in 1994-95 was that the general public did not understand the magic of DNA the way we do now.  The jurors, for their part, couldn’t make head or tail of what the pools of blood proved or didn’t prove, and into that vacuum—thanks to Johnnie Cochran and company—was placed the notion that a fanatically racist cop, Mark Fuhrman—the man who found the famous leather glove—had both the motivation and the wherewithal to plant evidence on the fly that made Simpson the only possible culprit.

As it happens, Fuhrman did no such thing.  The defense, for all its insinuations, never demonstrated how such tampering might have occurred—not least because it was physically impossible for Fuhrman to have gotten away with it.  As prosecutor Marcia Clark caustically says in the ESPN documentary, “The only reason I know [Fuhrman] didn’t plant the evidence is because [he] couldn’t have.  Otherwise, I’m with them.”  (“Them” in this case being the entire African-American community.)

Hence the breathtaking irony that defined the entire saga:  Here, a basically corrupt cop was being scapegoated for a case in which—maybe for the first time in his life—he had behaved more or less as he was supposed to.  Add to that the even greater irony that Simpson himself—a man who hated being defined by his blackness—would become a poster child for the tragedy of the black experience in America, and you have the perfect storm of conflict that this case was perhaps destined to become.

If there was a thousand-pound elephant somewhere in the courtroom—a subtext that dare not speak its name—it would’ve been the specter of reparations—the idea that black Americans, as a group, are owed something from white Americans that the latter have every obligation to pay and thus far have not.

While most white Americans think of reparations strictly in terms of slavery—an institution that conveniently ended before any of us could be born and assume responsibility for it—it has lately been definitively argued—perhaps most memorably by Ta-Nehisi Coates in The Atlantic—that any debts owed to African-Americans must also cover such comparatively recent, but no less unjust, phenomena as housing discrimination, voter discrimination, employment discrimination and, naturally, discrimination in the legal and criminal justice systems.

Despite our so-called best efforts, we all know that black and white citizens are not treated equally in all facets of our society—a cursory look at our prison population makes the point plain enough—and that even if we magically resolved all of those inequalities tomorrow, it would not absolve the white folks who have long benefited from this arrangement of responsibility for all the harm they have caused up until now.

Given how intractable the racial justice gap continues to be—how nothing seems to change no matter how much our leaders claim to try—what choice do we humble citizens have than to surreptitiously tip the scales whenever we get the chance?

The O.J. Simpson verdict might’ve been a miscarriage of justice in the strictest sense—after all, it allowed a wife-beater to literally get away with murder—but it was also a signal—a warning, in fact—that there would be real and richly-deserved consequences for police departments that didn’t take the concept of “blind justice” seriously.

It was an assertion—however imperfect the circumstances—that black lives matter.

It’s On Us

Early last Sunday morning, a Muslim walked into a gay bar and murdered 49 people because the Christian and Jewish bibles commanded him to do so.

That’s not necessarily how the incident has been reported, but that doesn’t make it any less true.  As any half-literate scholar of the Old Testament knows, the book of Leviticus contains the following injunction:  “If a man has sexual relations with a man as one does with a woman, both of them have done what is detestable.  They are to be put to death; their blood will be on their own heads.”

In other words, according to the Old Testament—which, rumor has it, is the literal word of God—wherever active homosexuality exists, it is the duty of society to snuff it out.  As we know, the Old Testament constitutes the entirety of God’s revelation to Jews, one-half of the same to Christians, and is substantially the basis for the sacred text of Islam.

Accordingly, whenever an individual takes it upon himself to murder gay people because of their sexuality, he is only following orders from the one guy you’re not allowed to disobey.  In so doing, he is guilty merely of taking the Bible literally—as an enormous chunk of Jews, Christians and Muslims are clearly instructed to do, particularly with regards to prohibitions on certain personal behavior.  To this day, virtually every preacher on Earth intones that homosexuality is inherently a sin—and not for any greater reason than “the Bible says so”—and who’s going to argue with the infallible wisdom of God himself?

The man who massacred 49 men and women at a gay nightclub in Orlando certainly didn’t.  Like so many insecure young men before him, he became consumed with hatred for the gay community—inflamed, it has been suggested, by his own suppressed homosexuality—which he then justified and acted upon through the language of religious fundamentalism—language that (to repeat ourselves) is readily available for anyone to use without changing a single word.

In this respect, congressional Republicans are absolutely right that the shooting at Pulse was a function of religious extremism.  The big mystery, however, is why anyone would single out Islam at the expense of all other religions.  While the Quran undoubtedly looks upon homosexuality with contempt, it has merely borrowed ideas originally conceived by Christians and Jews.  As far as prescribed treatment of gay people is concerned, to condemn one monotheism is to condemn them all.

So why are we pretending that one religion is more guilty than the others on this subject?

Politically, the reason Christianity and Judaism are getting a free pass is so obvious we need hardly mention it.  For both demographic and cultural reasons, a U.S. public official cannot say an unkind word about either faith any more than he can boycott the NFL or burn an American flag.  For all the talk about the separation of church and state, we still regard ourselves as a faith-based people guided by so-called “Judeo-Christian values.”

On the whole, Americans view religion—at least their own—as a force for good in society, which becomes problematic when the very dictates of said religion produce unconscionable evil.  Since we cannot bear to think of ourselves as complicit in such behavior as we saw last weekend, we simply deflect blame onto some foreign entity that we can happily (and ludicrously) profess not to understand nor know nothing about.  Hence the scapegoating of Islam for a disease—homophobia—that is still so prevalent in the country at large that most Republican congressmen can’t even bring themselves to speak its name in public.

The truth is that Jews and Christians who continue to stigmatize gay people are complicit when others take the logic of their arguments to their natural conclusion through acts of extreme violence.  While we non-Muslims comfort ourselves by insisting that our religious figureheads, however anti-gay, do not literally call for homosexuals to be executed—it does, after all, conflict with that business about “thou stalt not kill”—occasionally some self-appointed Christian spokesman will do exactly that, and sometimes major Republican presidential candidates will speak at that person’s conferences, thereby tacitly endorsing such views as legitimate.

So long as a large minority of Americans—enabled by their leaders—continues to treat homosexuals as perpetrators of social unrest, rather than as victims thereof, we cannot guarantee that crazy people won’t continue to go on killing sprees to eradicate what they have been told is an existential threat to civilization.

To be sure, we cannot guarantee such a thing in any event.  Not all hatreds are borne from religion, and homophobia in some people is as ineradicable as racism or antisemitism are in others.  Plus—despite what virtually every professional and amateur opinionator has said—we do not know for sure where the Orlando killer got his own hateful ideas (not that we don’t have plenty of material from which to speculate).

Here’s what we do know for sure:  Human beings do not exist in vacuums.  While each of us is ultimately responsible for what we think and how we behave, our thoughts and actions are the product of our environment—the people and places that shape us during our adolescence, as well as those with which we choose to associate once we are old enough to chart our own course.  Just as America’s closet racists have been empowered into action through the rise of Donald Trump, so do closet homophobes find refuge in the rhetoric of anti-gay demagogues who may well not understand the carnage they are allowing to be inflicted on their watch.

As a society, our choice is as follows:  Either we foster an environment in which gay people—and particularly gay relationships—are so thoroughly integrated into mainstream society that even a lunatic will be unable to find a reason to harm them, or we keep our heads in the sand by pretending violence against the gay community is not America’s problem and being shocked—shocked!—whenever a natural-born American citizen proves our assumption wrong.

It may not be in our power to prevent all future atrocities against vulnerable citizens from happening.  What is in our power is to effect a society that—as George Washington famously wrote in 1790—“gives to bigotry no sanction, to persecution no assistance.”

In the meantime, a bit of gun control probably wouldn’t hurt.

Don’t Let the People Decide

In the first decade of the 19th century, the Federalists became the first major American political party to keel over and die.  Led by such luminaries as Alexander Hamilton and John Adams, the party was done in—or rather, it did itself in—largely through internal squabbling and managerial incompetence.

At the heart of this disintegration, however, was the Federalists’ increasingly unpopular theory about government, which argued—in a nutshell—that America ought to be run by a select group of intellectual elites—a “natural aristocracy,” as it were—who were smarter, wiser and nobler than the public at large.  They viewed ordinary citizens as an unsophisticated “mob” prone to irrational, violent outbursts, whose opinions, therefore, should be neither sought nor heeded in matters of great national importance.

In short, the Federalist Party didn’t really believe in democracy—not directly, anyway—and felt the country would function just fine without it.

In light of this year’s party nominating contests, I think this would be the perfect time to consider whether they were right all along.

A boatload of Republicans certainly seems to think so.  Having seen GOP primary voters anoint Donald Trump as the party’s presidential nominee, a great many officials are still entertaining the possibility—however remote—that the party will stage a coup at the upcoming Cleveland convention  by somehow stripping Trump of the nomination and handing it to somebody—anybody!—else.

The immediate rationale for this would-be hostile takeover is that Trump could not possibly defeat Hillary Clinton in November, and since political parties have no greater duty than to win elections, this entitles the so-called Republican establishment to take matters into its own hands by overruling the will of the people and hoping all goes well.

The implication is clear:  Given the choice, it is better to win with a candidate whom primary voters did not choose than to lose with a candidate whom they did.  The democratic process may be all well and good, but when push comes to shove, all that really matters is victory.

It has been theorized that had the GOP copied the Democrats and introduced “superdelegates” into the mix, Trump may well have been overtaken by some other candidate.  In truth, based on Trump’s lead in “pledged” delegates at the time his rivals dropped out, it’s unlikely that a superdelegate revolt would’ve been enough to produce its desired effect.

But let’s grant the premise, anyway, and suppose that a) the GOP elite succeeds in removing Trump from the race, and b) the replacement nominee actually defeats Hillary Clinton in the fall.  Would we consider that fair?  Would it signify that the system “works”?  Would it reflect the sort of country we want to be or, rather, would it suggest that democracy, as we know it, is a mere figment of our imagination?

The answers might seem obvious to us—namely, that the above would be a clear perversion of the principles of representative government and a big, fat middle finger to Republican voters from a party leadership that views them with patronizing contempt.

By today’s standards, yeah, that’s about the size of it.  By definition, if the party decides, the country does not.

However, by dismissing such tactics as brazenly undemocratic—and, by implication, blatantly un-American—is to ignore almost the entirety of American history and the U.S. Constitution along with it.

Although political parties have existed for almost as long as the country itself, our founding documents conspicuously omit mention of presidential primaries—possibly because they didn’t exist until 1904.  For the first century of the American presidency, nominees were selected not by a state-by-state popular vote, but rather by—you guessed it!—a group of party elites, acting on nothing but their own superior wisdom and, presumably, a series of crooked backroom deals.  In this preliminary stage of presidential campaigns, the “will of the people” was not yet a thing.

What’s more, once primaries were formally introduced, it soon became clear that the results were not exactly binding:  However the rabble voted, delegates went right on choosing whomever their hearts desired—based, again, on which candidate might actually win the election.  Indeed, it was as recently as 1968 that the Democratic Party selected a nominee, Hubert Humphrey, who had not even competed in direct primaries, but who nonetheless secured enough delegates from non-voting states to jump the line past such candidates as Eugene McCarthy and Robert Kennedy, who had taken the trouble to actually campaign.

Was the 1968 Democratic nominating contest an electoral farce?  According to us in the present, yes, of course it was.  (Plenty of folks thought so then, too.)  However, when compared to all previous primaries up to that point, the shenanigans that produced Humphrey were essentially par for the course.  What mattered to the party was not how its voters felt at the time, but rather how the entire electorate might feel in the first week in November.

Until very, very recently, that is how American democracy functioned:  From the top down, with the public playing an exceedingly minor role in how our leaders are chosen.  Even today, the existence and idiosyncrasies of the Electoral College dictate that the country install the people’s choice for commander-in-chief only after all other options have been exhausted.

The rationale for this is rooted in an admirably straightforward assumption:  On the whole, the American people are a bunch of idiots and rubes whose ability to choose a leader is no more informed than a toddler’s ability to land a jetliner.

Now that the rise of Donald Trump has lent real credence to that theory, we are forced to confront whether unfettered democracy—that is, a direct primary that cannot be overturned by superdelegates or anyone else—is simply too dangerous for the continuing health of the republic and the world at large.

Our system has institutional checks for when our leaders lose their minds and put the entire country at risk.  Why shouldn’t we retain similar checks for when voters behave the same way?

We Will Survive

Given the choice, which of the following would be the more unnerving prospect:  That Donald Trump becomes president and effectively destroys the entire world order, or that Trump becomes president and does a perfectly decent job?

Over the past year, we Trump skeptics have spent so much time imagining the catastrophic consequences of a theoretical Trump presidency that it has barely crossed our minds that he just might be up to the task—or, more precisely, that our fears of what his leadership style would mean for the future have been unduly exaggerated.  That upon becoming the most powerful person on Earth, Trump might finally come to his senses and behave in a more cautious, dignified manner on the world stage.

Admittedly, the reason none of us has entertained this notion is that everything Trump has ever said and done has indicated the exact opposite.  Whether through his infantile personality, his lack of basic knowledge about policy or his propensity for flying into a tizzy whenever anyone calls him out—especially if that person is a woman—Trump has made it impossible for any reasonable observer to give him the benefit of the doubt:  The preponderance of the evidence suggests a disaster in the making.

But what if we’re wrong?  What if Trump surprises us by proving himself a competent, solid leader who manages America’s foreign and domestic affairs with grace, fortitude and good humor?  What if he lays aside his rougher edges and characteristic bile and somehow wills himself into an able statesman?

Or—if that scenario seems too outlandish—suppose he abandons some of his baser instincts in the Oval Office and muddles through four years of minor accomplishments and periodic setbacks, amounting to a presidency that, while hardly great, is finally regarded as a respectable effort and a mere blip in the ongoing saga of republican governance?

Indeed, the prospect of a boring, so-so performance from this man seems to be the one eventuality that both Trump’s fans and haters neither want nor expect—perhaps because it simply doesn’t compute that such an explosive character could possibly be middling.  In the hysterical environment in which we live, today’s electorate is convinced that a President Trump would be either a towering success or a catastrophic failure.  (We should add that, given the differing values of these two camps, it’s possible that, four years hence, both will claim to have been correct.)

And yet, if history teaches us anything, it’s that the U.S. presidency is a fundamentally stable and moderating institution—strong enough to endure even the likes of one Donald Trump.

Taking a cursory view of all U.S. presidents to date, we find that a small handful were truly great, an equally small handful were truly terrible, while the remaining several dozen landed in the giant chasm in between.

What we find in all cases, however, is that not a single one of those 43 men has caused the American republic to collapse or the entire planet to explode—i.e. the two things that half the country more or less assumes will happen under a President Trump.

Whether the presiding administration engaged in open bribery (e.g. Grant and Harding), imperial overreach (Johnson and Bush), nuclear hot potato (Truman and Kennedy) or domestic genocide (Andrew effing Jackson), the country itself managed to endure—both while and after such dangerous men stood at the helm.  To date, no chief executive (try as they might) has succeeded in fully negating the principles of the Constitution.

(For our purposes, we’ll allow that the Civil War—the closest America ever came to disintegrating—was the culmination of a 73-year-old argument as to what those principles actually were, and was not the fault of a single leader.)

The short explanation for our system’s remarkable buoyancy is that the Founding Fathers hit the jackpot by dividing the federal government into three equal branches, with a bicameral legislature and a Supreme Court acting as checks on executive power.  This way, whenever the president does go too far, the remaining branches are empowered to rein him in and/or throw him out until Constitutional equilibrium is restored.  While this arrangement has never operated flawlessly and the power of the presidency has grown with each passing administration, it has worked just well enough to keep things chugging along.

Now, it’s possible that the United States has merely experienced 229 consecutive years of dumb luck and that Trump is now the right guy at the right time to give the Constitution that one final nudge over the cliff.  He certainly professes to care not a whit about the separation of powers, and we have every obligation to take him at his word.

Or rather, we don’t, because when has Trump’s word ever meant anything?

Don’t forget the one thing about Trump that we know for sure:  Whatever he says today has no bearing on what he might say tomorrow.  On matters related to policy and governing, he plainly doesn’t have a clue what he’s talking about and, when asked a direct question, he reflexively spits out the first thought that pops into his head, no matter how incompatible it might be with all his previous statements on the issue—including, in some cases, what he said just a sentence or two earlier.

Nope.  It’s like we’ve been saying for months now:  Trump is the world’s most transparent con man whose only instinct is to say and do whatever he thinks will induce others to bend to his will.  Like every avaricious, status-obsessed windbag before him, he cares nothing for the public good except for how it might enrich him personally.

But here’s the thing:  Trump is not the first presidential candidate driven almost exclusively by narcissism and greed, nor would he be the first commander-in-chief bereft of a basic sense of right and wrong.

These are hardly attractive qualities in a leader of the free world, but they are not—in and of themselves—a hindrance to a competent and fruitful presidency, and even failed presidents can do genuinely good things.  Consider, for instance, that although Richard Nixon gave the world Watergate and four decades of cynicism about public officials, he still found time to open China and establish the EPA.  Or that while George W. Bush was unwittingly fostering a terrorist breeding ground in the Middle East, he was simultaneously funneling billions of dollars to diseased-ravaged countries in Africa, reportedly saving over one million lives and counting.

Long story short (too late?):  Just as Trump himself should quit being so inanely confident about his ability to foster a magical new American Eden, so should we dial back our own assumptions that, if given the chance, he would fail in a million different ways—or worse, that he would “succeed” in the most frightening possible sense.

It’s not that Trump has shown any real propensity for intellectual growth (he hasn’t), or that his whole candidacy has been an elaborate performance masking a much more serious and learned man (if so, he hides it well).

Rather, it’s that the presidency—that most peculiar of institutions—has a way of scrambling the expectations of every person who enters into it and every citizen who observes the machinations therein.  Like no other job on Earth, it has a way of turning great men into cowards and mediocrities into legends.

The truth is that we can’t know what kind of president someone will be until it’s too late to stop them.  With Trump—arguably the most erratic person to have sought the job in any of our lifetimes—this uncomfortable fact becomes all the more self-evident.  If we agree that he is inherently unpredictable, we must allow for the possibility that, once in office, he will do things that we have thus far failed to predict, and that we just might be pleasantly surprised by the results.