The Battle of Princeton

This is what happens when you name buildings after human beings.

In this week’s edition of White People Discover Their Heroes Were Racist Thugs, students at Princeton University have demanded that the school disassociate itself with Woodrow Wilson, a man who served as Princeton’s president for eight years before going into politics.

Specifically, today’s protesters want Wilson’s name and likeness removed from all campus buildings, including the Woodrow Wilson School of Public Policy and International Affairs.

The basis of this demand is fairly straightforward:  Making an objective appraisal of the record, it becomes clear that Woodrow Wilson was, in fact, an unreconstructed white supremacist.  A man who openly viewed black people as inferior to white people and who, upon becoming president, ran an executive branch that turned this view into official policy, most damningly through the re-segregation of various government offices and facilities.  (When Boston newspaperman Monroe Trotter brought a delegation to the White House to protest, Wilson informed them, “Segregation is not humiliating, but a benefit, and ought to be so regarded by you gentlemen.”)

Factoids like these were not unknown to history until last week.  Wilson’s bald racism has been well-documented for eons, available to anyone who cared to look it up.

The problem, then, has been twofold.  First, up until now, few people have cared to educate themselves on the more shameful aspects of our 28th president’s life, both personally and politically.  Second, and more disturbingly, many other folks have known the ugly truth about Wilson all this time but haven’t summoned the strength to be appalled by it.  They simply accept his racist tendencies as a function of the era to which he belonged, then promptly shrug and move on.

The real scandal is how we Americans have allowed ourselves to get away with this for so very long.  How our historical assessments of Wilson’s presidency have focused (appropriately) on his handling of World War I abroad and progressive politics at home, but rarely, if ever, on his handling of race relations a full half-century after Reconstruction began.  As one expert after another has said, Wilson didn’t merely sustain the principle of white supremacy:  he actively made it worse.

Not that I speak from a position of moral superiority.  While I’ve never particularly been a fan of Wilson’s—in college, I wrote a paper detailing his sinister power grabs during the war—my previous understanding of his racism had been limited to his open-arms embrace of D.W. Griffith’s The Birth of a Nation, the film that lionized the Ku Klux Klan and employed white actors in blackface.  Disturbing as that was, it never led me to dig deeper.  It didn’t occur to me that the president of the United States, five decades after the Civil War, would harbor such bigoted views toward black people and not even bother to hide them.

I’m not quite that naïve today, nor—thankfully—is an increasing chunk of the country as a whole.

Thanks most recently to the Black Lives Matter movement and its supporters, Americans are no longer permitted to sweep racial prejudice under the rug without one heck of a fight.  While BLM’s focus is on racism in the present, racism in the past has inevitably factored into the argument.  Further, while racial prejudice can often take subtle or even unintentional forms, Wilson’s was neither.  On the race issue, he was simply a scoundrel.  The only question is why it took us so long to acknowledge it.

However, this does not automatically mean we should strike him from the record entirely.  Or, in this case, scrub his name from the college he so ably led.

With the entirety of American history planted in the back of our minds, let us consider—if we may—the Slippery Slope.

Demoting Woodrow Wilson in the popular imagination and at Princeton would undoubtedly make us feel a little better about ourselves, because it would send the message that bigoted men should not be honored or immortalized after they die, regardless of whatever good they might have otherwise done.

Indeed, we have already broadcast this message elsewhere with regards to such controversial figures as Andrew Jackson, John C. Calhoun and Robert E. Lee—men who made their success by subjugating entire classes of Americans.  This year’s extended kerfuffle over the Confederate battle flag seemed to kill several birds with one stone.

But where, exactly, should we draw the line?

If we are to shun Wilson on the grounds that he discriminated against black people, what are we to make of George Washington?  I don’t know about you, but I think personally owning 123 black people is a pretty cut-and-dry example of valuing one race over another.  Although Washington privately spoke of his desire for emancipation and allowed for his own slaves’ freedom upon his death, he didn’t do a thing to advance the cause of racial equality while he was alive and the most powerful man in America.

So what’s the difference between him and Wilson?  If the latter doesn’t deserve to have even a school named for him, why should the former continue to be the namesake of our nation’s capital, one U.S. state and a main thoroughfare in every city and town in this country?

Is it because, although both men were white supremacists, Wilson was more of a jerk about it?  Is it because Washington’s accomplishments as a general and president are just too important to overlook, while Wilson’s leadership in World War I is apparently negligible?  Do we consider an 18th century slaveholder to be somehow more forgivable than a 20th century segregationist?  What’s the standard?

It’s safe to assume that we won’t be expunging the existence of George Washington any time soon, and this might help to clarify why the whole concept of moral cleansing can be so problematic.

The truth is that almost every American leader between 1776 and 1865 was complicit in the perpetuation of a racist society, either through direct action or through silence when action might have done some good.  (Abolitionists were the exception, and many of them paid a huge price for their courage.)  If we are to retroactively adopt this zero-tolerance policy toward race-based discrimination, why shouldn’t we be consistent and apply it to everyone who was responsible?

We know why not:  Because people are complicated and inspiring and compromised and good and evil, usually all at the same time.  Everyone has a little more of some traits than others, and we evaluate an individual’s overall character using some mysterious algorithm that depends a great deal on context—i.e. time and place—and our own biases.

So we tell ourselves that George Washington’s plantation, while unfortunate, is not a deal-breaker for his reputation because, hey, he treated his slaves decently enough and, by the way, he did single-handedly save the country from oblivion on more than one occasion.  It’s not that we ignore that he owned slaves; we just don’t condemn it as strongly as we would if he weren’t the father of his country.  Don’t make the perfect the enemy of the good.

Meanwhile, the Woodrow Wilsons of the world—essential as they are—do not carry the same aura of Godliness and, thus, make easier targets for our derision.

But maybe I’m wrong.  Perhaps the posthumous inquisition of Washington is next.  Maybe our righteous zeal to reconcile our country’s shameful history is so strong that it will eventually reach all the way to the top, and we will soon achieve a society in which buildings and institutions are only named for people (or things) who never did anything wrong.

While I would personally have no problem with this result—what’s wrong with naming a school after a tree or a city or the president’s mom?—it would have one unintended consequence:  It would provide us with one fewer mechanism for arguing about the meaning of America.

Let’s be honest:  If a group of Princeton students hadn’t caused such a row about the Wilson School of Public Policy, how many millions of people would never have realized just what a wretched little puke Woodrow Wilson actually was?  However inadvertent on Princeton’s part, the controversy gave way to education, argument and clarity.  That’s precisely what great universities are for.

As for the question immediately at hand—should Princeton capitulate to the students’ demands?—I defer to the wisdom of Ta-Nehisi Coates who, himself ambivalent about the whole thing, nonetheless imagines how, to a black student, “seeing one’s University celebrate the name of someone who plundered your ancestors—in a country that has yet to acknowledge that plunder—might be slightly disturbing.”

Slightly.

Losers

Quick question:  Will a Republican ever be elected president again?

I don’t mean to be flippant in asking.  I’m completely serious, although, as a liberal, I can’t pretend to despair at the prospect that the answer might be “no.”

Historically speaking, the odds of such a thing are just a hair north of zero.  Indeed, if the past several generations of elections have taught us anything, it’s that American voters can stand one party in the White House for only so long before swinging the other way and throwing the bums out.

In the last 63 years—that is, since the election of 1952—only once has the same party won three presidential elections in a row—namely, two by Ronald Reagan and one by George H.W. Bush.  On all other occasions, the executive branch has seen a transfer of power from one party to the other within either four or eight years.

Fundamentally, the country is split down the middle when it comes to political ideology, with the small group of folks in the middle ultimately determining which way the wind blows.  The last seven elections have been won by a margin of less than 10 percent, which is rather remarkable when you consider that five of the preceding nine were won by more than 10 percent.

So it stands to reason that—if only to satisfy statistical norms—a Republican will, in fact, win the presidency in 2016 or, at the absolute latest, 2020.

That’s before factoring in the legacy and current standing of the man whom our next president will succeed.  From a composite of recent polls, President Obama’s approval rating sits at 44 percent.  While by no means catastrophic—George W. Bush ended his presidency at 34 percent—it’s not exactly reassuring to a Democratic Party that might otherwise want to capitalize on Obama’s successes in anointing his heir apparent.

If Obama’s current levels of (un)popularity hold, he would be in roughly the same shape as George H.W. Bush, who couldn’t save himself in 1992, and in considerably worse shape than Bill Clinton, who was at 60 percent on Election Day 2000 and still couldn’t save Al Gore.

As if that weren’t bad enough, there was the media’s reminder earlier this month that, for all the Democrats’ dominance on the national level, the Obama era has seen sweeping victories for Republican candidates on the state and local levels.  There are ten more Republican governors today than in 2009 and, as reported in the New York Times, “Democratic losses in state legislatures under Mr. Obama rank among the worst in the last 115 years, with 816 Democratic lawmakers losing their jobs and Republican control of legislatures doubling since the president took office.”

In short, the 2016 race is the GOP’s to lose.  But they’re going to lose it, anyway.

Why?  Because Republican voters are determined to do so.

You don’t need me to tell you which GOP candidate is currently—and enduringly—ahead in the national polls.  Nor, for that matter, do I need to explain why this is such a spectacular moral farce.

However, in light of how close the Iowa caucuses have become and how little the polls have changed over the last several months, it is entirely worth spelling out this travesty in full, just in case the full force of it hasn’t yet sunk in.

Lest we forget that, for all his popularity with GOP voters, Donald Trump remains the man who ridiculed John McCain for having been a prisoner of war.  The man who said a Black Lives Matter activist deserved to be “roughed up” at one of his campaign rallies and that a pair of supporters who assaulted a Hispanic homeless man were “very passionate” people who “love this country.”  The man who is so hilariously thin-skinned that he picks (and loses) Twitter fights with people whom most Americans haven’t even heard of—including, most recently, a reporter whose physical disability Trump gleefully mocked onstage.

It has gotten people asking:  Is there anyone left in America whom Trump has not tacitly (if not personally) offended?

Apparently there is, because (at the risk of repeating ourselves) he remains the top dog among his party’s base, with his numbers consistently in the mid-to-upper 20s in a 14-person contest.  Much can still happen before Iowa and New Hampshire (to be held on February 1 and 9, respectively), but for now GOP voters have made their views clear, and the rest of us have no choice but to acknowledge it.

Once we’ve done that, however, we can proceed directly to the next self-evident truth, which is that Donald Trump will never, ever, ever in a billion years be elected president of the United States.

It’s not just that he’d barely get a single vote from Hispanics, whom he has tarred—directly or by association—as rapists and drug dealers.  Or that he’d garner zero interest from African-Americans, whom he affectionately refers to as “the blacks.”

Nope, in the end, his downfall may well come at the hands of the whites.

Should he secure his party’s nomination—following a demolition derby of a primary season, no doubt—he will discover that there is a good chunk of moderate, independent white voters who, despite conservative or libertarian worldviews, just cannot bring themselves to support a man who behaves like a real housewife of Beverly Hills.  Who is so emotionally unstable that he throws a spontaneous fit whenever anyone says anything unflattering about him, and so intellectually insecure that he name-drops his alma mater almost as frequently as his net worth.

For all their fickleness and inscrutability, American voters are cognizant of the image they project to the world when they elect a commander-in-chief.  While we are certainly susceptible to leaders who project strength through swagger and machismo (see Bush, George W., 2004), we are not so weak and panicky that we will surrender the Oval Office to a fellow who would enshrine religious and ethnic discrimination (back) into law.  We don’t mind sacrificing some of our privacy in the interest of fighting terrorism, but we aren’t prepared to sacrifice all of it.  We appreciate a chief executive who indulges in social media, but not necessarily at 4 o’clock in the morning.

We could go on and on about what a child Donald Trump truly is, but that would unfairly let the rest of the GOP off the hook.  As anyone paying attention to national politics knows, Trump is not the only “serious” candidate with a knack for behaving like a petulant toddler.  On Friday, for instance, the New York Times ran an amusing story chronicling the off-the-charts use of profanity by candidates throughout the campaign season, noting that employing four-letter words is perhaps the most promising way to draw attention to oneself and hopefully experience a bump in the polls.

Is there anything more pathetic than that, let alone more childish or un-presidential?

More broadly, the GOP in Washington shows no particular interest in shaking its reputation for obstructing every last Obama proposal for no reason except that Obama proposed it.  As the recent struggle to find a new House speaker demonstrated, Republicans in Congress have long since transitioned from a governing body into a gang of hyperactive, nihilistic know-nothings whose ambitions are limited to negating every major piece of legislation the previous few Congresses have passed, while spending the rest of the time calling each other names and screaming about the end of the world.

With a legislative branch like that, are we really on the verge of anointing an executive branch that’s on the exact same page?  To paraphrase Trump, how stupid are we?

The silver lining here—for Republicans and the country alike—is the theory that primary voters will eventually come to their senses and nominate one of the alleged grownups in the field—someone like Marco Rubio or John Kasich, whose experience and relative sanity could plausibly give Hillary Clinton a run for her money.  Trump supporters are, after all, a slim majority of all eligible voters and would be hugely outnumbered if only Trump non-supporters could reach a consensus as to which non-Trump candidate they prefer.

It could happen.  The 2016 general election may well end up as a variation of 2012, with two flawed but serious contenders who both see the world more or less as it actually is.  It’s not too late.

But if that doesn’t happen—if the GOP goes insane and nominates someone who is manifestly unacceptable to 55-60 percent of the country—then the next four years will probably look an awful lot like the last eight, featuring an ideological civil war within the party, during which its two major factions will debate, yet again, about whether the GOP should retain its extremist Tea Party bent and remain ideologically “pure,” or whether it should entertain such heretical concepts as moderation and compromise, which might include recognition of climate change, same-sex marriage and the consequences of white supremacy and lax gun control laws.

Shortly after Obama was first inaugurated, blogger Andrew Sullivan predicted that, with respect to the GOP, “It will get worse before it gets better.”  The past six-and-a-half years have certainly vindicated that assessment, although we are still waiting for an answer to the natural follow up:  Will it ever get better, or will the party ultimately disband and start over again from scratch?  It’s a crazy, outlandish scenario—one that hasn’t happened to a major political party since the death of the Whigs in 1856—but we may well have found the crazy, outlandish goons with the power to make it happen.

All Votes Matter

Why are Republicans so scared of democracy?  Why are they so hostile toward voting?

In most of his campaign speeches this year, Bernie Sanders has made the point that, in general, voter turnout is directly correlated to Democrats winning elections.  That is, when the maximal number of people cast ballots in a given year, Democratic candidates tend to do well, while Republicans fare better when turnout is relatively low.

While evidence for this is suggestive but not conclusive, the idea is that young people and poor people are the groups who vote the least, and both demographics tend to support liberal candidates.  Thus, if Democrats could simply inspire those bums to get off the couch on the first Tuesday of every November, the party would eke out solid victories from coast to coast, and probably never lose a presidential election again.

Certainly, there are counterarguments to this theory, beginning with the fact that less-dependable voters are also less ideological, and thus more susceptible to change their minds from year to year.

Then again, we don’t really need statistical proof that high turnout favors Democrats and disadvantages Republicans.  All we need to do is observe how the two parties behave whenever the issue of voting rights comes up.

At every juncture, Democrats do all they can to expand the voter pool and erase whatever barriers remain for citizens who are already eligible to vote.  Republicans, meanwhile, take the opposite approach, digging whatever sand traps they can to make the act of voting as difficult and unpleasant as possible.

Do I exaggerate?

Year after year, it is Republicans—and only Republicans—who advocate “voter ID” laws, which would necessarily disenfranchise a significant chunk of eligible American voters who, without having broken any laws, happen not to possess the sorts of identification such laws would require.  (In big cities, for instance, many residents don’t own a driver’s license because they have no need for a car.)

In Virginia in 2008, it was Republicans who sent flyers to Democratic neighborhoods telling them to vote on the wrong day.  In 2012, five states—four of which had Republican governors—cut back on early voting, which allows those who can’t get out of work on Election Day to cast a ballot on a day that they can.  (“Too busy” is the number one reason registered voters don’t make it to the polls.)  During the Maryland gubernatorial race in 2010, a Republican consultant pulled back whatever was left of the curtain by saying, “the first and most desired outcome is voter suppression,” specifically by ensuring that “African-American voters stay home.”

Shenanigans like these—anecdotal that they are—help to erase any notion that Republicans’ real target is so-called “voter fraud”—the act of casting a ballot under false pretenses.  While it sounds reasonable to want to prevent that sort of thing, it becomes slightly less so when you learn that, according to one study, there has been a grand total of 31 instances of voter fraud in the United States since 2000—a period of time that saw roughly one billion ballots cast.

Percentage-wise, the likelihood of voter fraud affecting the outcome of an election is roughly equivalent to that of being eaten by a shark in the middle of the Mojave Desert.

No, the purpose of voter ID laws is exactly what it looks like:  To keep liberals away from the ballot box.

This being the case, we are now in the nascent stages of a major fight on this issue between representatives of our two political parties.  The fight concerns a simple but profound question:  Should all Americans be automatically registered to vote when they turn 18?

Presently, if you want to participate in the democratic process, the onus is on you to march down to City Hall and register to vote.  You must do this upon reaching the age of majority and any time you move to a different address.  God forbid you forget, don’t have time, don’t care or don’t get your application processed on time.

Now there is talk of streamlining the registration process by reversing it.  Under the new proposal—bills have been introduced in both houses of Congress—you would be automatically added to the voter rolls unless you specifically opt out.

What a splendid idea.  Indeed, it would be the most pro-voter federal policy shift since the Nineteenth Amendment, and we have the research to prove it.

Consider organ donation.  In many countries, you must give your affirmative consent to become an organ donor—typically upon renewing your driver’s license—while in others, you are made a potential donor automatically unless you actively refuse.  The outcome of these policies is striking:  In “opt-in” Germany, for instance, only 12 percent of citizens are organ donors.  Meanwhile, in “opt-out” Austria right next door, the number is 99.98 percent.

In general, you can use statistics to reach any conclusion you want, but this case seems pretty cut and dry.

Further, there is little reason to expect that automatic voter registration wouldn’t yield similar results:  At the proverbial end of the day, how many Americans are so hostile toward the democratic process that they would actively deny themselves the mere opportunity to cast a vote?  So long as they are afforded that right, what exactly is the problem?

The effect of such a system could be transformative.  According to the U.S. Census Bureau, there are now more than 70 million Americans who are eligible to vote but haven’t even bothered to register.  That’s 70 million potential ballots that are just floating around in the ether—legitimate would-be votes with the potential to swing every race in this country.

To be sure, just because those 70 million people would suddenly become registered does not mean they would actually exercise their newly-acquired right.  Some people do not become voters for entirely deliberate reasons, and there are plenty of Americans whose grasp of the issues is such that their abstention from the process is probably for the best.  (Then again, we could say the same for many who do vote, but that’s another story.)

All things considered, automatic voter registration seems like a slam dunk—one of those simple, obvious ideas we’re embarrassed not to have thought up sooner.  Who could possibly object?

Chris Christie, for one.  Earlier this month, the New Jersey legislature passed the “Democracy Act,” which, in addition to automatic registration, would have allowed residents to register online and provided two weeks of early voting every election cycle.  However, as governor, Christie vetoed the legislation, calling it “thinly-veiled political gamesmanship” and arguing that such reforms would lead to increases in—you guessed it!—voter fraud.

We’ve already established how the latter claim is utter nonsense, but what about the former?  Are Democratic Party initiatives like early voting and automatic registration mere ploys to run up the score in favor of America’s left wing?

Sure they are.  If Democrats know—or at least assume—that high voter turnout redounds to their benefit, any maneuver to jack up turnout is axiomatically a political act.

But that’s not the point.  Everything is a political act.  The question is whether this particular political act is consistent with basic American principles and traditions.  If so, the politics behind it become irrelevant.

Call me crazy, but I would estimate that ensuring equal protection under the law is a more worthy American tradition than keeping poor people and minorities from participating in the democratic process.

Further, this shouldn’t be an especially difficult feat to pull off.  Over the past 150 years, we have amended our Constitution to clarify that the right of adult citizens to vote shall not be denied or abridged on account of race, sex, age and—with the Voting Rights Act—race again.

Having accomplished all of that—however haltingly—you’d think denying the vote on account of laziness and/or having a busy schedule would be a breeze to overcome, even for the Congress we’re stuck with today.

As Churchill is alleged to have said, “You can always count on Americans to do the right thing, after they’ve tried everything else.”

Solving Islam

More than 14 years after the September 11 attacks, why are Americans still arguing about whether Muslims are people?

In 2001, the country suffered an act of terrorism carried out by 19 men, all of whom were Muslim and claimed to be acting on divine orders.  Even then, however, cooler heads occasionally prevailed when it came to assigning blame.

Consider the following statement from September 17 of that year:

“The face of terror is not the true faith of Islam.  […]  When we think of Islam we think of a faith that brings comfort to a billion people around the world.  […]  America counts millions of Muslims amongst our citizens, and Muslims make an incredibly valuable contribution to our country.  Muslims are doctors, lawyers, law professors, members of the military, entrepreneurs, shopkeepers, moms and dads.  And they need to be treated with respect.  In our anger and emotion, our fellow Americans must treat each other with respect.”

That was George W. Bush.  In a special address to Congress three days later, he added, “[T]hose who commit evil in the name of Allah blaspheme the name of Allah.  The terrorists are traitors to their own faith, trying, in effect, to hijack Islam itself.”

In other words, even President Bush in 2001 understood that a war against Islamic extremists was not the same as a war against Islam.  It’s a fairly simple concept to grasp, so why are we having so much trouble with it now?  Why can so many Americans still not distinguish a gang of murderers from the billion-plus peaceful folks whose religion they happen to share?  Why are we scapegoating all members of a particular faith for a problem caused by some of them?

The explanation for this can roughly be traced to three separate but occasionally interconnected sources:  Ignorance, bigotry and a few unfortunate facts.

The first two require little explanation.  Regrettably, a sizable minority of American citizens are just plain dumb when it comes to understanding people who are different from them.  Either because they don’t bother to educate themselves or because they reject the information that is staring them directly in the face, these people are impervious to reason, sensitivity and intellectual growth.

In the present context, this would include those who look at someone wearing a hijab and immediately think, “Terrorist!”  Or, more explicitly, those who see Muslims committing atrocities overseas and bellow, “Let’s not allow any Muslims to enter the United States!”

Not even Mark Rothko painted with a brush that broad, yet that is precisely the mainstream view among nearly all Republican presidential candidates and their supporters.  Donald Trump surprised no one this week by suggesting all American Muslims should be “registered.”  (Whatever that means.)  Ben Carson has said an observant Muslim should not be elected president.  Jeb Bush and Ted Cruz gave the game away by advocating preferential treatment to Christian refugees over those who, shall we say, pray to a slightly different god.

Against all of this xenophobic nonsense—betrayals of such foundational American values as multiculturalism and religious freedom—there remains a profoundly uncomfortable question:  Why , at this moment, are ISIS and its ideas so goddamned popular among certain members of the Islamic faith?

In 2013, Pew released results of a survey of Muslims around the world.  Among other things, the survey found that 72 percent of respondents agreed with the statement, “Suicide bombing in defense of Islam is never justified.”  That seems reassuring—until you realize that it means 28 percent of respondents didn’t agree with the same statement.

In fact, 11 percent of the world’s Muslims explicitly endorsed the view that suicide bombing in defense of Islam is either “often justified” or “sometimes justified.”  That leaves 17 percent who either refused to answer or didn’t have an opinion on the merits of murdering large amounts of civilians.

I don’t know about you, but I find these numbers slightly alarming.

If one out of every nine Christians were in favor of blowing themselves up in a crowded marketplace because someone said something disparaging about Jesus, would we not be correct in saying that Christianity had a problem?

We can bang on and on about how Islam is a religion of peace and that an overwhelming majority of Muslims reject violence in all its forms—the latter being an incontrovertibly true statement, particularly in the United States—but we are entitled to look at that minority and conclude that Islam itself might have something to do with it.

There’s a popular refrain that says the problem isn’t religion; it’s people.  That is, there’s nothing in religion to turn good people evil; rather, it’s that certain people are already evil and will cling to any philosophy to justify their actions.

It sounds convincing and is largely true—in the end, each individual is responsible for his own behavior—but it does not resolve the question of why a disproportionate number of these murderous psychopaths belong to one faith is particular.  If suicide bombing doesn’t have to do with religion, why do virtually all suicide bombers belong to the same religion?  If Islamic texts don’t instruct adherents to resort to violence in response to blasphemy, why is one in nine Muslims so convinced that they do?

These are the sorts of questions we ignore at our peril.  However, they are ultimately mere window dressing for the only question that matters:  What do we do with this information?

As we have found, there are two general approaches to addressing this issue.  One, we could decide that because 11 percent of Muslims are sympathetic to Islamic terrorism, we are therefore entitled to stigmatize and openly discriminate against the other 89 percent.  Or two, we could stop acting like children and recognize that two separate and seemingly contradictory facts can be true at the same time.  Namely, that Islamic holy books provide justification for holy violence and also that most Muslims have the decency and common sense to ignore what those books say.

We can all recite verses from the Christian and Jewish bibles that condemn certain people to death for all sorts of offenses, and we can equally recite the names of people—in America and elsewhere—who take those verses to heart.  Why, it was just a few weeks ago that several GOP presidential candidates spoke at an event hosted by a Colorado pastor who openly advocates the murder of all gay people on Earth—as explicitly recommended in Leviticus 20:13.  In many countries in Africa and the Middle East, of course, this commandment is actually carried out.

Yet somehow, the balance of the world’s Jews and Christians manage to overlook these prehistoric injunctions, living, instead, according to the laws of man and the good old Golden Rule.  If we Judeo-Christians can pat ourselves on the back for pulling this off, why can’t we extend the same courtesy to others who have done the same?

As ever, the tonic to religious fanaticism includes such concepts as secularism, pluralism, rule of law and—when all else fails—treating one’s fellow human beings with dignity and respect.  This necessitates seeing people as individuals rather than members of a group—even when they identify as both—since applying labels to each other tends to produce hatred and discord at the precise moment when common ground and reconciliation are in order.

We might agree that love, respect and empathy will not solve a problem like ISIS all by themselves.  On the other hand, there is no instance I know about in which they have ever made matters worse.

Speak No Evil

Thomas McCarthy’s new movie Spotlight has been likened, in both form and quality, to Alan J. Pakula’s All the President’s Men.  The comparison may at first seem like a cliché, but after seeing both movies this past weekend—the latter for the fourth or fifth time—I realize the connection is both unavoidable and entirely germane.

When you get right down to it, Spotlight isn’t a companion to All the President’s Men so much as a remake.  While the two films are by no means identical—they take place in different cities at different times and have completely different plots—their agendas are one and the same, and they succeed for exactly the same reasons.

The agenda, then, is to demonstrate how justice and democracy cannot exist anywhere without freedom of the press, and how investigative journalism itself is a long, difficult, boring process that—counterintuitively and against all common sense—makes for positively riveting cinema.

It’s easy enough to talk about the preeminence of the First Amendment and of speaking truth to power, but Spotlight goes a step further by showing us how near-impossible that task really is—even for the most well-equipped and widely-circulated newspaper in town.

McCarthy’s movie—in case you’ve been kept out of the loop—is about how the Boston Globe in 2001 uncovered evidence of rampant sexual abuse of children within the Roman Catholic Archdiocese of Boston, which the church’s leadership—up to and including Archbishop Bernard Law himself—spent decades covering up.  All told, some 250 Boston-area priests were alleged to have sexually molested young boys and girls, with the “official” victim count at 552.  God knows what the true figure really is.

Like Pakula’s film, which famously recounted the Washington Post’s investigations into Watergate, Spotlight features a small group of unknown reporters undertaking a methodical, comprehensive, long-shot project to reveal that a gargantuan and revered American institution is rotten to the core.  In All the President’s Men, that institution was the federal government during the Nixon administration.  In Spotlight, it’s the Catholic Church.

The bottom line—the implied moral to the story—is that were it not for the Spotlight team’s exhaustive and heroic efforts, the Globe’s revelations about predatory priests and a corrupt hierarchy could very easily have remained a secret forever, denying justice and any kind of closure to an entire generation of molested kids, not to mention all the generations that came before.

To this conclusion, one might naturally ask, “How?”  How could that many children be raped—physically and emotionally—without a single one of them speaking up and being heard?

Spotlight’s answer:  Many of them did speak up, but nobody wanted to listen.

Fourteen years after the fact, we now know beyond doubt that the Catholic Church  in Boston engaged in a conspiracy of silence on the epidemic of sexual abuse, moving problem priests from one parish to another while saying nothing publicly about what those priests were up to.  (It was as a direct consequence of the Boston revelations that similar scandals came to light in virtually every Christian country on Earth.)

What we didn’t know—at least not as fully as we should have—is that the archdiocese was not the only player in this terrible drama.  Far from acting alone, Cardinal Law and his gang received a crucial assist from the people in the pews:  those in the Catholic community who loved and respected the Church and would never, for a moment, have entertained the possibility that a member of this institution could do something wrong—let alone something criminal and obscene.

For millennia, clergymen of all faiths have served (often rightly) as the most trustworthy members of society—men of education, wisdom and unimpeachable moral fiber.  It didn’t hurt that, in the case of Catholicism, these men also had God on their speed dial and could invoke divine punishment or reward to bend their parishioners to their will.

And so whenever there was a whisper about some priest doing this or that to the altar boys under his tutelage, most Catholics—including a few who worked at the Globe—dismissed the allegation before the thought could even settle into their minds.

Like a parent who hears that his son is dealing drugs or a Patriots fan who learns that Tom Brady was doing something fishy with those footballs—or, indeed, an idealistic citizen who views the government as a benevolent force for good—churchgoers could not bring themselves to see what was directly in front of their nose

They didn’t want it to be true, so they convinced themselves it was false.

That, in so many words, is what journalism is for:  To tell you what you’d rather not hear.  Reporters have resources and privileges that ordinary citizens do not, which makes it inevitable that the press will disappoint your rosy views of humanity every now and again.

As such, it also means that news outlets will forever be on the front lines in the battle for truth, justice and accountability.  Reporters and editors have no professional obligation except to find out what the hell’s going on.  That’s why movies about newspapermen tend to be so entertaining:  Sometimes fact really is more compelling than fiction.  In a moral universe, it’s the only thing that matters.

As with all major institutional scandals, the details mean everything.  The triumph of Spotlight is that it allows the survivors of the Church rape epidemic to have their day in court—that is, to explain precisely what being held captive by a priest entailed—along with those who had everything to lose from the publication of this story, from the archbishop to the priests to the family members who chose to look the other way.

In so doing, the film shows how the process of newsgathering is inherently a dreary, depressing, often hostile endeavor in which powerful forces will try everything they can to prevent you from doing your job.  A job, we might add, that depends overwhelmingly on cooperation from the public, which includes those same institutions.  The term “Gordian Knot” leaps oddly to mind.

To break the Watergate caper, Carl Bernstein and Bob Woodward needed to draw connections between key White House figures—“follow the money,” they were sagely told—and they did so by extracting information from witnesses who had every reason to keep their insights to themselves.  Some of this involved misdirection and sleight of hand—asking a witness to “confirm” information you don’t actually know is always a neat trick—some involved dumb luck, and some required nothing except patience, asking the right questions and a near-pathological refusal to take “no” for an answer.

With Spotlight, it’s déjà vu all over again.  The Globe team, working separately and together, accumulates its information through a combustible mixture of instinct, legal wrangling, library basement research and good old-fashioned interviewing.  A late-inning confrontation between Michael Keaton and a representative for child victims is a virtual carbon copy of Dustin Hoffman’s run-in with a Florida lawyer with a cabinet full of crucial documents.  In both instances, the reporter explains that his paper is running the story with or without this person’s cooperation, so he might as well stand on the right side of justice.

Long story short (too late?):  Revelatory, in-depth reporting does not happen by accident.  It’s the result of thousands of man hours of detective work—and all the court orders, dead ends, slammed doors and wrecked lives that go with it—and when is it done carefully and seen through to the end, it can change the world.

The Globe would eventually print more than 600 articles in connection with the Church abuse tragedy, for which the paper was awarded a Pulitzer Prize in 2003.  It’s worth noting that the award—the most prestigious in all of American journalism—was not in the category of “Investigative Reporting,”  “Explanatory Reporting” or “Local Reporting,” although any of those would surely have fit the bill.

Rather, the Pulitzer Prize Board saved the Globe’s output for its most distinguished category of all:  “Public Service.”

Houston, You Have a Problem

Am I the only person in America who walks into a public restroom without giving a thought to whoever else might be in there?

No, really:  When I enter the men’s room, I have exactly one item on my agenda.  And once that mission has been accomplished, I wash my hands, make my way to the exit and return to my regularly-scheduled life.

Seems like pretty basic etiquette to me:  Get in, get out, move on.  Public bathrooms may forever be an inherently awkward social phenomenon—particularly at halftime or intermission, when the whole town is there at once—but the weirdness can very easily be alleviated by, shall we say, minding your own damn business.  I’ve tried it for nearly three decades now.  Works like a charm.

In fact, I’m guessing that most people take this minimalist approach to bathroom behavior.  Indeed, we might agree that the matter of regulating multi-person lavatories is one of those issues that wouldn’t even exist if everyone would just act like a normal, decent human being.  After all, if each of us were capable of navigating a bathroom without eying our fellow patrons and making moral judgments, it wouldn’t even occur to us to draft legislation specifying who can (and cannot) use them.

Unfortunately, we aren’t all capable of going to the john without making a big, dumb stink about it.  As a result, we had that massive pile of nonsense last week in Houston, where voters resoundingly rejected an anti-discrimination ordinance out of fear that it would engender a predatory atmosphere in certain Texas restrooms.

Specifically, the Houston proposal would have banned discrimination in various public accommodations on the basis of sexual orientation or gender identity (among other things).  While such a proposal might seem fairly uncontroversial today—19 states and Houston’s own city council have passed bills along the same lines—city residents voted it down following a scare campaign by opponents who argued that the ordinance, if enacted, would enable male sexual predators to enter women’s restrooms and commit unthinkable crimes.

After all—the logic went—if the city of Houston allowed men who identify as women to use the ladies’ room instead of the men’s, what’s to stop actual men from claiming to identify as women in order to sneak into the ladies’ room and do something appalling?

It makes perfect sense—until you think about it for more than, say, 10 or 15 seconds.

Certainly, we can accept the premise that there are a handful of men in every city and town who are totally sexually depraved and would barge into a women’s bathroom if they could—say, the moment invisibility cloaks finally hit the shelves.  Indeed, we could even assume that some of these specimens would take advantage of a transgender rights law by intruding into a women-only setting and, upon getting caught, raise their arms and say, “Not to worry, I’m a woman, too!”

That’s apparently what Mike Huckabee had in mind when he joked earlier this year, “I wish that someone told me that when I was in high school that I could have felt like a woman when it came time to take showers in PE.  I’m pretty sure that I would have found my feminine side and said, ‘Coach, I think I’d rather shower with the girls today.’”  If a two-term governor and presidential candidate can speak about sexual predation in such a lighthearted manner, just imagine what’s in the minds of the people voting for him.

What is much harder to fathom, however, is that the realization of these sick fantasies is such an imminent threat that it outweighs any consideration for the rights and safety of America’s transgender community.

So far as I know, sexual harassment is still illegal in the city of Houston.  If Mike Huckabee or anyone else totters into the wrong locker room and starts peeping around, it won’t be long before the authorities get involved.

Meanwhile, there are roughly 700,000 people in the United States who genuinely identify as the opposite gender from the one they were born as, and when they step into a bathroom they have exactly one objective in mind, and it’s the least-sexy activity you could imagine.

That’s the real fallacy in this whole kerfuffle:  The idea that restrooms are ground zero for satisfying your deepest, darkest sexual desires.  Internet porn, but for real.

Except for those in the violent throes of puberty, are there any mentally and emotionally-balanced people who actually think this way?

I mentioned how my own bathroom visits tend to be as quick and uneventful as possible.  I didn’t mention that I’m a guy who’s attracted to other guys.  For me—in theory, anyway—to be in a men’s locker room is precisely the fantasy that straight men can only dream of.  I beat the system simply by existing.

But you know what?  At no point in those situations have I ever thought, “Lucky me.”  Never has the act of entering a public restroom caused my spine to shiver or my heart to race.

Quite the opposite, in fact.  When you’re in the closet, the high school locker room is the most terrifying place on planet Earth, as you realize that one wrong look could result in you being unceremoniously outed and, as a consequence, teased, bullied or killed.  (I managed to avoid all of that, but not everyone does.)

Post-coming-out, this feeling of cautiousness never completely leaves you.  For all the recent breakthroughs in gay rights, homophobia has not yet been completely killed off, and the men’s room is the absolute last place to test it.

So you act natural, keep your head down, do your thing and get the hell out.

While I don’t know for sure, I suspect that the transgender experience is fairly similar on this front, albeit much worse.  When—through no fault of your own—you find yourself in an intimate social setting at the mercy of other people’s prejudices, your only concern is getting out of there alive and in one piece.

As far as politics go, you can always depend upon loud idiots to be unintentionally hilarious, and opponents of the Houston bill did not disappoint.  The signature placard at the anti-ordinance protests read, “No men in women’s bathrooms.”  Of course, that’s exactly what the transgender community wants:  For everyone to be able to use the restroom that corresponds with his or her true gender.

A person who was born female but identifies as male is a man, period.  Had it passed, Houston’s law would have ensured that that person would be entitled to use the men’s room, thereby averting the spectacle of a man in a women’s bathroom.  By rejecting the ordinance, Houstonians have guaranteed the presence of men in women’s bathrooms (and vice versa), thereby prolonging the confusion and discomfort on all sides of the case.

Sooner or later, this will have to be fixed.

America’s sexual outliers don’t choose to be the way they are—why would they?— but they do insist upon having their existence acknowledged by society and being treated as equal under the law.  Letting them go to the bathroom in peace is literally the least we can do.

Bigotry By Any Other Name

Is homophobia a choice, or are people just born that way?

Amongst all the silliness and bombast at the most recent Republican primary debate, there was the following statement from Ben Carson, who was asked to clarify his position on same-sex marriage:

“I believe that the Constitution protects everybody, regardless of their sexual orientation or any other aspect.  I also believe that marriage is between one man and one woman.  There is no reason that you can’t be perfectly fair to the gay community.  They shouldn’t automatically assume that because you believe that marriage is between one man and one woman that you are a homophobe.”

In a strong field, that might rank as the most incoherent thing that any candidate has said about any issue.  It would be easy enough to ignore or dismiss it—most media outlets have done just that—except that a) it came from the highest-polling candidate in the race (more or less), and b) it forces us to confront the issues of marriage and homophobia in a manner that is just too interesting to pass up.

Getting right to the point, then:  Is it possible to oppose same-sex marriage without being homophobic?  Can you believe that gay people are morally and legally equal to straight people while also believing that only the latter are entitled to marriage?

I’ll be honest:  I do not find these to be difficult questions.

No, you cannot oppose gay marriage—or any other gay right—without the disease of homophobia coursing through your veins.  Thinking that gays are beneath the institution of marriage is precisely to think that heterosexuals are a superior human species—a view otherwise known as homophobia.

“Defending” traditional marriage is homophobic by definition.  You can’t have one without the other.  To say that these two people can receive a marriage license but those two people cannot is axiomatically to think that the former are more deserving of the American dream than the latter.

Hence the absurdity of Carson’s statement.  He wants to have it both ways, but how could this be?  If you believe—as Carson apparently does—that gay people are entitled to equal protection under the law, how could that protection not include the right to get married?

Officially, marriage is nothing more than a legal contract between two consenting adults.  It’s a secular institution whose broader meaning is determined by those who enter into it.  Conservatives can bang on and on about what marriage is “for”—commitment, sacrifice, procreation, serving God—but the truth is that marriage is whatever each individual couple makes of it.  It is neither possible nor desirable for the government to make those decisions for them.

If you truly thought that all men and women are created equal, then the notion of withholding marriage from gay people wouldn’t even occur to you—just as prohibiting marriage between interracial couples wouldn’t make sense to anyone who believes in equality of black and white.

The reality is that most Americans are adept at holding utterly contradictory views in their heads, and most of the time they don’t even realize they’re doing it.  This has been true since the founding of the republic (see:  Jefferson and slavery) and we can expect it to continue until long after we’re all dead.

The far more interesting trend—and a welcome one at that—is the degree to which homophobia itself has fallen out of fashion.

Even as the country remains fairly divided on same-sex marriage—the current split is 60 percent in favor, 37 percent opposed—very few people today are comfortable with being viewed as anti-gay.  Even as they espouse policies that are obviously and deliberately discriminatory toward gay folks, they are very careful to launch into a “some of my best friends are gay” routine, insisting that their opposition to gay rights should not be construed as opposition to gay people.

It’s a ridiculous and hypocritical stance—an insult to the intelligence of anyone who made it through kindergarten—but it’s also indicative of how thoroughly gay people have been integrated into polite society.

Remember:  It was as recently as the 1980s that gay people were so marginalized by their leaders that, when Ronald Reagan’s press secretary was asked if the administration was aware of a “gay plague” known as AIDS, the entire press room erupted in laughter.

In the 1990s, anti-gay animus was so strong that Bill Clinton—a Democrat!—was able to sign the Defense of Marriage Act and institute “Don’t ask, don’t tell” without experiencing any real pushback from the gay community because, hey, what other option did they have?

The difference between then and now in enough to give you whiplash.  Not only is same-sex marriage legal from coast to coast, but gay people are so visible in every walk of life—including positions of power—that the straight community has no choice but to treat them like human beings.

Or at least to give lip service to that effect.  A chunk of Americans remains opposed to affording gays equal protection under the law, but—as if taking a cue from Pope Francis—they are far less cavalier than they’ve ever been before, concealing their true feelings behind inclusive and compassionate rhetoric.

Today, you can’t even be a Democrat unless you offer full-throated support for every plank of the dreaded “gay agenda,” and you can’t run for president as a Republican without at least pretending to have a few gay acquaintances and acknowledging that homosexuality is, in fact, a real thing.  (I wish we could say the same for climate change.)

But let’s not be cute about it by letting opponents of gay equality off the hook.

Yes, I am aware of many good people who support “traditional” marriage and, by all outward appearances, harbor no prejudice toward their gay colleagues and treat everyone with respect.  They regard their views on marriage as an honest disagreement—invariably informed by their religious faith—and not, in any case, as an expression of bigotry, intolerance or blind hatred.

Well, of course that’s how they feel.  In any great debate about civil rights, everyone wants to view themselves as the hero—the person on the “right side of history.”  Being several generations removed from when, say, George Wallace could proudly stand at a podium and bellow, “Segregation now, segregation tomorrow, segregation forever,” we no longer allow ourselves to hold prejudicial views without performing an elaborate sleight of hand to convince ourselves and others that we are not the villain in this story.

Sorry, but it won’t wash.

In this era of equality, you can no longer get away with threading this particular needle without looking like a disingenuous nincompoop.

If you’re going to support anti-gay legislation, then you have to own the fact that—whether you realize it or not—you, yourself, are anti-gay.

If you don’t want to be tarred and feathered as an intolerant prude, then quit advocating for a society that withholds basic rights from an entire group of citizens on the basis of their emotional attractions.  That, after all, is exactly what an intolerant prude would do.

If you truly believe, à la Ben Carson, that “the Constitution protects everybody” and “there is no reason that you can’t be perfectly fair to the gay community,” then join the rest of us in effecting a system of laws that actually are perfectly fair to the gay community—namely, laws that don’t care whether your significant other is a man or a woman, because why on Earth should that make a difference?

Show, don’t tell.  Either you believe that we’re all equal before the law, or you don’t.  Sooner or later, you have to pick a side.