Playing It Straight

What ever happened to acting?

That’s what I wondered after reading Matt Damon’s controversial new interview this week in the Guardian.  Asked whether it’s still difficult to be openly gay in Hollywood, Damon—who is openly straight—responded in the affirmative, then offered the following advice:

I think you’re a better actor the less people know about you period.  And sexuality is a huge part of that.  Whether you’re straight or gay, people shouldn’t know anything about your sexuality because that’s one of the mysteries that you should be able to play.

Rarely has a beloved celebrity been so right while also being so very, very wrong.

Damon was asked the gay question because of his recent performance as Liberace’s boyfriend in HBO’s Behind the Candelabra.  The implication is that it’s much easier for a straight actor to play a gay character than the other way around.  That is, audiences are more willing to accept a straight person “acting” gay than a gay person “acting” straight.

Historically, this hypothesis has proved true beyond dispute.  Pick any moderately-successful recent film with gay themes and/or prominent gay characters, and you’ll find they all have one thing in common:  heterosexuals.

Just in the last year or two, for instance, we have had Benedict Cumberbatch in The Imitation Game, Jared Leto in Dallas Buyers Club, John Lithgow and Alfred Molina in Love is Strange and Mark Ruffalo in The Normal Heart.  (That last one was technically a TV movie, but who’s counting?)  Before that, of course, there was Heath Ledger and Jake Gyllenhaal in Brokeback Mountain and Sean Penn and James Franco in Milk.

Know how many same-sex relationships those actors have had in their collective lifetimes?  You could count it on the fingers of two fists.

Taken in isolation, this doesn’t necessarily strike me as a problem.  The names I just listed include some of the finest performers working today and I wouldn’t trade those performances for anything.  A straight person is allowed to be gay in a film.  As a wise man once said, there’s a reason they call it acting.

But that’s only one half of the issue.  The other, much more challenging part is the natural follow-up:  Where are all the gay movie stars?

Why is it that, in a supposedly liberal Hollywood in a supposedly gay-friendly epoch of American history, virtually all of the great gay and straight roles go to heterosexuals?  Is it because the major studios still treat gays the way they treat black people and women over 40—namely, as an inessential niche commodity?  Or is it simply that there are no bankable gay actors available to fill these roles?

In the Guardian interview, Damon cited the British thespian Rupert Everett as evidence that “coming out” can actually damage an actor’s career—that is, by precluding him from ever again being cast as a strong heterosexual lead, out of fear that audiences won’t buy such a character if they know the man playing him is a queer.  (“It’s tough to make the argument that [Everett] didn’t take a hit for being out,” Damon said.)

The implication is clear:  If you’re an aspiring gay actor interested in success above all else, you’re better off staying in the closet forever.  Just like in sports, high school and the Republican Party.

It’s worth noting—to use Damon’s own example—that Rupert Everett came out in 1989, which was an entirely different universe from the one we currently inhabit.  It would be ridiculous to suggest that a closeted actor’s fears of coming out today are identical to those of a quarter-century past.

That is, until you take a look at today’s Hollywood and realize how shockingly little has really changed.

Here’s a simple challenge:  Name any successful openly gay film star in, say, the last decade who has achieved his or her success in mainstream cinematic fare, post-coming out.

The list is achingly short and comes with several key caveats.  Almost without exception, the members of this elite club are either British, female and/or primarily involved in television or theater—artistic arenas that, for various reasons, are much more sexually equitable than film.  Even a certified A-lister like Neil Patrick Harris—who has proved, more or less single-handedly, that an “out” entertainer can conquer just about every artistic medium simultaneously—has yet to become anything resembling a cinematic leading man, and neither has anyone like him.

Which is all to say that Matt Damon has a point.  If being openly gay is not a hindrance to success in Hollywood, the evidence is pretty damning nonetheless.

That’s the bad news.  The question is whether this could ever change.  Should closeted actors continue to feign straightness to advance their careers, or are truth and self-respect more important?  It’s all well and good to trump honesty and equality above all else, but when those values necessitate risking your very way of life—and a lucrative one at that—it is not irrational to hedge your bets.

And besides, for all the flak Damon has drawn for suggesting that actors should conceal their true selves from the public—up to and including their sexual preferences—the idea is not without real merit.

Personally, I think it’s kind of neat for a great actor to be utterly penetrating on the screen and a total mystery in real life.  I like the notion of actor-as-chameleon—someone, like Meryl Streep or Daniel Day-Lewis, with a superhuman ability to assume the character of anyone else but whose own character remains largely, if not purposefully, unknown.

As a rule, I frankly don’t care what my favorite movie stars do in their spare time, just as I’m not much interested in what my favorite politicians or athletes do in theirs.  While this is hardly a prevailing view in our hyper-voyeuristic culture, it’s one I would recommend all the same.

However, to advocate, as Damon did, that America’s entertainers actively withhold basic information about their personal lives in the interest of objectivity is completely insane in the context of today’s world.  While I don’t for a moment think Damon meant to come off as homophobic, the logic of his theory leads us to no other conclusion.

To wit:  When, in the entire history of forever, has a well-known heterosexual person been compelled to hide the existence of an opposite-sex spouse for the purpose of appearances?  Under what possible circumstances would this be seen as a reasonable request?  Is it not utterly demeaning to both parties to carry on their relationship behind closed doors because, hey, audiences might get the wrong idea when the next big movie is released?

It’s completely idiotic and unworkable, and a huge insult to the intelligence of American moviegoers, most of whom—I dare say—are capable of holding opposing ideas in their heads at the same time.  You know, ideas such as “Matt Damon normally makes love to a woman, but for two hours on HBO, he will make love to Michael Douglas, because that’s what actors do.”

In fact, while straight couples are never expected to keep their private affairs under wraps, gay couples are frequently under pressure to do exactly that.  Whether the pressure is external—say, having to conceal a relationship to get into the army in the “Don’t Ask, Don’t Tell” days—or internal—such as wanting to avoid an international incident with relatives at Thanksgiving—the practice of hiding major components of your day-to-day life in the interest of self-preservation has been a part of the gay experience since time immemorial, and one that most members of the gay community would be happy to put behind them once and for all.

As of late, this has certainly begun to happen in the entertainment industry, as it has in most other walks of life.  Closeted actors are coming out in greater numbers than ever before, and audiences have taken it in stride, recognizing that actors (for the most part) are human beings who are entitled to personal happiness like everyone else.

If Matt Damon wants to implore his colleagues to stop revealing so much about themselves to the press and online, he is welcome to try.  For all we know, it might restore a degree of majesty and class to this great art form, creating icons instead of mere personalities.

But let’s not kid ourselves that there is a straight line (so to speak) between being a great actor and being unknowable in real life.  Many of the greatest stars of all time had private lives every bit as lurid and public as those of today, yet audiences could somehow tune them out once the lights dimmed and picture started.

The way Damon talks, you’d almost think he was from Mars.

Advertisements

Republican Holy War

The problem isn’t that Ben Carson wouldn’t vote for a Muslim president.

The problem is that few other Republicans would, either.

The problem isn’t that Donald Trump dignified the insane anti-Islam rants of some random crank.

The problem is that a massive chunk of all GOP voters share those same toxic views.

It would be bad enough if the men representing one of America’s two major political parties happened to be a bunch of xenophobic cretins.  But it’s worse than that because, as it turns out, a plurality of their fans are, too.

In other words, the GOP primary’s rank bigotry isn’t a bug.  It’s a feature.

Nor is the party’s contempt for certain Americans limited to Muslims.  At various junctures, Republican candidates have demonstrated robust, unchained hostility toward immigrants, women, homosexuals and unbelievers, among others.  And their supporters have followed them every step of the way.

Not all of them, of course.  Perhaps not even a majority.

But if there is any measurable difference between Democrats and Republicans, it is that the latter are significantly more likely to harbor open suspicion and disapproval of minorities—individually and collectively—on the basis of their minority status.

In a recent Gallop poll, we find that while 73 percent of Democratic respondents would vote for a qualified presidential candidate who happened to be Muslim, only 45 percent of Republican respondents would do the same.  Similarly, although 85 percent of Democrats would vote for a gay candidate, only 61 percent of Republicans would as well.  For an atheist candidate, the party split was 64 percent versus 45 percent, respectively.

While those numbers are nothing for either faction to brag about, the gulf between the two is unmistakable, and it leads us to a fairly obvious conclusion:  As it currently stands, the Republican Party is a one-stop shop for paranoia, hatred and prejudice toward anyone who seems even slightly foreign to some preconceived, mythical idea of what makes someone a “real American.”

Yes, many self-identified Republicans are sane, decent folks.  Yes, there are many components of GOP dogma that have nothing to do with shunning minorities and other undesirables.  Yes, conservatism itself is still a perfectly legitimate means of thinking about the world.

And yet I wonder:  Why are there any “moderate Republicans” left?  At this point, isn’t that phrase a contradiction in terms?

Case in point:  If you happen to think that all Muslims are terrorists and all gays are perverts, then it makes perfect sense that you would align with today’s GOP.  Their values are your values.

But if you don’t think those things—if you find the denigration of entire classes of people to be juvenile, unattractive and dangerous—then why would you throw in with a political party that loudly and proudly does?

Notwithstanding whatever else you might believe—say, about taxes or foreign policy—why would you join arms with an organization that—at least in its presidential candidates—has adopted enmity and ignorance as its defining characteristics?  What’s the appeal in belonging to a gang so fundamentally unappealing?  After all, you can always vote for Republicans without being one yourself.

The explanation, I suppose, is roughly the same as why so many Catholics remain committed to their church, in spite of its history of raping innocent children and using every means necessary to cover it up.

That is:  Many people are quite skilled at keeping utterly contradictory ideas in their heads and somehow still getting through the day.  They compartmentalize, embracing virtue while ignoring or overlooking vice.

And in the end, it is religion where the Republican Party exerts its most breathtaking feats of hypocrisy and self-deception.

In fact, Ben Carson’s infamous rumination on Meet on the Press about the dangers in electing a Muslim president contained the most telling statement any candidate has yet made on the subject of mixing religion and politics.

To the question, “Should a president’s faith matter?” Carson responded, “I guess it depends on what that faith is.”  As far as most Republican candidates are concerned, that’s exactly right.

The GOP fashions itself as the champion of religious freedom—defender of the clause in the First Amendment that says, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.”

Don’t believe it for a minute.  The GOP would love Congress to make a law respecting the establishment of religion, and the only religion its leaders are interested in exercising freely is their own.

When that ridiculous Kentucky clerk refused to issue marriage licenses to gay couples because she is personally opposed to same-sex marriage, she informed the media that “God’s law” takes precedence over man’s law, and when certain Republicans defended her willful disregard of the latter, they defined her “struggle” precisely in terms of a religious war.

How often we have heard—from nearly every major and minor candidate—that Christianity is “under attack” and being “criminalized” because those who don’t believe in gay marriage—ostensibly for Biblical reasons—now have to grin and bear the fact that the Supreme Court has ruled against those beliefs.  Mike Huckabee, the self-appointed leader of the cause, said, “No man […] has the right to redefine the laws of nature or of nature’s God.”

I wonder:  What exactly is the difference between that statement and Sharia law?  The latter, of course, is the idea—popular in the Middle East—of running a legal system based on teachings in the Quran and other Islamic holy works, rather than on any precepts devised by man.

In principle, there is no difference at all.  Huckabee and the king of Saudi Arabia apparently agree that the word of God is more important than the rule of law, and that an individual’s own religious convictions can and should overrule any rule that comes into conflict with them.

And yet—amazingly—it is these same cultural conservatives who attack and condemn Sharia law at every opportunity, insisting that some nefarious Islamic cabal is secretly plotting to bring Sharia to the United States and is this close to succeeding and—my God!—what a horrible world it would be if America became an oppressive, Bronze Age theocracy.

Read those last few paragraphs again and tell me this isn’t the most spectacular double standard in recent American politics.  Taking them at their word, GOP leaders evidently think that religion in the public square is both good and bad, that holy books are simultaneously more and less authoritative than the Constitution, and that Christians—who represent 70 percent of the U.S. population—are under threat, while Muslims—who are less than 1 percent—are on the verge of taking over the whole damn country.

The logistical cartwheels in this reasoning are enough to give you whiplash.  The term “Schrödinger’s cat” springs curiously to mind.

In reality, though, the thinking is straightforward and simple, and it’s exactly like Ben Carson said:  Christianity good, Islam bad.  God is great, except when his name is Allah.

Once you convince yourself—as Carson and company have—that Islam is fundamentally incompatible with living in a free society like ours and that no individual Muslim could possibly adopt America’s values as his or her own—a self-evidently absurd idea—then it becomes quite easy to make comically hypocritical statements like the above and somehow think you’re being principled and consistent.

But these guys aren’t.  They believe in religious freedom when the religion is Christianity and when the “freedom” involves preventing gay people from leading fulfilling lives.  I’m sure the irony of the latter will sink in sooner or later, although we probably shouldn’t hold our breaths.

In the meantime, we would all do well to remind ourselves that freedom means nothing if it only applies to certain people and that the United States, for all its religious citizens, does not have an official state religion and does not take sides in religious fights.

This did not happen by accident.  In the fall of 1801, a group of Connecticut Baptists sent an urgent letter to the new president, Thomas Jefferson, pleading for protection against religious tyranny by a rival sect.  Jefferson’s famous response, which guaranteed such protection, intoned that “religion is a matter which lies solely between man and his God” and that the Establishment Clause of the First Amendment amounted to “a wall of separation between Church and State.”

As Christopher Hitchens used to say:  Mr. Jefferson, build up that wall.

Stirring the Pudding Pot

Is it finally time to let boys be girls and girls be boys?

That question has been wafting across the culture for a while now.  Last week, it made it all the way to Harvard.

Or, more precisely, to a beloved Harvard institution called Hasty Pudding Theatricals.  In this case, the question is:  Can women perform in a comedy show whose entire appeal depends on its total lack of women?

To explain:  Hasty Pudding Theatricals—Hasty Pudding for short, “The Pudding” for shorter—is a 220-year-old troupe of Harvard undergrads who every year write, direct and perform an original musical farce, complete with dazzling costumes and sets, knockout song-and-dance numbers and the sorts of juvenile puns and double entendres that one can expect from America’s greatest university.  The show runs in Cambridge six days a week from early February through early March, followed by a brief tour in New York and Bermuda.

The catch—or should I say, the draw—is that, although roughly half the characters in the show are women, all of the performers are men.  Basically, it’s a drag show with a storyline and a full orchestra, and however the plot unfolds, it always ends with the entire cast in matching dresses and heels, flawlessly pulling off elaborate dance formations and Rockette-style high kicks.

In short, it’s a grand old time at the theater.  When I first experienced it in 2003, I thought it was just about the funniest damn thing I’d ever seen on the stage.  A dozen years later, my enthusiasm has waned barely at all.  For sheer cheeky ridiculousness, it’s still one of the best shows in town.

Today, however, change is in the air.  In our 21st century culture of gender equality and limitless opportunity for all, the elephant in the theater has finally been addressed:  Shouldn’t we get a few women up there on the Hasty Pudding stage?

That’s what 17 women in particular wondered this past week when they turned up to audition for roles in this year’s production.  However serious some of them might have been, several made it plain that their presence at the auditions was largely symbolic—a means of forcing the issue as to how much longer Hasty Pudding can remain a male-only clique before modernity catches up with it.

To be clear:  Hasty Pudding Theatricals is not actually an all-male organization.  For decades, female students have been equal partners in the show’s writing, music, costumes, set design and every other component of the technical and creative process.  It is only in the casting that the “no girls allowed” rule takes effect, and the reasons for this—up until now, at least—have made absolutely perfect sense.

To wit:  If the whole joke is that men are pretending to be women, how can you toss actual women into the mix while still making the joke work?  If Hasty Pudding is to welcome women into its cast, won’t it require changing the very nature of Hasty Pudding itself?

Neither of those is any great mystery.  In reverse order, the answers are “sort of” and “fairly easily.”

I mentioned how, in a typical Pudding production, half the characters are male and the other half are female.  Well, then:  Why not cast men to play the women, as usual, and then cast women to play the men?  If cross-dressing is the show’s core competence, why not take it to the max?  Why subvert the conventions of one sex when you could just as easily subvert both?

Through the prism of today’s sensibilities, it becomes evident that, by restricting its cast to only men, Hasty Pudding may well have deprived itself of a great deal of priceless comedic material during the 167 productions it has created thus far.  After all, is there any reason to think that a woman dressed as a man would be any less funny than a man dressed as a woman?

To the extent that we don’t know this already—i.e. we don’t have too many examples to draw from—we can blame several millennia of sexism that allowed men to do whatever the heck they wanted and women to do very little at all.  (In Shakespeare’s time, for instance, it was perfectly normal for male actors to play Juliet and Lady Macbeth, while female actors weren’t even a thing.)

And sure, expanding the Pudding cast would alter the club’s identity a bit and cause a certain chunk of Harvard traditionalists to bow their heads in mourning over the death of a tradition that has existed since the Van Buren administration.

But once that happens, it will almost surely give way—as every other ceiling-breaking moment has—to the collective realization that we should’ve taken care of this years ago and there’s really no excuse for why it took so damned long.

The more difficult issue, though, is whether there are instances—even today—in which discrimination based on gender is justified.  Even if Hasty Pudding doesn’t qualify, its continued existence demonstrates how the imperative of gender equality is not always as black and white as it seems.

In the 1970s, there was a prolonged, highly contentious showdown on this subject in the form of the Equal Rights Amendment.  This proposal—first introduced in 1923, shortly after women secured voting rights— stipulated, “Equality of rights under the law shall not be denied or abridged by the United States or by any State on account of sex.”

It sounds innocuous enough—a no-brainer if there ever was one—but opponents of the ERA quickly alerted the public to various unintended consequences that such a law would, or might, create.  Among these were the end of certain special protections traditionally afforded women, such as alimony and child custody, as well as the integration of all the country’s women-only colleges and universities.  After all, what are Wellesley and Bryn Mawr if not institutions that deny admission to male applicants on account of sex?  In 2015, does this really make any more sense than a college that denies admission to girls?

Then there are sports.  Professional leagues like the NFL and Major League Baseball currently contain no female players, but we all know it’s only a matter of time before they do.  Now that Kristen Griest and Shaye Haver have become the first women to survive Army Ranger School—a feat that will likely accelerate the integration of the Armed Forces themselves—systematically shunning female athletes from traditionally male sports leagues will increasingly be seen as quaint, pointless and unacceptable, as it already has in the fields of business, politics, science and the arts.

Or perhaps not.  In the end, it all depends on whether we think men and women are more different than they are similar, and whether the differences are significant enough that they need to continue to be enshrined in law.

Or, in this case, the Harvard student handbook.

Ted Talk

Over Labor Day weekend, I paid my first visit to the newest civic attraction in Boston:  The Edward M. Kennedy Institute for the United States Senate.

It was awesome.

This institution, which opened in March but was first conceived in 2003, is a great gift to the public and a noble step in the direction of fostering a more informed electorate.  And wouldn’t you know it:  In the way it has been planned and realized, it even stands a fair chance of being appreciated by the sorts of Americans it most vitally needs to reach.  Namely, young ones.

Let us begin with the center’s name, which is perhaps more revealing than it sounds.  It’s the Edward M. Kennedy Institute for the United States Senate.  That is to say, it is a museum dedicated to the history and mystique of the Senate itself—our country’s most exclusive and powerful deliberative body—but is also, simultaneously, a loving and pointed tribute to a single member of that club—namely, the youngest brother of John F. Kennedy, whose presidential library sits just across the parking lot, overlooking Boston Harbor.

This means that, like a presidential library, this new civic tourist trap is a history museum with a point of view.  In this case, the view that the U.S. Senate has, in the end, been a force for good for the American republic.  That, for all its shortcomings, the Senate is the most indispensable and effective means of making life better for ordinary citizens.  And, finally, that Ted Kennedy—who represented Massachusetts from 1962 until his death in 2009—embodied all the best traditions of Congress’ upper house and is a sterling example of what can be achieved therein.

Considering Massachusetts politics and the extraordinary influence of the Kennedy family within the commonwealth and without, it is entirely predictable that such a place would exist.

However, I am sort of curious what Republicans and other conservatives might think about this temple to progressivism and its particular spin on the meaning of America.

On the one hand, right-wingers will presumably gag at the sight of “Lion of the Senate,” a roomful of Senator Kennedy’s greatest hits, from his support for the rights of women, gays and the disabled to his opposition to the Vietnam War to his lifelong pursuit of free healthcare for every man, woman and child.

For liberals, these issues and more are the essence of good government and what it means to forge a more perfect union.

But for conservatives, Ted Kennedy has always personified what they most fear and detest—namely, the idea that the central purpose of the Senate—and of the federal government as a whole—is to create a fair and equitable society and to spend lots and lots of money doing so.  Au contraire:  Republicans, as a rule, believe government should be as unobtrusive as possible and avoid meddling in issues that should be considered on a more local level.

There’s also the fact that, in the summer of 1969, Kennedy killed a 28-year-old woman, Mary Jo Kopechne, by accidentally driving his Oldsmobile off a bridge (Kopechne was in the passenger seat), then waited nearly 10 hours before notifying the authorities.  And yet, because of his illustrious last name, Kennedy was never arrested or charged with manslaughter.  He pleaded guilty only to “leaving the scene of a crash after causing injury” and didn’t spend a single night in jail.

For skeptics, this episode is Exhibit A in the charge that the Kennedys have always thought of themselves—and been treated by others—as being above the law.  The family remains “American royalty” not just because its members are so widely admired, but also because they seem to get away with things that mere mortals do not, from stealing elections to cheating on their wives to allowing young women to drown without calling the police.

If Ted Kennedy was able to redeem himself through 47 years of genuine hard work and accomplishment on Capitol Hill—a debatable but widespread view—so does the Ted Kennedy Institute justify its existence through its loving and meticulous treatment of the house in which Kennedy served.

However you might feel about Congress today, this museum takes great pains to explain how the Senate actually functions—pulling away the curtain to reveal what the day-to-day job of a senator entails.  For a certain chunk of wonky, nerdy right-wingers—the ones who quote Alexander Hamilton by heart and carry a miniature copy of the Constitution in their pockets—this joint may just win the day.

The literal and figurative centerpiece of the Institute is a life-sized reproduction of the Senate chamber itself, complete with 100 wooden desks and a visitor gallery above, and every so often, the museum’s PA system will summon everyone in to debate a bill.  Not a pretend bill, mind you, but an actual piece of legislation that has recently been introduced in Washington, D.C., and is being marked-up and argued about as we speak.  (The day I was there, the issue was whether to require vaccines for all children enrolled in Head Start programs.)

Here, as there, a suited-up museum employee reads a summary of the bill, after which two of his colleagues recite arguments for and against.  At this point, the mike is passed around the room for anyone else to add their two cents (upon being formally recognized by the presiding officer, that is).  Once that is done, everyone present whips out their tablets and clicks “Yea” or “Nay,” with the final tally emblazoned on a giant screen behind the podium.  (Admittedly, such a screen does not yet exist in the real Congress.)

The idea—simple but powerful—is that there is no more effective way to teach people the inner workings of government than to actually bring them along for the ride.  Legislating, like all activities, is best learned through active participation—rather than, say, reading a book or hearing a lecture—and democracy, of all things, should not be a spectator sport.

The people behind the Ted Kennedy Institute instinctively understand this, and their efforts at making the legislative process comprehensible to young (and old) audiences is commendable, not least because it is so rare and so desperately needed in our selfish, ignorant culture.

We are all well-aware of the abysmal attention spans of today’s youth, and I cannot imagine that a conventional museum about government—with long paragraphs of prose plastered all over the walls—would quite do the trick.

To be sure, the Ted Kennedy Institute contains a great many of those paragraphs—indeed, the sheer volume of information in this building is staggering—but you feel neither bored nor bombarded by them, thanks to the inspired idea of making everything electronic.

I mentioned that the mock Senate votes are done on tablets.  In fact, the entire experience is done on tablets, because every visitor is given one immediately upon checking in.  Like audio guides at more traditional museums, these devices allow you to wander the halls at your own pace and summon the sorts of information you are most interested to learn.

Yes, there is writing on the walls—with such predictable subject headings as “What is the Senate?” “Senate Milestones” and, of course, “How a Bill Becomes a Law”—but the words dance around and are interspersed with video and graphics so that your mind doesn’t wander too far before something new pops up.  The Institute’s designers realize that quick facts and choice anecdotes are the secret weapon for getting the average American interested in otherwise boring subjects, and they have been diligent in peppering these exhibits with plenty of both.

Will it work?  Will the hordes of field-tripping schoolchildren who visit the Edward M. Kennedy Institute for the United States Senate leave with an appreciably greater knowledge of how their country runs than when they arrived?

It is hard to conceive that they won’t.  If they do, it’ll be their own fault, because this place tries just about everything.

To be sure, it’s a crying shame that teaching civics in classrooms is such a dying art form that it needs to be exported to cultural institutions.  But at least those institutions exist, and this one in particular seems cognizant of how urgently its services are needed.  The opening of the Ted Kennedy Institute is not going to revolutionize the level of civic engagement in America, but it’s a start.

The Beginning of an Era

Tuesday night at 11:30 is the first episode of The Late Show with Stephen Colbert on CBS.  You might as well tune in.  It’s going to become the best talk show on television—if not the best show, period—and you’ll want to be able to brag that you were present at the creation.

That may sound like a slightly cavalier prediction for a program that hasn’t even aired yet.  It’s not.  At least not for anyone who watched nearly all 1,447 installments of The Colbert Report on Comedy Central between 2005 and 2014 and has some idea of what to expect from David Letterman’s heir apparent.

Really, though, you only need to sample a few random segments of Colbert’s Report to understand what the rest of us have known for years:  Stephen Colbert is one of the great TV entertainers of our time and the most versatile comedian in the English-speaking world.

If his selection as the new Late Show host seemed odd at first, he will very quickly assuage any concerns that skeptical American audiences might have about his capacity to attract a national following.  Indeed, those in the heartland might be pleasantly surprised by how irresistible he is.

Of course, there has been an overabundance of feature stories about Colbert’s network debut in recent weeks, nearly all of which have focused on the supposed mysteriousness of who the “real” Stephen Colbert is.  Considering how his Report persona was an over-the-top caricature—a fictional character, more or less—it is a reasonable question to ask.

Or rather, it would be…if the Internet didn’t exist.

In fact, the answer to, “What is Stephen Colbert like in real life?” is—like 99 percent of all the questions ever asked by humanity—readily available online to anyone with enough energy to run a Google search.

And that is a research project I would recommend, because while seeing Colbert in character—as a willfully ignorant right-wing blowhard—is a singularly pleasurable experience, seeing him out of character—whip smart, witty and warm—is gratifying in an entirely different way.

Taking these two personalities together, it gradually dawns on you that the secret to Colbert’s success can largely be reduced to two distinguishing, interconnected and indispensable characteristics.

First—and most obviously—Colbert is a master class in the art of improvisation, otherwise known as thinking on one’s feet.

And second, he is a polymath—a term most succinctly defined as “someone interested in everything, and nothing else.”

If you require concrete evidence of both and have an hour to kill, I might suggest the conversation between him and Neil deGrasse Tyson at New Jersey’s Montclair Kimberley Academy in January 2010.

Tyson—as you may know—is an astrophysicist and director of the Hayden Planetarium at the American Museum of National History in New York.  He is also—as you may not know—the most frequent guest in the history of The Colbert Report.

Yup:  On a show that was ostensibly about politics and other current events, it was a man of science with whom Colbert jousted more than anyone else.

Why?  Because the “real” Stephen Colbert is genuinely intrigued by the mysteries of the cosmos—and by science in general—and he quickly decided there is absolutely no reason his TV alter-ego shouldn’t engage with them as well.  True, the “fake” Colbert always approached such snooty, intellectual pursuits with disdain, but—as with everything else in that topsy-turvy world he created—the cheekiness of the act was mere window dressing for some truly enlightening exchanges.

For a classic example, consider his 2008 encounter with Stanford professor Philip Zimbardo.  When Zimbardo, promoting his book The Lucifer Effect, remarks that hell and evil were both created by God, Colbert interjects, “Evil exists because of the disobedience of Satan. […] Hell was created by Satan’s disobedience to God, and his purposeful removal from God’s love […] You send yourself to hell.  God does not send you there.”  To this, Zimbardo sheepishly responds, “Obviously you’ve learned well in Sunday school.”  With a grin, Colbert fires back, “I teach Sunday school, motherf—er!”

He does, indeed.  After a busy week of lampooning conservative ideologues before an audience of godless liberals, he retires to his hometown church to talk to children about Jesus.  What a country.

In his wide-ranging hour-long Q&A with Neil deGrasse Tyson—in which both men are free to speak as their true selves—Colbert opens with the philosophical query, “Is it always better to know than not to know?”  Tyson, the scientist, immediately responds, “Yes,” before adding, “But someone else might have a different answer.”  Then Colbert:  “For instance, Oedipus might have a different answer.”

That’s just a snippet, but it gives you an impression of the breadth of Colbert’s talents as an entertainer.  Specifically, his ability to blend genuine curiosity and intellectual depth with wholesale silliness and innuendo, all the while trusting his audience to follow along with him because, hey, if you’re savvy enough to tune in, then you’re probably also savvy enough to a) get the Oedipus joke, and b) understand why it’s not entirely a joke.

Historically, when it comes to late-night network talk shows, American audiences expect one of two things from their host:  Either that he’s straight-laced and well-informed (e.g. Charlie Rose, Jim Lehrer), or that he’s goofy, entertaining and can maybe-sorta hold his own in the face of a more serious guest (e.g. pretty much everyone else).

In this particular TV universe, Colbert will be the first to combine the best of both, and it will be a glorious sight to behold.

Best of Enemies

It’s almost too obvious to mention, but when it comes to religious liberty in America, we are in the midst of a veritable golden age.

The First Amendment to our Constitution begins, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof,” and damned if we haven’t nailed it in the last many years.  The right to live according to the dictates of one’s faith has never been stronger, and there is little indication that this will change in our lifetimes.  As ever, we don’t realize how lucky we are.

Whether you are a Christian, a Sikh or a Seventh Day Adventist, you can travel to your place of worship on Sunday (or whenever) totally unmolested by your government or, with rare exceptions, your fellow citizens.  Observant Jews can wear kipot and refrain from eating pork, while Muslims can pray five times a day and…refrain from eating pork.

While being a member of the “wrong” religion can get you shunned, maimed or murdered in many other countries of the world, America is truly a land of pluralism—a nation that, at least on paper, protects its most vulnerable citizens just as robustly as its most populous.

Indeed, the inclination toward granting each other religious freedom is so forceful—such a prevailing view—that we are now having a semi-serious debate about whether the right to one’s faith-based opinions actually entitles an individual to break the law and deny the civil rights of other individuals.  Yes, even if that particular individual happens to work for the government.

Of course, I am referring to the one-woman crusade currently being waged by a Kentucky county clerk named Kim Davis.  As an observant Christian, Davis has refused to issue marriage licenses to same-sex couples, because doing so would violate her religious beliefs.  This in spite of the fact that, since June 26, gay marriage is the law of the land in all 50 states.

In effect, the issue is whether the First Amendment’s “free exercise” clause can ever supersede the rule of law.  In other words, can the word of God take legal precedence over the word of Congress or the Supreme Court?

As we have seen, this question has precisely one correct answer.  By refusing to issue marriage licenses to couples who have every right to obtain one—even after the nation’s highest court explicitly ordered her otherwise—Davis has been held in contempt and carted off to jail.  While, as an elected official, she cannot technically be “fired,” it doesn’t look terribly likely that she will remain in this job much longer.  And rightly so:  Why should Kentucky taxpayers be compelled to pay a clerk for not doing her job?

Much has been made of the disclosure that Davis herself has been married four times and divorced thrice.  Personally, I’m still reeling from the fact that, five months after divorcing Husband No. 1, she give birth to twins who were adopted by Husband No. 2 but were, in fact, fathered by Husband No. 3.  (Feel free to read that sentence again.)

Of course, all of that is perfectly legal and we should never judge or make assumptions about anyone’s marital history.  Relationships are complicated, and marriage is messy even under the most ideal circumstances.

On the other hand, marital infidelity is clearly and definitively condemned in the Bible and, in Deuteronomy, is punishable by death.

Kim Davis has said she performs her official duties in accordance with the Biblical definition of marriage.  It begs the question:  If she really means that, then why hasn’t she hired someone to kill her?

Happily for everyone, she plainly doesn’t mean it.  She is against homosexuality for reasons all her own and, like every Christian, she handpicks the Biblical passages that align with her views and ignores the ones that don’t.

This is not to suggest that her beliefs are not sincerely held.  It just means they are not held for the reasons she claims and that she is a massive glittering hypocrite when it comes to enforcing holy writ.

Of course, as an American, she is fully entitled to be the horrible person that she is and to believe whatever the hell she wants.  That’s the very definition of religious liberty and no one would dare force her to think differently.  If we all agreed about everything, we wouldn’t need a First Amendment in the first place.

However, we are nonetheless a society in which laws reign supreme over religion, and it’s precisely because we have so many different religions that can each be interpreted in a billion different ways.  While it might be amusing to imagine a culture in which everyone can ignore any rule they disagree with, the idea of actually doing it doesn’t even pass the laugh test.

Put simply:  To say the First Amendment includes the right to deny someone else a marriage license makes no more sense than saying the Second Amendment includes the right to commit murder.

Certainly, there are countries in which “the authority of God” (as Davis called it) has final say over who gets to live or die, let alone who can get married or not.  Of course, these countries tend to be predominantly Muslim and their system, known as “sharia,” is universally condemned—particularly by American conservatives—as medieval and antithetical to everything that Americans hold sacred.

How curious, then, that many of these same conservatives (read: half the GOP presidential candidates) are now defending this very same principle when the God in question is a Christian one.  How peculiar that defying settled law through Islam is repulsive, but doing the same through Christianity is just fine.  I’m sure there’s a non-racist, non-homophobic explanation for this somewhere.  As an atheist, I regret I’m not the best person to find it.

In any case, I didn’t come here to talk about Kim Davis, as such.  Really, I would just like to take a moment to underline how unbelievably lucky the gay community has been lately with respect to its would-be antagonists.

It would have been one thing if the self-appointed poster child for upholding “traditional marriage” were someone who actually engaged in the practice herself.  Someone who could credibly claim to be holier than thou.

That this particular mascot for following “God’s will” happens to be a raging phony is not merely hilarious; it also demonstrates just how phony her entire argument is.

To be clear:  Davis’ personal morality has absolutely no bearing on the legal arguments vis-à-vis her behavior as the Rowan County clerk.  Her actions would be contemptuous and absurd regardless of how many husbands she has had.

That, in so many words, is the point:  The law does not care about morality.  The law exists whether you agree with it or not, and applies to all citizens equally.  Further, if you happen to be a public official whose one and only job is to carry out the law, then your opinion of the law does not matter.  Either you do your job or you resign.

But of course, this doesn’t negate the role that ethics play in our day-to-day lives, and this is where Davis has become the gay rights movement’s new best friend.

Now that same-sex marriage is legal in all 50 states—and will almost certainly remain that way forever—there is nothing left to concern ourselves with except for the proverbial “changing hearts and minds.”

And where persuading people of gays’ inherent humanity is concerned, what finer image could there be than a thrice-divorced heterosexual turning her back on a homosexual couple attempting to get married just once?  In what possible universe does the person who has cheated her way through three marriages assume the moral high ground over couples who are embracing this sacred institution afresh?  What possible threat do those couples pose to society or morality, other than the possibility that, in time, they may turn into people like Kim Davis?

The Reckoning, Part 2

 In general, life is complicated.  So is politics.  And so, especially, is politics as it relates to race and class.

However, every so often a big public controversy erupts that would lead any honest person to wonder, “Is there anything here that cannot be explained by good old-fashioned racism?”

That question popped into my head multiple times during the new HBO drama Show Me a Hero, whose final two-hour segment aired this past Sunday.

This spellbinding series—the latest from David Simon, creator of The Wire—recounts the racial powder keg that exploded in the city of Yonkers, New York in the late 1980s—a socioeconomic showdown over desegregation and public housing that might well have stayed buried in the past were it not for its obvious parallels to events in the present.

Certainly, the circumstances that led the good people of Yonkers to very nearly lose their minds spawned from legitimate and complex concerns about the well-being of their neighborhoods.  But they were also—on the basis of this show, at least—borne of the fact that a bunch of rich white people really, really didn’t want to live on the same block as a bunch of poor black people.

They insisted it wasn’t about race.  Of course it was about race.

Here’s the deal.  In 1985, a federal judge ordered Yonkers—a city of 190,000 immediately north of the Bronx—to build 200 units of low-income housing in and around its most affluent neighborhoods.  This was essentially a means of desegregating a community in which most of the white folks lived in the nice part of town while most of the black and Hispanic folks lived in slums.

If the city council failed to approve such a plan, the judge continued, then the city would be held in contempt and fined exorbitant sums of money until either a) the council came to its senses, or b) the city went bankrupt.

You’ll never guess what happened.

That’s right (spoiler alert!):  Egged on by their raucous, angry constituents, the Yonkers City Council voted to defy the court’s order to build public housing, thereby incurring daily penalties that soon totaled in the millions, resulting in the suspension of basic city services and the closing of several public institutions.  While the ensuing outrage ultimately forced the council’s holdouts to change their minds, the damage was done and the point was made.

In short:  The white residents of Yonkers were prepared to destroy their own city rather than have a handful of black people living nearby.

It’s almost not enough to call this racism.  It’s a psychosis that exists in a realm beyond racism—a pathology that has convinced itself that segregation is the natural order of the universe and must be defended at all costs.  And all based on the notion that one group of human beings is superior to all the others.

To be sure, there were other forces at work in this struggle.  The fourth-largest city in New York did not almost bring about its own demise solely because of abnormally high levels of white supremacy inside City Hall.  Allocating public housing in a big city is a messy and contentious business under any circumstances.  Not everyone is going to be treated fairly.

Indeed, the “official” argument against desegregation in Yonkers was economic:  If you move a bunch of lower-class families into an upper and middle-class neighborhood, the overall desirability of that neighborhood will decline, and property values will slide right along with it.  If you’re a homeowner who plans to sell one day, of course you want to prevent a precipitous decline in your home’s value in whatever way you can.

But in watching Show Me a Hero, you cannot help but suspect that racism is always, finally, at the root of the problem.  That if people viewed each other as equal human beings, rather than as members of alien tribes, then most of the other conflicts would either cease to exist or become infinitely easier to resolve.

The most compelling evidence for this is the character of Mary Dorman, played with great subtlety by Catherine Keener.  As one such homeowner, Dorman begins as a vehement opponent of the low-income housing plan, publicly carping about property values, et al, while privately confiding to her husband, “These people, they don’t live the way we do.  They don’t want what we want.”

But then something unexpected happens:  She starts spending time with “these people” as a member of the transition committee—a group that essentially handpicks which families will get to move into the new townhouses—and she discovers that, lo and behold, poor black people do want what “we” want and do live the way “we” do, to the extent that their circumstances allow it.

Now, about those circumstances.

We take it as a statistical truth that poor neighborhoods in big cities are disproportionately non-white and contain disproportionately high levels of crime.  That’s to say nothing of how this affects incarceration rates and the chances of success in higher education and employment many years down the trail.

The $64,000 question is:  Why might this be?  How did it happen that folks with darker skin are—by a huge margin—more likely to find themselves impoverished, unemployed or in jail?  Are black and Hispanic people inherently lazier and more violent than white people, or is there something more institutional at work?

Following many decades of study and a little bit of common sense, we find the answer staring us directly in the face.  While there are multiple layers, it can essentially be explained in two words:  housing discrimination.

As Ta-Nehisi Coates definitively showed in his devastating Atlantic cover story, “The Case for Reparations,” white people and the U.S. government spent a great deal of the 20th century actively preventing black people from ever owning a home—and, consequently, from accumulating real wealth.

Through the process of “redlining,” black house hunters were shut out of entire neighborhoods in most major U.S. cities, and in the places they were allowed to live, they could not obtain regular mortgages and had to depend on loans that were neither guaranteed nor honestly granted.  In an interview, Coates described this system as having combined “all the problems of renting with all the problems of buying and none of the rewards of either.”

In other words, housing segregation occurred by design, not by accident.  It had nothing to do with the personal behavior of the black folks who were being victimized, and everything to do with an effectively white supremacist government that made it very nearly impossible for African-Americans to achieve the American dream.

After nearly a century of this madness, to turn around and blame it all on black people who wear their pants too low is to portray a spectacular historical ignorance that, in our culture, is more or less par for the course.

Indeed, here is a classic example of where basic knowledge of the past can yield intelligent decisions in the present and future.

Most critically, to know that housing segregation was a plot intended to keep black people out of polite society is to understand that desegregation is a national moral imperative—one small step in our collective reconciliation with America’s broken soul.

Once you grasp that our country’s appalling wealth gap is a direct consequence of that racist system and that narrowing the gap will improve the quality of life for everyone, then it becomes perfectly sensible to expand affluent neighborhoods to include residents who, in an equal society, would have gotten there anyway.

In the process, both groups will get to know each other on a one-to-one basis, which is the surest means, in any society, of reducing prejudice and fear.  It was no coincidence that support for same-sex marriage skyrocketed at the same time that gay people made themselves visible to straight people in record numbers, thereby implanting this crazy idea that we are all equally human.

Prejudice is a function of ignorance, which in turn is a function of physical separation among different groups of people.  Really, it’s all just a variation on fear of the unknown, and the way to eradicate that is to make the unknown known.

This doesn’t mean we’re not still going to hate each other from time to time.  It just makes it far more likely that we’ll hate each other for the right reasons—namely, for the content of our character, rather than the color of our skin.

The people of Yonkers learned this the hard way, but they learned it nonetheless.  While housing desegregation might not have solved all of that city’s problems, it nonetheless fostered a more open and integrated community in which a greater number of people had a fair shot at making a better life for themselves.

Call me naïve, but I consider that progress.