Sorry Seems to Be the Hardest Word

Why can’t men ever admit when they’re wrong?

I do not consider myself a particularly quick-tempered individual, but I do experience acute irritation at the sight of somebody rigorously denying a fact that is sitting directly in front of his nose, refusing to accept responsibility for even the most obvious indiscretions.  Indeed, this being the case, I’m amazed I’ve followed politics for as long as I have.

America is a land of far too much certainty and not nearly enough doubt.  In our leaders we look for unwavering strength, which we have somehow conflated with being deficiently introspective.  We don’t mind a president who occasionally commits an error, but he had better be prepared to deny it at every turn.  God forbid he acknowledge he is human and learn from his mistakes.

Politics means never having to say you’re sorry.

Or perhaps not.  Toward the end of Oliver Stone’s Nixon, as the walls of Watergate are closing in and Richard Nixon finds himself shackled down by the weight of his own stubbornness, his chief of staff H.R. Haldeman muses to John Ehrlichman, “Eight words in ’72:  ‘I covered up.  I was wrong.  I’m sorry.’  The American public would have forgiven him.”

A tickling thought, is it not?  At the moment Watergate broke, possibly they would have.  After all, Nixon had just won the election of 1972 by a score of 49 states to one; sentiment was on his side.  That old maxim, “It’s not the crime; it’s the cover-up,” suggests nothing so much as nipping a problem in the bud, before it takes on a life of its own.  Instead, knowing Nixon’s character as well as anyone, Ehrlichman can only respond, “Dick Nixon apologize?  That’ll be the day.”

If Nixon’s fatal flaw was pride, for many pols today it is fear—namely, fear of the people’s wrath.  Fear of being seen as weak and easily cowed.  Fear of shame and disgrace and consequences unknown.  And all predicated, of course, on the fear of losing one’s job.

What these representatives of ours do not consider is the possibility that this public they so dread can smell it on them at first whiff, and is not terribly impressed.  Maybe—just maybe—the American infatuation with strength-through-certainty has been vastly overstated over time, and today is but a figment of lawmakers’ imaginations.  They project unto us the fact that they just can’t handle the truth.

Long stretches of the George W. Bush administration seemed rooted in the assumption that it was preferable to be viewed as wrong rather than reflective, particularly regarding the Iraq War, leading to denial after denial of basic facts about WMDs long after the original assertions had proved false.  Bill Clinton, with his well-known rolodex of extramarital conquests, in 1998 opted to stonewall and proclaim Monica Lewinsky was not one of them, rather than cop to a fact that any American could have easily guessed.

With this behavior, did Clinton and various Bush officials not look like unholy fools?  Further, there is little evidence their anti-apology wagers did them any good in the first instance:  Upon finally apologizing for the whole messy affair, Clinton’s famously high approval ratings barely nudged, and Bush’s own begun their steady decline in conjunction with ever-worsening news from Iraq, neither ever recovering.

Mitt Romney, nebulous on most other topics, has clearly and tirelessly made the point of “not apologizing for America.”  Indeed, he wrote a whole book titled, No Apology.  While apologizing on behalf of the United States is not quite equivalent to apologizing on behalf of yourself, the underlying attitude is the same.

Saying that the U.S. government should not apologize under any circumstances, Romney can mean only one of two things:  Either that the United States never does anything wrong, or that it is simply not responsible for acknowledging when it does—justified, in Romney’s mind, by the presumption that America errs only with the best intentions.

In fact, President Reagan in 1988 apologized on behalf of the U.S. for Japanese internment during World War II.  Ten years later, President Clinton offered remorse for America’s lack of intervention in the genocide in Rwanda.  In June 2009, the U.S. Congress passed a resolution that apologized for slavery and Jim Crow.  Myriad further examples abound.

In other words, contrition in government has its precedents and its benefits.  It is downright silly to act as if it does not.  I say, then, that men ought to show a bit more respect for their fellow primates by not concealing or embellishing when they err, but rather to fess up and acknowledge reality when it creeps up and bites them in the patoot.

Team Players

“Take one for the team.”

Anyone who ever played Little League baseball understands the phrase, along with pretty much everyone who didn’t.

We have heard it quite frequently during the 2012 campaign.

Paul Ryan and his defenders, faced with Ryan’s uncomfortable history of voting for various debt-inducing spending sprees during the George W. Bush administration, have explained that although Ryan is personally against runaway government spending, he cast such votes in the spirit of Republican solidarity.  He was taking one for the team.

Todd Akin, the Missouri congressman who recently suggested women can’t become pregnant upon being raped, has faced near-universal pleading from fellow Republicans to cede his Senate contest to a slightly less radioactive candidate, to ensure the seat is won by a Republican.  He should take one for the team.

This adage—succinctly defined by Urban Dictionary as “willingly making a sacrifice for the benefit of others”—carries undeniable appeal when viewed from a certain angle.  The image of self-sacrificial nobility, of Rick Blaine on the runway insisting, “The problems of three little people don’t amount to a hill of beans in this crazy world,” serves as a healthy rebuke to our culture of ruthless ambition and loose morality.  We as a society would probably be poorer without it.

It also, unfortunately, demonstrates much that is wrong with government today and why those of us who abstain from partisan affiliation find the state of our two-party system so repulsive.

Primarily, we must explain what we mean by “team.”  In baseball, “taking one” usually refers to being hit by a pitch or tagged out in such a way as to advance the base runners and win the game, at the expense of one’s own stats or even physical safety.  We can argue about the morality of this (particularly when it involves 9-year-olds), but the logic is sound, as everyone on the team has the same goal, namely winning.

A team of lawmakers does not operate in quite the same way, despite media tendency to cover politics as if it’s a sporting event.

Who are the politicians we say we most esteem?  Nearly every voter, if asked, claims to hold special admiration and respect for those known for being “independent,” who will “stand up for what they believe” and “put country ahead of party.”  That is to say, for men and women who are not “team players.”

Members of Congress are individuals, each with his or her own views and each representing a specific piece of our absurdly diverse country.  They, like the president, swear an oath of loyalty to exactly one entity:  The U.S. Constitution.  Beyond that, it is left entirely to the individual to decide whether to err on the side of one’s party, one’s constituents or one’s own conscience, should any of these conflict.  Very rarely do they not.

Senator Scott Brown (R-MA) reacted to the Todd Akin brouhaha by issuing a letter to RNC Chairman Reince Priebus that read, “Even while I am pro-choice, I respect those who have a different opinion on this very difficult and sensitive issue […] Our party platform should make the same concession to those of us who believe in a woman’s right to choose.”

As a Massachusetts Republican up for re-election, Senator Brown has the unenviable task of appealing to an exceptionally liberal constituency without invoking the ire of his own exceptionally conservative party.  The abortion issue crystallizes his dilemma:  Whatever his personal views, he would very unlikely be elected in the first place should he toe the party line; meanwhile, his failure to adopt this most elementary of Republican doctrines effectively tars him as a RINO (Republican in Name Only) in the minds of party elders, ensuring he will never be fully trusted as a national figure within the party.

By contrast, the Next Vice President of the United States is very much a head honcho in the GOP, first by virtue of his across-the-board party views on social issues, and second for his stated opposition to all things big government.

Interestingly, in one such instance of Paul Ryan betraying his apparent anti-spending principles—voting “aye” for an auto bailout in December 2008, against a heavy majority of fellow Republicans—he explicitly explained that his concerns were local:  “At the forefront of my mind are jobs in Southern Wisconsin,” he said, understanding that “taking one for the team” would have come at the expense of a whole lot of regular Joes who don’t necessarily sympathize with Ryan’s usual doctrinaire approach to government spending.

Was this not the correct answer?  Should not a U.S. representative vote in the interests of those he or she represents, regardless of party affiliation and whatever damage it might do to the “team”?  And if that’s the case, what use is the team in the first place?

Romney and the ‘Image’ Thing

On the occasion of next week’s Republican National Convention in Tampa, which opens Monday and concludes Thursday, this week the New York Times ran a behind-the-scenes feature detailing the convention planners’ foremost challenge:  “Selling” Mitt Romney to the public, in the hopes of “paint[ing] a full and revealing portrait of who Mitt Romney is.”

As the Times duly notes, this will prove a rather delicate task, as Romney has made something of a second career of attempting, like so many poor souls before him, to be all things to all men.  Much has been written, for the better part of the half-decade that Romney’s been running for president, about Romney’s tendency to change his public views about a given subject at a moment’s notice, implying that he has no core and is a pure opportunist.

To then make him appear “genuine,” as it were, is not so much a herculean task, but an impossible one.

Taking Romney’s political shape-shifting as a whole—that is, as a man who generally portrayed himself as a so-called “moderate” while governor of Massachusetts and now generally portrays himself as “severely conservative”—one of four things has to be true.  One, he really was a moderate who experienced a series of conservative epiphanies in a very short time; two, he is a moderate pretending to be conservative; three, he is a conservative who formerly pretended to be moderate; or four, he has no political convictions of any kind.

Of course, I have just addressed two separate and distinct matters:  First, whether Romney is a genuine person; second, whether Romney appears to be a genuine person.  The convention is responsible for the latter; only Romney himself can do anything about the former.  Conceivably—since 21st century politics is so very much dependent on perception rather than reality—if a candidate succeeds in appearing authentic, he therefore is authentic.

It’s the essence of cynicism:  Purposefully deceiving voters based on the wager that they’re not informed or savvy enough to realize it.

A textbook example:  Barack Obama on same-sex marriage.  You will recall how the president announced his support for marriage equality this past May, following nearly an entire term of “evolving” on the issue.  Intriguingly, as a state senate candidate in 1996, Obama wrote in a questionnaire, “I favor legalizing same-sex marriages, and would fight efforts to prohibit such marriages,” before publicly opposing them during his runs for the U.S. Senate and the White House.

The president would have us believe that his history with the subject is on the level—that he is, effectively, the only person in America to voice for gay marriage before voicing against it (before voicing for it again)—and how dare anyone infer political calculation along the way!

Yet Obama more or less got away with this, largely because so many supporters acted as co-conspirators at every juncture, figuring that a leader who publicly opposed (but privately supported) marriage rights was preferable to one who opposed them both publicly and privately—the good old “lesser evil” theory at work.  The president’s flagrant cynicism was a big, fat open secret to friend and foe alike.

That’s how Romney now seems to me.  All but the lowest of the “low-information voters” (the current shorthand for the dumb and the apathetic) seem to have gotten the memo that the former governor plays fast and loose with his political convictions and that, if elected, his policies as president are therefore very difficult to anticipate.

So why deny it?  Isn’t this always the case, anyway?  Has a president ever carried out policy the way he vowed to do as a candidate?  Like a major league slugger, a president who succeeds one-third of the time (as Obama currently has) can reasonably be considered an effective commander-in-chief.

The reason for this, as we so often forget, is that the power of the presidency is far more limited than the campaign season would suggest.  As President Obama quickly learned, even the most disorganized and unpopular Congress in the modern era can turn any major initiative into a crumbling bridge to nowhere.

Faced with this particular reality, it would be my fondest wish—my Aaron Sorkin/Will McAvoy fantasy—for Mitt Romney to come clean and present himself exactly as he is:  A man who wants to be president badly enough that he will bend his views to suit the circumstances of the time and the people whom he will serve.  To embrace his perceived weakness and argue—as he very plausibly could—that it is actually a strength and a necessity.  To have the courage of his convictions to say that he does not have the courage of his convictions.

Now there’s an image for you.

A Webless World

I was slightly delayed in posting my most recent column to this page because, due to a minor snafu with my apartment’s cable provider, I was denied access to the Internet for a full 48 hours in the middle of last week.

When I first switched on my computer and discovered this lack of a connection, not knowing the cause or how long the outage might persist, I proceeded to act as any grounded, technologically-attuned twentysomething would in such a circumstance.

I freaked out.

Eventually I calmed down, knowing the blackout would not last forever, and took to my traditional rotisserie of break-glass-in-case-of-emergency time-fillers, such as going for a walk and (I shudder at the thought) reading a book.  When I couldn’t take it anymore, I whipped out my trusty Android and, through its satellite hookup, surfed the web.

The question begs itself:  What would we ever do with ourselves without the Internet?  Not for the piddling two-day window I experienced—rather, forever.  My peers, of the so-called Millennial Generation, have the distinction of remembering what life was like before the ubiquity of the web, yet utterly unable to picture—and function in—such a world today.  We are fond of asking our parents, “How did you survive without the Internet?”  Well, how did we?

At the risk of generalizing:  Americans have proved themselves impressively adept at adapting to new ways of life, particularly in the field of technology.  Baby boomers have hopscotched from one means of listening to music and watching movies to another with admirable grace.  Of course, the young plow through ever-snazzier iterations of telephone, computer and music player at dizzying speeds.  And while old folks might resist the clarion call of newfangled gadgetry more than the population as a whole, there are nonetheless far more grandmas and grandpas with Kindles and Facebook accounts than any of us could have foreseen, say, five years ago.

But adapting backward?  Maybe not so easy.

It was George Carlin who theorized that, should the entire world suddenly be denied electricity, it would take all of two years for our planet to plunge back into a barbaric cavemen hellhole, so ill-prepared to cope would we be.  Never mind that Homo sapiens existed in so-called “behavioral modernity” form for 50,000 years prior to the industrial age.  Once we have effectively outsourced to computers all human activity that used to be done manually, to then revert to our old ways…well, as the lady said, “How ya gonna keep ‘em down on the farm after they’ve seen Paree?”

The truly frightening thought, then, is that this is such a truly frightening thought.

You don’t need me to count the ways, and the degree to which, the Internet now controls every one of our lives—in both a micro and macro sense, in activities ranging from the essential (buying 12 jars of peanut butter from Amazon.com) to the more mundane (regulation of the global economy).  As conventional wisdom has it, the primary function of the Googletube has not been to introduce mankind to wholly new activities and concepts, but rather to make everything we were already doing exponentially quicker, easier and (on a good day) cheaper.

Has it been worth it so far?

Nicholas Kristof wrote one of the more enjoyable New York Times columns of recent weeks, titled, “Blissfully Lost in the Woods,” in which he recounted a recent 200-mile hike he took with his daughter, exalting the great wilderness as “an antidote to our postindustrial self-absorption.  It’s a place to be deflated, humbled and awed all at once.”  He mourns the declining interest in activities such as fishing, hunting and backpacking, suggesting implicitly there is more happiness to be found out in the wild than in the uber-connected virtual metropolis we inhabit most of the time.

Certainly the notion of Getting Away From It All is nothing new, but the need to embrace it seems to grow stronger with each successive generation.  After all, when Henry David Thoreau wrote Walden, Thomas Edison was seven years old and tweeting was strictly for the birds.  Compared to our own time, how different from the bustle of downtown Concord in 1854 could the woodlands immediately outside downtown Concord possibly have been?

Today, precisely because our society has so dramatically advanced in so little time, living in any kind of serene technological austerity is an exponentially more radical departure from the norm than ever it was in the past.  For that reason, engaging in exactly that behavior—if only for short bursts every so often—is every bit more desirable, if not outright essential, for our own well-being and perspective.

What I learned, in short, is not to wait for the next severance in the Internet-time continuum to appreciate an existence without a web hookup sitting so reliably on my desk or in my back pocket.  Our species plugged along without either for an impressive span of time; if compelled, we could probably do it again.  It is ingrained in our DNA:  Sooner or later, you have to answer Nature’s call.

The Unbearable Whiteness of Being

The jokes came fast and furious from all quarters, but leave it to the Onion to have the last word.

“Mitt Romney, Paul Ryan To Awkwardly Hug, High Five For Next Three Months,” read the headline on the satirical news website, above a photograph of the Republican candidates for president and vice president more or less doing exactly that.

This came but a mere week after another Onion zinger at Mitt Romney’s expense:  An article titled, “Romney Stuck In Endless Loop of Uncomfortable Chuckling,” also with an accompanying image so pertinent and amusing you almost can’t believe it’s real.

You can’t say satire does not imitate life:  It took only until Monday for an actual piece of reporting in the New York Times to include the sentence, “[Romney and Ryan] rarely left each other’s sight, exchanging hugs, backslaps and knowing smiles, as if they knew their time together might be short-lived.”  Is that not the most precious thing you ever did read?

Amidst the arguments that have begun in earnest about issues of great national consequence—nearly all of them concerning our highly tenuous economy—the newly-chiseled fact of a Romney-Ryan GOP ticket additionally sears back into the collective American unconscious the Republican Party’s most enduring image:  Gawky, middle-aged, male white dorks.

For quite a while now, there has existed a state of affairs in the Caucasian community (if I may employ such a term) whereby nearly all of its members have tacitly agreed that black people as a group are in every way hipper than white people as a group.  In a bit from the 1990s, comic George Carlin put it simply:  “White guys, let me tell you something.  You’re never going to be as cool as black guys.  You’re white, and you’re lame.”

I have often wondered whether a favorable characterization of a racial or ethnic group (e.g. “black people are cool”) is still, technically, a form of racism.  A noteworthy distinction between the two major political parties is that Democrats tend to go out of their way to single out minority groups for praise, while Republicans—in line with their stance against affirmative action—try to act as if racial and ethnic categories do not exist, or at least should not determine how individuals are judged.

As we, as a culture, continue to sort this out, we can probably agree on the slightly less controversial proposition, widely practiced, that members of a group are allowed to make negative characterizations about themselves (e.g. “white people are lame”), especially if that group wields a hefty majority in the population.

To my knowledge, no one has ever accused Mitt Romney of being cool.  You might recall the moment during the 2008 campaign when Romney, standing for a photo-op with a group of black teens, spontaneously burst into a non-melodic rendition of “Who Let the Dogs Out?”  Priceless example of white dorkitude that it was, it illustrated how to be merely a nerd is not half as nauseating (yet adorable) as to be a nerd who lacks basic self-awareness.

Romney’s new running mate, Rep. Paul Ryan of Wisconsin, is a tougher case to crack, making me wonder whether there is a certain calculation behind the GOP team’s uber-vanilla image.

Accepting his appointment, Ryan took the stage last Saturday proclaiming, “I am deeply honored and excited,” in that disarming, matter-of-fact way that only a man who has spent a lifetime running for class president can.  To see him discuss economic policy with the likes of Charlie Rose or Chris Matthews is to witness an acrobat at work:  He did his homework and, gosh darn it, he is going to let you know it.  He is truly a policy wonk, as both friends and adversaries will attest, but then you read the latest column by Maureen Dowd and reflect that his frequent calls for non-ideological bipartisanship are just the slightest bit disingenuous.

Ryan, not unlike President Obama, depends for much of his appeal on his own personal charm—from appearing to care more about policy than politics, but arguably being more adept at the latter.

That is to say that, when it comes to himself, Ryan is expert at image-making, and that is why I pause to entertain that his and Romney’s friendly-neighborhood-white-guys motif is not quite as organic as it might appear.  That their quasi-unthreatening demeanors are a performance.

After all, they are running against the black guy this November.  Gambling that the stereotypes are true, they have perhaps determined that since out-hipping Obama is not a feasible option, their surest bet is instead to present America with a choice:  Elect us, and you can be certain the executive mansion at 1600 Pennsylvania Avenue will be white as snow, both within and without.

Hitch-22

It’s the old story.  Boy meets girl, boy casts girl, boy sexually harasses girl, boy is hailed as a master of his craft and is lionized from coast to coast.

On October 20, HBO will premiere a film called The Girl, which reportedly chronicles the relationship between director Alfred Hitchcock and actress Tippi Hedren during filming of Hitchcock’s The Birds in 1963.  Hitchcock has long been known for his tough, even callous treatment of his actors—particularly his legendary “blondes”—but The Girl and Hedren herself charge him with behavior far worse than previously understood (at least by me).

A scene in the film reportedly has Hitchcock (played by Toby Jones) coming on to Hedren (Sienna Miller), demanding she “make yourself available to me sexually.”  When she resists, Hitchcock retaliates by essentially torturing her with the many live and prop birds used in filming The Birds, at one point having one hurled at her through broken glass of a telephone booth without warning.

“We are dealing with a brain here that was an unusual genius, and evil, and deviant, almost to the point of dangerous,” says Hedren, adding, “If this had happened today I would be a very rich woman.”

For us onlookers, sitting in judgment, it’s a tough but regularly-occurring conundrum:  When an artist has done despicable things, are we still allowed to admire him for his art?  Is it possible to separate the two and, if so, how?

The dilemma of appraising the flawed artist springs up in many different contexts, and it is essential to treat each case on its own, specific terms.  Details matter.

A major consideration, to be sure, concerns the nature and severity of the indiscretions themselves.  For instance, we can watch Mean Girls without caring one whit about Lindsay Lohan’s troubles with alcohol, or listen to Bruno Mars despite his brief history with cocaine.  We understand the usual corrupting influences of show business and recognize we’re all human.  Not to mention that the substance of their art is not thematically connected to the substances in their digestive tracts.

Far more problematic, say, is the case of Roman Polanski, the renowned film director who in 1977 was charged with drugging and sexually assaulting a 13-year-old girl.  Following a plea bargain in which he would deny charges of rape but affirm “unlawful sexual intercourse with a minor,” Polanski fled the United States for legal immunity in France, where he remains to this day.

Polanski’s crime was hideous—unforgivable to most.  All the same, it was no more directly related to his work than pop stars’ drinking and drug habits are related to theirs.  So it is still possible to view and appreciate (and award Oscars to) films such as The Pianist and Chinatown without becoming completely overwhelmed by the atrocities of their creator.

With Hitchcock, the circumstances are more complicated still and, for an admirer of his such as myself, more painful to consider.  You see, it is part of the legend—in now-classic films such as Vertigo, Psycho and indeed The Birds—that the Hitch wrenched great performances from his leading ladies essentially by treating them like dirt on set.  Accused of having quipped, “All actors are cattle,” he dryly clarified, “What I said was all actors should be treated like cattle.”

In light of Hedren’s recollections, the legend is somehow not as charming as it once was.  Can we really now watch The Birds in forced ignorance of the real-life torment to which Hedren was subjected by her boss as soon as the scene cut?

What is especially disturbing, now that details of the ordeal have been planted directly in front of our noses, is how, in a way, we knew about this the whole time and chose to overlook it—to convince ourselves Hitchcock’s behavior could not have been as inappropriate as it evidently was.  As Hedren suggests, the ways in which men publicly humiliated women in the Mad Men days of the early 1960s, like other mass sins we might cite, was simply accepted as normal in its time, but was no less painful for those upon whom it was inflicted.

But here is a tougher question:  Setting aside the charges of outright physical abuse, how far do we permit our artists to go in the interests of creating great art?  What do we make of the prospect that Hedren’s performance in The Birds or Kim Novak’s in Vertigo (a film recently voted the greatest of all time) are so affecting as a direct consequence of the ways Hitchcock mistreated them during filming?  Are we altruistic enough to say we’d rather not have the films at all, for the sake of their stars’ well-being?  To what extent do the ends justify the means?

It is a shame, in a way, that Hitchcock died in 1980 and cannot be made to answer for these alleged acts.  All the same, it seems as fitting as ever that, when asked what he might want inscribed on his headstone when the time came, he volunteered, “This is what we do with naughty boys.”

Stranger in Your Own Land

I could never be a practicing Sikh.  I do not have the courage.

Following last Sunday’s shooting at a Sikh temple in Wisconsin, I stumbled upon an excellent short documentary from 2004 called, “Dastaar: Defending Sikh Identity.”  Directed by Kevin Lee, the film encapsulates the uncommon struggle of the India-based religion here in the States, particularly in the aftermath of 9/11, emphasizing the significance of the dastaar, or turban, as a symbol of the faith.

The essence of the issue is that Sikhism in America has fallen victim to a cruel and ironic case of mistaken identity:  Because its adherents wear turbans—as they are commanded to do—certain armed geniuses assume them to be members of the Taliban, al Qaeda or some other manner of Muslim.  In point of fact, Sikhism has no relationship whatever to any Islamic sects, except (ironically again) for fighting the occasional war against them.

While violence against American Sikhs was not unheard of prior to September 11, 2001, the attacks and ensuing paranoia about foreign-looking men with headgear engendered this confusion and produced a spike in high-profile “hate crimes” that continues today.  The Sikh Coalition, founded in New York on the night of 9/11, says it has received “more than 1,000 complaints of violence or discrimination against Sikhs since September 11, 2001.”  Prabhjot Singh, the organization’s operations director, very eloquently laments, “We’ve been attacked twice.  Once by the terrorists, and then by fellow Americans.”

“I can’t hide who I am,” Prabhjot Singh says in the film, underlining his faith’s present dilemma, “I am a Sikh because I wear a turban.  I am identified as a Sikh.”  Amardeep Singh, Sikh Coalition’s legal director, is even more direct:  “Our articles of faith require us to stand out.”

This is the component of the whole mess that I find most admirable, if not outright heroic.

The United States, for all its official freedoms, is a culturally conformist place most of the time.  To dress differently—in whatever way and for whatever reason—rarely endears one to the community.  When it doesn’t lead to violence, it leads to ridicule and suspicion or, at the very least, a general air of discomfort and unease from one’s fellow travelers.

In light of so many Americans’ misplaced and hostile perceptions about those with turbans, beards and long, flowing robes, it requires genuine nerve to assume the outfit nonetheless, risking safety for piety.  It is a tradeoff no law-abiding religious or ethnic group should be forced to make, so we must bow our heads in reverence to those who are so compelled and choose, with the utmost integrity, to err on the side of danger.

I have never had this problem.  I count myself a member of two religious minorities—I am a Jew and an atheist—but neither one enjoins me to proclaim it on a daily basis; I can keep either a secret if I want, and often do.  Observant Sikhs, with a great deal more to fear from exposure, are not afforded such a privilege.

In point of perspective:  Any discussion of religious persecution should include the disclaimer that, as a country in which to freely exercise one’s faith, the United States ranks higher than most.  We can be thankful that crimes against members of a religion are, in fact, looked upon as crimes (not all societies can make such a claim).  We are plagued by dumb people, rather than dumb laws.

Nonetheless, within our borders the Sikh community faces an unceasing and singular conundrum that it did nothing to deserve and, on balance, has handled with charity and grace.  It merits all the support it can get, and this is as good of a week as any to say so.

It is sometimes the duty of the coward to sing the praises of the brave.