A Year at the Museum

I’ve been an active museumgoer for many years, but in 2015 I hit critical mass.  If my ticket stubs can be trusted, I have frequented some 30 museums and galleries since the first of the year, eight of which I had never visited before.

What singular pleasures did these institutions bring?  I’m glad you asked.

In no particular order, and with apologizes to those who reside well outside the Northeast Corridor:

The Worcester Art Museum—50 miles west of Boston—recently acquired a warehouse full of armor and armaments—including a line of samurai swords—from the defunct Higgins Armory across town, which are on view in a now-ongoing exhibit, simply titled, “Knights!”

In the town of Lincoln, the deCordova Sculpture Park and Museum—just down the road from Walden Pond—put on a group show, “Walden:  Revisited,” with local artists’ interpretations of Henry David Thoreau’s 1854 novel about living for two years in a small, remote wooden cabin “to front only the essential facts of life.”

In Thoreau’s hometown of Concord, the Concord Museum organized “The Art of Baseball,” guest-curated by Doris Kearns Goodwin, featuring everything from baseball-themed paintings and sculpture to Carlton Fisk’s catcher’s mask and a set of World Series rings.

Boston’s Museum of Science mounted a spring exhibition, “Maya:  Hidden Worlds Revealed,” with a bustling collection of artifacts and reproductions from the storied Mayan civilization that thrived across Central America throughout the first millennium of the Common Era.

Also in Boston, the Massachusetts Historical Society presented “God Save the People!  From the Stamp Act to Bunker Hill,” which used primary documents to chart how the idea of the American Revolution came to be, long before independence was formally secured on the field of battle.

In the same vein, the Boston Public Library hosted “We Are One:  Mapping America’s Road from Revolution to Independence.”  Meanwhile, in the library’s Levanthal Map Center, there was “Literary Landscapes:  Maps from Fiction,” which included blueprints of such places as Oz, Narnia, Hogwarts and the Hundred Acre Wood.

A few miles north in Cambridge, the newly-reopened Harvard Art Museums is in the final days of “Corita Kent and the Language of Pop,” showcasing a lifetime’s worth of output from the nun-turned-artist most famous for sprucing up a giant gas tank along Route 93 in Dorchester.

Up in Salem—site of the witch trials and the Halloween capital of the world—the Peabody Essex Museum gave us “American Epics:  Thomas Hart Benton and Hollywood,” covering multiple facets of the prolific artist’s oeuvre, including children’s book illustrations, Hollywood movie posters and large-scale murals depicting scenes from American history.

Boston’s Museum of Fine Arts, which commissions dozens of special exhibits every year, hit the jackpot with “Class Distinctions:  Dutch Painting in the Age of Rembrandt and Vermeer,” whose 75 pieces—most on loan from other institutions—are as majestic and absorbing as any of their kind that you’ll see in one place.  The show runs through January 18.

Until January 10, at Philips Academy in Andover—alma mater of both Presidents Bush—the Addison Gallery of American Art has “Converging Lines:  Eva Hesse and Sol LeWitt,” demonstrating the friendship and creative influence—in multiple media—between those two innovators throughout their respective careers.

Boston College’s McMullen Museum presented “John La Farge and the Recovery of the Sacred,” implicitly arguing that the late 19th century painter and stained glass designer has been criminally underappreciated in the American art canon.

In Framingham, the Danforth Museum held the “New England Photography Biennial,” featuring the most arresting work—of every genre—by photographers all over the region.  Around the same time, the Griffin Museum of Photography in Winchester crowned the winners of an annual juried competition of its own.

Out in the Berkshires, the Clark Art Institute organized “Van Gogh and Nature,” a sumptuous collection of landscapes spanning the entire (albeit tragically short) career of America’s favorite Dutch Impressionist.

In the nearby town of North Adams, the Massachusetts Museum of Contemporary Art—Mass MoCA to you—granted considerable space for “Jim Shaw:  Entertaining Doubts,” a multimedia extravaganza that riffs on everything from the death of Superman to the life of Dan Quayle.  It continues through January 31.

Outside of New England, the Museum of Jewish Heritage near Manhattan’s Battery Park chronicles “Nazi Persecution of Homosexuals, 1933-1945,” pointing out that Jews were not the only group who suffered systematic discrimination and mass murder under the Third Reich.  The presentation was set to close in January, but has been extended to February 29, due to popular demand.

Farther south, until Saturday, the Library of Congress in Washington, D.C., presents “The Civil Rights Act of 1964:  A Long Struggle for Freedom,” a frightfully timely journey through the whole history of the fight for racial justice, which, as we are reminded, began more or less exactly when the words “all men are created equal” were put to paper in 1776.

A few blocks west, the National Museum of American History just opened a new wing devoted to American innovation, covering everyone from Benjamin Franklin to Steve Jobs.

And through January 24 in the middle of nowhere (actually, Hartford, Connecticut), the recently renovated Wadsworth Atheneum—the oldest continuously operating public art museum in America—has “Warhol & Mapplethorpe:  Guise & Dolls,” considering the relationship and thematic overlap between those two gender-bending renegades of photography and pop art.

That’s a sample of the delights and curiosities I encountered over the past 12 months, but it is by no means comprehensive.  Indeed, with all the temporary exhibits, I haven’t even mentioned the colossal works that these institutions house year-round—the permanent collections that can take a full day to see and a lifetime to appreciate.

For me, going to museums is a meandering search for transcendence—a reminder that the universe existed before I was born and that society extends well beyond the boundaries of my own hometown.  In this way, museums are humbling and inspiring at the same time, showing you that your own greatest accomplishments are peanuts compared to those of centuries past, but also affirming that, when all is said and done, the world is a pretty beautiful place.

Night and Day

If there is one thing I have learned for sure about Hillary Clinton, it’s that she is both better and worse than everyone seems to think.

Worse because of her ongoing paranoia, deceit and iron-fistedness vis-à-vis her quest for the Oval Office.

Better because of her wit, intelligence, compassion and jaw-dropping stamina as they relate to the exact same goal.

In the spring of 2008, I wrote an op-ed for my college newspaper in which I petulantly griped about how Hillary Clinton has a way of getting under your skin even as you find yourself agreeing with most of what she stands for.  How her single-mindedness and love-hate relationship with rules and facts tend to overshadow her finer qualities, even for those who are otherwise prepared to accept her as the standard-bearer for the Democratic Party.

Re-reading that article seven-and-a-half years later, I am somewhat alarmed by how well it holds up.  While my writing has matured (arguably), my hang-ups about a potential President Clinton Part II were pretty much exactly the same then as they are now.  They include:  Her penchant for making up stories when the truth is readily available for all to see; her brazen disregard for the rules whenever they are inconvenient; and her tendency, in any case, to exacerbate the little scandals that pop up whenever she is in power, invariably by blaming the whole thing on her would-be enemies, be they Republicans, foreign governments or a White House intern.

All of those quirks still apply, and must forever be held in consideration when one endorses Clinton for president or any other office.  As ever, a vote for Hillary is a vote for all the baggage that comes with her.  And that’s before we get into the issues that involve actual substance.  As the enduring success of Bernie Sanders demonstrates, there remains a great minority of Democratic primary voters who consider Clinton the wrong candidate at the wrong time and who, should she become the party’s nominee, might even stay home on Election Day rather than pull the lever for her.

Against all of that, however, I come bearing news:  Politics has changed a lot over the last two election cycles and we no longer have the luxury to vote only for candidates we like.  When and if we make it to November 8, 2016, most of us will be faced with two people whom we don’t particularly want to be president, but we’ll need to choose one of them all the same, because that’s how elections work.

I know:  This sounds like a “lesser of two evils” lecture.  It’s not, because presidential campaigns are not a choice between two evils.  Deciding to ally with Stalin against Hitler—that was a choice between two evils.  When we vote for a commander-in-chief, the decision is between not just individuals, but two opposing philosophies of how to run the government of the most important republic in the world.  There’s nothing evil about it, but the choice is stark nonetheless—now more than ever before.

If you think there is no meaningful difference between Republicans and Democrats, you’re not paying close enough attention.  If you’re unwilling to vote for either because their candidates just aren’t perfect enough, you’re a child and a fool.

Last Saturday’s Democratic debate drew only a fraction of the audience of any GOP contest this year.  That’s a real shame, because, if nothing else, it affirmed Bill Maher’s observation in 2008 that to see both parties talk, it’s as if they’re running for president of two completely different countries.

Case in point:  At the most recent Republican forum, you would be forgiven for thinking that 9/11 happened yesterday and that terrorism is the only thing worth caring about when it comes to the welfare of the United States and its citizens.  It was practically the only subject that came up, while such things as the economy, health care, infrastructure and even immigration received little more than a passing shout-out from any of the nine candidates.

The Dems spent plenty of time on terrorism, too—the San Bernardino massacre made it unavoidable—but they allocated equal, if not greater, emphasis on subjects that are—let’s be honest—considerably more urgent and germane to all of us at this moment in time.  Along with the issues I just mentioned, these included gun control, race relations, income inequality, college affordability and the fact that America’s prisons are overstuffed with people whose only “crime” was getting high and having a good time.

This isn’t your ordinary, run-of-the-mill disagreement over national priorities.  This is a dramatic, monumental clash over whether the only thing we have to fear is fear itself.  The whole GOP platform has been reduced to, “Be afraid all the time, because you could die at any moment,” while the Democrats act as if tomorrow might actually come and we might as well live and govern accordingly.

Is this the lowest bar we’ve ever set in the history of presidential elections?  Possibly.  Indeed, it’s downright depressing that the very act of governing is no longer seen as a given for anyone in public office.

What is far more depressing, however, is that so many citizens seem to think it doesn’t matter which party is in charge, or that both parties are equally at fault for all of the preventable problems that have occurred throughout the Obama era.  Neither of those assumptions is true, and there are tangible consequences to thinking otherwise.

Care for some examples?  Listen to the GOP’s own rhetoric:  If a Republican is elected president next year, it means the Affordable Care Act is in danger of actual repeal, as is the nuclear agreement with Iran.  It means reversing climate change is no longer a priority, along with the rights of black people, gay people, poor people, women, immigrants, Muslims and refugees.  It means the Supreme Court will net at least one conservative justice, which could easily lead to decisions adversely affecting all of the above and more.  It means our “war” against ISIS will almost certainly escalate to include actual boots in the sand, and God knows what impact that’ll have on our national debt (to the degree that anyone cares).

I realize, of course, that America’s conservatives would be thrilled by such results, but that’s not really who I’m talking to right now.

No, I would mostly just like to remind my fellow leftists that there is a limit to what your disgust with “establishment” Democrats like Hillary Clinton can accomplish.  Clinton is most certainly a flawed candidate, and a flawed messenger for the liberal view of good governance.  She is plainly compromised by her close relationship with the financial industry and remains insufficiently skeptical of large-scale military interventions in the Middle East.  She hasn’t yet mastered the art of damage control and offers little assurance that she won’t create more damage in the future.  A second Clinton presidency would guarantee a fair share of political nonsense from the day she arrives to the day she leaves.

Know what else it would guarantee?  Health insurance for tens of millions of people.  Funding for Planned Parenthood.  Increased protections for the LGBT contingent.  A more liberal Supreme Court.

And it would guarantee our first female commander-in-chief.  Sure, I know we’re supposed to be a meritocratic society that doesn’t care about race, sex, etc., but let’s not pretend that following our First Black President with our First Woman President wouldn’t be unimpeachably gratifying.  We already know beyond doubt that a woman can manage a country at least as well as a man—perhaps you noticed that, for the last 10 years, one such woman has been more or less running all of Europe—but wouldn’t it be great to have it actually happen here?

Of course, none of this matters during the primary phase of the campaign, where we are now.  So long as Democratic voters still have a legitimate choice between Clinton and Bernie Sanders (and, I suppose, Martin O’Malley), they have every obligation to argue about which option makes the most sense for where the party ought to be, and that choice is always a balance between ideological purity and perceived electability.  If you want Sanders as your nominee, you’d best make your case now, before it’s too late.  (I’ve already made mine.)

But should time run out and your preferred candidate lose, realize that our whole electoral system operates on the principle that the party is ultimately more important than any individual within it, which means a great number of people will be forced to compromise some of their deepest-held beliefs in the interest of party unity—because it’s better to support someone with whom you agree 60, 70 or 80 percent of the time rather than ensuring victory for someone with whom you agree not at all.

If total ideological alignment leads to total electoral defeat, then what good did those principles do you in the first place?  Republicans have been learning this lesson continuously since the moment President Obama was elected.  Are Democrats on the verge of making the same stupid mistake?

Little Miss Perfect

Right now, there are two types of people in America.

  1. People who think Adele is terrific.
  2. Jackasses.

I know, that sounds a bit harsh.  But deep down, y’all know it’s true.

By all means, you don’t need to love Adele Adkins’ music in order to be accepted into polite society.  You don’t need to listen to it at all, nor should you feel compelled to openly celebrate the mere existence of America’s British sweetheart the way the rest of the country so deliriously has.

All the same, we are experiencing a moment in which this one singer has sucked all the oxygen from the rest of the music industry, and it’s hard to fathom that any of us hasn’t taken a minute to decide what we think about it, and about her.  In our increasingly decentralized culture, Adele’s level of saturation on TV and radio over the past two months is the sort of phenomenon that doesn’t happen terribly often; who could resist weighing in?

If you’re on the fence, I’ll make it easy for you:  She’s great.  Everything about her—the voice, the look, the personality, the marketing strategy, everything.  She’s better than the hype and, frankly, more interesting, too.  Indeed, she is the sort of person who is just about impossible to dislike on the merits, which means that anyone who does dislike her probably has something else going on.

Admittedly, this may be a bit of a straw man situation.  To date, I have yet to locate any specific person who isn’t smitten by Adele to one degree or another (not that I’ve made any effort to look).  As Saturday Night Live so amusingly demonstrated last month, her music has a way of bridging divides between people who otherwise have nothing in common.  In extolling her virtues, I may be preaching to the world’s largest choir.

But that brings us to our main point, which is that Adele is essentially a purple unicorn:  Too good to be true, but true nonetheless.

In general, this sort of thing never happens, because Americans never come together on anything—not music, not politics, not sports, not nothing.  We are a tribal people.  Whatever the issue, we retreat into our corners and duel to the death.  We live to argue with one another, and in the rare instance when a large group of us does coalesce around a common idea, it usually springs out of hatred rather than love:  Hatred of racism, hatred of Donald Trump (or do I repeat myself?), hatred of hurricanes and blizzards, hatred of that jerk who shot Cecil the lion, and so forth.

So we’re good at mutual contempt.  But when was the last time we united in mutual joy?  Who was the last public figure to command universal, sustained approval and respect from virtually the entire American public?  Harder still, who was the last person who deserved it?  J.K. Rowling, for sure; otherwise, my mind is a blank.

I bring this up because I know there are certain folks (including me, sometimes) who get rather annoyed, if not alarmed, when some pop culture figure or other is hoisted onto a pedestal and crowned He (or She) Who Must Be Worshipped By All.  In a free society, it’s a wee bit sinister for our American (or English) idols to be chosen before the voting even starts.

“I’m tired of being told who to admire in this country,” said George Carlin in his 2008 HBO special It’s Bad For Ya, adding, “I’ll choose my own heroes, thank you very much.”

Of course he’s right.  The media has no business deciding which people are worthy of praise and which are not—particularly in the realm of popular music, where no two people share the exact same tastes.  Even professional critics, for all their wisdom and expertise, are limited by their own personal biases; some are decent enough to admit it.

In fact, the real problem lately has been the gulf between fame and actual talent.  Through such would-be icons as the Kardashian family, Justin Bieber and the aforementioned Trump, we are still a culture in which people tend to become famous for all the wrong reasons (or for no reason at all), which inevitably inhibits genuinely worthy individuals from ever breaking through.

As such, we jaded consumers—conditioned to expect this quantity-quality divide—grow suspicious and cynical whenever some new pop wunderkind rises to the top.  Based on past experience, why shouldn’t we?

Adele is the exception that proves the rule.  She is famous for the exact reasons she ought to be.  That she has been so successful is a credit both to her and us.

Here, after all, is a woman who does not bother with social media.  Who has a wicked sense of humor and cusses like a sailor.  Who enjoys her privacy but relishes every moment in public.  Who looks positively regal in concert formal wear, reminding the world that “plus-sized” is not a dirty word after all.  Who indulges in booze and cigarettes but shows no signs of being controlled by either.  Whose voice can alternately break your heart and send a shiver down your spine, and whose poise is unrivaled by any mainstream performer of her generation.

She is the complete package.  Practically perfect in every way.

Some people (I imagine) find this boring.  Some are simply not interested in buying what she’s selling—namely, old-fashioned, weepy ballads.  (It’s a shame her new album has no “Rolling in the Deep”-style anthems.)  And others—as I implied earlier—make a point of hating what everybody else loves for the sake of contrarianism.  Personally, I find that boring, but I suppose the Adele backlash is coming.  Is has to, right?  No one can sweep up this much positive attention without being tarred as “overrated” by somebody.

Or can they?

Time will tell, as it always does.  But at this moment—after 750 million views of “Hello” on YouTube, 15 million purchases of the new album, a ratings-crushing concert special, and a forthcoming world tour that sold out in mere minutes—Adele is sitting pretty in the eyes of the public in a way that few, if any, pop stars ever have.  It’s one thing to have millions of admirers, but it’s quite another to have no detractors.  Haters gonna hate, but in this case, apparently not.

Is this an unqualified good thing for our society, having one person to whom we all pledge undying adoration because of the joy she brings to our ears?  Or is it somehow unhealthy—an example of group mentality run amok, in which the greatness of a performer is established as an objective fact, and anyone who is skeptical is pressured to keep his or her mouth shut?

Nah.  It’s the first one.

Hitchcock Goes to Church

I thought I knew everything about Alfred Hitchcock, probably my favorite director of all time.  As it turns out, I didn’t even know what I didn’t know.

Playing in select theaters right now is a crackerjack documentary called Hitchcock/Truffaut, which recounts the time in 1962 when up-and-coming French director François Truffaut conducted an interview with the Master of Suspense that was so long and so deep that the resulting material, published as a book in 1966, runs some 368 pages and covers virtually every frame of every Hitchcock film.

Truffaut’s interview is considered a landmark in the history of cinema, because it marks the moment when Hitchcock began to be taken seriously by his peers.  Before Hitchcock/Truffaut, he was regarded strictly as an entertainer.  After the book was published, he became an artist and a renegade.  Today, he is considered arguably the most influential director who ever lived.

More noteworthy still is how much Hitchcock revealed about himself and his work.  Despite his reputation for being tight-lipped and (it must be said) a bit of a tyrant on the set, in his chat with Truffaut he pretty much gave the game away.

As such, perhaps the most tantalizing moment in the new documentary, which includes audio clips from the original interview, is the moment when Truffaut asks Hitch about the influence of his Catholicism in many of his most compelling works.  Hitchcock’s response:  “Go off-record.”  We hear a click, and everything goes black.

It was David McCullough who mused that you can learn an awful lot about a person from what he chooses not to say in public—particularly when he is perfectly willing to say so much else.  So perhaps if there is a “rosebud” to Hitchcock’s career, it can be found in his Catholic youth.

I must admit, I had no idea Hitchcock was Catholic.  Indeed, I had never given a thought to what religion he identified with, nor did it occur to me that such a thing might be relevant.

For some great directors, religion is inescapable—be it Catholicism for Martin Scorsese or Judaism for Woody Allen or Joel and Ethan Coen.  It’s not that their movies are necessarily about their faith so much as they are informed by the values and sensibilities that their faith espouses.  Taxi Driver could not possibly have been made by a non-Catholic and Annie Hall could not possibly have been made by a non-Jew.

You don’t get that sense with Hitchcock, whose movies are intended as mass entertainment above all else and possess no particular sensibility beyond wanting to give their audience a good old-fashioned thrill.

Or don’t they?

What changed my mind about this—what made me view Hitchcock’s work through a more theological lens—was seeing (for the first time) his 1953 film I Confess.  Based on an old French play, the story involves a priest who learns that a man has committed a murder, but because he hears this in the sanctity of the confessional, he cannot divulge any information to the police in their investigation of said murder.

This being a Hitchcock movie, the priest himself will eventually become implicated in the crime, thereby raising the stakes in his professional and spiritual obligation to “clergy-penitent privilege”—the notion that what happens in the confessional stays in the confessional.  By honoring his theological duty, he risks sacrificing his own freedom.  But by breaking his oath of confidentiality, he may well lose his job and, with it, his whole reason for being.

It’s a devilishly clever conceit—yet another variation on Hitchcock’s long-running theme of a man ensnarled in a legal bind from which there is no escape.

More than that, however, I Confess stands as one of the most singularly Catholic movies ever made by a major (and otherwise nondenominational) filmmaker.  The priest is played by Montgomery Clift—that most mysterious and charismatic of Hollywood stars—as a man undergoing a deep internal struggle over whether doing the “right” thing might involve turning his back on God.

It’s a performance of towering complexity—subtle, delicate and wrenching—in a movie that is brave and dignified enough to treat Catholic tradition with the gravity it deserves—in this case, the tradition of the confessional as a sacred space, even when that sanctity might allow a man to get away with murder.  Theological dilemmas don’t get much thornier than that.

It’s a measure of the movie’s nerve that audiences were not crazy about it when it was first released.  As recounted by Truffaut in his book, “[T]he public was irritated with the plot because they kept on hoping that Montgomery Clift would speak up.”  Hitchcock agreed, saying, “We Catholics know that a priest cannot disclose the secret of the confessional, but the Protestants, the atheists, and the agnostics all say, ‘Ridiculous!  No man would remain silent and sacrifice his life for such a thing.’”  When Truffaut asked if this disconnect served to weaken the film as a film, Hitchcock nodded, saying, with remarkable candor, “[W]e shouldn’t have made the picture.”

Here, in other words, was a movie more concerned with spiritual truth than with satisfying popular tastes.  That Hitch himself apparently disapproved of the final product only goes to show how personal the whole thing was, as if it was the one time he indulged whatever remained of his strict Jesuit upbringing, if only to get it out of his system once and for all.

However, even if I Confess is an outlier in the Hitchcock canon, it helps us to recognize the latent Catholic themes that run through virtually all of his great works—most prominently, the sin of guilt.  Janet Leigh’s guilt over stealing $40,000 in Psycho.  Kim Novak’s guilt over masquerading as James Stewart’s dream girl in Vertigo (and Stewart’s guilt in thinking he contributed to her death).  Eva Marie Saint’s guilt over deceiving Carey Grant in North by Northwest.  Farley Granger’s guilt over murdering a classmate for sport in Rope.  And on and on and on.

These are not Catholic movies, per se.  However, they are all haunted by the aura of divine justice and the fear of God’s eternal wrath that only a Catholic could fully appreciate.  While most of Hitchcock’s heroes probably fear the police and/or each other more than the man upstairs (this was certainly the case with the director himself), they are nonetheless aware that their actions have consequences.  That sooner or later, one way or another, they’re going to get what’s coming to them.

And unlike in, say, the films of Woody Allen—a writer-director who has very little faith in God or justice—these sinners generally do pay a price for their crimes, thereby allowing moral order to be restored to the universe just in time for the end credits to roll.

While Catholicism certainly doesn’t have a monopoly on guilt, sin, justice or anything else, Catholic filmmakers have long been uncommonly adept at portraying how the teachings of their ancient holy books manifest themselves in the contemporary world.  They’re the ones who take God seriously, for better and for worse.

I note this, in part, because there is a large cadre of nonbelievers who sincerely think that religion has nothing positive to offer civilization.  Or, at the least, that whatever good might come from religion could just as easily come from secularism and, in any case, is dramatically outweighed by the evil that could not come from anywhere else.

I used to agree with this assessment.  Most of the time, I still do.  But in the process of extricating myself from the world of the faithful, I have come to better appreciate the monumental role of religion in the lives of others.  I don’t think either God or religion is necessary to lead a fulfilling life, but roughly three in four Americans do, and their faith has sometimes inspired them to craft works of art that could not have emerged in any other way.

I can live without God.  I’m not sure I could live without Raging Bull.  I don’t generally resort to prayer to help solve my biggest problems, but I’m pleased that it worked for George Bailey.  Religion does little for me, but in the end that doesn’t matter so long as it does something for everyone else.  And if no religion meant no Alfred Hitchcock—well, I’m not sure that’s a trade-off I’d be prepared to make.

Freedom From Fear

When it comes to terrorism, how did we suddenly become such a nation of scaredy cats?

Sure, each of us has our own private set of fears—things that add unwelcome tension to our day and maybe even keep us up at night.  Some of these are perfectly rational, while others seem to have been invented from whole cloth.

I don’t know about you, but I certainly know a few things that frighten me.  Failure.  Poverty.  Writer’s block.  Cancer.  Bugs.

But you know one thing that doesn’t scare me at all?  Being killed in a terrorist attack.

On any given day, I am far more concerned about a beetle wandering into my bed than a suicide bomber wandering onto my subway car.  Why?  Because I’m a reasonably logical human being who realizes that the former is infinitely more likely than the latter, and I’m not about to waste my time fretting about every last terrible thing that could possibly happen to me.

Could I find myself in some kind of active shooter/bomber/hostage situation?  Sure, why not?  Bad guys exist and somebody has to be their victim.  I lived in New York on September 11, 2001, and in Boston on April 15, 2013, so I’m not entirely naïve about the horrors that Islamic (and non-Islamic) extremists can unleash upon unwitting bystanders.

All the same, there is something to which I am equally attuned:  statistics.

You’ve read the actuarial tables.  All things equal, each of us is roughly 35,000 times more likely to die from heart disease than from a terrorist attack.  Heck, we are 350 times likelier to die from gravity (read:  falling off a roof) and four times likelier to be struck by lightning.  According to at least one study, the average American’s lifetime odds of being killed as the result of terrorism are approximately 1 in 20 million.

On one level, these numbers serve as amusing, if abstract, pieces of trivia.  On a deeper level, they reflect what a colossal waste of time it is to actively fear being caught up in an act of mass violence.  The probability of such a thing are so remote, you might as well get worked up over being eaten by Bigfoot.

And yet, from a new poll, a record-high number of Americans claim to be more fearful of terrorism now than at any time since September 11, 2001.  Thanks to the atrocities in Paris and San Bernardino—and the increasing reach of ISIS in general—the super-low risk of being the victim of a similar attack now strikes many of us as entirely feasible, if not outright imminent.

It’s not, and it never will be.  Get it together, people.  Don’t be such drama queens.  Keep calm and…well, you know.

Look:  I watch Woody Allen movies.  I understand that if someone is determined to freak out about an imaginary bogeyman, there’s nothing you can do to stop them.  Then there’s the fact that this particular bogeyman is not completely a figment of our collective imagination.  In Syria and Iraq, it’s a lot worse than that.

But realize that, here in America, by being afraid of a hypothetical attack by a gang of faceless, radical Muslims, you are—by definition—letting the terrorists win.

Not to get too cute or cliché, but the object of terrorism is to generate terror.  For the jihadist, committing random mass murder is the means, not the ends.  Whenever a follower of ISIS or al Qaeda opens fire in a crowded marketplace or plants a bomb on a city bus, the point isn’t merely to kill a bunch of people; rather, it’s to make everyone else nervous about entering a marketplace or boarding a bus, because, hey, they might be next.

George W. Bush was absolutely right to say that the best way to fight back is to continue going about our lives as if nothing has changed.  In the most fundamental sense, nothing has:  America remains an exceptionally open society in which all citizens can come and go as they please.  Our economy and armed forces continue to be the envy of the world.  The First Amendment is in such strong shape that a private business denying service to gay people is now considered a form of free expression.  And—sorry to be so repetitive—the likelihood of being personally affected by terrorism is all but microscopic.

To be sure, the government does not have the same luxury as individuals to adopt such a blasé attitude toward the global struggle against violent extremism (or whatever you want to call it).  Having the means to actually disrupt organized crime originating in the Middle East, our military and intelligence agencies are obligated to take the ISIS threat seriously, thereby giving us private citizens the freedom to leave our houses every morning with the confidence that we will return in one piece.

But here’s the main point:  There’s absolutely no reason why we shouldn’t adopt this optimistic attitude anyway.  There is much our government can do to keep us safe, but there is just as much that it can’t.  Islamic terrorism—like Christian terrorism—cannot be eliminated completely.  More perpetrators will fall through the cracks and more innocent people will be killed.

But so what?  There’s very little we civilians can contribute to this struggle—other than the whole “see something, say something” initiative, which has produced mixed results—so where’s the stock in being terrified?  Death itself is unavoidable, and death by terrorism is on roughly the same plane of probability as death by asteroid—and nearly as futile to prevent in advance.

What we should do, then, is take a cue from Franklin Roosevelt, who in January 1941 outlined the “four freedoms” to which all inhabitants of the Earth should be entitled.  While he merely plagiarized from the First Amendment for two of them—“freedom of speech” and “freedom of worship”—and paraphrased the Constitution’s preamble for the third—“freedom from want”—the fourth was an invention all his own:  “freedom from fear.”

Whatever such a concept meant at the outset of World War II—a reduction in global arms, mostly—today we can accept it as a right we grant to ourselves:  The freedom to go about our lives as if they were actually controlled by us.

Springtime For Donald (and the GOP)

I don’t know why I didn’t see it before—perhaps it took a Hitler comparison to really hammer the point home—but I’ve found the perfect reference point for the bizarro performance art that is the Trump presidential campaign.  Indeed, it’s so obvious there’s really no way around it.

Donald Trump is The Producers come to life.

Y’all know The Producers.  A 1968 film and a 2001 musical, Mel Brooks’ masterpiece of lunacy is the story of a washed-up Broadway kingpin, Max Bialystock, who schemes to put on the most unwatchable, offensive Broadway musical ever produced—a show guaranteed to close in one night, enabling Bialystock to pocket his investors’ money without ever needing to pay it back.

As an elaborate act of fraud, this teeters on the edge between ingenious and completely nuts.  In any case, it shows real gumption on Bialystock’s part—a level of greed and hunger, at once spectacular and pathetic, of which we can only stand in awe.

You can probably see where I’m going with this.

Whenever any prominent public figure runs for high office, we more or less take it as read that he really means it—that he genuinely (if misguidedly) thinks he could win and is prepared to assume the awesome responsibilities of the office should he succeed.

We do not generally presume, for instance, that a quasi-serious presidential candidate would run for purely mercenary reasons—a drawn-out charade to make an extra few (million) bucks.  True, virtually all candidates tend to release a book upon entering the race—in America, there is always a profit to be made somewhere—but we nonetheless grant them their sincerity.  After all, considering what an epic headache the whole electoral process is, what kind of lunatic would dive in just for the hell of it?

A lunatic named Trump, that’s who.

Look:  None of us can prove that Donald Trump doesn’t take his own candidacy seriously and that his play for the White House is nothing more than a means of feeding his planet-sized ego before he ultimately tiptoes out the back door—say, a few hours prior to the Iowa caucuses.  Nor can we prove that he doesn’t actually give a damn about the wellbeing of the Republican Party or, for that matter, the country as a whole.  Or that he is, in fact, a secret Democratic Party mole who is actively sabotaging the GOP’s chances of ever winning another presidential election.

We don’t know any of these things for sure.  All we can say—and we might as well—is that if Donald Trump were a Democratic double agent sent in to destroy the GOP from within, the resulting blast would look almost exactly like what’s going on right now.

After all, this was supposed to be the year the Republican Party would make nice with various racial and ethnic minority groups.  The year the party’s mythical “big tent” would expand to include enough non-white voters to actually carry a national election in our increasingly non-white society.

This being the case, what better result could the Democrats hope for than a GOP standard-bearer who is so fanatically hostile towards those very folks—Hispanics and Muslims most of all—that he has undertaken a one-man crusade to literally banish them from the country?  A guy who has effectively taken one look at these potential electoral converts and said, “Go screw yourselves.”

It would all make perfect sense if Trump were a fictional character dreamed up in a laboratory at Democratic National Committee headquarters.  Or—more plausibly—if, like Max Bialystock, he were deliberately self-sabotaging as part of a ruse to reap maximum benefits while assuming minimal responsibility—that is, enjoying the perks of running for president without the complications of actually being president.

In any case, Trump is plainly a slow-motion catastrophe for the GOP, which brings us to the most Producers-like component of this whole ridiculous story:  The fact that Trump’s methods have managed to backfire in every conceivable way.  No matter how insane his candidacy becomes, he just can’t seem to lose.

In the Mel Brooks film, of course, the show that Bialystock and his accountant, Leo Bloom, decide to produce is a neo-Nazi valentine to the Third Reich by the name of Springtime for Hitler.  In New York City of all places—an oasis of liberalism, Judaism and highbrow artistic tastes—nothing could be more toxic than an unironic paean to the good old days of the SS and Aryan supremacy.

The punch line, then, is that Bialystock’s audience members—more jaded and sophisticated than he gives them credit for—take Springtime for Hitler as a big, bold farce and laugh themselves halfway into next week.  As a result, the show is a smashing success and Bialystock finds himself on the precipice of financial ruin.

Candidate Trump is certainly a farce in his own right—a galling, topsy-turvy perversion of reality with bottomless comedic potential—except that the foundation of his surprising success is precisely the opposite of Bialystock’s:  Trump is winning because his audience can’t see through the façade.  Even as his whole shtick is essentially an Onion article that’s gotten out of hand, his supporters take him deadly seriously and think his ideas about mass deportation and religious persecution are just swell.  The more outrageous his public statements become, the higher he rises in the polls.

It begs the question:  Is there not a limit to Trumpism, after all?  If his slurs against Mexicans, women, prisoners of war, the disabled and now Muslims have failed to do him in, is there anything that will?  What is left for him to say that could feasibly erode his evidently bulletproof base of support?

The Springtime for Hitler connection is apt:  If you behave vaguely like a fascist dictator and still can’t get your fans to hate you—all the while being explicitly compared to the Führer in the press and apparently not minding it—then the crazy train can no longer be routed back to the station.  It’s going over the bridge and into the ravine, and that’s all there is to it.

Back in July, the actual Onion ran a story titled, “Admit It:  You People Want To See How Far This Goes, Don’t You?”  At that point, Trump was still a novelty item whose popularity, however surprising, was nothing to get too alarmed about, because we knew that somebody in that field would put him in his place.

Now that all of those assurances about Trump’s eventual collapse have proved false—or at least supremely premature—we onlookers have little choice but to morbidly peek our eyes through our fingers until this horror show finally plays itself out.

While we can sleep easy knowing that both history and statistics show that a Trump nomination—let alone a Trump presidency—is the longest of long shots, we can plunge ourselves right back into panic and despair over the likelihood that, should Trump manage to shame and disgrace himself all the way to the White House, he, like us, won’t have the slightest idea how he got there.

Christmas Cruelty

The month of December is chock full of Christmas TV specials.  Jewish atheist that I am, I plan on catching just about all of them.

While the sheer volume of holiday programming ensures a great diversity of subject matter, it seems fair to say that if we could only save two of them to carry into the next century and beyond, they would have to be A Charlie Brown Christmas and Rudolph the Red-Nosed Reindeer.  And as singular as those Christmas classics are, they have one big thing in common:  They are two of the most depressing programs ever inflicted upon American families.

Yes, they have happy endings.  (Sort of.)  But the trials their protagonists undergo are not merely challenging:  they are borderline sadistic.  It gets you wondering:  why should Christmas, of all things, be so bloody painful?

Admittedly, in Charlie Brown’s case, the abuse is more or less politics as usual.  A Charlie Brown Christmas begins with the ceremonial missed field goal (courtesy of Lucy) and proceeds with Charlie Brown assuming the role of Christmas play director for no apparent reason except for all the other kids to ridicule every decision he makes—including, most memorably, his choice of an actual sapling (rather than an aluminum pole) to use as the gang’s official Christmas tree.  “Boy, are you stupid, Charlie Brown,” says Violet, telling the group, “He isn’t the kind you can depend on to do anything right.”

We laugh because it’s a cartoon, but we also realize that when this happens in real life it’s called bullying, which has a way of burning emotional scars that can take years to heal.  (If you manage to live long enough, that is.)

Then again, at least the torture that Charlie Brown experiences is strictly at the hands of his fellow adolescents.  While children can be very cruel indeed, there is a particular and arguably worse trauma that comes from being bullied by grownups.

Enter Rudolph the Red-Nosed Reindeer.

It’s easy to overlook—what with the lighthearted theme song and the charming stop-motion animation—but the early scenes of the 1964 classic include behavior toward the titular character—by adults, mind you—that is jaw-droppingly callous.  When we say “all of the other reindeer used to laugh and call him names,” that includes his own father, Dasher, who forces Rudolph to conceal his peculiar proboscis and, when Rudolph objects, barks that “there are more important things than comfort:  self-respect!”  The flying coach, Comet, is the one who incites all the name-calling after the fake nose falls off, and it’s Santa Claus—Santa Claus!—who sees Rudolph’s shiny red bulb and tells Dasher, “You should be ashamed of yourself!”

Piled on top of this is a parallel story involving an elf, Hermey, who would rather be a dentist than one of Santa’s slaves, but is told that an elf’s lot in life is to make toys and follow orders.  When Hermey pleads that he just wants to “fit in,” the boss coldly retorts, “You’ll never fit in!”

Jesus, Mary and Joseph, what in the heck is wrong with these people?

Admittedly, this program first aired 51 years ago, back when child abuse was an accepted form of parenting and schoolyard hazing was a fun way to make friends.

Ultimately, Rudolph is a story about empowerment, individualism and acceptance—hence the happy ending, in which Rudolph and company defeat the Abominable Snow Monster and save Christmas from that meddlesome fog—but until we reach that point, it’s essentially a story about being heartlessly exiled from society because of the ignorance of others.  The program’s finest quality is that it pulls no punches, and we cherish it, in part, because we suspect that it could never get made today.

The whole plot screams “allegory!”  Personally, I’ve long seen Rudolph as a metaphor for homosexuality and coming out, and I was delighted to conduct some quick research and find that the rest of the Internet has the exact same theory.  (One blogger noted that “Island of Misfit Toys” would be a fantastic name for a gay bar.)

Indeed, for any closeted young person, it’s nearly impossible to see Rudolph and Hermey rejected for who they are and not be overcome by waves of fear, shame and guilt over the emotional tsunami that’s going on in your head.  While my own childhood was not nearly as traumatic, that doesn’t make watching Rudolph any less poignant.

I’m sure the show’s creators had none of this in mind in 1964.  The genius of the script is that it can be adopted by anyone who feels like a misfit toy and wishes the rest of world would cut them a little slack.

If Rudolph has a weakness, it’s how, when the folks at the North Pole finally do accept Rudolph, it’s for the dumbest possible reason:  utility.

Apart from having saved the town from the monster, Rudolph is made a hero because Santa realizes his glowing appendage has an immediate practical function—namely, guiding Santa’s sleigh through the storm—and not because having an odd facial feature is an incredibly stupid reason for banishing someone from his own hometown.  Santa and company welcome Rudolph because they realize they need him—not necessarily because they want him.

That’s a rather ambivalent lesson, to say the least, suggesting one’s personal quirks are fair game for ridicule and condemnation unless other people happen to find a specific use for them.  I am reminded of the title an old essay by gay rights pioneer Andrew Sullivan, “What Are Homosexuals For?”

In a way, the conclusion to A Charlie Brown Christmas is the more honest of the two.  After Linus’s famous soliloquy quoting from the Gospel of Luke, the whole Peanuts troupe wanders into the snow, steals all the fancy decorations from Snoopy’s doghouse and reassembles them onto Charlie Brown’s feeble sapling.  They have actually learned something:  Beauty is not always apparent at first glance, but you can always find it if you look closely enough.  “Charlie Brown is a blockhead,” Lucy concedes, “But he did get a nice tree.”

As we well know, that’s about as close as any Peanuts kid gets to genuine human affection, so this counts as an unqualified triumph for good old Charlie Brown:  He stubbornly resists the commercialization of Christmas, and in time, everyone else realizes that he is right.

It’s a warm payoff to a very cold setup, and like Rudolph, it shows how Christmas has a way of bringing out people’s better angels.

But the real test—as both of these great shows understand—is whether this yuletide kindness can survive all the way to December 26.