Summer Reading

Throughout my high school English classes, I made certain never to seek the aid of SparkNotes—those best-selling laminated cheat sheets that explain classic works of literature in language teenagers can understand. 

Even as virtually all my classmates supplemented their reading assignments with this handy dandy resource—often with the encouragement of the teacher—I took it as a point of pride to always go into a book cold, knowing as little as possible about its contents beforehand and plodding through its prose—however impenetrable—fully on my own, in order to achieve a truly organic reading experience.

It was one of the stupidest decisions I’ve ever made.

In point of fact, I succeeded in never relying on SparkNotes to understand classic literature.  The result?  I graduated from high school not understanding classic literature.

I opted to treat the act of reading as an ordeal rather than any kind of pleasure, and—wouldn’t you know it?—this had the effect of retarding my ability to read for pleasure.

Of course, I exaggerate a bit.  The Catcher in the Rye was relatively straightforward even in 11th grade, and John Irving’s A Prayer for Owen Meany was a joy from start to finish.  (Granted, the latter, published in 1989, does not exactly qualify as a “classic.”)

But then there were the likes of Moby-Dick, one of several novels assigned the summer before my senior year, which I struggled all the way through for the sole purpose of being able to say that I had done so.  To this day, I still occasionally make that boast, but I don’t remember a thing about the book itself, and I hardly understood a word of it at the time.  I was interested merely in finishing the damn thing, and cared not one whit as to why it was (or was not) worth opening in the first place.

I make these rather self-pitying recollections now—lo these many years later—to draw a handful of possibly useful life lessons here in the opening week of a new school year.

First, I note what an obviously terrible attitude this was in any case.  Indeed, if you insist on making things needlessly difficult for yourself, whatever the subject at hand, you are all but fated to succeed.

Second, I must take exception to the general feeling among most former students that English teachers—or, more specifically, English curricula—have extended every effort to make literature as unpleasant and dull as humanly possible, and are therefore singularly responsible for any and all hostility that young people develop toward the idea of reading for fun.

Yes, it’s true that many such courses reduce even the most irresistible tomes to various mind-numbing “exercises,” be they lists of arcane vocabulary words or long-winded discussions of symbols and metaphors.  And, perhaps more to the point, the roster of selected readings itself is always something of a head-scratcher.  (To wit:  Why, in a world that contains Edith Wharton’s The Age of Innocence, do so many school districts opt to waste everyone’s time with Wharton’s Ethan Frome?)

As such, blaming one’s teachers for one’s aversion to reading would be convenient, cathartic and partially correct.  But I would just as well blame myself, and I would advise others to adopt a like view.  J.K. Rowling once quipped, “There is an expiry date on blaming your parents for steering you in the wrong direction; the moment you are old enough to take the wheel, responsibility lies with you.”  I dare say the same is true for educators.  Sooner or later, the resistance to read is not their fault.

Third, and most crucially, is the question of pride.

The truth of the matter is that there are plenty of teenagers who can handle (and enjoy) sophisticated books without outside assistance, but I was simply not one of them.  I was not clever enough in high school to digest Dickens and Fitzgerald sans SparkNotes.  In thinking I was, I deprived myself of the intellectual delights that come with truly exploring and understanding a great work of literature.

I thought that reading a classic novel without any context or cheat sheet was the right and honorable way to go.  I was mistaken.

In the intervening years, I have attempted to rectify this grievous sin by catching up on many of the works I had subconsciously avoided since leaving high school, and I am infinitely glad to have (finally) done so. 

The difference between then and now—that is, the secret to genuinely enjoying the sorts of books that are forced upon you when you are young—is that I have succumbed to the temptation to cheat.  I have pocketed my pride, having found so little use for it in the first place.

Earlier this summer, for instance, I finally encountered Hamlet.  I’d somehow never read it before, and decided that enough was enough.  In preparation, I went to Wikipedia and learned the entire plot, from start to finish, as well as all the major themes and characterizations. 

As such, I was able to properly digest and appreciate Shakespeare’s most complex play in one go.  This appreciation came through basic comprehension, and that in turn came through knowing what the heck was going on.  Even today, I find Shakespearean prose rather daunting.  The key was admitting this to myself in advance, and not trying to be a hero by plowing through it on my own.  By acting otherwise for so long, what was I trying to prove, anyway?

They say pride is the deadliest of the deadly sins.  I’m sure there are some practical uses for it, but destroying the experience of reading a great book is not one of them.  I just wish it had taken me less than a decade to figure that out.

Advertisements

Challenge Accepted

The ice bucket will not kick the bucket.

It first took social media by storm sometime in mid-July, and now here in the final week of August, it is still going strong.  It has been called the “Harlem Shake” of 2014, although even that meme did not boast quite as much staying power as this one.

What’s the secret?  That’s an easy one.  It’s the way that it seamlessly combines two of Americans’ favorite pastimes in the Internet age:  Doing good deeds, and patting ourselves on the back for doing them.

I speak, of course, of the “Ice Bucket Challenge.”  At this point, I dare say I needn’t explain what the challenge is about, since anyone with access to this blog presumably has access to the rest of the Internet as well and therefore knows perfectly well what the challenge is about.

Then again, perhaps not.  One critique that has popped up is how, for all the celebrities who have lent their fame and dignity to the act of filming themselves being doused with ice-cold water, hardly anyone has bothered to mention the real point of the exercise, which is to raise money for the ALS Association (or, in theory, any other charitable organization).  The world has gotten a kick out of the increasingly ingenious ways people have found to transfer a pool of liquid from a bucket to their own face, but one could be forgiven for arriving to the party late and assuming it’s all being done just for the fun of it.

Probably the best rebuttal to this criticism is the fact that the ALS Association has received nearly $90 million in donations over the past four weeks, which is more than 30 times what it raised during the same period in 2013.  If people are not aware of the connection between the Ice Bucket Challenge and ALS awareness—or, more to the point, if they are not more aware of ALS itself—the numbers seem to suggest otherwise.

Is it possible that even more money would have been raised—and still could be raised—if participants in the challenge made their objective clearer?  It’s certainly conceivable, but it would be awfully hard to prove either way.

In my view, the far more salient point is as follows:  Less than two months ago, amyotrophic lateral sclerosis was a mysterious, deadly and horrifying disease known to most Americans as having taken the life of Lou Gehrig, but was otherwise rarely on the mind of anyone not faced with it directly.

Today, ALS is a mysterious, deadly and horrifying disease that people are looking up on Wikipedia and giving money to fight at unprecedented rates, thereby generating the funds and publicity required to make it a higher priority within the medical community than it might otherwise have been.

This happened for precisely one reason:  Because a whole bunch of people stood in front of a camera and dumped a bucket of ice water onto their heads.  It made no particular sense that one thing would lead to the other, but there you have it.

In other words, if we insist on clinging to hypotheticals, then the most pertinent one is the alternate universe in which the Ice Bucket Challenge did not exist.  That is, the scenario in which there hadn’t been some fun, goofy gimmick to force people—if only fleetingly—to think about a terrible disease that demands our attention, and to give lots of money to help get rid of it.

Ice Bucket critics implore people to donate to ALS research straight-up, without all the bells and whistles.  (“Do not film yourself or post anything on social media,” writes Will Oremus of Slate, “Just donate the damn money.”)  But this ignores the fact that these same critics wouldn’t have even thought to make this suggestion if not for the viral and self-serving nature of the campaign they supposedly detest.

As they say, don’t make the perfect the enemy of the good.  Not everything requires your curmudgeonly, self-righteous disapproval.  Sometimes a good cause is just a good cause.  The Ice Bucket Challenge, for all its silliness, is a good cause.

What is more, the magnitude of its goodness owes almost entirely to the magnitude of its silliness.  It would be nice if people could be made to regularly donate to charity en masse for its own sake, but in reality, coaxing people to part ways with their hard-earned cash is a bit like getting a toddler to eat his vegetables:  It has to be turned into a game.

This fact, insomuch as it is a fact, leads us to a far more interesting question:  Should we consider it a vice or a virtue that so many Americans—particularly Millennials—are willing to give to charity, but only in the most narcissistic possible manner?  That nothing motivates us to do good as much as the opportunity to advertise what wonderful, caring people we are?

Nobody seems to argue that ego-stroking is honorable, but if it can be harnessed to effect honorable ends, what cause have we to complain?  If giving money to a worthy cause is inherently moral, does the motivation behind it really matter?  I dare say the charitable organizations that receive this money are not terribly picky on this point.  (Not that they have much of a choice.)

It is a singular irony that the most respectable form of charity—giving anonymously—is also the one that, by definition, cannot be recognized.  Indeed, that is what makes it so commendable in the first place.

But modesty and anonymity don’t quite work on social networks, which generally encourage users to be as loud and obnoxious as possible, and a central lesson of the Ice Bucket Challenge is that if you really want to get something done, social networks are the place to do it.

In any case, the key to amassing a sizable hill of funds has always been to get the word out.  To send e-mails and letters.  To be annoyingly persistent.  To not keep quiet and assume everyone will give out of the goodness of their hearts, unprompted.

You can afford to shun publicity if you happen to be a billionaire whose donation could single-handedly fund the treatment of a dozen ALS sufferers or more.  But for us mere mortals, silence is deadly.  It’s not enough for you to give—you have to make everyone else give, too.

And so I say:  May the ice buckets never stop.

The Happiness Factor

The federal government might be spending less money on anti-tobacco campaigns in the near-future.

Why is that?

Because, according to the FDA, smoking is just too much fun.

As reported recently in the New York Times, the Food and Drug Administration released a study in April regarding the regulation of tobacco products in the United States, and it included the claim that the total economic gain from reduced tobacco use must be cut by 70 percent to account for (in the Times’ words) the “loss in pleasure that smokers suffer when they give up their habit.”

I will repeat that.

For every unit of benefit that comes from the effort to induce people not to smoke—lower rates of cancer and heart disease, less crowded hospitals, whiter teeth, fresher breath—seven-tenths should be shaved off, due to the reduction in overall happiness that quitting smoking effects in smokers.

The objective of this FDA report—84 pages in length, including a 43-word title—is to present a cost-benefit analysis of federal anti-smoking policies in order to determine whether such endeavors are worth the trouble. This is something the government is required to do for any set of proposals that costs more than $100 million.

With this newfound “happiness quotient,” the FDA has set a fairly high bar for what might constitute “worthwhile.” As you can imagine, anti-smoking activists—and many economists—are slightly less than pleased.

The main objection of these critics is not the existence of this lost pleasure metric, as such, but rather that it should be so gosh dern high. It would be one thing if the drawbacks of not smoking accounted for, say, 10 percent of all relevant cost-benefit calculations. But 70 percent? Surely pumping several thousand toxins into one’s lungs is not as enjoyable as all that.

But then I am hardly one to say, as the entirety of my own tobacco-inhaling experience consists of a single pack of vanilla-flavored Djarum cloves, consumed in the course of a single month in the summer of 2010, and then never again. While I can affirm that those evenings were abundantly satisfying while they lasted—the balmy weather and bottles of Corona Extra probably helped—I cannot say what might have resulted from, say, smoking a few hundred packs more, exhausting my entire savings account in the process.

Except I hardly need to wonder, because I am a regular viewer of The Late Late Show with Craig Ferguson, whose host is both an ex-smoker and a recovering alcoholic, and has explained on multiple occasions that while he stopped drinking because it was increasingly impairing his ability to function, his decision to stop smoking was more intellectual, and thus more irritating. He knew cigarettes would eventually kill him if he didn’t knock it off, but it was one heck of a lifestyle adjustment, and rarely a cheerful one.

As to precisely what tobacco’s benefits are, Christopher Hitchens was characteristically succinct:

If you aren’t hungry, it will give you an appetite. If you are hungry and there isn’t any food in the immediate future, you can dull your hunger by smoking. It wakes you up if you’re tired. It makes you sleepy if you’re not tired. It’s the perfect self-administered micro drug. It’s the little glowing friend that never lets you down.

(In saying this, Hitchens caustically added, “Come to think of it, I can’t think why I gave the shit up.”)

What I admire about the inclusion of a “happiness quotient” in this FDA paper—however mathematically problematic it might be—is how it admits that there are reasons we consume toxic substances in the first place. That in deciding to indulge, we are making a cost-benefit calculation of our own. That we are knowingly running a risk that, for a while at least, yields real rewards—even if they are ultimately outmatched by heavy, and often fatal, consequences. (Hitchens, for one, was brought down by a sudden and agonizing case of esophageal cancer not long after making the above observations.)

Elementary and high school health classes certainly make little room for this kind of nuance. From all the usual propaganda in textbooks and PSAs, you’d think tobacco and other drugs serve no purpose except as expensive forms of passive suicide.

Certainly, much of the time they do exactly that. At this point in mankind’s scientific evolution, only a deluded fool would say they do not.

However, to say that they do only that—that recreational drugs are an abject waste of one’s life and are to be avoided in all circumstances—is to commit a sin of omission and an insult to the intelligence of even the most mildly clever person.

I am reminded of a recent Onion headline, “Study Links Meat, Sugar Consumption To Early Death Among Those Who Choose To Be Happy In Life.” Or, as Bill Maher phrased it, “Sometimes fun costs ya.”

The obvious rebuttal to this—that is, apart from the extremely obvious one—is that because substances like tobacco are inherently (and deliberately) addictive, the choice to use them is not really a choice at all, particularly among teenagers. As such, any supposed pleasure one derives from them is, to some degree, illusory. (To wit: Have you ever, in a fit of ecstasy, scarfed six or seven extra servings of chocolate lava cake and not felt absolutely awful the next morning?)

Even so, should the FDA care whether this is true, let alone assume that it always is? Is it finally the government’s role to calibrate itself with how people ought to behave, or with how they actually do?

If the right to the pursuit of happiness is truly a founding American value, what business does the government have to limit our capacity to do what makes us happy, provided that it is otherwise legal? Or does this pursuit only encompass what the government thinks is good for us, while everything else is fair game for restrictions?

In the opening line of our Constitution, the government is tasked with “promot[ing] the general welfare.”  To be sure, maintaining a generally healthy populace is a component of such a commission. But isn’t ensuring that we, the people, are personally satisfied another? And isn’t it up to us, not it, to decide what personal satisfaction means? And to the extent that such a thing is objective, aren’t we entitled to be wrong?

Now there’s a happy thought.

Teachable Tragedy

On the American home front, there were two big events last week.

First, a 63-year-old man killed himself for no good reason.  And second, a police officer killed an 18-year-old kid for no good reason.

The former is newsworthy because the man was beloved actor and comedian Robin Williams.  The latter is newsworthy because the officer was white and the kid was black (and unarmed), and because of the subsequent uproar in the town of Ferguson, Missouri, where the killing occurred.

If the official narratives are to be believed, both deaths came about through mental illness.  Williams was a victim of depression, while the kid, Michael Brown, was a victim of racism.

In fact, we don’t know for sure whether either of those assertions is true.  Williams apparently did not leave a suicide note, and there are crucial details about the shooting of Brown of which we remain ignorant.

But that’s not the point.  These two incidents were tragedies—both incalculably unjust and unnecessary and preventable—and we, the human race, have made it our duty to make sense of them, regardless of the facts.  To explain things that are inexplicable.  To transform a tragedy into a “teachable moment.”  To shape individual deaths into symbols of broader crises in our society, in order that we might prevent such misery in the future.

There is scarcely anything wrong with this impulse, as such.  While it would be nice for us—particularly our representatives in Congress—to address all the injustices in the world all the time without any prodding, certain practical considerations prevent it.  There just aren’t enough hours in the day.

Accordingly, we often depend on specific, isolated moments to remind us of the issues that especially deserve our attention, and which had perhaps been neglected up until then.  Hence the emphasis on gun control legislation following a school shooting (or three), or on climate change policy in the aftermath of a destructive hurricane.

So it is understandable that the suicide of an admired celebrity with a history of depression and drug abuse would lead to an outpouring of public interest in suicide, depression and drug abuse.  They are real and serious problems—as are the stigmas attached to them—and if it takes the loss of Robin Williams to examine them closely, so be it.

But the situation in Ferguson is exceptional, owing to the sheer number of “national conversations” that have arisen in its wake, some of which not necessarily having much to do with each other.

There is, for starters, the question of whether outfitting local police forces with military-style tanks and weapons might carry unintended consequences.  And whether dispersing non-violent protesters with tear gas and rubber bullets ultimately does more harm than good—both in terms of maintaining order and establishing trust.

As well, there are the matters of suppressing freedom of the press and the right to peaceably assemble that have come into question amidst the public response to the Brown shooting, along with the media’s tendency to perpetuate clichés and prejudices as to who the “heroes” and “villains” are, long before all the facts are known.

But this is all mere window dressing around the central concern of black and white. 

First is the assumption that Michael Brown is dead because he was black and the officer who shot him, Darren Wilson, is white—in other words, that racism itself, be it latent or blatant, is the primary culprit. 

Second, that the near-uniform whiteness of the Ferguson police force in a town that is two-thirds black is at least partly to blame for all the mayhem that has occurred there in the past week and a half.

Third is the long history of racial tensions in the greater St. Louis vicinity, illustrated and exacerbated by the way that black people there tend to be overrepresented in number but underrepresented in power—a fact partly, but by no means entirely, explained by politics.

This is but a partial list of the topics that have suddenly sprung to the nation’s lips, and they are all due to a single incident that—at the risk of repeating myself—we know practically nothing about.

This year marks the centenary of World War I, whose very existence still baffles us 100 years hence.  To this day, much of the world is still trying to fathom how a single, seemingly random incident—namely, the assassination of the Archduke of Austria by a 17-year-old Serb—could possibly throw every great empire on Earth into conflict.  How could so much come from so little?

In light of the events in Ferguson, I am beginning better to understand.

The answer, in both cases, is that the commencement of hostilities did not, in fact, come from nowhere.  Rather, such tensions had been simmering, lying in wait for many years, until some triggering event forced them to the surface, allowing the aggrieved parties to have it out once and for all.

This at least explains the readiness of virtually every person on Twitter to attribute the Brown shooting itself to racial prejudice.

In point of fact, we do not know what was inside Darren White’s head when he decided that shooting an unarmed 18-year-old six times was a good idea, just as we do not know what was inside Robin Williams’.  White hasn’t yet appeared in public, and thus hasn’t uttered a word in his own defense.  We have been provided several eyewitness accounts, and they do not agree on all points.

The shooting of Brown might well have been motivated by racism in one form or another; perhaps one day we will know for sure, although we shouldn’t hold our breaths.

The broader point, though, is how convenient it would be for our national narrative about race relations if it were.  If Darren White considered Michael Brown threatening purely (or even partly) because he was black, it would confirm all our suspicions about racial bias in our police forces.  And if White is ultimately exonerated, it would confirm similar biases in our justice system.

It’s not as if we require any such confirmation at this point in the game.  As no less than Senator Rand Paul put it, “Anyone who thinks that race does not still, even if inadvertently, skew the application of criminal justice in this country is just not paying close enough attention.”  The statistics speak for themselves.

Nonetheless, it would greatly serve the purpose of noticing and ultimately rectifying the problem of racial prejudice in America if the shooting of Michael Brown could, indeed, be categorized as just such an incident.  What is more, it would save us the discomfort in considering that the shooting had no basis at all.  That it was a senseless act from which nothing meaningful can be learned.

No, it is much better always to have a moral to the story.  To not let the facts get in the way of the truth.

The Unhappy Anarchist

I had never seen Dead Poets Society before, so I figured the death of Robin Williams was as good an occasion as any to catch up. Better 25 years late than never, as they say.

As most people already know, the movie is about an English teacher at a prestigious and extremely conservative prep school who causes all hell to break loose by introducing such heretical concepts as thinking for oneself and pursuing one’s own happiness, which he does through such exercises as reciting poetry and standing on his desk.

I must say I was slightly underwhelmed by the film, owing largely to the fact that every adult character other than the teacher is a one-dimensional scoundrel, from the principal who extols “tradition” at all costs to the father who thunderously forbids his son from pursing his dream of being an actor. On the question of whether unbridled individuality is a sin or a virtue, you might say Dead Poets Society stacks the deck.

Nonetheless, it is quite easy to understand why John Keating, the teacher, is among Williams’ most beloved movie creations, and why the principles he espouses are still so widely quoted today.

In an imperfect movie, Keating is perhaps the quintessential Williams character, insomuch as he reflects the credo by which Williams himself conducted his public life. In keeping with the film’s signature proverb, he was a man who, in every conceivable manner, seized the day.

Well, that’s a bit of an understatement. He didn’t seize the day so much as grab it by the scruff of the throat and throttle it to within an inch of its life.

Robin Williams was a comedic anarchist, and oftentimes the world didn’t quite know what to do with him. That he was so widely admired all through his career is a credit first to his singular abilities as a performer, and second to the sensibilities of his audience.

The secret to Williams’ appeal is the same as that of Groucho Marx, Mel Brooks, George Carlin and Zach Galifianakis. It’s the ability and the willingness to purposefully break the rules, and to not be afraid of authority figures who might stand in your way. To violate every taboo in the book, if only for its own sake, knowing that no joke is funnier than the one that is not supposed to be told.

All comedy is subversive, but Williams’ comedy had the added virtue of being utterly uninhibited. Once his mind started churning and his lips started flapping, there was no way to stop him. He was in his own world. A natural force.

He became most widely known through his movie career, but stand-up was always and forever his natural habitat. It was the place where he could let loose with absolutely no restraints. On an empty stage, a comic has no particular limits on time, subject matter or taste. As much as anyone can, he can say and do whatever the hell he wants. In Williams’ case, the results were often sublime.

In his movies, not so much.

For all the joy his best comedic film performances brought—Mrs. Doubtfire, The Birdcage and Aladdin must be included on any such list—there was ultimately something incompatible between Williams’ act and the film medium itself.

With exceptions (I’ll come to the biggest one in a moment), Williams’ singular wit did not explode into full metal funny on screen the way it did on stage.

The most succinct explanation for this, as I suggested at the top, is that few writers and directors were able to keep up with him. He was far cleverer than they were, and in practice this meant one of two things. First, that he was saddled with mediocre scripts that he was forced to plod his way through; or second, that he was given free rein to improvise and do his own thing, often resulting in an implausible or disjointed narrative. (To wit: How terribly convenient that he always managed to play someone who had a gift for impersonation, regardless of whether it had anything to do with the plot.)

Movies are ultimately about story and character, and no-holds-barred stand-up comedy does not naturally lend itself to either. Williams was the most enjoyable when he was totally unrestrained, and yet movies, by their nature, require restraint at least some of the time. (Even a handful of Marx Brothers movies were polluted by irrelevant romantic subplots.)

The one time Williams managed to square the circle—that is to say, the one time his talents were put on full display without being compromised—was in Good Morning, Vietnam. Directed by Barry Levinson in 1987, the movie is about an American radio DJ in Saigon who dares to introduce irreverence and rock ‘n’ roll onto military airwaves. Naturally, this leads to his being regularly harangued by his superiors, who would love nothing more than to yank him off the air, except that he is just too bloody popular.

If that sounds like the perfect Robin Williams role, that’s because it was. It gave him carte blanche in his choice of stand-up material—a radio show is pretty darned close to an empty stage—and it provided the authority figures for him to push back against.

However, Good Morning, Vietnam went even deeper than that, by following Williams’ character, Adrian Cronauer, beyond the radio booth and into the war itself, suggesting in the process that, for all his confidence and bluster on the air, he is actually a far sadder and more compassionate person than he would ever wish to let on. In a key scene deep into the film, he finds himself yukking it up with a group of soldiers, addressing them one-by-one as if they’re guests on his program, and we realize that his mighty grin is a mask. That the welling in his eyes are not necessarily tears of joy.

Cronauer was, in the end, probably the closest Williams ever came to playing himself.  It’s a tragedy that this should be so, but it sure was fun while it lasted.

A Performance From Beyond the Grave

In the six months since Philip Seymour Hoffman died, I have insisted to myself and others that he will never truly be “gone.”  Like a band that has broken up or a novelist from a bygone era, the Finest Actor of His Generation deeded us a body of work that will allow him to continue to entertain us for as long as we possess the means (and the interest) to indulge.

Sure, Hoffman’s hideously untimely demise at age 46 meant that his movie oeuvre had reached an abrupt endpoint, but what a collection of performances he gave!  From his pathetic fanboy in Boogie Nights to his appealing but possibly pedophilic priest in Doubt, from his scruffy, boisterous rock critic in Almost Famous to Truman Capote himself, Hoffman appeared not only capable of doing it all, but seemed, in his 23 years on screen, to have actually done so.

Another 23 years of Hoffman, had they existed, would have yielded countless more excellent roles, but probably not anything we hadn’t seen before, broadly-speaking.  At this point in his career, I figured, he had retained his power to impress, but had all but exhausted his power to surprise.

Then I saw Hoffman’s performance in A Most Wanted Man, and all of that thinking went out the window.  I am now somehow compelled to mourn all over again.

This movie, directed by Anton Corbijn from a novel by John le Carré, was filmed in the fall of 2012, but released only last month.  It features Hoffman in its leading role, essayed when he was very much alive and kicking, and thereby has the distinction—much like The Dark Knight in 2008—of showcasing a virtuoso performer to an audience that cannot help but view him in the past tense.

And like Heath Ledger, whose mad, manic Joker revealed an exciting, promising and altogether unexpected side to an actor everyone in the audience knew was dead, Hoffman in A Most Wanted Man allows the painful irony of introducing a whole new depth of his talent that we will never get the chance to see.

What exactly am I referring to, you ask?  What is it about his final major film appearance that so differentiates it from all that came before?

A German accent, as it turns out.

In A Most Wanted Man, Hoffman is Günther Bachmann, a German intelligence agent based in Hamburg.  He had been responsible for a major intelligence failure in the past and is now attempting to redeem himself in the present by heading off a terrorist attack in the future.  (Hamburg had played host to several key planners of the 9/11 attacks.  The film takes place shortly thereafter.)

It’s not that anyone doubted Hoffman could credibly play a spy.  Indeed, he did exactly that in 2007’s Charlie Wilson’s War, for which he received an Academy Award nomination.  Nor should anyone be taken aback by his exceptional capacity to brood, arresting our attention with little more than his mere presence and a few puffs from a cigarette.

Nope, the revelation is in the accent.  Hoffman plays a German man speaking English, and unlike virtually every other American actor to ever attempt such a stunt—including two other actors in this movie, I might add—he makes you believe he is, in fact, a native-born German.  Those who are seeing Hoffman for the first time will have no reason to assume he is actually American, just as when I first saw Titanic, I had no idea that Kate Winslet is actually British.

Obviously, that’s not all there is to the performance, nor is Hoffman all there is to the film, which is engaging and politically astute even when Hoffman is nowhere to be found.

But it’s worth underlining all the same, because Hoffman in his movies—unlike, say, Meryl Streep in hers—was not known for speaking any way except as he actually did (Capote was an exception).  That he could pass so persuasively as a European, while unsurprising in retrospect, was not something to which we had been subjected while he was alive.  Now that he’s dead, we will forever be tormented by the gazillion additional turns his career might have taken.  The infinite possibilities.  The prospect that he was an even better actor than we thought.

Of course, Hoffman is not the first great actor to shuffle off at a point when, by all outward appearances, he had plenty of life still in him.  Indeed, it was in the middle of writing the previous paragraph that I learned that Robin Williams, one of the great comic chameleons of the age, has gone off to the big genie retirement home in the sky at the frightfully young age of 63.  Who’s to say he didn’t have a secret second (or third) act in his back pocket that would have blindsided us all?

In Hoffman’s case, the loss is felt with particular intensity due, in large part, to the intensity of the man himself.  And to the paradoxical notion that, for all he had accomplished as an actor—an output so vast in both size and scope for someone only in his mid-40s—he was really just getting started.

Petty Crimes and Misdemeanors

Now that most of America’s grown-ups seem to have realized that impeaching President Obama would be an exceedingly stupid idea, we can more clearly reflect on the 40-year anniversary of when hounding the commander-in-chief from office made absolute perfect sense.

It was indeed on August 9, 1974 that President Richard Nixon ever-so-reluctantly bode farewell to the American public, following some two-plus years of high-level shenanigans all piled under the heading of “Watergate.”

The whole saga, from the break-in to the resignation, has been rehashed so many times in the last four decades—in books, films, TV programs, newspaper articles and the ever-expanding collection of Oval Office tape recordings—that it has become increasingly impossible to wrench any new or interesting insights from one of the more embarrassing episodes in U.S. politics.  We have acquired new facts, but no new truths.

But of course we continue yapping about it all the same, the Nixon era remaining the most potent of narcotics for political junkies—perhaps because it contains so much junk.

Watergate deeded the baby boom generation a whole dictionary of political clichés—uttered today without a smidgen of hesitation—and the event itself has become a cliché.  Having nothing fresh to teach us, but apparently incapable of dislodging itself from the country’s collective subconscious, the drama that crippled and ultimately destroyed the Nixon presidency and forever poisoned the public’s relationship with its leaders has, rather amazingly, evolved into a nagging bore.

The dirty little secret—the fact that our wall-to-wall nostalgia-fests tend to obscure—is that Watergate was not the worst crime ever committed by an American president.  Not by a long shot.  Alongside other executive malfeasance down the years, Watergate might not have been nothing, but it was a fairly minor transgression when you consider all things.  It is not worth the extraordinary attention it still garners, and the numbing effects of constantly reliving it do not make matters any better.

I’m not just talking about the break-in itself—namely, the bungled attempt by Team Nixon to get a leg up on the Democrats in anticipation of the 1972 election.  (Against George McGovern, Nixon won the contest by a score of 49 states to one.)  Most people agree that, while sleazy and illegal, the burglary was a silly little farce that hardly threatened the integrity of the republic or constituted a grave beach of White House power.

Considered in today’s environment, where everyone is secretly recording everyone else and everybody knows it, the Watergate scheme seems positively quaint.

Indeed, in the usual narrative, the whole point about the adage, “It’s not the crime; it’s the cover-up,” is that the Nixon administration’s single-minded obsession with suppressing any and all incriminating evidence about the burglary was, itself, the raison d’être for punishing Nixon in the first place.  Had Nixon simply allowed the investigation to take its course, some heads would still have rolled, but Nixon’s would have not, and the country would have moved on.

The real offense, as it were, was thinking that because he was president, he could control the dissemination of facts and avoid being held to account.

It was the principle of the thing, as high school principals like to say.  It’s not that President Nixon and his underlings did anything major.  Rather, it’s that they went to such elaborate lengths to evade responsibility for something minor.  In other words, they demonstrated that they were inherently untrustworthy.

In this way, we could establish that, in practice, there are two forms of presidential crimes:  actual crimes and suggestive crimes.

The former are those that directly and plainly harm the republic.  Historically, these would include the Harding administration’s exchange of no-bid contracts for bribes with oil companies, or the Reagan administration’s exchange of money and hostages for weapons with Iran.

The latter, meanwhile, are the indiscretions that are not inherently destructive, but which indicate that far worse shenanigans are on the way.  Or at least that they bloody well might be, and you’d be well-advised to prevent them while you still can.

To wit:  When President Bill Clinton was found to have committed perjury regarding whether Monica Lewinsky was more than a mere pizza delivery girl, no serious person asserted that an affair between the president and an intern was, itself, a cause for serious concern as to the well-being of the United States.

No, the refrain was always something along the lines of, “If Clinton will lie under oath about an affair, then what won’t he lie about?”  Clinton’s brush with impeachment was, in effect, an indictment of his character more than his actions.

The question with Clinton—and also with Nixon—is this:  If the act itself is not an impeachable offense, then why is lying about the act any worse?  We might agree that dishonesty is inherently bad—and that perjury is inherently very bad, indeed—but let us not suggest for a moment that all lies are created equal, or that all abuses of executive power are equally harmful to the country or the office.

Does this mean Richard Nixon should not have been subject to articles of impeachment?  Not in the least.  The bases for impeachment are deliberately broad, and Nixon’s actions regarding Watergate all-but-demanded the three charges he faced—namely, “obstruction of justice,” “abuse of power” and “contempt of Congress.”  We can argue about whether the rules are just, but Nixon most certainly broke them.

What I would argue, however, is that the Watergate affair is far overrated in our collective consciousness of the last half-century in American history.  As with the Kennedy assassination and the September 11 attacks, we have come to regard the investigation and its findings as a “loss of national innocence,” whatever that means.

What Watergate really did was confirm a few things that we already knew but apparently were not prepared to admit out loud.

Power corrupts.  Richard Nixon was a paranoid scoundrel who surrounded himself with other paranoid scoundrels.  Ambitious men, once in power, will go to extraordinary lengths to stay in power.  Follow the money.

Were any of these things actually revelations in 1974, or were they merely the end of a happy self-delusion on the part of the entire country?  Albeit with four decades for us to think it over, the answer today would seem to be self-evident.

More to the point, so long as the darker side of government is, and has always been, a simple fact of life, what exactly was so tragic and violent about being made aware of it once and for all?  Isn’t it in our best interests to know what our elected officials are up to, rather than remaining ignorant and assuming everything will turn out fine?

Indeed, Watergate may well have been one of the best things ever to happen to us.