A Boy’s Life

Every great movie is a little different each time you watch it.  If there is any clear divide between good cinema and bad cinema, it’s that the former contains depth and subtlety that the latter lacks—much of which remains hidden until you’ve digested it many times over.

Watching Boyhood for, let’s say, the fifth time was, for me, distinguishable from the fourth for a very particular reason:  I was, for the first time, viewing it in the presence of an actual boy.

Over the weekend, Richard Linklater’s 2014 film debuted on cable TV, right around the time a big family get-together of ours was winding down.  A handful of us tuned in, many for the first time.  Among these was my 13-year-old cousin, who was skeptical about why a director would take 12 years to make a single movie, let alone why anyone would watch it—especially with its nearly three-hour running time.  (No doubt many grown-ups feel this way, too.)

Then the movie got underway, and he grew mildly engaged—not least by the friction between Mason, the protagonist, and his slightly older sister, Samantha.  (He has a sister, too.)

As Boyhood approached its halfway point and Mason’s age aligned with my cousin’s—leading to such vignettes as getting verbally accosted by schoolyard bullies and discovering the wonders and mysteries of women—he sat up in his chair and remarked, “That’s exactly how it is!”

I don’t think Linklater could’ve asked for higher praise than that.

If Boyhood is about anything, it is all the little joys and horrors of being a kid from first to twelfth grade in today’s America, particularly if you’re a guy.  Naturally, this makes those within that age range the movie’s target audience—or, at the very least, the people who can best judge whether what it portrays rings true.

As a 20-something American male, my own adolescence played out only a few years removed from the movie’s time frame.  And so I have felt reasonably confident, up until now, that Boyhood is as accurate and insightful as most people say, hence its magnetic effect on my psyche.

But memory is unreliable, and there are innumerable films about childhood that reflect their directors’ assumptions about the experience that, in fact, are either romanticized or traumatized beyond any sense of realism.  (Sometimes this is deliberate, but often not.)

Now I know, much more confidently than before, that Linklater nailed it.  I know because someone with much more authority on the subject than I has said so.

That means a lot to me, because it helps to calm one of my greatest cinematic fears:  That my deepest and most memorable film-going experiences somehow weren’t real.  That they were manipulated, mistaken or emotionally fraudulent.  That it was all in my head.

This fear is especially acute when it involves movies that are universally acclaimed, heaped with critical praise bordering on hysterical.  Boyhood is a classic example, with Manohla Dargis in the New York Times calling it “profound” and a “masterpiece”—terms used very sparingly in the paper of record—with the Times’ other film critic, A.O. Scott, writing, “In my 15 years of professional movie reviewing, I can’t think of any film that has affected me the way Boyhood did.”

I know how he feels, but it can be dangerous to employ such gushing appraisals about works of art, since they inevitably raise viewers’ expectations to impossible levels, leading to an equally-inevitable backlash featuring contrarian critiques and the barking of words like “overrated.”  While there hasn’t yet been a ton of this with Boyhood, there has been just enough to make me nervous.

To be clear:  I don’t fear opposing views about my favorite movies.  I only fear persuasive ones.  I fear that someone will point out some fundamental flaw that I hadn’t noticed before and that I won’t be able to shake it when I see the movie again.

In general, I know better than to read things that will do nothing but upset me, and I am endlessly thankful that the overwhelming majority of Internet-based analysis is complete rubbish and not worth anyone’s time.

But then there are folks like Ross Douthat, who recently posted a New York Times blog entry titled, “The Trouble With ‘Boyhood.’”  Although Douthat is best-known as a Times op-ed columnist, he also reviews movies for National Review and, more to the point, is one of the most thoughtful and intellectually honest writers in the biz.  So when he is compelled to puncture the idea that Boyhood is perfect, I can’t just dismiss it.

As it happens, I did not ultimately find his gripes about the movie compelling.  I understand how he reached the conclusions he did, but my recent re-viewing rendered his critiques immaterial.  For instance, he says (and quotes others as saying) that by the end, Mason does not appear sufficiently affected by the various family dysfunctions in his upbringing, and that there is not nearly enough drama and conflict to get us across the finish line.

In one sense, Douthat and his co-contrarians are right:  Overall, Boyhood does not examine the long-term consequences of divorce and other familial unrest on children as thoroughly as it might have.  Nor is Mason himself an exceptionally assertive or colorful character, and he has a definite knack for deflecting would-be hardships instead of absorbing them head-on and having to nurse the resulting emotional wounds.

On the other hand, what does that have to do with anything?

Life only happens once, and we all handle it differently.  If Mason emerges from an adolescence of constant domestic turbulence with a general air of serenity, maybe it’s because that’s just the kind of person he is.  On what basis should we expect him to act any other way?  If the years of fighting and bitterness between his estranged parents give way to comity and near-reconciliation, perhaps it just demonstrates that adults, like their kids, are sometimes capable of change and personal growth.

It’s absolutely true that the players in Boyhood do not follow the conventions of similar characters in other films, nor does the film itself adhere to anything resembling a traditional plot.

Who ever said that it should?  I don’t know about you, but I prefer movies that approach their subjects differently than movies that came before.  If I wanted to watch the same thing over and over again, I would watch the same thing over and over again.

Indeed, that’s what I seem to be doing lately with Linklater’s little experiment.  Not only do I find it so very different from everything else that’s turning up in movie houses today, but also—if I may end where I began—a novel experience from one viewing to the next.  As Mason grows from a six-year-old into a college freshman, so does the movie itself assume a more confident and fully-formed identity.

I can’t explain this.  (Nor do I care to.)  All I know is that I’m still very much in the rapturous, love-at-first-sight stage in my relationship with this movie.  And like all such relationships, it contains a modicum of stone-cold dread for the moment when it all comes crashing down to Earth and I find out that Boyhood is not the greatest thing since gluten-free bread after all.

That’s the trouble with love:  It’s completely irrational, and therefore fragile—especially when reason suddenly enters into it.

I would love to think that my visceral adulation of great films is impervious to logic and to the criticisms of others.  But I am a logical being, too, and cannot depend on sheer faith to ensure that such adoration burns brightly forever.

That’s what makes it so heartening to find other people who feel that burn, too.  Or simply, in this case, someone who sees a portrayal of a young boy’s life and says, yup, that’s how it is.


The G Word

Today in Germany, it’s against the law to deny the existence of the Holocaust.

Today in Turkey, it’s against the law to affirm the existence of the Holocaust.

We’re talking here about two different Holocausts, but the point is the same:  Some countries have the courage to fess up to past atrocities, while others are abject cowards.

For us Americans, the responsibility to acknowledge other countries’ grievous sins would seemingly be straightforward.  And yet, in practice, it has become so fraught and complicated that you’d think we’d committed the crimes ourselves.

I’m speaking, of course, of the annual disgrace that is the American president’s failure to call the Armenian genocide by its rightful name.

Beginning on April 24, 1915—exactly a century ago—the Ottoman Empire in present-day Turkey began a process of premeditated, systematic murder against Christian Armenians living within its borders.  Generally, this was done either through outright slaughter or through prolonged “death marches,” whereby victims would ultimately starve.

At the start of World War I, Armenians numbered roughly two million within the empire itself.  By 1922, about 400,000 were left.

While there remains a debate about the exact numbers, a broad historical consensus has emerged that what happened to Armenians under the Ottoman Turks was, in fact, genocide.  That is, it was a deliberate attempt to annihilate an entire people on the basis of their ethnicity.

(An interesting linguistic footnote:  The word “genocide” did not exist until 1943.  In 1915, U.S. Ambassador Henry Morgenthau referred to the Ottomans’ treatment of Armenians as “race extermination”—a term that, as Christopher Hitchens observed, is “more electrifying” than the one we now use.)

A century on, the legacy of the Armenian Holocaust is as contentious as ever.  However, the basic facts are only “controversial” in the sense that the basic facts about climate change are “controversial.”  Politicians continue to argue, but among the folks who actually know what they’re talking about—in this case, historians—the science is resoundingly settled.

Which brings us to the unnervingly Orwellian chapter of this story:  The careful refusal by every American president to utter the word “genocide” whenever the subject comes up.

It’s weird and frightening that this is the case, and in more ways than one—even when just considering the present occupant of the Oval Office.

You see, it’s not as if Barack Obama avoids the issue altogether.  Thanks to the efforts of the Armenian community in America and elsewhere, he doesn’t have a choice.

During this centennial week, Obama aides have met with several Armenian-American groups, and Treasury Secretary Jacob Lew is in Armenia’s capital to mark the anniversary.  National Security Advisor Susan Rice, meeting with Turkish officials, called for “an open and frank dialogue in Turkey about the atrocities of 1915.”

Nor—while we’re at it—does Obama himself deny the truth that is staring him directly in the face.  In January 2008, as a presidential candidate, he said, “The Armenian genocide is not an allegation, a personal opinion or a point of view, but rather a widely documented fact.”

And yet, in the six-plus years of the Obama administration, the word “genocide” has never passed the lips of any American official.

The explanation for this is depressingly straightforward:  Turkey, a strategic U.S. ally, denies that such a genocide ever took place, and the U.S. is terrified that if we declare otherwise, our relationship with Turkey will suffer irreparable harm.

That’s right:  Our government, in our name, is publicly maintaining a major historical lie in order to placate a foreign country that murdered a million and a half of its own citizens and, a hundred years later, still pretends that it didn’t.

By comparison, just imagine a world in which it was official U.S. policy not to formally recognize an organized plot by Hitler’s Germany to eradicate the Jewish population of Eastern Europe.  (To say nothing of the continent’s gays, Gypsies, Poles and others.)  Imagine if Germany today claimed that the six million Jewish casualties were essentially a fog-of-war coincidence.  Imagine if Angela Merkel arrested and jailed anyone who implied otherwise and the U.S. did nothing meaningful to stop her.

We don’t need to imagine it.  Replace “Germany” with “Turkey” and “Jews” with “Armenians,” and you’re left, more or less, with the world we have.

The Turkish government acknowledges that a great many Armenians were killed in the First World War, but denies that it was the Ottomans’ fault.  Further, thanks to Article 301 of the Turkish Penal Code, anyone who argues to the contrary can be imprisoned for the crime of “denigrating the Turkish Nation.”  By not going all the way in our condemnation, we Americans—the people who are supposed to be leading the world in justice and freedom—allow the practice to continue.

It’s a moral disgrace by all involved—an insult to Armenians, to history and to truth itself.  And everybody knows it.

That’s the creepiest part:  It’s not just that so many officials are saying something untrue.  They’re saying something untrue that everybody knows is untrue.

It’s the very essence of totalitarianism:  Create your own reality and exert no effort in making anyone believe it.

In actual dictatorships, this strategy works because the leaders wield absolute control over their citizens.  (To wit:  If you’re being starved, tortured, raped, etc., the fact that your government is also duplicitous is not a particularly high concern.)

On the other hand, such transparent dishonesty never works in democracies like ours, because our system is designed to make it impossible.  So long as we retain the freedom of expression, the separation of powers and a reasonably competent press corps, the truth will (eventually) rise to the surface.

So the president will eventually come around on this issue, and the Republic of Turkey will just have to deal with it.

Until that happens, however, Obama’s ongoing squeamishness will continue to validate the pessimism of many voters that the promise of “change” in Washington is an illusion.  That campaign pledges, however sincere at the time, will always ultimately be overruled by entrenched interests at home and abroad.  That insurgents who vow to “shake things up” are no match for the status quo.

To be sure, there’s no point in being naïve about these things.  If you’re the leader of the free world, you can’t just go insulting other countries willy-nilly and expect nothing bad to happen in return.  You have to accept the world as it is, politics is the art of the possible, blah blah blah.

But does the bar for political pragmatism really have to be set this low?  By acceding to other nations’ fantasies about the facts of history, aren’t we diminishing not just history but ourselves?  Are we not paying a random that any other wrongheaded country could demand as well?

Why would we do this?  Why should the bad guys win?

It’s certainly not inevitable.  Just look at Germany.

A mere seven decades after committing the most horrible crime against humanity in modern times, the Federal Republic of Germany stands not just as a stable, functioning, open society, but as Europe’s premier economic power and—crucially—just about as un-anti-Semitic as it’s possible for such a country to be.

Of course, in a nation so large, pockets of anti-Jewish sentiment still percolate, some of which manifest themselves through violence.  However, the overall prevalence of German anti-Semitism today is no greater than that of most other nations in Western Europe, and is considerably smaller than some (looking at you, France).

More to the point:  Since completely reinventing itself during and after the Cold War, Germany, in its official acts, has never stopped apologizing for its wretched past, even going so far (as I noted earlier) of punishing anyone who “approves of, denies or belittles an act committed under the rule of National Socialism,” along with anyone who “assaults the human dignity of others by insulting, maliciously maligning, or defaming segments of the population.”  This might explain why the country’s Jewish population doubled in the first five years after reunification, and then doubled again over the next decade and a half.

In America, of course, those sorts of laws would be completely unconstitutional, as the First Amendment guarantees the right to insult whoever you want.  However, as both a Jew and a defender of human dignity, I appreciate the sentiment.  Better to outlaw lies than truth.

This is all to say that Turkey will ultimately come to terms with the darkest period in its history, and all the reconciliation that it entails.  We can’t be sure how long it will take for such a proud nation to own up to its past cruelties.  But there is one thing of which we can be sure:  It will have no reason to take that leap until it stops being enabled into complacency by superpowers like us.

Nothing to See Here

The problem isn’t that Hillary’s burrito bowl doesn’t matter.  The problem is that it does.

Oh, it certainly shouldn’t be anybody’s concern that Hillary Clinton popped into a Chipotle somewhere along her magical mystery tour through the Midwest last week.  Contrary to popular belief, presidential candidates do occasionally eat lunch.  It’s not an inherently newsworthy event.

In fact, you’d need to be more or less clinically insane to be so invested in a potential president—19 months before the election—that you wonder where (and what) they ate this week.  Or ever.

Many words leap to mind.  One of them is “stalker.”

But, of course, that’s what our country’s venerable press corps spent its time doing in the opening days of the Hillary Clinton campaign.

From the moment the former secretary of state announced her candidacy in a YouTube video on April 12, a gaggle of reporters raced to her roving campaign headquarters—a Chevy Explorer named Scooby—and they’ve been holding her road trip under a microscope ever since.

When word came that Clinton had patronized a Chipotle without anyone noticing, the media couldn’t let it go.  Over the next several days, no stone of Burritogate was left unturned:  What Clinton ordered, whether she left anything in the tip jar, why she was there incognito and didn’t mingle with other customers.

This is probably the moment for us to wryly observe that if the media had been as maniacally vigilant about the Iraq War as they are about a former senator’s dining habits, the last dozen-odd years of Middle East calamities might have been avoided.  But that’s a cliché for another day.

The fact, in any case, is that the press is treating this early phase of the 2016 election pretty much as you’d expect:  By clinging onto every last microscopic detail of the two parties’ respective contenders and wringing as much meaning out of them as they can.

At bottom, this is the result of two simultaneous—and seemingly unavoidable—conditions.  First, the reporters in question apparently only exist in order to cover these sorts of things.  Because, you know, it’s not like there’s anything else happening in America that might provide a better use of their time.

And second, since the first primary ballots of the 2016 race won’t be cast for another nine months, they really have no choice but to cover literally anything the candidates do.  Thus far, Clinton is the only active campaigner on the Democratic side, so there you have it.

The logic of it, however depressing, seems airtight.

It’s not.  There is a choice involved here, both for journalists and for us:  The choice to look away.  To ignore everything to do with the 2016 election until—oh, I don’t know—the year 2016.  To wait patiently until something interesting happens, rather than trying to create interest out of nothing.

We could allow ourselves a scenario—if we so chose—in which presidential aspirants would go on their whistle stop tours of Iowa and New Hampshire for years on end, but without reporters and cameras breathing down their necks 24 hours a day.  Grant the good residents of these early primary states the attention of the candidates, but not of the entire country.  Really, what do the rest of us care?

There are those—particularly on the interwebs—who will insist to the last that early nuggets from the campaign trail can serve as insights into a candidate’s character and managerial style, and are therefore worth covering and commenting upon.

As much as I would love to dismiss this theory outright as a load of hooey—political pop psychology run amok—I am in the unfortunate position of agreeing with it.  At least some of the time.

For instance, it became clear in the early days of John McCain’s first run that his scrappy, welcoming attitude toward the press would make him an uncommonly congenial and entertaining nominee (a fact that, admittedly, didn’t quite hold the second time).  Conversely, I think Rand Paul’s already lengthy history of arrogance and condescension toward reporters asking him simple questions should rightly give pause to anyone who thinks it’s a good idea to make this guy America’s chief diplomat to the world.

I’m not convinced, however, that it requires two full years of coverage for the truth about these people to come out.  Indeed, I am as certain as I can be that a person who completely tunes out all “news” about the 2016 election from now until, say, next February will be no less informed of its essentials than someone watching The Rachel Maddow Show every night between now and then.

I should add that, so long as reporters continue hounding candidates day and night, I have no particular problem with viewers at home following it as pure, disposable entertainment.

Just don’t pretend that it’s anything else.

Personally, I think it’s kind of hilarious that Hillary Clinton has named her campaign van Scooby.  It’s goofy, whimsical and endearing—and possibly a latent grab for the stoner vote, considering whom it’s named after.

But I did not need to know that.  It’s not important, and it reveals nothing relevant about Clinton that I won’t learn through debates, speeches and actual primaries.

More to the point, I did not need a professional journalist to tell me the van’s name, knowing what that journalist might have learned and written about instead.

The key in covering a round-the-clock event that goes on forever is knowing how to distinguish the things that matter from the things that don’t.  When reporters treat everything equally—as if where a candidate eats lunch is just as important as what he or she thinks about climate change—they license voters to do the same thing, leading to a campaign that is dangerously trivial.

The trouble, you see, is that talking about a trip to Chipotle is a lot more fun than talking about, say, ISIS.  Given the choice, there isn’t one of us who wouldn’t secretly (if not openly) prefer the former, even though we know the latter is infinitely more consequential and pertinent to being president of the United States.

Which means that we can’t be given the choice.

We can’t have our laziest instincts accommodated by being told that following the most inane details of a presidential campaign makes us informed citizens.  It doesn’t.  It just makes us voyeurs and turns our candidates into exhibitionists.  To elevate irrelevant pablum to a level of respectability is to enable both us and them into being our worst possible selves.

As we well know, the cultural erosion this practice creates does not end with the campaign.  Think about how many precious TV hours and newspaper columns have been expended on the exploits of the first family, or on the president’s March Madness bracket.  Or the fact that the White House Correspondents’ Dinner is still a thing.

The human need for trivia is plainly innate and inescapable—hence the proliferation of reality TV, People and the National Football League.

However, government and politics are supposed to exist outside the world of superficiality, not least because the future of the republic depends upon them.

Is it too much to ask that we take this one aspect of American life seriously?

If our press corps didn’t spend days on Hillary Clinton’s burrito runs and the like, would we really be unable the handle the stuff that really matters?

Don’t answer that.

The Sanctity of Life

If a convicted murderer wants to die and the family of his victim wants him to live, should he be executed anyway?

That question joins an already crowded field of considerations about what to do with Dzhokhar Tsarnaev, the Boston Marathon bomber whose trial enters its “penalty phase” on Tuesday.  Tsarnaev was found guilty of all 30 counts against him; now, the same jury will decide whether to sentence him to death.

My own view is that execution would be a mistake, and that the case for life imprisonment has grown stronger by the day.

Before we go any further, however, both parts of my opening question must be qualified.

First, we don’t know for sure that Tsarnaev desires his own death, although we can be forgiven for having that impression.

During the fateful day—exactly two years ago today—in which Tsarnaev eluded authorities by hiding in a shrink-wrapped boat in Watertown, he took the time to write out a message for whoever happened to find him.  It read, in part, “I ask Allah to make me a shahied (iA) to allow me to return to him and be among all the righteous people in the highest levels of heaven.”

“Shahid” is the Arabic word for “martyr.”

This may well have been bluster, and the defense will likely argue that he was not in the proper state of mind for his note to be taken seriously.  Maybe he’s actually terrified of death and just wanted to put up a tough front.

However, given the nature of his crimes and his family’s flirtations with jihadism—a movement defined by an eager willingness to enter the kingdom of heaven through violence—it’s reasonable to assume he means what he says, and that being killed by the state is a result that he welcomes.

That’s the first qualifier.  As for the second:  It’s not that the Richard family wants Tsarnaev’s life to be spared, as such.  They just want this whole terrible ordeal to end, and a life sentence is the only practical means of doing so.

Martin Richard was the eight-year-old boy who, with his parents and two siblings, was standing mere inches from the pressure cooker bomb Tsarnaev dropped on the sidewalk on Marathon Monday.  The resulting explosion killed Martin almost instantly and seriously wounded his mother, father and younger sister, Jane.  (Miraculously, their older brother, Henry, was unhurt.)

On Friday, the parents, Bill and Denise, issued an open letter on the front page of The Boston Globe, asking the Department of Justice to abandon its case for execution, writing that “the continued pursuit of that punishment could bring years of appeals and prolong reliving the most painful day of our lives.”

“As long as the defendant is in the spotlight, we have no choice but to live a story told on his terms, not ours,” the letter continued.  “The minute the defendant fades from our newspapers and TV screens is the minute we begin the process of rebuilding our lives and our family.”

That the Richard clan has taken this public stance is compelling for at least two reasons.  First, as the family most deeply and directly affected by the actions of the man on trial (as opposed to those of his brother), they wield absolute moral authority on what constitutes justice in this case.

And second:  Up until now, the fact and details of Martin Richard’s murder have served as the prosecution’s strongest argument in favor of putting Tsarnaev to death.  (What could possibly be more irredeemable than deliberately killing a child?)  For Martin’s survivors to publicly oppose it puts the government in a slightly awkward position.  How can prosecutors continue to press their case in the knowledge that it’s only inflicting further torment upon those whom they wish to protect?

To be sure, the Richards were not Tsarnaev’s only victims, and they certainly don’t speak for all of them (nor do they presume to).

Further, it is an unavoidable fact of our judicial system that criminal trials are not, strictly speaking, about the victims.  Heinous crimes are committed not just against individuals, but against civilized society as a whole, and punishments should be handed down accordingly.  This is especially true with respect to the Boston bombing, which was purposefully an assault against the entire American culture.

However, this doesn’t mean Bill and Denise Richard don’t have a point.  Throwing their plea together with all the other arguments against capital punishment—in general and in this particular instance—the case for keeping Tsarnaev alive becomes almost overwhelming.

To begin with, legal experts have affirmed that the concern about a prolonged appeals process is 100 percent merited.  Should the jury choose death, it would likely take many years—and God knows how many millions of dollars—before the execution would actually occur.  Timothy McVeigh, the most infamous recent federal inmate, was killed four years after being convicted for the bombing in Oklahoma City.  However, factoring in all federal and state-level offenders, the average waiting time on death row is nearly a decade and a half.

Who’s looking forward to that?

Then again, nobody ever said justice comes swiftly.  The bigger, deeper question is whether capital punishment would be justice at all.

If the assumption is that Tsarnaev committed the worst of all crimes and deserves the worst of all punishments, the science is not settled that death by lethal injection is that punishment.  Not even close.

As The Boston Globe recently explained, a lifetime prison term for Tsarnaev would probably be served at ADX, a “supermax” prison outside Florence, Colorado.  There, he would spend 23 hours a day locked in an 87-square-foot cell with concrete furniture, never interact with other inmates, never have visitors, be force-fed if necessary, and possibly go years without seeing natural light or so much as touching another human being.

Last year, Amnesty International released a report saying the conditions at ADX “amount to cruel, inhuman or degrading treatment or punishment in violation of international law.”  A 2012 BBC story explained, “The design of the cells and the architecture of the prison conspire to render the inmates docile and drive them mad.”

No wonder the government doesn’t want Tsarnaev to end up there.  It might lead us to feel sorry for him.

It’s quite an achievement for America to have made life imprisonment a fate worse than death—a fact that, in the present debate, seems vaguely important.

You see, the trouble with death is that we know so little about it.  We know it involves the rotting of the body, but we have no idea how it affects the soul.  (If souls exist, that is.)

We don’t know, for instance, that every suicide bomber doesn’t gain entrance to paradise when he blows himself up in a crowd of civilians.  We don’t know that the good ascend to heaven while the wicked burn in hell.  We don’t know whether a fresh corpse is immediately reborn, and whether the resulting entity is an improvement over the previous one.

In short, we sentence people to death on the basis of a gamble we cannot possibly have any confidence about.

By contrast, when we send someone to prison, we know exactly what it will entail.  There’s not much guesswork involved.  We don’t need to use our imaginations about whether and how the assailant will be punished—although many of us seem to enjoy doing so.

With Tsarnaev, there is one consequence of his execution of which we can be confident:  It would succeed in making him a martyr in the eyes of other jihadists.  By being killed by the United States, he will become yet another icon in the insane war against Western civilization by radical Muslims.  Untold numbers of young Islamic extremists will very predictably be inspired to take up arms as a result, giving us that many more enemies to fend off in the future.

This doesn’t mean we shouldn’t snuff the little bastard anyway.  It just means we need to be aware of what the act would do, and decide whether it’s worth it.

Well:  Is it?

Remember:  When the U.S. government gets into the business of killing people, it is no longer only a matter of seeking justice for a terrible crime.  It’s about national values.  It’s about what it means to be Americans.

I understand the theory that a person can commit an act so evil that he forfeits his right to live.  I appreciate the notion that America values the lives of its citizens so highly that it is prepared to exact the most profound retribution against those who callously extinguish them.  I think there are some people who deserve to die and that Dzhokhar Tsarnaev might be one such person.

However, I think that all of the above are superseded by the highest American value of all:  Charity.

The United States, as I understand it, is a country that treats people better than they deserve.  A country that doesn’t inflict cruel and unusual punishment, even when it probably should.  A country that believes in the sanctity of life so strongly that it grants it to everyone, including those who are practically begging for death.  A country that behaves with restraint when it could easily exert overwhelming force—just to prove that mercy is a nobler impulse than vengeance.

I am perfectly aware that, in practice, none of those things is actually true—at least, not too often.  We arrest and jail people who have committed no crime.  We torture prisoners with happy abandon.  And, of course, we wage wars that claim innumerable innocent lives, which we rationalize away at dizzying speeds.

Indeed, America has never succeeded in living up to its own ideals.  That’s what makes them ideals.

I just don’t see how executing Dzhokhar Tsarnaev would do any goddamn good.

We are supposedly in an ideological conflict against a creed that worships at the altar of death—people who believe that the killing of one of theirs should be met with the killing of one of ours.

I think we ought to try a little harder not to prove them right.


The coolest kid in the room is the one who makes absolutely no effort to be cool.

This is a fact we all learn sooner or later, although it always seems to be long after we’ve graduated high school—the period of adolescence when it would do the most good.

But no matter.  Better to know a crucial fact of life late than never.  And make no mistake:  Grade school is not the only place in which coolness—and perceived coolness—plays a major role in shaping the American culture.  For better or worse, it’s a factor that stays with us until the bitter end.

But that is what makes my opening observation such good news.  It’s a shame we’ve gone to such lengths to keep it a secret.

When you’re young, you want nothing more than to blend in with the crowd.  In practice, this often means suppressing or altering your personality—and with it, your true thoughts and feelings about how you see the world—lest everyone else think you’re a weirdo.

Sociologically-speaking, this is perfectly rational behavior, especially if being different means getting stuffed into a locker, or worse.

However, once you escape from that 12-year prison sentence and spend a bit of time in The Real World, you realize the people you truly admire are those who refuse to fit in:  The folks who think differently and do not worry about how it might affect their social standing—either because they don’t care about their social standing, or simply because it never occurred to them to be anything other than their true selves.

In a land of self-consciousness and insecurity, the confident man is king.

All of which is prelude to a simple but important fact:  Last Week Tonight with John Oliver is the coolest show on TV.

John Oliver, of course, first became known to America as a correspondent on The Daily Show with Jon Stewart, itself regarded as perhaps the hippest half-hour in all of basic cable, owing to its penchant for exposing hypocrisy in politics and the media and—it must be said—for towing a reliably liberal, millennial-friendly line on most issues.

Then in the summer of 2013, Stewart took a three-month sabbatical to direct Rosewater, during which time Oliver assumed the anchor chair and proved he was talk show host material.

Last Week Tonight premiered on HBO on April 27, 2014.  Broadcast for a half-hour every Sunday night, it proved an immediate creative success.  After nearly a year on the air, it has become indispensible—for reasons both obvious and unexpected.

To be sure, Last Week Tonight is appealing for many of the reasons The Daily Show is appealing.  They both skewer political disingenuousness and stupidity wherever they occur.  They both traffic in cheap puns and ironic cultural references.  And they’re both fundamentally more honest and trustworthy than most actual TV news programs.

But Oliver’s shtick is no carbon copy of Stewart’s.  Broadcast on HBO, it is not burdened by strict time limits, commercial interruptions or—crucially—censorship.

Also—and perhaps paradoxically—because Last Week Tonight airs only once a week, Oliver is able to go much further into depth than a show that runs every night.

Indeed, a typical episode of Last Week Tonight contains only a cursory recap of the past week’s news.  Oliver will briskly tick off the highlights—joking all the way, of course—before proceeding to the main event:  A lengthy, deeply-researched, fully fleshed-out monologue about a topic of his choosing.

There are no parameters for what the issue can be, and they have varied wildly from week to week.  Some are of clear relevance to recent events (e.g. government surveillance, drone warfare and income inequality), while others seem to fall randomly out of the clear blue sky (e.g. exposés of nutritional supplements, FIFA and the Miss America Pageant).

The show’s genius—the quality, above all, that makes it essential viewing—is to introduce subjects that are either boring, complicated or obscure and force us to care about them—first by making them comprehensible, and then by making them funny.

To achieve this week in and week out is not difficult.  It’s impossible.

But Oliver does it, and the service he renders is nothing short of heroic.

His premise—unspoken but unmistakable—is that most Americans’ priorities are completely out of whack, and that the issues we should care most about are the ones we most ardently ignore—often quite willfully, indeed.

In broaching matters that most of us would rather not broach, Oliver’s greatest weapon is empathy.  He will often begin a segment with an apology, acknowledging up front that, deep down, none of us really wants to deal with, say, the minutiae of civil forfeiture laws or the compromised relationship between doctors and pharmaceutical companies.  Why would we?  The very sound of those words causes our eyes to roll up into the back of our heads.

Oliver’s response, in effect, is to say, “Bear with me.”  Sometimes he will reel in his audience with a promise of a reward at the end of the segment (for instance, a YouTube video of a hamster eating a tiny burrito).  Other times, he will plow right ahead, trusting us to follow along in the understanding that whatever’s coming is important and worthy of our attention.  After all, why would a budding TV star risk his career on something that isn’t guaranteed to spark viewers’ interest?

In point of fact, that’s exactly what Oliver does.  As for why, perhaps it’s that raising awareness of heretofore overlooked concerns is more important to him than his standing in the Nielsen ratings.  Maybe he’s more interested in telling us what we should know rather than what we want to know.

And that is why Last Week Tonight is the coolest show on television.

For a textbook example, look no further than last Sunday’s episode, during which Oliver marked Tax Day with a defense of the IRS.

Yup:  In the very week that hatred of the Internal Revenue Service by every man, woman and child in America reached critical mass, Oliver went out of his way to put in a good word for the government agency whose sole purpose is to get between you and your money.

After acknowledging—in typically thorough fashion—that the IRS has often proved itself horrendously inefficient at providing basic customer service and at correcting its own mistakes, Oliver proceeded to illustrate that to direct all of your contempt about taxes toward the IRS is to fundamentally misunderstand how our government works.

“Blaming the IRS because you hate paying your taxes is a bit like slapping your checkout clerk because the price of eggs has gone up,” Oliver explained.  “It’s not her fault. She’s just trying to help you get out of the store.”

He’s right.  To the extent that the U.S. tax code is confusing, unfair and ridiculous, it is entirely the responsibility of Congress, whose esteemed members wrote the damn thing in the first place.

The Internal Revenue Service has absolutely no say in how America’s tax structure works.  All it does—all that it’s meant to do—is enforce the policies that Congress lays out, and to do so in the fairest possible way.

And yet—as Oliver went on to show—IRS offices are regularly targeted by the very taxpayers they mean to assist—often in the form of questionable substances affixed to senders’ returns.  In a clip from a 2007 documentary, we see the director of an IRS processing center explain, with almost comical detachment, how employees will simply wipe the offending substance from the check and send it on its merry way.

In portraying America’s annual Tax Day mania from the IRS’s point of view, Oliver’s implicit message becomes clear:  By making IRS workers the bad guys, we taxpayers are behaving like a bunch of whiny, self-righteous idiots.

The truth is that an IRS employee is like any other low-level office drone:  He spends eight hours a day moving paper around before returning home, emptying a pint of Jameson and passing out on the couch.  Projecting all of your frustration at him accomplishes nothing except proving that you are a colossal, inconsiderate jerk.

This was a point that absolutely needed to be made, yet one that hardly anyone wanted to hear.  For as long as paying taxes has been everyone’s least favorite springtime activity, the Internal Revenue Service has been the perfect villain:  An entity that we can all get together to detest.

Why ruin our fun with reality?

It’s a thankless job, indeed, to confront people with facts that make them feel guilty or foolish.  Perhaps not as thankless as performing an audit on someone who wants to squish you like a bug, but close enough.

The person who does it will never be popular, because why would he be?  They say a true friend is someone who will always tell you the truth, but when was the last time the truth made you feel better?

Truth-tellers are essential to a society that so stubbornly insists upon living in a fantasy world.  However, because the very concept of unattractive facts is anathema to the American way of thinking, the bearers of bad news will only ever be those with enough nerve to resist the peer pressure of groupthink and the idea that distinguishing yourself from the crowd has no social benefits.

No surprise, then, that the coolest man on American television is an Englishman.

The End of Comedy

Should today’s comedians tailor their material for people with no sense of humor?

Obviously the answer is no.  But you’d never know it from the past few weeks, in which far too many humorless rubes have had far too much say—and sway—over what cheeky, intelligent comics are allowed to say.

Increasingly, we are becoming a society in which every public statement—be it serious or in jest—must be understood by the dumbest, most literal-minded person in the room, and in which irony and sophistication are punished and looked upon with scorn.

It’s a form of cultural suicide.  Shame on us for doing so little to stop it.

We could look just about anywhere for examples, but at this moment, we might as well begin with Trevor Noah.

A stand-up comedian by trade, Noah was unknown to most Americans until the fateful moment two weeks ago when he was given the job of a lifetime:  Successor to Jon Stewart as host of The Daily Show on Comedy Central.

Naturally, this announcement led Daily Show viewers to plumb the Internet for clues about who the heck Trevor Noah is.  As it turns out, he is an uncommonly deft and sneakily subversive 31-year-old from South Africa who found great success in his country of birth—in radio, television and onstage—before wafting over to the United States in 2011.

He is also an extremely active presence on Twitter.  Since joining in 2009, he has issued nearly 9,000 tweets in all.  (That’s roughly four per day, in case you didn’t want to do the math.)

Like the rest of us, Noah tweets pretty much every half-clever thought that pops into his head, and because he tells jokes for a living, the entirety of his Twitter output covers an awful lot of ground.

By itself, this fact is not especially interesting—and certainly not “newsworthy”—but then the world made a horrifying discovery from which it has not yet recovered:  Some of those 9,000 tweets were politically incorrect.

The horror.

I confess that I have not personally read all six years’ worth of brain droppings from an entertainer who’s been culturally relevant for 15 days.  However, many people apparently have, because within hours of Noah’s hire, they produced the aforementioned damning tweets, about which two facts stand out:  First, none of them is less than three years old.  And second, you can count them on the fingers of one hand.

What is their content, you ask?  Which 140-character quips are so horrible—so appallingly beyond the pale—that their existence is germane to us several years after the fact, and are possibly grounds for dismissal for the man who quipped them?

They were, in no particular order:  A putdown of Nazi Germany.  A mild critique of Israel.  An observation about the scarcity of white women with curves.  And a musing about the value of alcohol for women with a few too many curves.

And.  That’s.  About.  It.

At this juncture, we could go further into depth, if we were so inclined.  We could follow the lead of Noah’s critics, attempting to connect a handful of disparate tweets to the inner workings of Noah’s soul.

Or we could choose option B:  Grow up, get a life and stop throwing a tantrum every time someone says something that makes us uncomfortable.

I’ll keep it simple:  If a biracial comedian’s cracks about white women are too much for you to handle, then you have no business watching Comedy Central.  If you cannot stomach the notion of an émigré from South Africa having a critical view of Israel—a country that tacitly supported the former’s apartheid government until the bitter end—then you’d better steer clear of any newspaper or magazine that crosses your desk, because it just might give you a heart attack.

Sorry to break the news, but one of the consequences of living in a country with freedom of speech is that people will occasionally speak freely, and you might not agree with all of them.

Or, in this case, even understand what they’re saying.

My fear, you see, is not just that free expression itself is under attack, but that a great deal of this offense-taking is based on misapprehensions.  That smart people cannot say anything in public without worrying how their words might be interpreted by idiots.

Case in point:  Note the stupidity surrounding Bill Maher’s recent throwaway gag about how Zayn Malik, the now-ex-member of One Direction, bears a passing resemblance to Boston Marathon bomber Dzhokhar Tsarnaev.

Juvenile, yes.  But the logic could not have been more obvious:  Person A looks like Person B, end of joke.  It’s funny (or not) because one is evil while the other is an innocuous pop star, and that’s what irony is all about.

No one could possibly have understood the joke in any other way.  And so, of course, everyone did.

OK, not everyone.  But there were enough complaints about Maher “comparing” Malik to Tsarnaev—paired with the fact that Malik is Muslim, which no one outside the One Direction fan club would have known—for this to become a news story in many major publications.  For a solid few days, an HBO talk show host was compelled to explain the comedic concept of implying that one famous person looks a little bit like another famous person.

Has America really become that intellectually infantile?  Is this the level to which our public discourse has plunged?  How long will our best and brightest continue to shoulder this burden before everyone else finally wises up?

Certainly, it’s not a new phenomenon that an entire culture can get dragged down by its lowest-hanging fruit—our so-called “bad apples.”  Just look at how a handful of corrupt, racist cops have single-handedly tarnished the image of their entire profession, even as 90-something percent of their colleagues are doing their jobs exactly as they should.

But it’s even trickier when it comes to the militant enforcement of political correctness, because unlike killing unarmed black people, being offended by a joke as a result of your own ignorance is not against the law.  As my eighth grade history teacher said, “In this country, you’re allowed to be stupid.”

And it’s not just about jokes.  The tendency to lazily misinterpret a sophisticated public statement has consequences for our political leaders, too.  And, indeed, for the very language we speak.

I am reminded, for instance, of candidate Mitt Romney touting his family’s support for civil rights by saying, “I saw my father march with Martin Luther King.”  George Romney was, indeed, a strong ally of the Civil Rights Movement, consistently supporting Dr. King’s efforts and even leading a Michigan march (as the state’s governor) to protest the police brutality in Selma, Alabama in 1965. However, according to newspaper reports, Romney and Dr. King never literally appeared at the same event on the same day.  This led the media to tar Mitt Romney as a liar for implying that they had.

In one sense, the media were right to call Romney out for saying something that was technically untrue.  However, considering the full context of Romney’s statement—namely, the fact that his father was a champion of black civil rights, despite being a white Republican—we can accept the words “march with” as a rhetorical device in service to a broader truth, rather than as a bald-faced fabrication.

Except that we don’t accept such things anymore, because we’re too busy setting mousetraps for our public servants to get caught in.  Thanks to the wonders of the interwebs, we live in an age in which every statement is maniacally fact-checked and a politician can’t get away with anything.

For the most part, this is a good thing, because it means that true deceptions get exposed within minutes of being uttered and our leaders are kept relatively honest.

However, this instinct toward righteous, ruthless truth-seeking can be taken too far, leading us to take down politicians for transcendently silly reasons, and possibly dissuading future leaders from ever entering the arena.

So long as our public figures have reason to worry that everything they say will be taken literally—including words and phrases that are self-evidently figurative—they will have no choice but to dumb down their oratory and rhetoric until all the poetic flourishes are gone—and, with it, any hint of inspiration or linguistic flair.

That’s how our future is looking, so you’d better prepare yourself.  At long last, we are fulfilling the prophesy of Vanity Fair editor Graydon Carter, who remarked one week after 9/11, “It’s the end of the age of irony.”

It took 13 years, but we’ve finally achieved a culture in which no one is allowed to be funny.

That is, unless one of two things happens:  Either the dolts who can’t take a joke suddenly acquire the powers of subtlety, or the rest of us stop giving them the time of day.  I don’t know about you, but I have a pretty good idea about which of those scenarios is more likely to occur in our lifetime.

If history has taught us anything, it’s that stupidity cannot be eradicated.  It can only be marginalized, ridiculed and ultimately ignored.

At Peace With Passover

Growing up, it never occurred to me that Passover could be enjoyable.

To be fair, it’s not really supposed to be.  And in my childhood, no one made much effort to change that.

For an awfully long time, Passover meant exactly two things:  Enduring a mind-numbingly boring Seder two nights in a row, and eating nothing but matzo for a week.  (Matzo, of course, is the large rectangular cracker that is often said to taste like cardboard—a claim that, as every Jew knows, gives short shrift to cardboard.)

As with every other Jewish holiday, the observance of Passover is awash in symbolism about an event in the Biblical past in which Jews were treated horribly—in this case, the Israelites’ enslavement in Egypt.  The traditional Seder, as spelled out in the Haggadah, contains no detail that isn’t a specific reference to some element of the Exodus narrative and its implications.  For instance, we eat horseradish to remind us of the “bitterness” of slavery, and we remove ten drops of wine from our glasses to mark each of the Ten Plagues that wiped out the Egyptians.

When you’re, say, five years old, going through this routine is every bit as much fun as it sounds.  In my family, it didn’t help that we read from a Haggadah written in a form of English that Shakespeare would’ve found arcane, or that we stage-frightened kids were tasked with reciting the “Four Questions” in front of everybody—in Hebrew!

So that was Passover for quite a while.  Not torture, per se, but certainly one of the more dreaded nights on the calendar.  (Not to mention the eight days of not being able to eat bread, cereal or pasta.  The horror.)

Then something funny happened:  I grew up.

Today, I have managed to get over my selfish adolescent hang-ups and appreciate Passover for what it really is:  An opportunity for Jews to enjoy each other’s company and consume ungodly amounts of food.

Essentially, Passover is Thanksgiving dinner preceded by two hours of saying grace and four glasses of wine.  No wonder grown-ups like it more than kids.

As I have discovered in recent years, you do not need to be an especially observant Jew to get something out of this holiday.  Actually, you don’t need to be Jewish at all.

All you need—if we’re gonna get right down to it—is a good host and a good crowd.  This year—not to the exclusion of other years—our family had both.

At both Seders we attended last weekend, there were very few attendees who would describe themselves as devoutly religious on a day-to-day basis.  In addition, we had a number of non-Jews in attendance—folks either with a Jewish spouse or simply good friends with the other guests and happy to be included.  Not to mention people, like me, who think organized religion is generally a bad idea but have nonetheless retained a small piece of their Jewish identity, if only on special occasions like this one.

But you would not necessarily have assumed any of that from our gatherings, which followed the basic structure of the Haggadah from start to finish, albeit with a fair amount of condensing and modernizing.  We covered every facet of the Exodus story and ruminated on why it’s worth retelling, and in a way that even the gentiles could appreciate.

In effect, we split the difference between Passover’s inherent solemnity and our modern, slightly irreverent sensibilities, crafting ceremonies that were simultaneously traditional and accessible.  What with the lively atmosphere and the regularly-scheduled wine-drinking, it didn’t seem like much of a wait before the food came.

And boy, did it ever.

It’s true that Passover tradition prohibits the consumption of chametz, meaning anything containing wheat, rye, barley, oats or spelt (whatever that is).  While this certainly eliminates a significant chunk of the Great American Diet, we so easily forget how much culinary goodness is left.

At our Friday Seder, with a crowd practically spilling out into the hallway, dinner was a cornucopia of roast turkey, beef stew, fried eggplant, marinated beets and an exceptionally fragrant matzo ball soup (according to legend, Marilyn Monroe was fed this dish so often by her Jewish husband, Arthur Miller, that she was compelled to ask, “Isn’t there any other part of the matzo you can eat?”).

On Saturday, we were hosted by a family composed (mostly) of vegetarians, resulting in a meal that included French lentil soup, roasted potatoes, grilled salmon and a tofu stir fry that almost made me forget how much I love meat.

If those all sound like unimaginably delicious entrees that could be served at any old time of the year, it’s because they are.  And they are all perfectly acceptable on Passover.

My point here is that those who complain about the dearth of decent Passover food are either grossly misinformed or simply enjoy complaining about things.  (Not that Jews have ever been known for kvetching.)

There’s a widely-accepted truism that says that non-Jews enjoy matzo much more than Jews, owing to the fact that non-Jews are never forced to eat it.  However, this isn’t quite correct:  Except at the Seder itself, where matzo is introduced as one of the evening’s many symbols, Jews are not compelled to consume the crumbly, unleavened atrocity in order to fulfill the commandment about avoiding chametz.  You can’t eat bread, but you’re perfectly free to avoid matzo as well.  There is more than one aisle in the supermarket.

In fact, the reality is even better than that.  Thanks to the miracle known as matzo meal—a powdery substance that behaves like flour without actually being flour—it is possible to cook and consume various baked goods without technically disobeying God’s dictates.  Admittedly, some of these confections are appalling—bland to the point of offensiveness.  However, others manage to be striking approximations of the real thing.  In our kitchen this week, for instance, we stumbled upon a recipe for Passover apple cake, and I’ll be damned if it doesn’t taste almost exactly like real, honest-to-goodness apple cake.  If you like, I’ll send you the recipe.

Is this cheating?  A cheap loophole through which to violate the holiday’s spirit without quite violating its word?

You bet it is.  And if there’s one thing we Jews are good at, it’s finding cheap loopholes.

Except that we aren’t breaking the spirit of Passover when we bake cake substitutes and the like, because doing so requires altering our behavior just enough to reflect on how this week is different from all other weeks (to paraphrase from the Four Questions).  As with Starbucks’ supposedly failed recent campaign to foster conversations about racial inequality, the point is to get our attention—to elevate our consciousness about a subject we might otherwise ignore.

Certainly, for many Jews, the above is hardly a sufficient level of observance in the eyes of God.  Among the more conservative of the tribe, any diversion from the original script is an abomination, and if anybody is enjoying themselves at the Seder table, you’re doing it wrong.

All I can do is point out that I, a resolute nonbeliever, have been compelled to keep one of Judaism’s holiest festivals, without any external pressure, for no reason except that it gives me pleasure.  After a period of divorcing myself from all expressions of religious faith and observance, I have partially reintegrated myself into the Jewish community, finding it to be not nearly as incompatible with my own values as I thought.

I cannot really account for this, and I don’t doubt it’s a function of being able to pick and choose which parts of Judaism to accept while ignoring the rest of it—including the idea of the Torah being literally true.

Then again, that’s how everybody approaches their religion of choice:  They pluck out the bits they like and pretend the others don’t exist.  There’s nothing dishonorable in this.  Considering the many ways most religions contradict themselves, it would be impossible to do it any other way.

As such, I don’t see why a non-religious person shouldn’t go along with the values and rituals that religions get right—much in the way that believers have adopted secular ideas when it has served their purposes.

That I have made peace with Passover may well indicate, as some like to claim, that believers and non-believers have a lot more in common than they think.

Or maybe it just indicates that the Jewish hankering for gefilte fish is impossible to shake.