Act Naturally

What’s the sign of truly great acting?  It’s when you don’t realize it’s acting.

“The very best actors,” Roger Ebert once said, “are the ones who do the least.”  Given a decent script and adequate preparation, he explained, a player in a film need only perform the physical actions required of his or her character, and everything else will fall naturally into place.

Watching Boyhood, Richard Linklater’s most peculiar new movie, I spent a great deal of time regarding its protagonist, Mason, as a largely passive character.  In scene after scene, everybody else is talking and carrying on and being dramatic, and Mason just sits quietly—serenely, even—offering little more than a raised eyebrow or a subtle grin.  Things happen around and to him—indeed, other people seem all-too-eager to tell him how to live his life—and he just runs with it, content not to generate any sort of drama himself.

Most movies don’t allow themselves a passive hero, perhaps from fear of boring their audience.  Screenwriting professors forever caution against it, and Kurt Vonnegut neatly instructed budding writers that a character in a story “should want something, even if it is only a glass of water.”

As Boyhood progressed, and the full measure of Mason’s personality gradually took shape, I realized I was mistaken.  Those early moments of introversion and pensiveness were not signs of a dull or lazily-written character.  Rather, they were the opening stages of an ongoing process that we might hazard to call a life.

Ten-year-old Mason’s silent wince when his dad talks to him about birth control is explained not by a timid screenplay.  It is explained by Mason being a prepubescent kid being cornered into an awkward sex chat with his father.  (Did I mention it occurs in a bowling alley and that Mason’s sister is there, too?)  His reaction is exactly what you would expect of anyone his age, albeit rarely in a movie character of any age.

A few years later, when he joins a group of shady schoolmates on a camping trip, quietly sipping cheap beer and allowing the others to dominate the conversation, it’s not that he has no personality.  Rather, it’s that he is a naturally reserved and easygoing person, and is perfectly fine not being the center of attention.

In high school, where he shows every sign of becoming a highly gifted photographer, a teacher gives him a tongue-lashing for spending too much time in the darkroom when he could be out there doing something lucrative and practical.  His response—shrugging the teacher off and sheepishly defending his work ethic against a charge of lackadaisicalness—might be dismissed by many script coaches as insufficiently confrontational.  In fact, what the scene underlines is Mason’s continued and determined effort to avoid confrontations.  It’s just who he is.

The cumulative effect of all these vignettes—the payoff, as it were—is a character not quite like any other in mainstream movies.  As Mason, Ellar Coltrane’s is an exemplary performance, because it never for a moment seems like a performance.  From the opening frame onward, Mason is simply a person we have the unusual opportunity of seeing evolve and grow.  (The movie was shot, little by little, over the course of 12 years.)  Yes, he is a fictional character, and yes, his lines were written down on a piece of paper.  But by the end of the story, we feel like we know him as intimately as many people in our own lives.  He has, through the magic of cinema, been transformed from a puppet into a real boy.

In the film vernacular, the word for this is “naturalism.”  There were points during Boyhood when I felt like I could go on watching it forever—not because its characters are exceptional (they’re not), but because they don’t feel like characters.  As with director Richard Linklater’s 2013 film Before Midnight (and its prequels, Before Sunrise and Before Sunset), the movie plays like a real-time documentary, but with all the exciting drama and wit that actual documentaries rarely contain.

I don’t have children of my own, but Boyhood instilled in me the sense of paternalism that comes naturally to the job of parenthood.  Like his mom, played by Patricia Arquette, I became protective of Mason—anxious when he went off on his own; horrified when his drunk, abusive stepfather took him and his stepsiblings hostage; proud when he placed second in a high school photography competition.  I felt those things as I rarely do for a fictional person, but in this case they were earned because the person in question was made to seem real and worth rooting for.

As it turns out, the secret to garnering affection in the movies is the same as in real life:  Don’t put on an act.  Just be yourself.

This Holy Land is Your Land

What if there were no Jewish state?

What if Hamas got its wish and Israel, as we know it, fell into the sea?

It’s the one scenario that the Israeli government considers absolutely unthinkable. So let’s think about it.

Not that Israel would disappear in the manner Hamas would prefer—namely, by taking all its people with it—but that it would simply cease to be a Jewish state, transitioning into a nonsectarian country like any other.

Indeed, the notion is “unthinkable” in both senses of the word. For Israeli Jews—and most non-Israeli Jews, too—the loss of the Promised Land as a birthright is unthinkable in the sense of being too horrible to contemplate.

As well—and perhaps most essentially—the death of a Jewish Israel is unthinkable in the way that the death of a democratic United States or a Catholic Vatican are unthinkable: Present circumstances render such events literally impossible.

In Israel’s case, the reasons are twofold. First, there is the fact of Israel’s nuclear arsenal, whose capabilities represent both an undeniable lifeline and the state’s primary contribution of Orwellianism to the world. (To wit: Everyone knows Israel possesses nuclear weapons, but no one is allowed to say so).

And then there is the fact of America, which will defend Israel’s right to exist until the last lobbyist dies.

Together, these two forces ensure that, for better or worse, the Jewish state will endure. That even if the supposed nightmare scenario occurs—namely, the literal wiping out of Israel’s population by a nuclear rogue state—the deed to the land itself would not become null and void. It would merely pass down to the next generation of Jews to seize, with the wind of world opinion at their backs.

After all, that is, in a manner of speaking, how modern Israel was born. In the twilight of British control over Palestine, the basis for “The Jews’ State”—as articulated by Theodore Herzl in 1896 and outlined in the Balfour Declaration of 1917—was, above all, to rectify the Jewish people’s long history of persecution and exile by providing them a solid, permanent living space in which such injustices would not occur. In 1948, the recently-completed systematic murder of six million of the ancient tribe—that is to say, roughly two-thirds of all the Jews then living in Europe—was the final straw and effectively the end of the argument. (And also, of course, only the beginning of the fighting.)

Bearing this long but necessary prelude in mind, let us now ask: Given the world as it is today, does this argument still hold? Are the Jewish people really as vulnerable as they were at the end of World War II, thereby deserving a state of their own?

If not, what is the justification for an explicitly Jewish state—one that automatically grants citizenship to any interested Jewish person, but does not extend the same courtesy to others?

And if so—that is, if anti-Semitism and the threat of violence and extinction are as great as many allege and fear—what could we reasonably expect to actually happen in a world without a Jewish state? Would Jews become yet more susceptible to others’ ire, or has the world moved on from treating the Jew as a perpetual victim, thereby rendering special consideration obsolete?

Does Israel, as we have known it for the past 66 years, any longer have a purpose?

The United States is explicitly and deliberately a nation of no people in particular. Ours is not a “Christian country,” nor an “English-speaking country,” nor a “right-handed country.” Statistically-speaking, all of those characteristics are accurate, but none is required either for citizenship or for special treatment under the law.

By design, America is pluralistic and secular, and it is precisely for that reason that absolutely anyone can coexist with anyone else. It is not even slightly a coincidence that, in a society that does not value one religion over another, the United States has never played host to a large-scale religious conflict within its borders. It is awfully hard to feel repressed when no one is repressing you, even if you’re a Muslim or a Jew.

When faced with this challenge about Israel’s role in the world, many have underlined the country’s status as the only stable, functioning democracy in the Middle East, a region otherwise populated by dictatorships and general political chaos of one sort or another.

With its regular elections and three separate branches of government, Israel is indeed exceptional, given its location and history. Americans and others are right to be thankful to have at least one dependable ally in the neighborhood.

Isn’t that good enough? Shouldn’t Israel’s primary objective in today’s world be to shore up its democratic traditions, not its religious favoritism? Isn’t the latter axiomatically a hindrance to the former? Isn’t the only real effect of Israel being a country for Jews, rather than a country for everybody, the continued uncertainty among all its residents that they can practice their particular faith in peace?

Has Zionism finally played itself out?

Rest assured: I am prepared to entertain the possibility that I am an idiot. That I know even less about Israel than I think I do. That my atheism blinds me to the concerns of the faithful, and that my experience as an American inhibits my understanding of life in the Middle East (or anywhere else).

Perhaps it really is necessary to have a “Jewish state”—not just a “state for Jews”—somewhere in the world, so that those under oppression (real or imagined) can be secure in their persons and their faith. And perhaps the only real concern in the present conflict in Gaza is how the Israeli government should manage itself, not the nature of the government itself.

All of those things might be true. However, I nonetheless suspect that they might not, and I worry that the continued assumption to the contrary will do all of us far more harm than we know.

This Holy Land is My Land

As a freelancer, I like to think that I can write about anything.  And yet, I somehow have a problem writing about Israel.

It’s not because I have a dog in the fight.  It’s because I don’t.  I too easily see both sides of the issue and sympathize with them, and that makes it awfully hard to take a stand in one direction or the other.  And if you don’t have a strong point of view, what’s the point of opening your mouth?

Except that I suspect most Americans—nay, most citizens of the world—also see the nuances in a contest that has brewed for several thousand years, and have likewise made the decision to keep their views to themselves.

The result, as with so much else, is a debate that has existed mostly at the extremes—namely, between those who think Israel is always and forever morally in the right, and those who think the opposite.

And in the rare moment when someone does introduce complexity into the mix—say, when Secretary of State John Kerry makes the obviously true statement, “Today’s status quo absolutely […] cannot be maintained”—that person is roundly condemned as a stooge for either the pro-Israel or pro-Palestine lobby and, for good measure, tarred as anti-Semitic, Islamophobic, or any other undesirable that would effectively disqualify him from the discussion.  (President Obama is regularly called all of those things at once.)

Further—and here is the main cause of my reticence to engage—once one does attempt to take a long and wide view of the clash between Israelis and Palestinians—or, if you prefer, between Jews and Muslims—one cannot help but be overcome with a sense of Sisyphean futility.  Here seemingly is a conflict that, almost by design, can never be solved.  It just keeps going around and around, accumulating fresh hatreds and injustices (and corpses) with each turn.

Certainly, the present battle in Gaza between Israel and Hamas is insoluble by definition, since the one condition that each side considers non-negotiable is also the one condition to which the opposing side will never, ever agree.  Namely, Israel’s demand to be formally recognized and Hamas’s demand that Israel cease to exist.  Since Israel isn’t going anywhere and Hamas is not about to alter its charter, this particular skirmish can only be resolved through the disempowerment of Hamas, a group that, while designated a terrorist organization by five countries (including the U.S.) and the European Union, is a democratically-elected body.

And that only concerns the events of the past three weeks.  Even if Hamas disbanded tomorrow, we’d still have a few thousand years of unfinished business to tend to.

On the $64,000 question—“How can Israelis and Palestinians coexist peacefully?”—we are met with one of the great paradoxes of the age:  In this challenge, with its bottomless well of complexities, the answer is both simple and obvious.

“The first thing to strike the eye about the Israeli-Palestinian dispute,” the late Israeli diplomat Abba Eban is supposed to have said, “[is] the ease of its solubility.”  With two groups of people credibly laying claim to the same piece of real estate, Eban continued, the only possible course of action is to divide the land equally, with one state for each group, so that everyone will have a place to call “home.”  Easy peasy.

Why didn’t this happen, say, 50 years ago?  Why, despite the world’s best efforts, has it continued not to happen in all the time since?  Why does it appear so unlikely to happen today or, indeed, ever?

It is always around this point in the conversation when the blame game begins.  The point at which everything is reduced to “It’s the Jews’ fault” or “It’s the Arabs’ fault.”  And of course, both of those statements are true.

It is true, for instance, that several former Palestinian leaders have passed up perfectly reasonable and mutually beneficial peace deals, essentially out of a mixture of spite and stubbornness about the “Israel’s right to exist” canard.

It is also true, for instance, that the Israeli government has spent the last many years building up and expanding its occupation of the West Bank—against constant protests from all corners of the globe—for no practical purpose except to poke its critics in the eye at precisely the moment when it ought to be shoring up goodwill and trust from both within and without.

We could go on like this all night, deeding yet another generation a lifetime of existing in a state of hatred and perpetual insecurity.  (Amidst the present carnage, it is encouraging to see the occasional appearance of ordinary Israelis and Palestinians who are sick of the whole thing, and who actually do coexist peacefully.)

We could also note (possibly in vain) the clear culpability of religion itself in the matter—a force that leads one to view a common land dispute not merely as a legal or culture issue, but as a matter of divine imperative.  After all, once you think you have God’s permission to seize and control a chunk of property from now until the end of time, why should some paltry earthly law get in your way?

We could sit back and allow the squabble to play itself out, hoping that, as in places like Northern Ireland and Yugoslavia, the two parties will essentially exhaust themselves into a reconciliation.  In the meanwhile, we Americans could wash our hands and breathe easy at the fact that it is not, finally, our problem to solve.

And, in any case, we can look forward to the glorious day when the stars perfectly align, and both sides of the dispute realize that sometimes human life is more valuable than land, and perhaps even more valuable than justice.

Will that day ever come?  It sure would be nice to think so.

Popularity Fallacy

Jeez, can we knock it off about Bill Clinton’s amazing popularity, already?

You see the talk everywhere these days, including most recently in a column by Maureen Dowd in Sunday’s New York Times.

“As Hillary stumbles and President Obama slumps,” Dowd writes, “Bill Clinton keeps getting more popular.”  As evidence, Dowd cites a Wall Street Journal poll from June ranking the “most admired” presidents of the last 25 years (Clinton won by a mile); a YouGov survey measuring the perceived “intelligence” of the last eight commanders-in-chief (again, Clinton finished first); and a May Washington Post poll putting Clinton’s overall “favorable” rating at a 21-year high.

Indeed, strictly to the question, “Do most people today like Bill Clinton?” the answer is an indisputable “Yes,” and it hardly depends on the meaning of the word “like.”

However, I would argue the question itself is a silly and fairly useless one, as it is with regards to every living (or recently dead) ex-president.

Of course Bill Clinton is more popular today than he was, say, during the “Gingrich revolution” in 1994 or the Lewinsky fiasco in 1998.  Of course he enjoys more general goodwill than President Obama or possibly-future-President Hillary Clinton.

Bill Clinton left the White House on January 20, 2001.  Know what he’s been doing in the 13-and-a-half years since?

Not being president, that’s what.

George W. Bush, for his part, ended his presidency with an approval rating of 34 percent.  Today, that number is 53 percent.  What has Bush been doing these past five years to merit such a rise in stature?

Not being president and painting.

Bush’s father, George H.W. Bush, also clocked approval numbers in the mid-30s during his final months on the job.  Today, he is nearly as admired as Clinton.  What’s he been up to?

Jumping out of airplanes, fishing, and (all together now) not being president.

Of course, I am being a tad unfair and simplistic.  America’s modern-day ex-presidents have, to varying degrees, done a great deal of good work after leaving office, for which they deserve kudos and a second look.  (Jimmy Carter has probably accomplished more in “retirement” than half our presidents did while in power.)

What is more, my “not being president” theory doesn’t even begin to address the large variance in overall perception among the many former presidents under examination (e.g.  Clinton ranks considerably higher than Carter), and the myriad possible explanations for it.

But the fact remains that nearly every president in modern history has become more admired in retirement than he often (or ever) was while in office.  To this extent, I think my reductionist hypothesis holds, and I’m sticking to it.

Consider:  To assume the presidency is to become the servant of each and every citizen of these United States, and to be personally responsible for their well-being (as far as they’re concerned, at least) and that of the country as a whole.  To be president is to be constantly photographed and broadcast, to be forever seen, heard and discussed, and to be drenched in a bottomless well of gripes and crises from every corner of the known universe.

However, the moment your term expires, all of that goes away.  To become an ex-president is to be freed not only from the duties and burdens of the office, but also from any expectations of leadership.  You can disappear into the woods, and no one will go looking for you.  You can play golf and eat junk food and no one will give you a second thought.  Constitutionally-speaking, a former president doesn’t have to do a damn thing for the rest of his life, and many have been quite happy to oblige themselves.

Long story short (too late?), we Americans approve of our former chief executives because we have no immediate or compelling reason not to.  Because they no longer wield supreme influence over our daily lives.  Because they are no longer on every TV screen every hour of every day.  Because they have transitioned from celebrities with power to mere celebrities.  Because their every move and every word are no longer of any relevance to our own existence, and maybe—at least in some cases—because we have forgotten the days when they did.

Today, Bill Clinton’s long-windedness and snark are adorable.  Would we feel the same way if he were employing them back in the Oval Office on the public dime?

George W. Bush has garnered near-universal praise for his marked disinterest in the nuances of foreign policy in his time away from Washington, even though this same quality yielded a decidedly different response when he was squarely in the middle of the action.

Time may not heal all wounds, but it can certainly numb them and render them moot.  As Paul McCartney said, reflecting on his years with the Beatles, “You always forget the bad bits.”

As we now consider the supposed “inevitability” of Clinton’s leading lady in her possible campaign for president, let us bear in mind that Hillary Clinton’s own popularity—not as high as her husband’s, but certainly an improvement over President Obama’s—is largely the product of her nearly six-year absence from the rough-and-tumble world of retail politics.  Once and if she returns to the arena, are the Democratic primary voters who so loathed her in 2008 going to be able to forgive and forget this time around?  Or is the thawing of their icy hatred contingent on her present status as an above-the-fray figure?

I think it is all-too-obvious that our views of one famous person or other are shaped by that person’s role in our own lives, and that the more benign and unobtrusive such a person is, the more popular he or she tends to be.

So stop talking about Bill Clinton’s enduring popularity as if it’s some sort of anomaly or in any way newsworthy.  It’s not and it’s not.  Rather, it is exactly what you would expect, particularly for a guy who wants nothing more than to be liked and who will go to extraordinary lengths to make it so.

A world leader being relieved of his power and becoming less admired as a result?  Now that would be news.

Blue Sunday

Why should the government tell me when I ought to buy booze?  Why should the government dictate when businesses are permitted to sell booze?

Alcohol is a legal commodity.  Why should the government be involved at all?

I’m not talking about age limits, which are probably necessary and which the state can be said to have a “compelling interest” in enforcing (to use the legal jargon).

No, I speak of the tradition whereby a state or local government can restrict the hours during which local businesses—namely, liquor stores—can sell alcoholic beverages to their customers, necessarily abridging people’s purchasing power and, to a degree, cutting into those businesses’ profits.

Specifically, I refer to the famous “blue laws” in the commonwealth of Massachusetts, which most concern me because, well, that’s where I live.

Until this past week, no Massachusetts package store (“packies,” we call them) could sell alcohol before noon on Sunday.  This regulation was established in 2003, amending a previous bylaw that prohibited Sunday alcohol sales outright.

On Tuesday, however, both houses of the state legislature approved a further loosening of this statute, moving up the opening liquor bell on Sunday from noon to 10 a.m., the hour at which bars and restaurants are already able to serve alcoholic beverages to their guests (mostly in the form of Mimosas and Bloody Marys, one assumes).  Whether or not Governor Deval Patrick ultimately signs this bill, state law will continue to stipulate that off-premises liquor sales not occur between 11 p.m. and 8 a.m. on the six remaining days of the week.

By no means is Massachusetts the only corner of America where alcohol cannot be bought and sold at all hours of the day and night.  Nearly all 50 states have such restrictions of one kind or another, be they statewide or on a town-by-town basis.  (A notable exception is Nevada, where it’s very nearly illegal to be sober.)

But the Sunday issue is a singular phenomenon, slightly separate from (and more interesting than) all other liquor laws in these United States, and also more specifically tied to the history and sensibilities of old-fashioned New England.

While no one can quite agree on the origin or exact meaning of the term “blue laws,” the concept arose for unambiguously religious reasons.  Blue laws were, in the first instance, a means of enforcing the Fourth Commandment, “Remember the Sabbath day, to keep it holy.”

Accordingly, the term encompasses proscriptions on all manner of formal activity performed on Sunday, when everyone is supposed to be at church.  These include prohibitions on selling cars, opening grocery and department stores, hunting game and, for a short time in one part of New Jersey, “singing vain songs or tunes.”

While a great many of these ordinances have since been relaxed or abolished, some are still on the books, particularly regarding booze.

Crucially, the rationale for them has evolved from its Biblical roots, since any attempt to regulate business practices based on a few lines from Exodus would be seen today as flatly unconstitutional.

Instead, we have the Supreme Court asserting, in the 1961 case McGowan v. Maryland, “The present purpose and effect of most of our Sunday Closing Laws is to provide a uniform day of rest for all citizens […] [T]he fact that this day is Sunday, a day of particular significance for the dominant Christian sects, does not bar the State from achieving its secular goals.”

In other words, the government is within its rights to mandate that certain businesses take a day off, on the grounds that it is in the best interests of everyone—believers and nonbelievers alike—for them to do so.

And so the question becomes:  Is this rationale good enough to merit the forced halting of free enterprise during certain designated hours?  What “secular goals” are we talking about, anyway?

Is it to protect the lowly employees of these establishments from being overworked?  Tell that to the minimum-wage laborers at 24-hour Walmarts and IHOPs, which somehow manage to evade such regulation of their business hours.

Is it to stop people from drinking too much, with the added assumption (to borrow an adage from How I Met Your Mother) that no good purchasing decisions are ever made after 2 o’clock in the morning?  The argument seems reasonable and desirable on its face, until you begin to apply the lessons of Prohibition, which include the fact that the most surefire way of getting people to drink is by making it difficult for them to do so.  That alcohol today is so very available, indeed, only reinforces the peculiarity in thinking its effects can be reined in by locking the cash register for a few hours every week.

Which returns us, in a way, to my original question:  What’s the point?

So long as booze remains a legal product, and so long as individuals continue to enjoy it, you cannot physically prevent them from doing so, and you probably shouldn’t try.

Those who want to take a “day of rest”—at church, at home or anywhere else—are free to do so.  No one is stopping them.

As for people who would love to take Sunday off but can’t because they have to work:  I’m afraid the shuttering of package stores will not be of much help in this regard (except, of course, for those who work in one).

Meanwhile, come September there will be a significant chunk of the American public for whom the prospect of Sunday-as-Sabbath means precisely one thing:  NFL football.  And you know what really complements a nice, relaxing day of game-watching—say, something cold and refreshing that you could pick up on the drive over to your friend’s place?

Here in Massachusetts, let’s say it’s probably a member of the Adams family.  And I don’t mean the presidents.

Don’t Make Friends

Life Itself, a new documentary about the life and death of film critic Roger Ebert, features many talking head interviews with friends and admirers from throughout Ebert’s life, who expound on Ebert’s virtues—and a few of his vices—in order to contextualize his significance in the twin realms of movies and film criticism.

An obvious take-home message of the movie is that Ebert was truly one-of-a-kind—the rare critic who became as well-known (and as beloved) as many of the people he wrote about.  Indeed, perhaps the most peculiar sight in Life Itself is the number of filmmakers who reminisce about ol’ Roger as if he had been a close, personal friend.  By all outward appearances, he was.

You’ve got Martin Scorsese citing Ebert’s unerring support for saving Scorsese’s career at a point when cocaine and despair could have very easily killed it (and him).

Ramin Bahrani, a moderately successful independent director, credits Ebert with effectively putting him on the map, which he did not merely with glowing four-star reviews but through tireless advocacy and unofficial patronage.  He used his own high status to build up that of someone whose films he thought deserved to be seen, serving as a friend and mentor along the way.

(We might as well also mention that the director of Life Itself, Steve James, made a documentary in 1994, Hoop Dreams, which Ebert proclaimed the best movie of the 1990s.)

On a personal level, these and other examples of Ebert’s huge heart and sense of moral justice are admirable and a wonder to behold.  Would that more powerful people used their influence for good, not evil, and we’d be living in a far more pleasant society.  Who could possibly object?

And yet, we are left with the inconvenient fact that Ebert was, after all, a critic, and a critic is supposed to be objective—or at least aspire (and, for Pete’s sake, appear) to be as such.  For a movie reviewer to be forming bonds of friendship with movie makers—well, a term like “fraternizing with the enemy” leaps to mind, not to mention “conflict of interest.”

It’s an issue first of ethics, and second of judgment, and we are obligated to consider both.

I offer it as a three-part question:  Should any beat writer become close with those about whom he is writing?  If so, is he then bound either not to write about them at all, or to issue a clear “full disclosure” notice to readers upon doing so?  And if not, has he not then surrendered any notion of objectivity, even if he makes a strong effort to separate personal feelings from professional responsibility?

In the case of Ebert, one could argue that he never attempted to disentangle his emotions from his intellect in the first place.  Ebert himself argued as much, saying that he worked according to the sentiment by fellow critic Robert Warshow, “A man goes to the movies.  The critic must be honest enough to admit that he is that man.”  In other words:  Impartiality be damned.

As for the possibility of being corrupted by friendships, Life Itself argues that Ebert did not lose his perspective when appraising films by people he knew and liked, and shows (albeit with only one example) that he was capable of filing negative reviews even when, for personal reasons, he had every incentive not to.  In the end, according to the documentary, he resisted the urge to become a professional hack.  It is left to each of us to ascertain whether this assessment is true.

But what if you’re a member of a profession for which impartiality is not merely recommended and preferred, but is outright mandatory?

For instance:  What if you’re a journalist?

We assume—nay, we hope and pray—that the reporters and columnists on whom we depend to tell us what is happening in the world and to keep our leaders honest are not cavorting around with the very figures they are meant to skewer and critique.

After all, in the world of political journalism, the question isn’t whether becoming friendly with politicians might distort a journalist’s work.  The question, rather, is how could it not?

And yet we have spectacles like the White House Correspondents’ Dinner, where journalists and lawmakers drink and intermingle on live television, suggesting all-too-convincingly that this is not the only night of the year in which the Fourth Estate enjoys a social relationship with its would-be targets.  The scribblers of America might think they are not being unduly influenced by this, but we have only their word on which to rely.  (Most of the time, we don’t even have that.)

America’s judicial system addresses the subject of impartiality through recusals, whereby a judge abstains from presiding over any case in which, because of the people involved, he or she has (or might have) a rooting interest.

The process of jury selection functions in the same way.  Sitting before the judge of a pending trial, the first question all prospective jurors are asked is whether they know the plaintiff, defendant or any of the witnesses personally.  Of course, there are many reasons a juror might be led to favor one side over the other, but it all begins with personal connections, be they friendly or hostile.

Would you want your vindictive ex-girlfriend in the jury box at your own trial?  Would you want the best friend of alleged Boston Marathon bomber Dzhokhar Tsarnaev among the jurors at his?

I didn’t think so.  It would be a travesty and a miscarriage of justice, and our system is all the more laudable for taking such pains to ensure it doesn’t happen.

Why shouldn’t the press be held to this same high standard?  Why are reporters allowed to so casually exchange pleasantries with the movers and shakers of government one day, and then be expected to disinterestedly grill them on TV and in newspapers the next?  Since when do any of these folks merit our benefit of the doubt?

The fact is that personal relationships are inherently corrosive to our ability to assess a person’s character and actions fairly.  That’s why friendships are so wonderful and hatreds so toxic.  The former allow us to fool ourselves into thinking certain people are more perfect than they actually are, while the latter do precisely the opposite.

Because this is how human nature works, and because we cannot pretend otherwise, we must make every effort to prevent such bonds from taking root in areas of professional life in which they do not belong.  And when they do take root, those involved should have the decency to call a spade a spade, lest they make a mockery of themselves, of us and of the eternal search for justice and truth.

Anything else would be unseemly.  Ebert would not approve.

Turn Off, Tune In, Log Out

How often we hear from those—including many Millennials—who want nothing more than to turn away from the technological toys they have been given, and return to a calmer, simpler time.

Specifically, I speak of the incessant complaints by many 20-somethings (and everyone else) about the increasing ubiquity of Facebook, Twitter, and so forth in the daily life of virtually every person in the industrialized world, and how much they wish it were otherwise.

We’ve all heard it.  No doubt, many of us have said it ourselves.  It’s not that we wish the social networking apps that have so dominated and defined the generation now coming of age would altogether cease to be.  It is, rather, that we would have them play a far more diminished role in our comings and goings.

Indeed, it is the fondest wish of (some of) these skeptics to detach from these omnipresent websites completely, if only to prove that Mark Zuckerberg and his ilk do not completely control our lives.

To these folks I have but one question:  What the heck are you waiting for?

If you really want to leave Facebook, then leave Facebook.  If you no longer wish to tweet, then don’t tweet.

If the addictiveness and lack of privacy inherent in these resources bother you that much, then, as the English say, get on with it.

Or, alternatively, you could cop to the truth of the matter, which is that you simply cannot imagine your life without Facebook et al., that you wouldn’t know how to function in the supposedly idyllic pre-Internet universe for which you so supposedly pine, and that, in the end, you’d much rather complain about the drawbacks to these technological marvels than actually part ways with them.

To be sure, there’s nothing unusual about this conundrum.  We all feel your pain.  No one particularly wants his or her privacy compromised, and few have any great confidence that the CEOs of these companies hold their customers’ privacy as their primary concern.

The question is whether we, today, have the choice to opt out of this system.

Literally, we do.  But practically?  Well, that may be the issue on which our entire culture hinges.

By all outward appearances, my own grandparents have managed to get away with it.  They have never owned a personal computer, let alone smart phones or Flickr accounts, and are able to operate happily on a day-to-day basis more or less as they always have.  If they need to be informed of anything important, someone gives them a phone call.  When a bill comes due, they send a check in the mail.  And if they absolutely must look up something online—well, that’s what libraries and grandchildren and for.

Of course, there is a giant asterisk to this story, which is that virtually everyone they know is well aware of their technological limitations, and have learned to act accordingly.  What is more, they spend half the year in a Florida retirement community, which means that nearly all the people they regularly interact with are, themselves, roughly on the same page.  So long as enough members of this generation exist, the rest of the world will be required to accommodate them.

Those in my parents’ generation, meanwhile, seem to lie right on the cusp of old and new ways of living.  For that reason, their experience is perhaps the most instructive of all.

My mother, after years of resistance, now has an active Facebook account.  My father does not, and possibly never will.

Like the rest of us, my mom joined out of a sense that she would otherwise be “out of the loop” regarding what certain friends and family members were up to—a concern validated by, say, learning about an engagement or pregnancy several months after the fact, since the announcement was made exclusively via Facebook, and everyone who received it assumed that everyone else had, too.

My dad, who is as interested in other people’s lives as anyone, does not seem to worry that his absence from online social networks has abridged his access to such information in any meaningful way.

Maybe this is because he knows he can rely on my mom to catch him up, if need be.  Maybe it’s that, like my grandparents, he is confident that the members of his real-life social network will keep him abreast of all important developments through other, more old-fashioned means.  And maybe the day will yet come when he determines he is not sufficiently plugged in to the world around him, and has no choice but to sign the “user agreement” the rest of us have found impossible to resist.

And if that happens, should we regard it more as an act of free will or latent cultural coercion?

We are advised, from an early age, not to bow to “peer pressure” and never to automatically run with the crowd, particularly when we object to the direction in which the crowd is headed.

And yet today’s social networks are a sparkling illustration of how nearly all of us do precisely those awful things, and how prevailing cultural trends have all but forced it upon us.

There is a playful paradox at work here, as we puzzle over how the collective world population has managed to levy peer pressure upon itself.  (I am somehow reminded of Yogi Berra’s quip about some restaurant, “Nobody goes there anymore; it’s too crowded.”)

In any case, the Facebook phenomenon is nothing if not self-perpetuating:  As more join, more are compelled to follow suit.  This means that the only way this form of human connectedness could possibly abate is if a whole mess of people suddenly and forcefully abstain, casting society back to the bucolic, antiquated days of, say, the 1990s, when you learned what was going on by reading a newspaper or talking on the telephone.

Is that really what we want?  From the way many bitch about the Internet’s imperial, invasive designs, the answer is a definite “maybe.”

If so, these naysayers can take some comfort from the reminder that millions in America and elsewhere still live precisely that way, with no desire to change course.

For those who have already ceded to the tide but now nurse second thoughts, the decision to withdraw from online social life is fraught with difficulties that the blissfully ignorant probably cannot appreciate.

But I maintain, nonetheless, that it is not impossible, and I would urge such dissidents to give it the old college try, lest they show themselves to be full of nothing but hot air.

Live as you truly wish to live.  What, apart from everything you’ve ever known, do you have to lose?

Dietary Restrictions

Would you ever eat a person?

What if it turns out that we humans are delicious?

Suppose the person in question, when he was still alive, gave you full permission to do so?  (For our purposes, we’ll assume the specimen is, in fact, dead.)

What if, by means of preparation, this slab of man meat were properly cooked and presented so that, from looks alone, you’d never even know what you were putting in your mouth?

Or, finally, suppose you were just really, really hungry?

Being the wealthy, industrious, resource-rich Westerners that we are, we seldom spend much time pondering these sorts of questions, if only because, under normal circumstances, we don’t have to.  (Particularly that last one.)

As a general rule, the notion of cannibalism strikes us as a repulsive one—the mark of a savage, immoral (or at least amoral) species.  While the history of the practice is intriguingly rich and diverse—among other things, it has been employed as a form of punishment, revenge, bereavement and strategic “predator control”—by far the most common motivation for the consumption of humans by other humans has always been sheer, terrified desperation.

Be they crewmen lost at sea, systematically starved prisoners or famished citizens of some Third World hellhole, most of those who feast on human remains do so not because they want to, but as a simple (or not-so-simple) matter of life and death.  They are profoundly hungry, and all other sources of nourishment have either been exhausted or were never there at all.

In the dystopian new movie Snowpiercer, in which a sudden ice age has forced the Earth’s survivors onto a sprawling locomotive divided into two sections—one for the rich, one for the poor—we find that the ragged slaves in the rear have, indeed, resorted to hacking and eating each other in order to survive.  In a chilling monologue, the film’s “hero,” played by Chris Evans (Captain America to you), despairs, “I hate that I know what people taste like.”  (The line that follows, which I won’t reveal, is even creepier.)

We assume and hope that the desperate, extreme circumstances that would force us to confront the cannibalism question literally (not just theoretically) will never happen in our culture, in our lifetimes.  Among the many subjects raised by Snowpiercer—a brutal but highly intelligent movie—is the prospect that it might.

The broader question, in any case, is how and where we draw the line for our own behavior with respect to the whole animal kingdom.  In extremis, how far are we willing to go?  Is there anything we will not do, no matter what?  Why is one act acceptable while another is not?  And who gets to decide, anyway?

Never mind cannibalism, and never mind extraordinary circumstances.  Let’s stick with the creatures we eat on a near-daily basis, and those we may well could.

For instance:  Would you, like the citizens of numerous Eastern countries, ever consider eating dog meat—say, if it were on the menu at some swanky restaurant and were, it turned out, every bit as yummy as roasted lamb?

Well, why not?  Why do we Americans eat lambs and chickens and cows but not dogs and cats?  Is it only because dogs and cats are adorable?  Would you be more likely to chow down on a boxer than a yellow lab?  We don’t seem to differentiate based on cuteness when it comes to fish or fowl.  Should we?  And if sentimentality is not the primary basis for what separates a pet from an entrée, then what is?

Vegetarians, to their credit, live by a clear and consistent moral principle:  If it’s a sentient being, it should not be eaten, period.

On this point, it is strictly we carnivores who have some ’splaining to do.

One can choose to approach the act of meat eating with no particular ethical qualms, as most of us effectively do when we bite into a hamburger without any concern for the dignity of its source.

But once you make that choice, consciously or unconsciously, you are left to contemplate whether you do, in fact, possess a moral limit for what you will willingly deposit into your stomach, and how you calculated where that limit is.  Since we each must do this individually, we necessarily invite the possibility that all such valuations (and the laws governing them) are ultimately arbitrary, and that there is no objective basis for what constitutes fair game (so to speak) and what does not.

And if you really, truly do not apply ethics to eating in any way, shape or form, then we end when we began:  If we’ll eat anything, why not each other?

Happy (Almost) Fourth!

Boston need not feel bad about not celebrating the Fourth of July on the actual fourth of July.

After all, even the original Fourth of July did not occur on the actual fourth of July.

Under normal conditions, the Hub of New England hosts a free concert every Independence Day around sunset, attended by north of half a million spectators. The show, performed by the Boston Pops Orchestra and various “special guests,” takes place at the Hatch Shell along the Esplanade, and culminates in a fabulous fireworks display over the Charles River.

This year, however, saw abnormal conditions in the form of a tropical storm called Arthur, which was forecast to (and did) dump bucketsful of rain on our fair city at precisely the wrong moment, forcing the Pops concert to be moved.

While Arthur’s projected storm track might have made today, July 5th, the most logical Plan B, the powers that be opted instead to throw the party on Independence Day Eve, July 3rd, reportedly to avoid a scheduling conflict with the featured act, the Beach Boys. (Indeed, I think we can all agree that the comfort and safety of 600,000 revelers should never take precedence over the possibility of the city losing its deposit on the entertainment.)

In any case, the weather cooperated and the night-before-July 4th show went off practically without a hitch. (I say “practically” because the 1812 Overture had to be scrapped to avert a wind-and-hail-leaden fireworks show, but let’s not quibble.)

While no one can decently fault state officials for not wanting to have a massive outdoor concert in the middle of a tropical storm, it nonetheless feels somehow wrong that we would commemorate July 4th on any day other than July 4th.

It’s one thing to “observe” occasions like Memorial Day or Presidents Day when they weren’t originally intended to be held, but with the Fourth of July—well, for Pete’s sake, the date is right there in the name! To shoot off fireworks on the 3rd or 5th instead might not be as radical as, say, hanging up Christmas lights in August, but it’s still a bit odd, and something we’d like to avoid whenever we can. It’s a betrayal of tradition, if nothing else.

And yet the tradition in question is, itself, grounded not in fact but in myth.

As all students of American history well know, the sacred date of July 4, 1776 is something of an historical misnomer. In point of fact, nothing of any great consequence happened on that day, in Philadelphia or anywhere else.

The celebrated final vote to ratify the Declaration of Independence took place, after a great debate, on July 2nd. The signing of the document—contrary to John Trumbull’s famous painting—occurred with no particular pomp or fuss on August 2nd, as each delegate dropped in to the Pennsylvania State House (now known as Independence Hall) at his convenience before returning to personal business.

Yes, the date “July 4, 1776” appears at the top of the parchment, and it was the day that the final wording of the declaration was approved. But surely such a trifling formality is not what we think we’re honoring when we ring bells and eat hot dogs on the sacred day itself.

Nope, the collective decision to mark the birth of our country on the fourth is one of those things that just sort of happened, and now we’re stuck with it.

As such, given the circumstances, it is my vain hope for the future that we un-stick ourselves and celebrate our independence whenever we damn well please.

As I’ve already noted, to tweak the date of America’s birthday each year would not be without precedent on the American calendar. To the contrary, most of our national holidays, whatever their origins, are “observed” on a date or day of the week that is most convenient for the most people—typically a Friday or Monday, to give ourselves a three-day weekend.

What is more, the Reform temple to which my parents belong will occasionally observe some Jewish holidays the week before or after they actually fall, in order to accommodate the highest possible number of congregants. (In the event, say, that a particular festival occurs during school vacation, when nobody is around to attend services.)

Similarly, I once fasted on Yom Kippur, the Jewish day of atonement, one day early, so as not to interfere with my work schedule. I figured that God, if he existed and was fair-minded, would prefer such a compromise to my not fasting at all. And anyway, it was to no one’s advantage for me to be hustling around at work on a profoundly empty stomach.

Strictly on the matter of birthdays, who amongst us has not attended or played host to a birthday party not on the actual day, but rather on the nearest Friday or Saturday night? That’s just basic good planning. What’s the point of being precise and official if it means fewer people eating cake and having a great time?

So let us not worry ourselves one whit if circumstances force us to move a national observance in one direction or another. Ultimately, the date doesn’t matter—all that matters is that we take the time to observe at all.

As a wise person (or two) once said, it’s the thought that counts.

A Fresh Start

There may be no more alluring delusion than the idea of a clean slate.

Whether in our personal lives or as a whole people, we stubbornly and all the time cling to the possibility that all our sins and silliness can be wiped away and that, harnessed with the wisdom and experience we now possess, we can simply begin again, in a sort of cosmic Take Two.

Such is the ethos underpinning New Year’s resolutions, weight-loss diets, cross-country moves, career shifts, rehab clinics, second marriages, third marriages, having children and emigrating to a new country.

On a national scale, we think of elections in much the same way.  They are our singular opportunity to throw out one set of bums in favor a different set, on the assumption that the mess made by the incumbents can be expunged through the mere opening and closing of a ballot box curtain.

Globally, things really get interesting, as we see attempts by citizens of a country to overthrow and flatten the leadership and, in some cases, the very system of government under which they had heretofore lived.

The Iraq War, as it was described by its practitioners in the Bush administration, was nothing less than a means of shaking up centuries of authoritarian rule in the Middle East and remaking the region more or less from scratch.

And the revolution they used as a blueprint is the one we are celebrating this week.

I began with the word “delusion,” which I strongly suspect this whole “begin again” theory to be, in every one of its forms.  Why?  Because while circumstances can change, human beings do not.

Embarking upon a new relationship, you can pledge not to repeat the missteps that may have ended the last one—and learning from mistakes is a valuable human trait—but the essence of one’s character does not magically disappear just because you’re sleeping with someone else.

A recovering alcoholic can vow not to touch the evil elixirs ever again, but the demons that lead one to indulge in the first place never completely go away.

The United States military can try its level best to effect pluralism in the Middle East, but all the hatreds between Sunnis and Shiites somehow always find a way to boil right back to the surface.

Clean slates do not exist.  They are a fantasy we fashion for ourselves as an E-ZPass lane to happiness.  However pure the future might seem, we can never completely escape the past.

Which brings us squarely to the American Revolution.

Of all the inspiring and improbable elements of the rebellion that formally began on July 4, 1776, arguably the most essential—the one that allows the story of American independence to retain its power—is the fact that, in the minds of those who led it, it really was an instance of a brand new country being created from whole cloth.  An opportunity to break from centuries of tradition and begin anew, with a constitution and system of government that had never quite existed before.

It was, you could say, the closest any group of people has come to starting the world over again.  And surely there is something to be said for the fact that, 238 years later, the United States of America still stands.

And yet the image of a so-called “American Eden” is as much a myth as every other example I’ve cited.

Never mind that key phrases and ideas in the Declaration of Independence were stolen from John Locke.  Never mind the horrible things our ancestors did to the Native Americans who had, after all, gotten here first.  Never mind the enormous debt the nascent American republic owed to France, itself a hereditary monarchy, without whose support independence could never have been achieved.

Then there’s the small matter of America’s “original sin”—the necessary corollary to the Garden of Eden metaphor—which, all by itself, destroys any pretense of purity in the formation of a new and great nation.

Yet even in the absence of these giant asterisks to the beloved story of our country’s birth, there was always the fact that this new republic would exist in the same time and place as all the old empires, and would have to deal with them sooner or later.  Even if we were perfect, we lived in an imperfect world that we could not simply wish away.

What is more, the people who designed this new system were, themselves, products of the old system, and therefore subject to certain values and assumptions that proved more durable than they (and we) might have preferred.  The ownership of their fellow human beings was one, but so, too, was the inclination toward petty factionalism, traceable to practically the moment George Washington died, as well as involvement in “entangling alliances” overseas, which our first president so elegantly warned against.  (The term itself, often attributed to Washington, was actually coined by Thomas Jefferson.)

None of this is to suggest that, on balance, the American Revolution wasn’t among the finest things ever to occur in the history of life on Earth, nor that the U.S. Constitution is not arguably the greatest user’s manual for how to run a decent country that has ever been written and put into practice.

But there was nothing “clean” about any of this, and I only ask that we take a moment this Independence Day to recognize it.

History is messy, just as life itself is messy.  We can always strive to improve—to make ourselves “more perfect”—but we deceive ourselves when we think that the past doesn’t matter and won’t have some influence over the future.

All we can reasonably hope is to do our level best here in the present.