Spinning Wheels of War

The Office ended its run on NBC last May after nine seasons on the air.  However, many viewers felt in their hearts that the sitcom was never quite the same after the departure of its star, Steve Carell, two seasons earlier.

The moment Carell’s character, Michael Scott, chose Holly over Dunder Mifflin and flew off into the sunset seemed like the perfect, natural conclusion to the series.  In its final years, The Office was just “spinning its wheels,” as the TV parlance goes.  Sans Michael Scott, the show might have retained its charm, but it had lost its purpose.

While the real world is not generally as neat as the world of television sitcoms, Americans often view reality through the prism of their favorite fiction.  While this is not always helpful or terribly intelligent behavior, it can nonetheless help us to understand why we feel the way we do regarding our country’s place in the geopolitical universe.

Not all of these feelings are wrong.

To wit:  If one thing has been made abundantly clear amidst the United States’ almost-war against Syria, it is that the American people have had it up to here with military interventions in the Middle East, carried out under the banner of “the war on terror.”

Every last public opinion poll reflects the same general trend:  Americans prefer less involvement in foreign affairs, not more, and preferably none at all.  We have neither the time nor the cash to fix all the problems here on the home front, let alone to right every atrocity committed by others (and by us) overseas.

It is sensible enough to surmise, as we have, that the leading cause of this sentiment is the experience of watching the Afghanistan and Iraq Wars become a bit messier and more complicated than we initially thought, and learning that not every crisis can be solved with brute force—even when that force is exerted by the U.S. military.

But allow me to introduce an additional (but not necessarily alternative) explanation for our collective antipathy toward foreign entanglements:

So far as the American public is concerned, the war on terror is over, and has been for quite some time.

The war began in New York on a crisp Tuesday in September, and it ended in a lavish fortress in Abbottabad, Pakistan some nine-and-a-half years later.

Ever since a team of Navy SEALs killed Osama bin Laden and deposited his corpse into its watery grave, the American war machine has just been spinning its wheels when it comes to conducting the war on terror.

The fact is, we Americans like our wars to be black and white, with easily-identified heroes and villains and, more important still, beginning and ending dates that are clear and unambiguous.

We understood the September 11 attacks to be one bookend of a great worldwide struggle between civilization and fanaticism, and with the demise of Public Enemy No. 1 on May 1, 2011, we were provided the other one.

Deep down, of course, we know the world is not that simple.  We know, for instance, that al-Qaeda is not the sort of organism that ceases to exist once you cut off its head—if the group can be said to possess a head at all.

What is more, we were explicitly warned in the earliest days after 9/11 that our country was engaged in “a different type of war” that may well prove more open-ended than past conflicts, with an enemy that claims no particular home base and does not abide by the same rules of engagement as we do.

Yet we have nonetheless clung to the idea that war is finite.  That we embark upon a given conflict with a particular, concrete objective, and that once that objective is either accomplished or proved impossible, we can pack up and go home.

Simplistic and antiquated as this assumption might be, we have every right to continue holding it.

However noble the objective of massacring large numbers of al-Qaeda members might be, it has proved one that is limitless by definition:  For every jihadist we kill, another one sprouts up in its place, and the cost of bombs and bullets required to destroy them only seems to increase.  (Our current drone program attempts to rectify this, but drones have proved problematic in their own right.)

And so the question must continue to be asked:  How much longer will this go on?  Will it ever end?  Has the so-called war on terror eclipsed its natural lifespan, or does it simply not have one?

By continuing to actively repel the forces of fundamentalism around the world, does the United States fight a war with a real and worthwhile purpose, or is it just spinning its wheels?

Don’t Encourage Them

Will Hillary Clinton run for president in 2016?  Will Ted Cruz?  Will Chris Christie?  Will Rand Paul?

Yes.  So long as we keep telling them to.

There are so many excellent reasons for us not to talk about the 2016 presidential election—at least not for another year or two—it seems nearly pointless to pick out just one.

Nonetheless, there is a particular drawback to early electoral pontificating that is worth underlining, if only because it can be so easily (and perilously) overlooked.

The conventional view is that speculating about an election that is more than three years away is silly, because anything we say now will prove to be utterly irrelevant to the way the campaign actually shakes out.

This is certainly true in some respects.  Three years before the 2012 election, for instance, Mike Huckabee was the leading candidate for the GOP nomination in most opinion polls, and he ended up not running at all.  In the preceding cycle, America spent the entirety of 2005, 2006 and 2007 convinced that the 2008 race would be between Hillary Clinton and Rudolph Giuliani.  Three years out from that election, Barack Obama’s name was nowhere to be found.

The greater concern, however, is exactly the reverse:  That our idle musings about who might make a decent president actually will have an effect on who takes the plunge—and in all the worst ways.

I put it to you like this:  Would you be more or less likely to apply for a particular job if a trusted friend told you that you should?  How about if the idea were floated by not one, but all of your closest confidants, and what if this prodding continued for several years and was amplified, in due course, by the entire American news industry?

At that point, would you not consider yourself worthy of the position, however skeptical you might have felt at the outset?  These people can’t all be wrong, could they?  And think of how disappointed they would be if you declined.

What we are seeing now are the birthing pangs of presidential hubris:  The period in which the media anoints a group of public figures whom it deems potentially interesting candidate material, thereby planting the thought in those people’s heads, and it becomes only a matter of time before the prophesy is self-fulfilled.

The trouble is that the American presidency is a job for which no one in America is truly and fully qualified, and so the gig tends to attract two types of people:  Those who have always been told they are special and view the Oval Office as their destiny and birthright, and those who have been told they will never amount to anything and seek to prove everyone wrong.

In both cases, when push comes to shove, every serious candidate for the nation’s highest office has been made to marinate in all of his or her best headlines—the inevitable consequence of being surrounded by folks who have dutifully drunk the Kool-Aid and think you’re the greatest thing since sliced avocado.

Today’s young people, the Millennials, are rather famously known as the generation with too much self-esteem.  The kids who were assured, almost from birth, that if they would just follow their dreams, they could be anything their little hearts desired, and whose self-confidence was reinforced all along the way with ribbons and rewards and the guarantee that nothing is impossible.

(I, for one, spent the better part of high school and college being told I might make a decent writer, and that’s why you’re stuck with me here today.)

The fact is that this behavior is not limited to one generation, and it does not evaporate at the end of adolescence.

Politicians feed on positive reinforcement more than most other mammals.  The majority of them are so needy and insecure, even while harboring unconscionable vanity and ambition, that every last stray comment about their potential for higher office serves as validation and a license to abandon any shred of self-doubt they might have had left.

Electoral politics is a personality cult.  This may ultimately be unavoidable.

What remains in our grasp, however, is the degree to which this continues to be the case.

If we truly believe that the cycle of campaigning has spun completely out of control and needs to be brought back to its proper proportions, all we need to do is ignore it until we reach a saner point on the political calendar.

Don’t watch cable news.  Don’t read the results of the latest Iowa straw poll.  Don’t be an enabler of a plainly corrosive process by thinking that your actions have no effect on it.

They do.

As surely as a politician will view public interest as an invitation to start his or her campaign, TV executives will view a lack of interest (and a drop in ratings) as a desperate plea to stop.

Raspberry Picking

We sold the house.

It wasn’t our house.  Not directly, that is.  Or perhaps I should say, not anymore.

You see, it belonged to my grandparents, Eve and Jack, forever known to us as Bubi and Zady.  They lived at 17 Buckley Ave in Whitman, Massachusetts from long before I was born until each of them died—Bubi in 2005, Zady in 2009.  Since then, legal ownership has fallen to their three children—namely, my mom and her brothers, Al and Phil.

While no members of the immediate family have resided at the rickety, two-story shack for the past four years, we have leased it to semi-distant cousins in the meanwhile, who have treated it well and provided some sense of continuity following the initial jolt of us not having either Bubi or Zady to visit anymore.

What visits they were.

Until I was eight and my brother six, we lived less than an hour’s drive in Framingham, which allowed for regular day trips to see the grandfolks, along with annual get-togethers like Thanksgiving, Passover and Labor Day, attended by the whole mishpucha—cousins, aunts, uncles.  The whole lot.

Then came the traumatic experience of moving to a foreign land called New York, and suddenly the periodic family drop-in required a bit more planning and a bit more accommodation on Bubi and Zady’s part.  With us no longer having an in-state abode of our own, theirs became our home away from home—a bed and breakfast whose check-out time was determined solely by Dad’s work schedule and Zady’s blood pressure.

(Each time we arrived, like clockwork, Zady would glance at our small mountain of luggage and quip, “Staying for a month, then?”)

Apart from the addition of a first-floor bedroom in the early 1960s and a second-floor bathroom in the late 1990s—the latter was considered a veritable revolution—the house looks exactly as it did when it was built at the dawn of the 20th century.  The kitchen has never contained a dishwasher or proper cabinets.  The second floor never acquired air conditioning.  The basement was never finished.

At one point—nearly the entire 1960s, actually—six people shared its four beds and one bath.  The telephone booth that passed for my mom’s childhood bedroom would, in later years, function as the dive bar (and source of the only television set) where all the men would pile in, like the Marx Brothers in A Night at the Opera, to watch the Sox game or to ring in the new year.

Compact and uncomfortable that the house’s innards were, the main attraction was the yard, which blanketed three of the property’s four corners.  While modest by most standards—less than a half-acre in all—it was positively teeming with life, much of it vegetative.  In its heyday, the grounds boasted three apple trees, two grape vines, a pear tree, a vegetable garden of tomatoes and cucumbers, and the pièce de résistance, twin raspberry bushes.

Every July and October, when the time was ripe, we grandkids were let loose to strip the bushes bare, snatching up as many raspberries as we could pile into our buckets—dodging bumblebees all the while—and delivering them to our grateful hostess, who would proceed to whip up several jars of her famous raspberry jam, permitting us to pocket our fair share of the bounty in its original, juicy form.

As well, the yard provided the main stage for a century’s worth of birthdays, anniversaries, wiffle ball games, wrestling matches and barbecues.  Last summer, our temporary tenants fashioned it as the chapel for their daughter’s wedding, proving that life does, indeed, move forward.

We do not know what the house’s incoming landlord plans to do with it.  He might lease it to someone else, or perhaps convert it into a duplex or a Dunkin’ Donuts.  He may well drive a steamroller over everything and start over again.  The land will belong to him, and he can do with it what he wants.

In a way, it doesn’t matter what happens to 17 Buckley Ave once the deed leaves our hands, since our memories of the place will endure regardless of whether the house itself follows suit.

Yes, the possibility that the site where virtually our entire family grew up could vanish from the Earth fills us all with unquenchable sadness.  But it’s a sadness that has been a long time coming and is, ultimately, inevitable.

The real tragedy, after all, is not that we have picked our last-ever Buckley Ave raspberries—although that is a tragedy in its own right—but rather that there is no longer a lady there to turn them into jam.

Blurred Meaning

Robin Thicke’s “Blurred Lines,” featuring T.I. and Pharrell Williams, has officially been crowned the song of the summer, with its catchy beat and breezy vocals wafting across FM radio dials from coast to coast.

Having been taken in by the tune’s easy charms myself, I was rather alarmed to realize—belatedly, I admit—that the lyrics of “Blurred Lines” would seem to apologize for, if not outright glorify, sexual assault.

The charge against the song in general—we’ll consider specifics in a moment—is that it amounts to a man declaring his intention to have his way with a woman without pausing to consider whether she is truly on board.  That this man views this woman as little more than the object of his uncommonly animalistic sexual faculties, of which he holds such a high opinion that, in his mind, no woman could possibly object to being given a closer look.  Even if she doesn’t ask for one.

In short, critics argue that “Blurred Lines” perpetuates the “rape culture” that has so frighteningly poisoned the American landscape for the last many years—a milieu that asserts (more or less) that men cannot be held responsible for the natural hormonal instincts that lead them to penetrate a woman without her permission.

Having now given “Blurred Lines” a more careful reading, I do not see why this must be so.

The basis of the outrage is the assumption that the woman in question has not given her consent.  That when the male narrator utters the stubborn refrain, “I know you want it,” the implication is that, in fact, she might not—rather, he is projecting his own desires unto her.  That she has implied “no,” but he has inferred “yes.”

But we don’t know that such a scenario has occurred in “Blurred Lines.”  To borrow an old SAT phrase, the meaning is not clear from the text.

We are told early on, “OK now he was close / tried to domesticate you / but you’re an animal / baby it’s in your nature.”  And later:  “The way you grab me / must wanna get nasty.”  To this, the narrator vows, “Just let me liberate you / you don’t need no papers / that man is not your maker.”  And:  “Nothing like your last guy / he too square for you / he don’t smack that ass and pull your hair like that.”

Now then.

Is it not possible that the reason for the man’s impression of this woman as “an animal” who “must wanna get nasty” is because, at some point during their courtship, she told him exactly that?  Might we entertain the notion that she ended her previous relationship precisely because her partner was “too square” for her and that she wants someone who will pull her hair and so forth?

In other words, rather than projecting things he doesn’t actually know, might the narrator of this song merely be reporting the facts and setting the scene?  Could it be that the woman is not only consenting, but insisting?

While my own experience on this front is regrettably limited, I have read my fair share of Dan Savage’s advice columns and have become sufficiently persuaded by the proposition that certain women, like certain men, are quite keen on rough sex.

What is more, that if 21st century feminism means anything, it means the freedom for women to assert and express their sexual selves as abundantly as their partners and the laws of physics allow.  Could not “Blurred Lines” be an endorsement of this modern, egalitarian sensibility from the viewpoint of a man who sees it as a win-win?

Against this admittedly optimistic interpretation, there are works such as “Project Unbreakable,” a chilling public awareness campaign, begun in 2011, featuring a collection of photographs of women who have been raped, each holding a sign with a direct quotation from the man who raped her.  In a blog post titled, “From the Mouths of Rapists,” novelist Sezin Koehler compiles a sampling of these images whose quotations are identical to (or nearly so) lines from Thicke’s song.

And so the point is made that, whether intended or accidental, “Blurred Lines” promulgates a strikingly casual attitude toward sex that, viewed through the prism of today’s rape culture, is careless at best and reprehensible at worst.

While I maintain that the precise nature of the song’s relationship is ambiguous, perhaps that is the strongest argument against it:  In real life scenarios, the nature of consent cannot be ambiguous under any circumstances, and a pop song has no business making light of this fact.

That such a song is the most commercially successful track of 2013 so far?  Well, the moral of that story is very ambiguous, indeed.

Oh My Gourd

This year, I think I am going to pass on pumpkin.

Sunday marks the official start of autumn, that most fertile of seasons for commercial exploitation.  Fall has been made to mean a million different things for any interested party, not least as the opening round of Christmas.

In recent years, arguably the most ubiquitous autumnal tent pole of all is that most alluring of vegetables, the pumpkin—and, to be precise, the myriad uses thereof.

It has become the great national challenge:  Is there anything we cannot create from a pumpkin?  While the answer is most assuredly “yes,” we have made it our mission to turn that “yes” into a “no.”  We get closer with each passing year.

I need not expend all that much time to explain what I mean, as anyone who has ever left his or her apartment during the months of September, October or November surely already knows.

Nothing more than a passing glance at Dunkin’ Donuts’ current window art will give one a fair impression of just how deep this harvest time fruit cuts in the American culinary culture.  You have your pumpkin-flavored coffee, muffins, lattes, donuts, bagels, cream cheese.  You want it, they’ve got it.

It gets worse in the supermarket aisles, where one can now find pumpkin Pringles, pumpkin Pop-Tarts and pumpkin M&Ms, among a billion other items whose identities have been co-opted by seasonal considerations in ways their creators could not possibly have foreseen.

Pumpkin beer?  Let’s not even start.

For a time, I was completely on board with this mass gourd worship, sampling every cinnamon and nutmeg-infused delicacy I could get my hands on.  I have yet to be convinced there is any confection more impossibly delicious than pumpkin pie, and so I figured it couldn’t hurt to transplant the sugar and spices from that classic treat into every other product on God’s green (and orange) earth.

As it turns out, it could.

While I have not yet had a pumpkin-centric experience that was wholly and irretrievably unpleasant, I have nonetheless been stricken by the disheartening epiphany that not everything can be improved through pumpkinization.  We do it because we can, but that does not mean that we should.

One test for the worthiness of any annual tradition is to ask yourself whether you would partake in said custom at any other time of the year.

For instance, I never miss NBC’s annual Christmas Eve broadcast of It’s a Wonderful Life.  Like so many Americans, I find the ritual of watching Frank Capra’s classic film at 8 o’clock on December 24 to be among the most enchanting in all of moviedom.

However, I am equally content to view the film on any of the 364 other days on the calendar as well.  No, the effect is not quite as magical in the middle of summer as in the middle of winter, but it’s bloody good enough.  A great movie transcends the environment in which one watches it.

On the other hand, I cannot quite say the same for A Christmas Story, the 1983 comedy that TBS broadcasts on a 24-hour loop throughout Christmas Day.  As entertaining as that movie is, at no other point in the year does it occur to me to pop it into the old VCR.  The film is particular to its season, and eternally tethered to it.

One reason Thanksgiving, not Christmas, is the greatest American holiday is that nearly every one of its defining characteristics is not confined to the fourth Thursday of each November.  Turkey, football, apple pie, family quarrels, indiscriminate drinking—is there ever a bad time for any of these?

With pumpkin products, this is simply not the case.  Some are excellent, while others are merely the result of festive capitalism run amok.  However much we might enjoy them in the heat of the moment—albeit a moment that lasts for one-quarter of the year—they are not of the inherent quality that would enable them to become year-round staples, as evidenced by the fact that they aren’t.

What I belatedly realized is how easily and enduringly I fell for it.  How I managed to deceive myself into thinking, as America’s PR department hoped I would, that a pumpkiny presence axiomatically makes everything better.  That these patently mediocre products I kept returning to every fall were somehow compulsory indulgences and, what is more, that they were worth returning to in the first place.

This year I am determined to resist and to scale back, discriminating between the trinkets I truly enjoy and the unnatural pretenders the culture is attempting to jam, ever-so-temptingly, down my throat.

Intemperance Movement

Following Miley Cyrus’s controversial performance during last month’s MTV Video Music Awards, the Federal Communications Commission received some 160 formal complaints from unsuspecting viewers, who portrayed various levels of umbrage at the former child star’s sexually suggestive set.

Last Friday, the FCC publicly disclosed the content of those grievances.  Having sifted through a representative sample of this document dump, I can affirm the complaints are more entertaining than Cyrus was.

Not one of these protests is grammatically correct.  Many seem wholly unaware both of the FCC’s function and the nature of its authority over cable television.  Some make pricelessly awkward attempts to describe precisely which sexual acts Cyrus was suggesting and why they crossed the line.  Still others are so spectacularly over-the-top in their disgust that one hopes they are tongue-in-cheek but suspects they are not.

And this from parents who worry that their children are growing up with inadequate role models.

It can very easily be argued that Cyrus’s VMA act was indeed beyond the pale for broadcast television.  The debate about the line between what is provocative and what is profane is nearly as old as television itself, as the envelope continues to be pushed and people’s tolerance for smut has struggled to keep pace.

Accordingly, I frame the conflict within this conflict as the following question:  Are you made more or less comfortable by the fact that, in this debate, the gang that has grabbed the microphone on the pro-“family values” side is a gaggle of hysterical, borderline illiterate wackadoodles?

Certainly, the brashness and immaturity of those who publicly object to arguably inappropriate content on the airwaves does not make their central argument wrong.

For that reason, I would think their more temperate and dignified fellow travelers would be rather alarmed that such crazed loons have fashioned themselves the spokespeople for the cause of upholding a basic standard of decency on cable television.

So long as the public face of this argument assumes a ridiculous, farcical form, the pro-regulation movement will continue to face undue hardships along the way to any possible eventual victory.

For purposes of illumination, permit me an historical analogy.

In 1968, as all hell was breaking loose amidst the escalating Vietnam War, the assassinations of Martin Luther King and Robert Kennedy, and the riots outside the Democratic convention in Chicago, presidential candidate Richard Nixon appealed to what he termed “the silent majority” of Americans as he vowed to restore “law and order” to the United States.

The idea, according to Nixon, was that the agitators protesting in the streets might be loud, passionate and able to command enormous attention in the national press, but they do not reflect the views of most of the public and should be marginalized as much as possible in polite society.

The election returns that November seemed to vindicate Nixon’s assessment, even though history has shown many of those protesters’ concerns to have been entirely legitimate.

By 1968, opposition to the Vietnam War and the policies that unleashed it was a perfectly respectable position to assume.  However, as he assumed the presidency and further escalated the war, Nixon successfully conflated the obnoxiousness and extremism of many antiwar demonstrators with the antiwar arguments themselves, thereby delegitimizing the very idea that the Vietnam War was a less-than-noble cause for America.

In more recent times, similar fates have befallen environmentalists, whose effective leader for a time, Al Gore, did not always prove the most helpful figurehead for the cause.  Likewise, the anti-abortion movement has not been terribly well-served by folks who roam the countryside with jars filled with aborted fetuses.

The complication to this is the fact that, from time to time, various forms of extremism have actually worked.

Today’s opponents of gun control have made no effort to appear cool-headed, but their determination and political savvy have enabled them largely to get their way.  A century ago, America’s various temperance societies successfully lobbied to enact Prohibition, even as their most famous cheerleaders were known for marching into taverns and demolishing the merchandise with hatchets.

But these are exceptions to the rule.  Most of the time, crazy does not effect results, and that’s good.  It’s bad enough that so many Americans are under the infantile impression that bad behavior will ultimately be rewarded.  Imagine how much worse it would be if the rest of us managed to prove them right.

Mr. Speaker

What a voice!

Bill de Blasio, the man likely to be the next mayor of New York City, has swiftly become known for all sorts of disparate personal and political qualities.  Among these are his strident denunciations of much of Mayor Mike Bloomberg’s 12-year rule, his characterization of New York as a “tale of two cities,” his towering height and his multiracial family.

For me, at this relatively early stage of the campaign, as striking a feature as anything else has emanated directly from de Blasio’s larynx.  While I cannot explain the science behind it, I find an uncommon tonal clarity in de Blasio’s speaking voice whenever he appears on the TV screen.  His is a voice that commands one’s attention seemingly without much effort, which could prove useful should he prevail in November.

Christopher Hitchens, a gifted orator in his own right, used to teach writing to college students with the formulation, “If you can talk, you can write.”  Finding that his pupils would perk up at this ostensibly encouraging news, he hastily and devilishly inquired, “But how many of your classmates do you truly enjoy listening to?”  The answer would invariably come back:  Very few, indeed.

To be a great speaker requires two overriding skills.  The first, as implied by Hitchens, is lucidity—the capacity to form clear, compelling ideas and express them in an organized and engaging manner.

The other is an arresting voice with which to do the job.

To be sure, the former is largely dependent upon substance, while the latter is merely a function of style.  Accordingly, one can reasonably conclude the first to be of considerably greater importance than the second.

Nonetheless, the power of one’s voice must not be overlooked, for it can often be the determining factor as to whether, and to what degree, one’s message is received by its intended audience.  If the substance of one’s rhetoric is worth saying in the first place, then it might as well be truly and properly heard.

As much as we might gripe to the contrary, the United States is hardly bereft of new, useful and interesting ideas on all the great concerns of the day.  If you wish to explore any and all possibilities for how we might resolve the problems that confront us, you are no farther than a Google search away from the information you require.

Our real national deficit is of the sort of spellbinding spokespeople to push these ideas forward.

In 2013, public intellectualism is not among America’s leading industries.  Most public talkers tend to possess one of the aforementioned gifts or the other (or neither), but very rarely is someone blessed with both.

We have our great American thinkers, and most of them are spectacularly boring in their oral exhortations.  (I defy anyone to take in a lecture by Noam Chomsky without drifting off to sleep.)  And we have our great American rhetoricians, and most of what they say is banal twaddle.  (See Philip Seymour Hoffman in The Master and substitute any real life counterpart you wish.)

But someone with a professor’s depth combined with a preacher’s gravitas?  Now you’re talking.

We recently celebrated the 50-year anniversary of the “I Have a Dream” speech by Dr. Martin Luther King, Jr., who proved that such a phenomenon is not entirely beyond the reach of man.  The trick to Dr. King’s finest oratory—if “trick” is the word for it—is not just the moral clarity of his ideas, but also (and equally) the sheer aural pleasure in receiving them.

This is not to suggest that a speech requires thunderous, God-like intonations in order to be effective.  Assuming the content is solid, the delivery mechanism need only exude an even mixture of confidence, sincerity, and the authority that comes with knowing what the heck you’re talking about.

Ultimately, the test of whether a speaker’s voice passes muster is reductive:  Ask yourself, in the spirit of Christopher Hitchens to his students, “Do I enjoy hearing this person speak for its own sake?”

It is often said of our finest singers that they could “sing the phone book” and we would still applaud.  When movie critic Roger Ebert died in April, many admirers recounted how they so enjoyed his writing that they would happily read his reviews of movies they had no interest in seeing.

Such should be the aim of anyone who talks for a living:  To regard speaking itself as a form of art, rather than a mere instrument for conveying one’s thoughts to the wider world.

Nothing more assures that your audience will be receptive to a particular proposal than the knowledge that they are prepared to listen to anything you have to say.

The Meaning of ‘War’

Here is a trivia question for you:  When was the last time the United States officially declared itself to be at war?

Answer:  December 8, 1941, the day after Japan attacked Pearl Harbor, inducing the United States to formally enter World War II.

That was it.  So far as the official record is concerned, every American military engagement since 1945—Korea, Vietnam, Iraq, Afghanistan, Iraq again—has been strictly off-book.

This is not to say that those conflicts (and plenty more besides) did not really happen, or that the United States has officially been in a state of peace for some 68 years.

Rather, it calls into question what terms like “war” and “peace” mean in the first place.  In point of fact, such definitions have never been clear since the founding of the American republic.

This is no small matter, both in theory and in practice.  As the current scuttlebutt surrounding Syria has reminded us, a great deal hinges on how the United States involves itself in foreign entanglements and, in particular, on who has the final say on whether to do so.

The U.S. Constitution states, in Article I, Section 8, that “Congress shall have power to […] declare War,” but offers no opinion as to precisely what war is or, indeed, what form such a declaration should take.  The president, as commander-in-chief, has the authority to conduct hostilities once they have commenced, but has no explicit license to commence them himself.

As a consequence of such constitutional vagary on this subject, it has been left to subsequent generations to fill in the blanks.

Two weeks ago, when President Barack Obama announced his desire to launch a series of strikes against the Syrian government, he said, “I will seek authorization for the use of force from the American people’s representatives in Congress,” but also that “I believe I have the authority to carry out this military action without specific congressional authorization.”

I could not have been the only one who felt slightly ill at ease by the contradiction between those two clauses.

The president said that he was taking his Syria case to Congress because he believes that in so doing, “the country will be stronger […] and our actions will be even more effective,” but would it not be more in the spirit of checks and balances if he were actually required to do so?

The fact is that, while the Constitution delegates the power to declare war to the Congress, the ambiguity with which the very notion of war is understood has allowed the executive branch extremely wide latitude on the actual employment of the U.S. Armed Forces.

In practice, chief executive after chief executive has managed to sneak America into a large-scale armed conflict simply by not calling it a war.  While the president himself may well intend a limited military action not to escalate into a full-blown commitment, history has demonstrated a clear pattern of the former giving way to the latter.

In 1964, for instance, Congress passed the Gulf of Tonkin Resolution—based on false information, it turned out—which authorized President Lyndon Johnson “to take all necessary measures to repel any armed attack against the forces of the United States” and “to take all necessary steps, including the use of armed force, to assist any member or protocol state of the Southeast Asia Collective Defense Treaty requesting assistance in defense of its freedom.”

The words “declaration” and “war” did not appear in the resolution, yet the document unmistakably gave the president permission to do whatever the heck he wanted vis-à-vis the conflict in Southeast Asia, which he and his successor, Richard Nixon, unmistakably did.  Their combined policies in and around Vietnam, licensed by the Tonkin Resolution, led to the deaths of some 58,000 American soldiers and several hundred thousand Vietnamese civilians.

If that isn’t war, what is?

And so I humbly ask:  With the prospect of a fresh new American-sponsored military thingamabob in the Middle East, should we not clarify America’s war-making laws once and for all?

Can the president send American troops into harm’s way of his own accord, or not?  If he can, does it not infringe upon Congress’s prerogative to declare war?  Can the president do whatever he wants so long as he does not call it war?  Or is any deployment of the U.S. Armed Forces axiomatically an act of war, period?

Or:  In today’s world, where the United States and others can inflict great damage without proverbial “boots on the ground” and in which violent conflicts are not nearly as linear as they used to be, have these sorts of questions become obsolete?  And if that is the case, of what use is our 226-year-old Constitution in the first place?

Not About You

I am an uncommonly selfish human being.  My interest in history is, in part, an antidote to this.

To study the past is to understand that the world does not revolve around you.  That American and international culture existed before you came onto the scene and will do just fine after you have gone.  That while you can certainly make an impact on your physical and social environment in the brief time you have on Earth, you are nonetheless a mere blip in the broader space-time continuum.

Today, September 11, we observe the 12-year anniversary of the event that swiftly became this generation’s “I remember where I was” moment.

If asked, anyone over the age of 16 or 17 can, without a moment’s hesitation, tell you precisely what he or she was doing when word came that two airliners had plowed into the World Trade Center in Lower Manhattan, and how many seconds it took to sprint to the nearest television set and realize the situation was even worse than it sounded.

On one level, we engage in this act of collective memory-mongering as a coping mechanism—a means of making an unfathomable event fathomable by framing it in more personal, human terms.  To this extent, such a practice is acceptable and perhaps even necessary in order to keep ourselves sane.

At the same time, there is something distinctly unattractive in reducing a great tragedy like the September 11 attacks to a personal anecdote.  The way many people recount it, the point becomes not the fact of 9/11 itself, but rather the excitement of having watched it unfold, as if there were something unique or courageous about passively observing something from which one could not conceivably have averted one’s attention.

We have fashioned ourselves as having been active participants in an event that, in point of fact, we had absolutely nothing to do with.

As I rant, you can rest assured that I know from what I speak.

For the past 12 years, I have regularly regaled others with the tale of how, late in the afternoon on September 11, 2001, I hiked to the peak of Turkey Mountain, the highest point in my hometown, and faintly witnessed the massive plume of smoke hovering over Lower Manhattan some 40 miles south.

As memorable as this little adventure might have been for yours truly, what I have slowly come to realize is that, to everyone else, it has not a shred of interest or anything in the way of a point.  It illustrates nothing except the cosmic accident that I happened to live in the New York metro area on the day the city was savagely attacked by a gang of terrorists.  Why should anyone else care?

My advice, to myself and those in like circumstances, is to shut up about it.  To recognize that 9/11, like past historical flashpoints to which I alluded at the start, is not about you, and you shouldn’t try to make it so.

Obviously, this plea does not apply to everyone.  For the 3,000 who died, their friends and families, and the scores of rescuers and bystanders literally caught in the eye of the storm, 9/11 was very much a personal trauma that altered their lives in fairly profound ways.  They can bang on forever about what the attacks mean to them.  No one has any cause to stop them, and we might even learn something along the way.

For the rest of us, however, this is simply not the case.  For us, who lost no loved ones in the attacks and whose lives were not immediately and violently disrupted by them, 9/11 is nothing more than an historical occurrence that we happened to see on live TV.

For whatever reason, a great number of us have trouble accepting this.  We figure that if we conflate our memory of a significant event to the event itself, we can make ourselves feel significant as well.  We see this in sports fans who refer to their favorite team as “we” rather than “they,” or in baby boomers who talk about the thrill of Woodstock and Monterey Pop as if they were there when, in fact, they were not.

We should all knock it off.

We should have the humility to distinguish between being a participant in a great drama and being a mere witness to one.  Who, after all, are we trying to impress?

One Nation Under God?

Perhaps this is the wrong moment to bring up the Pledge of Allegiance.

It was the comic curmudgeon Lewis Black who dryly suggested that the perennial debate about whether the words “under God” violate the separation of church and state is the sort of intellectual exercise that ought to be reserved strictly for times when not much else is going on in the news—that is, when there is nothing more compelling for us to argue about.

At present, the prospect of a sort-of war in Syria has more or less taken care of that.

Yet I must insist that one direct one’s attention towards the Pledge all the same, for the issue of constitutionality presented itself this past weekend in a manner that, against all probability, allows us to think about it in a fresh and interesting way.

Up until now, the general thrust of the Pledge of Allegiance debate has been simple enough.  In one corner is the view that by declaring the United States “one nation under God,” the U.S. government is necessarily taking a position on a religious matter, which the First Amendment plainly forbids it from doing in the stipulation that “Congress shall make no law respecting an establishment of religion.”

From the other corner comes the rebuke that “under God” is a purposefully vague and general assertion that favors no particular God over any other, and therefore no religion over any other, and thus does not infringe upon any individual’s right to believe whatever he or she likes.

What is more, while the Pledge is recited daily in most public schools across America, the practice today is optional rather than compulsory, and thus no one can decently claim that he or she is being “forced” to declare religious adherence.  The tradition exists for those who wish to participate; those who don’t may simply avert their eyes, ears and throat.

My own view has been that “under God” does indeed constitute state-sponsored religion—atheists, who do not believe in a God for America to be under, are excluded by definition—but also that the non-compulsory nature of the Pledge renders the whole issue fairly trivial.

The new approach to this question, which stems from a lawsuit by a family in Massachusetts, insists that the above is not quite good enough.

The plaintiffs in the case, who wish to strike the words “under God” from the Pledge, allege the phrase to be a violation of the Equal Rights Amendment of the Massachusetts Constitution—a noteworthy legal position to strike, as most anti-Pledge cases hinge instead on the separation of church and state.

As currently worded, said the plaintiffs’ representative David Niose, the Pledge “validates believers as good patriots and it invalidates atheists as non-believers at best and unpatriotic at worst.”

In other words, the point isn’t whether reciting the Pledge in school is optional.  The point is that it exists, period, and with the official blessing of the U.S. government.  Floating about in its present and exclusive form, the Pledge predisposes people to accept it and assumes that they will.  It may not explicitly hold those who object to its content in contempt, but it very much leads the witness in a manner that is rather suspect in a free and open society.

Imagine, if you will, that instead of the Pledge, every American public school began the day with an eloquently-worded, government-sponsored salute to the National Football League.

After all, football is unquestionably America’s most popular sport; it pumps billions of dollars into the U.S. economy every year; and its main event, the Super Bowl, brings Americans together in excitement and revelry like few other occasions on the cultural calendar do.

What could be the harm in making recognition of such a wholly American tradition official?  Provided that such a gesture did not favor one NFL team over another and that students who wished to abstain could do so, who could possibly object?

To this airtight logic, one might nonetheless inquire:  Why exactly is it the business of government to express an opinion about the worthiness of a particular sport, or about sports in general?  What is the practical effect of such an action, other than to make those heretics who do not agree that the NFL is an admirable enterprise feel as if their own opinion is morally inferior to everyone else’s?

If and once one concludes that the state has neither the duty nor the competence to issue proclamations on matters of sport, then one has no choice but to apply even greater scrutiny to instances of the state doing the same on matters of God.