Two For One

Today is Thanksgiving, and also the first day of Chanukah.

This is the first time such an odd phenomenon has occurred since 1888, and it won’t happen again until the year 81056.

Owing to the “man bites dog” nature of this cosmic convergence, the American media have been steadily covering it since before the turning of the leaves.

Unfortunately, nearly all of the copy on the subject has centered upon the unbearably stupid and lazy neologism “Thanksgivukkah” and essentially left it at that.  We have merged the two events linguistically, but given nary a thought to what they have in common thematically.  That’s a shame, because the answer is, “More than you might think.”

Typically, of course, the Jewish festival of lights is tethered to Christmas on the cultural calendar, a tradition that has been the bane of Jewish children’s existence since time immemorial.

The Christmas-Chanukah conflation has always been problematic, insomuch as the two holidays are related in no way beyond their temporal proximity.  That the latter would come to be nearly as commercially visible in America as the former is entirely a function of culture:  Jewish kids would see their Christian friends getting presents and chocolate at the end of every December and wonder to their parents, “Why not us?”  And so we established Chanukah as the “Jewish Christmas” and that was that.

The quandary in assuming cultural parity between the two, as we have, lies in their relative significance to their respective faiths.  In point of fact, Chanukah is not half as important to Judaism as Christmas is to Christianity, and was never intended to be observed as such.

(In Israel, where Christmas is no big deal, Chanukah is no big deal, either.)

On its own terms, Chanukah makes for a perfectly lovely and agreeable time, but its small charms have never really been given a chance to breathe amidst our society’s outsized seasonal hubbub.  Alongside Christmas, it’s a whimsical pink bicycle leaning up against a Sherman tank.

But connecting Chanukah to Thanksgiving?  Now you’ve got something.

As much as anything else, what both events signify is the paramount importance of religious liberty in the history of mankind.  Both celebrate a small group of renegades who succeeded in securing such freedoms for themselves, each in the face of overwhelming adversity.

The basis of the Chanukah story, as any Rugrats viewer well knows, is the successful rebellion by a Jewish rabble called the Maccabees against the Seleucid Empire between 167 and 160 BC.  Under the reign of Antiochus IV, Judaism and its practices suddenly became outlawed in the kingdom after a long period of tolerance.

Not prepared to take this repressive state of affairs sitting down, the Maccabees proceeded to launch a brutal guerilla war against the empire that, despite their small numbers, they ultimately won:  At the struggle’s end, Jewish rituals were again allowed to be performed and the Second Temple in Jerusalem, which had fallen into disrepair, was rededicated as a symbol of the resilience of the Jewish people.

Thanksgiving, in its traditional telling here in the states, similarly concerns the exploits of a put-upon minority that desired to worship its own god in its own way and went to enormous lengths to ensure that it could.  The men and women we call the Pilgrims arrived in Plymouth, Massachusetts from Europe in 1620 in order to practice their particular form of Separatism that was frowned upon in their native land.

As with the Maccabees and their descendents, it is the Pilgrims’ religious and cultural (not to mention literal) survival that we commemorate on the fourth Thursday of every November.  Their fight for freedom served as a forerunner for all those that followed.

By no means is the analogizing between today’s twin occurrences airtight.  There is much that differentiates Thanksgiving from Chanukah—far more, indeed, than that which joins them together—and there is more that characterizes each than what I outlined above.

But the United States does not have an official day of recognition for the first amendment to our constitution, as it arguably should, considering that the amendment’s stipulations for free expression—including the freedom to worship unmolested—are so fundamental to our way of life.

Among our secular holidays, Thanksgiving probably comes closest to essaying this most noble role.  And of the major non-secular festivals that occupy the American calendar, Chanukah fits the bill as fittingly as any other.

That the two should fall (finally) on precisely the same day is a nice coincidence, and a marked improvement over the usual way of things during the holiday season.  It’s a shame it won’t happen again for another 79,043 years.

Gobble Away

In recent days, the Boston Globe’s website, Boston.com, posted its annual slide slow of an advice column, “How to Cut 1,000 Calories from Thanksgiving.”

The Globe has included this feature in its virtual Turkey Day section for many years, hoping to assure those watching their weight that fully enjoying the fourth Thursday of November and blowing a giant hole in one’s diet are not mutually dependent phenomena.

Allow me to save you a few precious seconds and reveal this magical waistline-preserving secret right here and now:  If you wish to eat less food on Thanksgiving, eat less food on Thanksgiving.

I jest not.  To quote directly from one of the slides:  “Instead of piling on a full cup of mashed potatoes on your plate, consider scooping only half as much.”  From another:  “Instead of covering your plate with 6 ounces of a combo of white and dark [turkey] meat with skin, consider taking only 3 ounces of meat and leaving the fatty skin in the roasting pan with the rest of the grease.”

Smaller portions?  Less fat?  Genius!  Why didn’t I think of that?

In fairness, the Globe also offers slightly more sophisticated tips for reigning yourself in, such as stir-frying the veggie casserole instead of dousing it with fried onion rings.  But the takeaway message is the same:  The trick to eating well is eating well.

If this insight comes as breaking news to a significant portion of America’s weight loss community, then it’s no wonder our country is so irretrievably fat.

However, I suspect this is not the case.  The truth is that all who are serious about scaling themselves down know exactly how to do it:  Eat less, exercise more.  Period, full stop.  It works every time and never lets you down.

The only mystery involves summoning the willpower to do so, and then to keep it up for the rest of your life.

Accordingly, Thanksgiving indeed presents as a singular conundrum.  Apart from its more noble components, the whole point of this most American of holidays is to gorge ourselves into a blissful stupor simply because we can.

Yes, pretty much all of our annual national festivals involve an unholy assortment of culinary treats of one kind of another.  But Thanksgiving is unique in its insistence on gobbling up every last bit of it and licking the plate when you’re done.  The feast isn’t a mere side show; it’s the main event.

And that makes a real difference for those who make a point of avoiding exactly that.

It’s bad enough for a dieter to be overwhelmed by a bottomless buffet of hearty holiday helpings.  But to be all but ordered by one’s culture—and by relatives across the table—to dive in until it’s all gone?  Well, the psychological odds are not in your favor.

Your humble servant is certainly no exception.  I walked into last year’s family gathering determined, as ever, to keep my cravings under control.  Then out came the chips, the ale, the stuffing, the casseroles, the fruit salads—each new dish more impossibly sumptuous than the last—and all my defenses vaporized on contact.  At dessert, one whiff of my cousin’s homemade sweet potato pie and all hell broke loose.

In short, the effort at moderation was futile.  The fact is, Thanksgiving is not the time for restraint or self control.  Thanksgiving is about gluttony and excess and that’s just the way it is.

If, like me, you are simultaneously preoccupied with maintaining a slim figure yet utterly powerless in the face of fragrant culinary temptations, my Thanksgiving Day prescription is to give up.  To abandon any possibility of awakening on Black Friday without a rounded tummy and a splitting headache.  To relax and roll with the tide.  Some traditions simply cannot be fought.

And if, like the Globe’s target audience, you truly wish to deduct 1,000 calories from your Thanksgiving budget, might I suggest plucking out five days on either side of November 28, and consuming 200 fewer calories on each.

Then on Thanksgiving itself, you may proceed exactly as you were going to all along, without a moment of hesitation or guilt.

That’s what the holidays are all about.  You can have your turkey and eat it, too.

Not Just a Theory

One must never let the facts get in the way of a perfectly good conspiracy theory.

Yet I must confess that, on the matter of the Kennedy assassination, I have done exactly that.

My experience with the notion that President John F. Kennedy was not killed by a single person acting on his own began (boringly enough) with Oliver Stone.  Viewing his 1991 film JFK for the first time (and then a second and a third), I was mesmerized by the web of intrigue that surrounded the late president’s death.

At the very least, the movie suggested that whether Kennedy really had been killed as part of a grand plot, there is a trove of information to illustrate why the idea exists.

Mind you, in the many years during which I counted myself among JFK conspiracy cooks, I never clung to any particular narrative.  Whether the president had been done in by the mob, the CIA, Fidel Castro, extreme right-wingers, extreme left-wingers, or all of the above—that was beside the point.

For me, the case was a simple matter of forensics:  Early analysis of Abraham Zapruder’s film of the assassination concluded the shooting took place in a span of 5.6 seconds, which is simply not enough time for a single person to fire three separate shots with the rifle Lee Harvey Oswald allegedly used.  By definition, that means there were at least two shooters and that the killing was therefore a conspiracy of one kind or another.

Then some time later, I came upon a documentary, “Beyond Conspiracy,” aired on ABC in 2003, which noted that subsequent and more sophisticated examinations of the Zapruder film have established that—oops!—the actual time frame of the three shots is 8.4 seconds—more than enough for someone with Oswald’s background and training.

Since I had based my conspiratorial musings entirely on this one statistic, and since the statistic had now been proved incorrect, I saw no compelling reason to carry on with my investigations and I have suited up with Team Lone Gunman ever since.

Neat, huh?

On this 50th anniversary of that dark day in Dallas, I wish to contest a commonly-held perception about conspiracy buffs—namely, that they are stubbornly irrational creatures who are impervious to facts and data that might disprove their darkest convictions about how the world really works.

Historically speaking, this assumption is entirely correct, except when it’s not.

For instance:  When a wave of paranoia about President Barack Obama’s place of birth crested a few years back, the basis of the claim that Obama was born in Kenya, not Hawaii, was the lack of a birth certificate to prove otherwise.

When the president produced such a document, the controversy should have ended right there.  Yet the howls of protest continued from some corners of the Internet, with “birther” holdouts proceeding to concoct ever more elaborate explanations for how the objective truth was neither objective nor truthful.

However, this was not universally the case.  For every person who did not listen to reason, there were many more who did.

In a Gallop poll conducted in the first week of May 2011—several days after Obama’s “long form” birth certificate was made public—13 percent of respondents asserted the president was “definitely” or “probably” born in a foreign country.  In an identical survey two weeks prior—that is, when the birth certificate had yet to be seen—the number was 24 percent.

In other words, the size of the “birther” pool was cut nearly in half by a simple disclosure of fact.  For a sizable minority of the public, the conviction that the president was not born in the United States was, it turned out, susceptible to basic logic:  They asked for proof, they received proof, they accepted it and they moved on.  Presto.

I wish the size of this minority were bigger, and that there weren’t such a large gang of reliable idiots whose paranoia overwhelms all their other mental capacities.  The latter makes the former look bad, and that’s a shame.  We need honest skeptics in this society, because sometimes their instincts are right.

The JFK conspiracy theories might be hooey, but some conspiracies are real.  (The Lincoln assassination is one.)  We must take care to recognize this, and to differentiate between the two.

To assume nothing is a conspiracy is no less reckless than to assume everything is a conspiracy.  One generalizes at one’s peril.

The key, as with so much else, is to be all the time led by the facts and the evidence, and not by the lack thereof.

What Might Have Been

Over at the John F. Kennedy Presidential Library in Boston, there is a special exhibit, “To the Brink,” all about the Cuban Missile Crisis of October 1962.

Among the featured documents on display, perhaps the most arresting is the original typed draft of a speech that President Kennedy never delivered—that is, the announcement that the United States was about to launch an all-out assault on Cuba to destroy the missiles secretly installed there by the Soviet Union.

“I, as president, had the greatest reluctance to order the action that is now being carried out,” Kennedy was to have said.  “I made every effort to clarify my position.  But the Cuban authorities would not listen.  In the face of their open defiance action became inevitable.”

“There should be no doubt on the part of anyone,” he was to add, “that, in carrying out this commitment, the U.S. will be prepared to use all the forces at its disposal including nuclear.”

The American people never heard such an address because such a decision was never made (the president opted for a blockade instead).  But it jolly well could have been:  Several key members of the secret White House EXCOMM meetings recommended such a move, and Kennedy considered it seriously enough to prepare a speech just in case.

In this week of reminiscences of the Kennedy administration—Friday is the 50th anniversary of the assassination in Dallas—the question has predictably resurfaced, “What if Kennedy had lived?”

Minus those three shots fired from the Texas School Book Depository in Dealey Plaza, how might the arc of history differed from the one we have?

Would the United States have doubled down in Vietnam?  Would the Civil Rights Movement have progressed faster (or slower)?  Would the American public have been spared its disillusionment with government spurred by the presidencies of Lyndon Johnson and Richard Nixon?

The counterfactual history industry has long flourished in America, and it’s easy to understand why.  After all, the creative possibilities are endless and, by their nature, cannot be positively disproved.

In the case of Kennedy, the allure of crafting “what if” scenarios is especially potent, given the presidency’s oversized promise and undersized length.  It ended on a series of cliffhangers, and it has been left to survivors to second-guess how it might have played out.

What events like the missile standoff bring so sharply into focus, however, is the fact that the world does not require such horror shows as assassinations for the thrust of human events to change course.

As we know but sometimes forget, our leaders are all the time faced with decisions that could (and often do) prove enormously consequential in the longer term—decisions that were all but arbitrary at the time but are seen as inevitable in retrospect.

Such is one of the central insights of history and of life itself:  Nothing is inevitable.  Events unfold in only one way, but there are a billion other ways they could unfold, with only the mildest shuffling of the cards.

Never mind the decisions Kennedy might have made had he not died.  We cannot possibly sort through all the decisions he could have made while he lived.

Further, by no means is this principle of unknowable-ness exclusive to government and politics.  It also applies to each and every one of us.

Back to the Future was all about how Marty McFly’s parents, George and Lorraine, met and fell in love because George unwittingly stepped into the path of Lorraine’s father’s green Chevy.  As the movie makes plain, had George simply watched where he was going, the marriage would never have occurred and Marty would never have been born.

How many of us owe our own place in the universe to events that could very easily have gone the other way?  Is the alternative even possible?

And so when we talk about how different the world might have been if President Kennedy survived, let us acknowledge the limits of such theorizing by recognizing that the future is far more unpredictable than we give it credit for, that nothing is “destined” to happen until it does, and that we are all the time hostage to the playful randomness of the universe in ways that even a president cannot fully control.

Get On With It

Well, somebody had to test whether it was too soon to ask if the “Boston Strong” movement had run its course and was becoming just a little bit silly.

As it turned out, that somebody was Bill Maher and the answer was, “Yes, it is too soon.”

On the November 8 episode of his HBO program Real Time, Maher noted how the Red Sox championship parade observed a moment of reflection when the procession reached the finish line of the Boston Marathon, the site of the April 15 bombings.  Said Maher:

It was a bad day.  Three people died, that’s terrible.  More were maimed, that’s horrible.  But unfortunately that happens every day, in car accidents and everything else.  I mean, your city was not leveled by Godzilla.

In response, Thomas Menino, the outgoing mayor of Boston, called the comments “very irresponsible” and said that Maher “should be taken to task” for making them.

“He doesn’t know what he’s talking about,” Menino added.  “Come to Boston, visit Boston and see what a strong city we are.”

Views of many other Bostonians followed in the same spirit, and can be roughly summed up as, “What a jerk.”

Maher’s broader point can be gleaned from what he said moments earlier on his show about the New York metro area regarding Hurricane Sandy, whose one-year anniversary led, among other things, to a postponement of that city’s final mayoral debate:

It was a storm, it was a bad storm.  But it was an act of nature.  Do we always need to wallow and re-grieve over every bad thing that’s ever happened in this country?

Well, how ‘bout it, folks?  Conceding, as we might, that the person who asks such questions is an insensitive brute, let us soldier on and consider whether the insensitive brute has a point.

To wit:  What is the appropriate amount of time an entire city is allowed to publicly mourn a tragedy?  Is the length of the former determined by the scale of the latter?

Had the marathon attack killed three hundred people, rather than three, would and/or should shows of citywide solidarity, like those at the Red Sox parade, endure for a hundred times longer than they already have?

Conversely, had something like the 9/11 attacks inflicted far less damage than they did—suppose the buildings hadn’t collapsed—would the city of New York be justified in holding the sorts of massive annual commemorations it has held on every September 11 since?

In publicly grappling with acts of man-made and natural horror, should we not discriminate between truly seismic events and (relatively) small-scale traumas?  Or does every high-profile calamity necessitate an equal—and equally open-ended—outpouring of public concern?

I wish to stress the use of the word “public” in all of these queries, since no one—including Bill Maher—would presume to tell a stricken individual that the time has come to “move on.”  In one’s private life, there is no right or wrong way to grieve; everyone reacts to death and suffering differently.

But taking an assault on individuals to also be an assault on an entire city is a wholly separate matter and is fair game for scrutiny.

The essence of “Boston Strong”—the phrase itself and the attitude it represents—is that the people of Boston, like the people of New York before them, will not allow punks with bombs to bring the city to its knees.  That we will carry on—proud and undaunted—and prove to evildoers everywhere that our values are not easily abandoned or destroyed.

Is this not, in so many words, exactly what the mean old man on HBO was suggesting?  That we not abandon all sense of perspective and completely lose our marbles whenever something terrible happens?  That the effects of what occurred on Boylston Street were challenging, but by no means insurmountable?

A central precept of all fiction writing intones, “Show, don’t tell.”  If the city of Boston genuinely insists upon the doctrine of carrying on, might we demonstrate it by actually carrying on?  By returning to our normal lives and not throwing a hissy fit whenever our pride is questioned?

If you want to prove that you’re strong, be strong.  Don’t say it—just do it.  Acknowledge your loss, comfort those who need comforting, and then resume business as usual.

For heaven’s sake, our city was not leveled by Godzilla.

The Less You Know

Here’s a cheerful thought for you to ponder.

Suppose there was a document, hidden somewhere in the bowels of the National Archives, that proved beyond a reasonable doubt that the assassination of President John F. Kennedy was conceived, planned and executed by some group within the Central Intelligence Agency.

And then further suppose that such a document, having been successfully withheld for some five decades, were somehow obtained, in a WikiLeaks-style coup, and released into the public domain on Monday.

What, then, would happen on Tuesday?

Among the many conspiracy theories surrounding the death of President Kennedy, the 50th anniversary of which we will observe next week, the one involving the CIA is arguably the most plausible.  Not, mind you, because there is any particularly persuasive evidence to suggest such an event actually happened—there isn’t—but simply because it is in the agency’s nature to commit the most unthinkable crimes without detection or any measure of accountability.

Programs such as Homeland might not be accurate in every last particular, but the known history of the real CIA shows assassination to be something of a hobby for our esteemed spy network, be it directly or through snafus known as “blowback.”  Is it really that much of a stretch to imagine its nefarious practices committed on its own commander-in-chief?

This year, thanks to one Edward Snowden, we have experienced a veritable waterfall of disclosures about the heretofore secret and unchecked high jinks of the National Security Agency, which has been found to have tapped the phones and e-mail accounts of pretty much everyone on planet Earth, including the leaders of countries with whom we are supposedly friends.

We disagree about whether—and to what extent—the NSA should engage in this behavior, but tell me:  Now that you know it does, do you wish that you didn’t?  In possession of this information, do you as an American feel morally soiled, or do you rather feel cheated to have so long been kept in the dark?

Never mind the rest of the world and never mind “national security.”  Broadly speaking, is there any information about the U.S. government that, if true, you would simply not want to know?  Something so ghastly—so antithetical to the highest ideals of the American republic—that you would just as well remain ignorant of it for the balance of your natural life?

With November 22 upon us, I return to my original query:  What would it mean to learn President Kennedy was assassinated by the CIA?

For starters, it would mean we live in a country whose government murdered its own head of state—a practice we like to think is reserved for third-world dictatorships in the most backward corners of Africa and the Middle East.  And from a wing of that government, we might add, that has been in continuous operation in the half-century since, carrying on more or less as it always has—in secrecy and very nearly immune from legal recrimination.

In the event of such a revelation, what would the outfit’s current director possibly have to say in his agency’s defense?  “Sorry about that—won’t happen again”?  “Hey, it was a long time ago, let’s just move on”?

We don’t need the Kennedy-killed-by-CIA theory to be true in order to face these grave questions.

In the past decade alone, we have been made to grapple with the fact of our government, in our name, having tortured suspected terrorists—in clear violation of the Geneva Conventions—as well as having used drones to target and kill American citizens at the whim of the executive branch, uninhibited by such annoyances as due process and trial by jury.

The scandal here is not only that the U.S. does these things, but that the public has essentially shrugged them off as necessary and unavoidable byproducts of the so-called war on terror.  “Yeah, it’s unfortunate—but hey, what can you do?”

Is it possible we would regard a hypothetical plot to kill Kennedy in the same way?  With a resigned “meh”?  With a brief series of protests and howls of outrage, followed by obedient silence?

Could it be that the real problem is not that there are certain things we could not bear to know, but rather that we are no longer capable of being shocked by what our government may or may not be doing behind our backs?

I’m not sure that’s something I want to know.

Junk Food

I doubt that I will ever actually read Double Down, the new chronicle of the 2012 presidential race by Mark Halperin and John Heilemann.

The book, released last week, has accrued enormous press coverage in recent days, swiftly becoming the “official” account of the proverbial horse race between Barack Obama and Mitt Romney, thanks to a peanut gallery of firsthand accountants who dish on the personalities involved in a presidential election that, if not our country’s most exciting, was nonetheless (as the joke goes) certainly the most recent.

Political junkie that I am, I have naturally skimmed the excerpts from the tome published in the New York Times and TIME and the various analyses that have followed about the meaning of the 2012 race in the context of U.S. history and, of course, the eventual campaign of 2016.

But that is as far as I wish to go.

I do not require every last detail about what Obama and Romney were doing and thinking at every moment of their contest for the Oval Office.

I did not need to know—as I now do—that Romney’s vice presidential vetting committee codenamed their operation “Goldfish” and referred to New Jersey Governor Chris Christie as “Pufferfish” and Florida Senator Marco Rubio as “Pescado.”

Nor, frankly, do I much care about the minutiae of the Obama administration’s “poll testing” about whether to replace Vice President Joe Biden with Hillary Clinton.  For Pete’s sake, any self-respecting incumbent is going to explore every available avenue for success, and Obama is no different.  There is nothing especially groundbreaking about this so-called scoop, and none of these disclosures means much in the broad sweep of history.

Yet I marinate in this piffle all the same, and will probably continue so long as it keeps popping up on my news feed.

I don’t want to, but I do.  I just can’t help myself.  I’m a junkie, and this is my junk food.

Mind you, political junk food should not be confused with political guilty pleasures.  There is a distinction between the two that, while narrow, is worth pointing out.

A guilty pleasure is something that is dismissed in polite society as vulgar and trivial, but might nonetheless contain some redeeming value.  Take, for instance, my occasional pastime of eating an entire jar of peanut butter with a spoon at 2 o’clock in the morning:  No, it’s not something I could get away with in public, but hey, think of all the protein!

Per contra, junk food is, well, junk.  It’s pure sugar and fat, it does nothing for you in the long run, you consume it in a moment of weakness and feel disgusted with yourself a few minutes later.  Rather than peanut butter, picture killing a whole jar of Nutella.  (You know who you are.)

So how might we differentiate these two concepts in the worlds of government and politics?

I am extremely tempted to argue that the entire experience of following a political campaign constitutes a guilty pleasure, with most of its constituent parts nothing but pure junk—sparkly distractions yielding heat but no light.  The boring legislative sausage-making that (sometimes) occurs between elections—that’s the part that matters.

Halperin and Heilemann have defended Double Down against the inevitable charges of gossip-mongering by asserting that all of its goodies were subject to rigorous cross-checking, and that any assertions that could not be verified, no matter how titillating, remained on the cutting room floor.  In short, they have committed journalism, not tabloid hackery.

Here’s a thought:  They have actually committed both simultaneously, and there’s your problem.  In today’s environment, where personality not only trumps substance but is considered substantive itself, any stray piece of dirt about a political figure, no matter how inconsequential, is considered axiomatically newsworthy, provided that it is obtained in a journalistically valid manner.

My plea to the peddlers of this troubling tendency:  Knock it off.

Don’t elevate the status of disposable schoolyard chatter into the realm of respectability.  Don’t conflate valuable information with pure muck.  Learn to discriminate between the two, for many of your readers cannot, but they jolly well should.

By no means do I advocate an end to all frivolity in political reporting, just as I wouldn’t dream of purging the supermarket shelves of all candy and chocolate.  Such trifles will always have a place in our society—namely, to provide a mental release from the weight of the serious business of life.

All I ask is that we recognize our depraved desserts for what they are, and not pretend they are wholesome and nutritious.

We can allow ourselves the occasional indulgence, but let’s not make every day Halloween.

The War Gulf

Christopher Hitchens once published a collection of essays titled, Love, Poverty and War.  In the volume’s introduction, Hitchens explained, “An antique saying has it that a man’s life is incomplete unless or until he has tasted love, poverty, and war.”

By that standard, I expect I will die an incomplete person.  On most days, I hope as much.

While I have indeed tasted love and have done my level best to dip my toe into the wonders of extreme scarcity, war to me remains a complete and abject mystery.

Sure, I have watched Saving Private Ryan and Apocalypse Now, and enrolled in a college course, “History of War,” that broadened my understanding of the peculiar institution in ways for which I will forever be grateful.

But I also realize that nothing quite substitutes for actual experience on this matter, and being the moral and physical coward that I am, I would be perfectly content to maintain my basic ignorance for the rest of my natural life.

As we recognize our veterans on this eleventh day of the eleventh month, at the forefront of our minds ought to be the enormous and inevitable gulf between those who have served in the United States Armed Forces and those who have not, and to honor and appreciate our uniformed comrades accordingly.

As the above quotation suggests, there is something about the fact of having engaged in real live combat, in its many forms, that shapes and alters one’s outlook of the world in a manner that is singularly strange and ultimately inconceivable to nonparticipants like yours truly.  It makes the duty of celebrating the work of veterans that much more imperative, but fraught with a certain awkwardness and dissonance as well.

To wit:  I can instinctively perceive great writing when I see it, because I have made innumerable attempts, however unsuccessful, to produce it myself.  Likewise, I can respect the talents of a great ballplayer from all the miserable, mediocre seasons I spent in little league.

Call it the Amadeus effect:  In any creative endeavor, it takes one to know one.  And as demonstrated by Salieri with respect to Mozart, no one appreciates success more than a failure.  This involves a conjecture of sorts, but one that is reasonably coherent.

Per contra, civilians’ respect for soldiers comes not through experience, but rather through lack of experience, and through the outright awe for someone with the nerve to join the armed forces at all.

I defend the rights and values that define America by writing about them in the comfort of my own living room, remote and free from harm.  For those who do the same by strapping on a uniform and willfully parachuting into enemy territory to face physical threats to one’s person, known and unknown—well, there is no means of comparison.

It is a natural human impulse to try to relate to another’s experience—particularly a difficult one—by drawing an analogy from one’s own life, as a means of assuring, “I know exactly how you feel.”  Sometimes this is a perfectly healthy and useful means of exercising basic empathy.  At other times, however, it takes on false or outright ludicrous forms.

Many couples preparing for their first child, for instance, like to comfort themselves with the notion that they will make fine parents because, to date, they have made fine pet owners.  A woman with her newborn will sooner or later be told by a friend, “I don’t have a baby, but I have a dog, so I know just what you’re going through.”  Late Late Show host Craig Ferguson, periodically informed as much by certain guests, aptly responds to the effect of, “Yeah, try leaving a baby alone in the backyard.”

In like spirit, let us use today to reflect upon the myriad ways in which war is sui generis in the human experience.  It’s not “like” anything else that might happen to us in our daily lives.  It is a monster unto itself.

Probably the only solution—the means of bridging this gulf between the fighters and the bystanders—is to shut up and listen.  To tone down our more mindless expressions of reverence and simply allow our soldiers to tell their stories, if they so choose.

It won’t enable us to escape our “incomplete” lives, insulated from and naïve about the big, bad world around us.  But it will provide us a tiny window into the psyches of those who have, and with it, a more proper respect for the enormity of the sacrifices they have made on our highly unworthy behalf.

Pop Goes the Weasel

Everyone had a good laugh this week at the expense of Rob Ford, the boorish, drink-sodden mayor of the great city of Toronto who, after months of stalling, finally admitted to having smoked crack cocaine at some point in the past—a disclosure facilitated by an apparent videotape, obtained by police, showing him doing precisely that.

Ford’s explanation was positively and pricelessly Pythonian.  “Yes, I have smoked crack cocaine,” he said.  “But, am I an addict?  No.  Have I tried it?  Probably in one of my drunken stupors, probably approximately about a year ago.”

The logic is charming, is it not?  Yes, I was strung out on coke.  But it was only on account of the booze.  So don’t get any ideas that I’m some sort of junkie.

To be fair, Mayor Ford did not completely evade responsibility for his unholy act, saying, “To the residents of Toronto, I know I have let you down and I can’t do anything else but apologize.”  He did not, however, take the presumably inevitable step of surrendering City Hall.  To the contrary, he took the opportunity to announce his plans to run for a second term.

In Kentucky, meanwhile, Senator Rand Paul spent the week batting away accusations of plagiarism, having been found to have lifted passages of his speeches and printed works from others’ and from Wikipedia.

While admitting that he and his aides had been “sloppy” by failing to attribute the excerpts in question, Senator Paul hastened to add, “I’m being unfairly targeted by a bunch of hacks and haters.”

Again, the logic is airtight:  Yes, I stole other people’s words and passed them off as my own.  But let’s not get distracted from the main point, which is that I have become the target of a witch hunt.

Then there is President Barack Obama, faced with twin indictments relating to the Affordable Care Act—first, that the rollout of the website through which Obamacare is distributed has been an unmitigated disaster, and second, that the president plainly lied in saying, “If you like your current healthcare plan, you can keep it.”

Yes, the website sucks and the promise about keeping your old plan was false.  But never mind all of that.  Let’s just focus on the positives:  The website will eventually work, and the policies you thought you could keep will ultimately be replaced by superior ones.

For all that differentiates the crimes of which these officials stand accused, they are tied together by the weaselly manner in which said officials have responded to said accusations.

In all three cases, you will note, the men have attempted simultaneously to admit guilt and deny full culpability.  Their explanations all follow the formulation, “Yes, I did it, but…”  So far as they are concerned, they are as much the victims as the perpetrators.

Leave it a politician to not accept responsibility in the process of accepting responsibility.

What is more, the excuse for each transgression is a complete non sequitur, as it relates to the transgression itself.

Mayor Ford would have the good people of Toronto forgive his felonious crack smoking on the grounds that it was brought about by the effects of alcohol.  Good luck explaining that to law enforcement.

Senator Paul is surely correct that his current high profile has made him uncommonly vulnerable to dirt-digging by political adversaries.  But what on Earth does that have to do with whether the information they have uncovered is true?

As for the commander-in-chief:  He can rationalize all he wants about the wonders of the new healthcare exchanges, but it doesn’t make his infinite assurances that one can opt out of them any less of a lie.  (He has attempted to atone for this in recent days, but words like “too little” and “too late” nonetheless spring to mind.)

At issue in all of these verbal acrobatics is the principle commonly referred to as “owning it.”

When you have been caught with your hand in the cookie jar, just admit that you were hungry and didn’t think anyone else was in the kitchen.  Don’t change the subject.  Don’t conjure a list of extenuating circumstances around which your piggy actions can be sorta-kinda justified.

Own it.  Take responsibility and take all of it, without the qualifications and childish whining.  As a public official, don’t insult the intelligence of your constituents by acting like they can’t see the crumbs on your mustache.

Don’t be a weasel.  Be a grownup.

Turnout Bum Out

Marty Walsh has been elected the next mayor of Boston.  On January 6, he will become the first new chief executive of New England’s largest city in more than 20 years.

And how many of the residents Walsh will rule actually filled the bubble next to his name on a ballot this past Tuesday?

Eleven percent.

Breaking it down:  According to the most recent estimate, Boston’s population stands at roughly 636,000 people.  Of those, 141,000 voted in Tuesday’s election, with 51.5 percent opting for Walsh.

Admittedly, in arriving at this 11 percent figure, I am being a wee bit cute, as a considerable chunk of any city’s populace could not vote even if it wanted to, non-citizens and those younger than 18 chief among them.

Bearing this in mind, Boston’s “true” turnout in the mayoral contest, from a registered voter pool of some 372,000 souls, rung in at 38 percent—slightly higher than expected—meaning that 19.5 percent of everyone who either did or could have cast a vote on Tuesday went with the man who will be the Big Cheese for the next four years.

Feeling better?

The conundrum of low voter turnout is not a new phenomenon, and is certainly not limited to the home of the World Series champions of 2013.  (In New York City on Tuesday, the figure was a dismal 24 percent.)  In point of fact, elections in odd-numbered years are dependably poorly-attended affairs all across these United States, and the even-numbered events aren’t much better.

That’s the problem:  The aforementioned statistics are the rule, not the exception.

It is a crucial fact of American government, often misunderstood, that we live in a republic, not a democracy.  In practice, our annual elections are the only expression of direct democracy we offer.  They are our one chance to say, simultaneously and en masse, exactly what we think about how the United States should operate.  At all other times, our interests are served indirectly through our elected representatives.

The depressing truth that our piddling participation rates suggest, however, is that most of the time, we are barely even a republic.  That our elected officials can hardly be said to represent “we the people” in any meaningful sense.

Indeed, even in a lopsided race in a well-attended election, a victorious candidate would be extremely lucky to win the endorsement of more than half of all eligible voters.  In some presidential elections (thanks to third-party also-rans), the winner has even failed to secure a majority of the people who actually voted.

This is just the way it is, now and throughout history.  When we say that our leaders are chosen by and represent “the will of the people,” we are using an extremely broad definition of the word “people.”  Very rarely indeed has the sacred principle of “majority rules” involved an actual majority of the American public.

So what, then, do we do with this disheartening information?  Unfortunately, there is no single, silver bullet solution.

In any given election, we might agree that a “perfect world” scenario would entail a turnout of 100 percent of eligible voters, each of whom is sober and well-informed.

But this is where our agreement ends.

Suppose, in some hypothetical race, there is the option of either a 50 percent turnout of citizens who are universally well-informed, or a 100 percent turnout of complete ignoramuses.  Which of these would you prefer?  Which would better serve the interests of the republic?

How about a choice between that well-informed 50 percent versus a 100 percent that is evenly divided between the knows and the know-nothings?  How much are we willing to dilute the share and power of our most diligent citizens for the sake of maximizing total participation in the democratic process?

The answers to such questions are not at all self-evident, yet they concern the very nature of democracy in the modern world.

My own view (in case you wondered) is that voter turnout is an overrated component of the democratic process, provided that those who abstain from voting do so out of a genuine lack of interest in public affairs.  I wish everyone cared as much about the minutiae of government as I, but so long as they don’t, I would prefer they maintained the courtesy of steering clear of the ballot box.

In other words, the ideal within our imperfect system is to maximize voter quality, rather than voter quantity.

If a mayor is not chosen by a genuine majority of his constituents, he might as well be chosen by the particular minority that actually knows his name.