Two For One

Today is Thanksgiving, and also the first day of Chanukah.

This is the first time such an odd phenomenon has occurred since 1888, and it won’t happen again until the year 81056.

Owing to the “man bites dog” nature of this cosmic convergence, the American media have been steadily covering it since before the turning of the leaves.

Unfortunately, nearly all of the copy on the subject has centered upon the unbearably stupid and lazy neologism “Thanksgivukkah” and essentially left it at that.  We have merged the two events linguistically, but given nary a thought to what they have in common thematically.  That’s a shame, because the answer is, “More than you might think.”

Typically, of course, the Jewish festival of lights is tethered to Christmas on the cultural calendar, a tradition that has been the bane of Jewish children’s existence since time immemorial.

The Christmas-Chanukah conflation has always been problematic, insomuch as the two holidays are related in no way beyond their temporal proximity.  That the latter would come to be nearly as commercially visible in America as the former is entirely a function of culture:  Jewish kids would see their Christian friends getting presents and chocolate at the end of every December and wonder to their parents, “Why not us?”  And so we established Chanukah as the “Jewish Christmas” and that was that.

The quandary in assuming cultural parity between the two, as we have, lies in their relative significance to their respective faiths.  In point of fact, Chanukah is not half as important to Judaism as Christmas is to Christianity, and was never intended to be observed as such.

(In Israel, where Christmas is no big deal, Chanukah is no big deal, either.)

On its own terms, Chanukah makes for a perfectly lovely and agreeable time, but its small charms have never really been given a chance to breathe amidst our society’s outsized seasonal hubbub.  Alongside Christmas, it’s a whimsical pink bicycle leaning up against a Sherman tank.

But connecting Chanukah to Thanksgiving?  Now you’ve got something.

As much as anything else, what both events signify is the paramount importance of religious liberty in the history of mankind.  Both celebrate a small group of renegades who succeeded in securing such freedoms for themselves, each in the face of overwhelming adversity.

The basis of the Chanukah story, as any Rugrats viewer well knows, is the successful rebellion by a Jewish rabble called the Maccabees against the Seleucid Empire between 167 and 160 BC.  Under the reign of Antiochus IV, Judaism and its practices suddenly became outlawed in the kingdom after a long period of tolerance.

Not prepared to take this repressive state of affairs sitting down, the Maccabees proceeded to launch a brutal guerilla war against the empire that, despite their small numbers, they ultimately won:  At the struggle’s end, Jewish rituals were again allowed to be performed and the Second Temple in Jerusalem, which had fallen into disrepair, was rededicated as a symbol of the resilience of the Jewish people.

Thanksgiving, in its traditional telling here in the states, similarly concerns the exploits of a put-upon minority that desired to worship its own god in its own way and went to enormous lengths to ensure that it could.  The men and women we call the Pilgrims arrived in Plymouth, Massachusetts from Europe in 1620 in order to practice their particular form of Separatism that was frowned upon in their native land.

As with the Maccabees and their descendents, it is the Pilgrims’ religious and cultural (not to mention literal) survival that we commemorate on the fourth Thursday of every November.  Their fight for freedom served as a forerunner for all those that followed.

By no means is the analogizing between today’s twin occurrences airtight.  There is much that differentiates Thanksgiving from Chanukah—far more, indeed, than that which joins them together—and there is more that characterizes each than what I outlined above.

But the United States does not have an official day of recognition for the first amendment to our constitution, as it arguably should, considering that the amendment’s stipulations for free expression—including the freedom to worship unmolested—are so fundamental to our way of life.

Among our secular holidays, Thanksgiving probably comes closest to essaying this most noble role.  And of the major non-secular festivals that occupy the American calendar, Chanukah fits the bill as fittingly as any other.

That the two should fall (finally) on precisely the same day is a nice coincidence, and a marked improvement over the usual way of things during the holiday season.  It’s a shame it won’t happen again for another 79,043 years.

Advertisements

Gobble Away

In recent days, the Boston Globe’s website, Boston.com, posted its annual slide slow of an advice column, “How to Cut 1,000 Calories from Thanksgiving.”

The Globe has included this feature in its virtual Turkey Day section for many years, hoping to assure those watching their weight that fully enjoying the fourth Thursday of November and blowing a giant hole in one’s diet are not mutually dependent phenomena.

Allow me to save you a few precious seconds and reveal this magical waistline-preserving secret right here and now:  If you wish to eat less food on Thanksgiving, eat less food on Thanksgiving.

I jest not.  To quote directly from one of the slides:  “Instead of piling on a full cup of mashed potatoes on your plate, consider scooping only half as much.”  From another:  “Instead of covering your plate with 6 ounces of a combo of white and dark [turkey] meat with skin, consider taking only 3 ounces of meat and leaving the fatty skin in the roasting pan with the rest of the grease.”

Smaller portions?  Less fat?  Genius!  Why didn’t I think of that?

In fairness, the Globe also offers slightly more sophisticated tips for reigning yourself in, such as stir-frying the veggie casserole instead of dousing it with fried onion rings.  But the takeaway message is the same:  The trick to eating well is eating well.

If this insight comes as breaking news to a significant portion of America’s weight loss community, then it’s no wonder our country is so irretrievably fat.

However, I suspect this is not the case.  The truth is that all who are serious about scaling themselves down know exactly how to do it:  Eat less, exercise more.  Period, full stop.  It works every time and never lets you down.

The only mystery involves summoning the willpower to do so, and then to keep it up for the rest of your life.

Accordingly, Thanksgiving indeed presents as a singular conundrum.  Apart from its more noble components, the whole point of this most American of holidays is to gorge ourselves into a blissful stupor simply because we can.

Yes, pretty much all of our annual national festivals involve an unholy assortment of culinary treats of one kind of another.  But Thanksgiving is unique in its insistence on gobbling up every last bit of it and licking the plate when you’re done.  The feast isn’t a mere side show; it’s the main event.

And that makes a real difference for those who make a point of avoiding exactly that.

It’s bad enough for a dieter to be overwhelmed by a bottomless buffet of hearty holiday helpings.  But to be all but ordered by one’s culture—and by relatives across the table—to dive in until it’s all gone?  Well, the psychological odds are not in your favor.

Your humble servant is certainly no exception.  I walked into last year’s family gathering determined, as ever, to keep my cravings under control.  Then out came the chips, the ale, the stuffing, the casseroles, the fruit salads—each new dish more impossibly sumptuous than the last—and all my defenses vaporized on contact.  At dessert, one whiff of my cousin’s homemade sweet potato pie and all hell broke loose.

In short, the effort at moderation was futile.  The fact is, Thanksgiving is not the time for restraint or self control.  Thanksgiving is about gluttony and excess and that’s just the way it is.

If, like me, you are simultaneously preoccupied with maintaining a slim figure yet utterly powerless in the face of fragrant culinary temptations, my Thanksgiving Day prescription is to give up.  To abandon any possibility of awakening on Black Friday without a rounded tummy and a splitting headache.  To relax and roll with the tide.  Some traditions simply cannot be fought.

And if, like the Globe’s target audience, you truly wish to deduct 1,000 calories from your Thanksgiving budget, might I suggest plucking out five days on either side of November 28, and consuming 200 fewer calories on each.

Then on Thanksgiving itself, you may proceed exactly as you were going to all along, without a moment of hesitation or guilt.

That’s what the holidays are all about.  You can have your turkey and eat it, too.

Not Just a Theory

One must never let the facts get in the way of a perfectly good conspiracy theory.

Yet I must confess that, on the matter of the Kennedy assassination, I have done exactly that.

My experience with the notion that President John F. Kennedy was not killed by a single person acting on his own began (boringly enough) with Oliver Stone.  Viewing his 1991 film JFK for the first time (and then a second and a third), I was mesmerized by the web of intrigue that surrounded the late president’s death.

At the very least, the movie suggested that whether Kennedy really had been killed as part of a grand plot, there is a trove of information to illustrate why the idea exists.

Mind you, in the many years during which I counted myself among JFK conspiracy cooks, I never clung to any particular narrative.  Whether the president had been done in by the mob, the CIA, Fidel Castro, extreme right-wingers, extreme left-wingers, or all of the above—that was beside the point.

For me, the case was a simple matter of forensics:  Early analysis of Abraham Zapruder’s film of the assassination concluded the shooting took place in a span of 5.6 seconds, which is simply not enough time for a single person to fire three separate shots with the rifle Lee Harvey Oswald allegedly used.  By definition, that means there were at least two shooters and that the killing was therefore a conspiracy of one kind or another.

Then some time later, I came upon a documentary, “Beyond Conspiracy,” aired on ABC in 2003, which noted that subsequent and more sophisticated examinations of the Zapruder film have established that—oops!—the actual time frame of the three shots is 8.4 seconds—more than enough for someone with Oswald’s background and training.

Since I had based my conspiratorial musings entirely on this one statistic, and since the statistic had now been proved incorrect, I saw no compelling reason to carry on with my investigations and I have suited up with Team Lone Gunman ever since.

Neat, huh?

On this 50th anniversary of that dark day in Dallas, I wish to contest a commonly-held perception about conspiracy buffs—namely, that they are stubbornly irrational creatures who are impervious to facts and data that might disprove their darkest convictions about how the world really works.

Historically speaking, this assumption is entirely correct, except when it’s not.

For instance:  When a wave of paranoia about President Barack Obama’s place of birth crested a few years back, the basis of the claim that Obama was born in Kenya, not Hawaii, was the lack of a birth certificate to prove otherwise.

When the president produced such a document, the controversy should have ended right there.  Yet the howls of protest continued from some corners of the Internet, with “birther” holdouts proceeding to concoct ever more elaborate explanations for how the objective truth was neither objective nor truthful.

However, this was not universally the case.  For every person who did not listen to reason, there were many more who did.

In a Gallop poll conducted in the first week of May 2011—several days after Obama’s “long form” birth certificate was made public—13 percent of respondents asserted the president was “definitely” or “probably” born in a foreign country.  In an identical survey two weeks prior—that is, when the birth certificate had yet to be seen—the number was 24 percent.

In other words, the size of the “birther” pool was cut nearly in half by a simple disclosure of fact.  For a sizable minority of the public, the conviction that the president was not born in the United States was, it turned out, susceptible to basic logic:  They asked for proof, they received proof, they accepted it and they moved on.  Presto.

I wish the size of this minority were bigger, and that there weren’t such a large gang of reliable idiots whose paranoia overwhelms all their other mental capacities.  The latter makes the former look bad, and that’s a shame.  We need honest skeptics in this society, because sometimes their instincts are right.

The JFK conspiracy theories might be hooey, but some conspiracies are real.  (The Lincoln assassination is one.)  We must take care to recognize this, and to differentiate between the two.

To assume nothing is a conspiracy is no less reckless than to assume everything is a conspiracy.  One generalizes at one’s peril.

The key, as with so much else, is to be all the time led by the facts and the evidence, and not by the lack thereof.

What Might Have Been

Over at the John F. Kennedy Presidential Library in Boston, there is a special exhibit, “To the Brink,” all about the Cuban Missile Crisis of October 1962.

Among the featured documents on display, perhaps the most arresting is the original typed draft of a speech that President Kennedy never delivered—that is, the announcement that the United States was about to launch an all-out assault on Cuba to destroy the missiles secretly installed there by the Soviet Union.

“I, as president, had the greatest reluctance to order the action that is now being carried out,” Kennedy was to have said.  “I made every effort to clarify my position.  But the Cuban authorities would not listen.  In the face of their open defiance action became inevitable.”

“There should be no doubt on the part of anyone,” he was to add, “that, in carrying out this commitment, the U.S. will be prepared to use all the forces at its disposal including nuclear.”

The American people never heard such an address because such a decision was never made (the president opted for a blockade instead).  But it jolly well could have been:  Several key members of the secret White House EXCOMM meetings recommended such a move, and Kennedy considered it seriously enough to prepare a speech just in case.

In this week of reminiscences of the Kennedy administration—Friday is the 50th anniversary of the assassination in Dallas—the question has predictably resurfaced, “What if Kennedy had lived?”

Minus those three shots fired from the Texas School Book Depository in Dealey Plaza, how might the arc of history differed from the one we have?

Would the United States have doubled down in Vietnam?  Would the Civil Rights Movement have progressed faster (or slower)?  Would the American public have been spared its disillusionment with government spurred by the presidencies of Lyndon Johnson and Richard Nixon?

The counterfactual history industry has long flourished in America, and it’s easy to understand why.  After all, the creative possibilities are endless and, by their nature, cannot be positively disproved.

In the case of Kennedy, the allure of crafting “what if” scenarios is especially potent, given the presidency’s oversized promise and undersized length.  It ended on a series of cliffhangers, and it has been left to survivors to second-guess how it might have played out.

What events like the missile standoff bring so sharply into focus, however, is the fact that the world does not require such horror shows as assassinations for the thrust of human events to change course.

As we know but sometimes forget, our leaders are all the time faced with decisions that could (and often do) prove enormously consequential in the longer term—decisions that were all but arbitrary at the time but are seen as inevitable in retrospect.

Such is one of the central insights of history and of life itself:  Nothing is inevitable.  Events unfold in only one way, but there are a billion other ways they could unfold, with only the mildest shuffling of the cards.

Never mind the decisions Kennedy might have made had he not died.  We cannot possibly sort through all the decisions he could have made while he lived.

Further, by no means is this principle of unknowable-ness exclusive to government and politics.  It also applies to each and every one of us.

Back to the Future was all about how Marty McFly’s parents, George and Lorraine, met and fell in love because George unwittingly stepped into the path of Lorraine’s father’s green Chevy.  As the movie makes plain, had George simply watched where he was going, the marriage would never have occurred and Marty would never have been born.

How many of us owe our own place in the universe to events that could very easily have gone the other way?  Is the alternative even possible?

And so when we talk about how different the world might have been if President Kennedy survived, let us acknowledge the limits of such theorizing by recognizing that the future is far more unpredictable than we give it credit for, that nothing is “destined” to happen until it does, and that we are all the time hostage to the playful randomness of the universe in ways that even a president cannot fully control.

Get On With It

Well, somebody had to test whether it was too soon to ask if the “Boston Strong” movement had run its course and was becoming just a little bit silly.

As it turned out, that somebody was Bill Maher and the answer was, “Yes, it is too soon.”

On the November 8 episode of his HBO program Real Time, Maher noted how the Red Sox championship parade observed a moment of reflection when the procession reached the finish line of the Boston Marathon, the site of the April 15 bombings.  Said Maher:

It was a bad day.  Three people died, that’s terrible.  More were maimed, that’s horrible.  But unfortunately that happens every day, in car accidents and everything else.  I mean, your city was not leveled by Godzilla.

In response, Thomas Menino, the outgoing mayor of Boston, called the comments “very irresponsible” and said that Maher “should be taken to task” for making them.

“He doesn’t know what he’s talking about,” Menino added.  “Come to Boston, visit Boston and see what a strong city we are.”

Views of many other Bostonians followed in the same spirit, and can be roughly summed up as, “What a jerk.”

Maher’s broader point can be gleaned from what he said moments earlier on his show about the New York metro area regarding Hurricane Sandy, whose one-year anniversary led, among other things, to a postponement of that city’s final mayoral debate:

It was a storm, it was a bad storm.  But it was an act of nature.  Do we always need to wallow and re-grieve over every bad thing that’s ever happened in this country?

Well, how ‘bout it, folks?  Conceding, as we might, that the person who asks such questions is an insensitive brute, let us soldier on and consider whether the insensitive brute has a point.

To wit:  What is the appropriate amount of time an entire city is allowed to publicly mourn a tragedy?  Is the length of the former determined by the scale of the latter?

Had the marathon attack killed three hundred people, rather than three, would and/or should shows of citywide solidarity, like those at the Red Sox parade, endure for a hundred times longer than they already have?

Conversely, had something like the 9/11 attacks inflicted far less damage than they did—suppose the buildings hadn’t collapsed—would the city of New York be justified in holding the sorts of massive annual commemorations it has held on every September 11 since?

In publicly grappling with acts of man-made and natural horror, should we not discriminate between truly seismic events and (relatively) small-scale traumas?  Or does every high-profile calamity necessitate an equal—and equally open-ended—outpouring of public concern?

I wish to stress the use of the word “public” in all of these queries, since no one—including Bill Maher—would presume to tell a stricken individual that the time has come to “move on.”  In one’s private life, there is no right or wrong way to grieve; everyone reacts to death and suffering differently.

But taking an assault on individuals to also be an assault on an entire city is a wholly separate matter and is fair game for scrutiny.

The essence of “Boston Strong”—the phrase itself and the attitude it represents—is that the people of Boston, like the people of New York before them, will not allow punks with bombs to bring the city to its knees.  That we will carry on—proud and undaunted—and prove to evildoers everywhere that our values are not easily abandoned or destroyed.

Is this not, in so many words, exactly what the mean old man on HBO was suggesting?  That we not abandon all sense of perspective and completely lose our marbles whenever something terrible happens?  That the effects of what occurred on Boylston Street were challenging, but by no means insurmountable?

A central precept of all fiction writing intones, “Show, don’t tell.”  If the city of Boston genuinely insists upon the doctrine of carrying on, might we demonstrate it by actually carrying on?  By returning to our normal lives and not throwing a hissy fit whenever our pride is questioned?

If you want to prove that you’re strong, be strong.  Don’t say it—just do it.  Acknowledge your loss, comfort those who need comforting, and then resume business as usual.

For heaven’s sake, our city was not leveled by Godzilla.

The Less You Know

Here’s a cheerful thought for you to ponder.

Suppose there was a document, hidden somewhere in the bowels of the National Archives, that proved beyond a reasonable doubt that the assassination of President John F. Kennedy was conceived, planned and executed by some group within the Central Intelligence Agency.

And then further suppose that such a document, having been successfully withheld for some five decades, were somehow obtained, in a WikiLeaks-style coup, and released into the public domain on Monday.

What, then, would happen on Tuesday?

Among the many conspiracy theories surrounding the death of President Kennedy, the 50th anniversary of which we will observe next week, the one involving the CIA is arguably the most plausible.  Not, mind you, because there is any particularly persuasive evidence to suggest such an event actually happened—there isn’t—but simply because it is in the agency’s nature to commit the most unthinkable crimes without detection or any measure of accountability.

Programs such as Homeland might not be accurate in every last particular, but the known history of the real CIA shows assassination to be something of a hobby for our esteemed spy network, be it directly or through snafus known as “blowback.”  Is it really that much of a stretch to imagine its nefarious practices committed on its own commander-in-chief?

This year, thanks to one Edward Snowden, we have experienced a veritable waterfall of disclosures about the heretofore secret and unchecked high jinks of the National Security Agency, which has been found to have tapped the phones and e-mail accounts of pretty much everyone on planet Earth, including the leaders of countries with whom we are supposedly friends.

We disagree about whether—and to what extent—the NSA should engage in this behavior, but tell me:  Now that you know it does, do you wish that you didn’t?  In possession of this information, do you as an American feel morally soiled, or do you rather feel cheated to have so long been kept in the dark?

Never mind the rest of the world and never mind “national security.”  Broadly speaking, is there any information about the U.S. government that, if true, you would simply not want to know?  Something so ghastly—so antithetical to the highest ideals of the American republic—that you would just as well remain ignorant of it for the balance of your natural life?

With November 22 upon us, I return to my original query:  What would it mean to learn President Kennedy was assassinated by the CIA?

For starters, it would mean we live in a country whose government murdered its own head of state—a practice we like to think is reserved for third-world dictatorships in the most backward corners of Africa and the Middle East.  And from a wing of that government, we might add, that has been in continuous operation in the half-century since, carrying on more or less as it always has—in secrecy and very nearly immune from legal recrimination.

In the event of such a revelation, what would the outfit’s current director possibly have to say in his agency’s defense?  “Sorry about that—won’t happen again”?  “Hey, it was a long time ago, let’s just move on”?

We don’t need the Kennedy-killed-by-CIA theory to be true in order to face these grave questions.

In the past decade alone, we have been made to grapple with the fact of our government, in our name, having tortured suspected terrorists—in clear violation of the Geneva Conventions—as well as having used drones to target and kill American citizens at the whim of the executive branch, uninhibited by such annoyances as due process and trial by jury.

The scandal here is not only that the U.S. does these things, but that the public has essentially shrugged them off as necessary and unavoidable byproducts of the so-called war on terror.  “Yeah, it’s unfortunate—but hey, what can you do?”

Is it possible we would regard a hypothetical plot to kill Kennedy in the same way?  With a resigned “meh”?  With a brief series of protests and howls of outrage, followed by obedient silence?

Could it be that the real problem is not that there are certain things we could not bear to know, but rather that we are no longer capable of being shocked by what our government may or may not be doing behind our backs?

I’m not sure that’s something I want to know.

Junk Food

I doubt that I will ever actually read Double Down, the new chronicle of the 2012 presidential race by Mark Halperin and John Heilemann.

The book, released last week, has accrued enormous press coverage in recent days, swiftly becoming the “official” account of the proverbial horse race between Barack Obama and Mitt Romney, thanks to a peanut gallery of firsthand accountants who dish on the personalities involved in a presidential election that, if not our country’s most exciting, was nonetheless (as the joke goes) certainly the most recent.

Political junkie that I am, I have naturally skimmed the excerpts from the tome published in the New York Times and TIME and the various analyses that have followed about the meaning of the 2012 race in the context of U.S. history and, of course, the eventual campaign of 2016.

But that is as far as I wish to go.

I do not require every last detail about what Obama and Romney were doing and thinking at every moment of their contest for the Oval Office.

I did not need to know—as I now do—that Romney’s vice presidential vetting committee codenamed their operation “Goldfish” and referred to New Jersey Governor Chris Christie as “Pufferfish” and Florida Senator Marco Rubio as “Pescado.”

Nor, frankly, do I much care about the minutiae of the Obama administration’s “poll testing” about whether to replace Vice President Joe Biden with Hillary Clinton.  For Pete’s sake, any self-respecting incumbent is going to explore every available avenue for success, and Obama is no different.  There is nothing especially groundbreaking about this so-called scoop, and none of these disclosures means much in the broad sweep of history.

Yet I marinate in this piffle all the same, and will probably continue so long as it keeps popping up on my news feed.

I don’t want to, but I do.  I just can’t help myself.  I’m a junkie, and this is my junk food.

Mind you, political junk food should not be confused with political guilty pleasures.  There is a distinction between the two that, while narrow, is worth pointing out.

A guilty pleasure is something that is dismissed in polite society as vulgar and trivial, but might nonetheless contain some redeeming value.  Take, for instance, my occasional pastime of eating an entire jar of peanut butter with a spoon at 2 o’clock in the morning:  No, it’s not something I could get away with in public, but hey, think of all the protein!

Per contra, junk food is, well, junk.  It’s pure sugar and fat, it does nothing for you in the long run, you consume it in a moment of weakness and feel disgusted with yourself a few minutes later.  Rather than peanut butter, picture killing a whole jar of Nutella.  (You know who you are.)

So how might we differentiate these two concepts in the worlds of government and politics?

I am extremely tempted to argue that the entire experience of following a political campaign constitutes a guilty pleasure, with most of its constituent parts nothing but pure junk—sparkly distractions yielding heat but no light.  The boring legislative sausage-making that (sometimes) occurs between elections—that’s the part that matters.

Halperin and Heilemann have defended Double Down against the inevitable charges of gossip-mongering by asserting that all of its goodies were subject to rigorous cross-checking, and that any assertions that could not be verified, no matter how titillating, remained on the cutting room floor.  In short, they have committed journalism, not tabloid hackery.

Here’s a thought:  They have actually committed both simultaneously, and there’s your problem.  In today’s environment, where personality not only trumps substance but is considered substantive itself, any stray piece of dirt about a political figure, no matter how inconsequential, is considered axiomatically newsworthy, provided that it is obtained in a journalistically valid manner.

My plea to the peddlers of this troubling tendency:  Knock it off.

Don’t elevate the status of disposable schoolyard chatter into the realm of respectability.  Don’t conflate valuable information with pure muck.  Learn to discriminate between the two, for many of your readers cannot, but they jolly well should.

By no means do I advocate an end to all frivolity in political reporting, just as I wouldn’t dream of purging the supermarket shelves of all candy and chocolate.  Such trifles will always have a place in our society—namely, to provide a mental release from the weight of the serious business of life.

All I ask is that we recognize our depraved desserts for what they are, and not pretend they are wholesome and nutritious.

We can allow ourselves the occasional indulgence, but let’s not make every day Halloween.