Jeepers Creepers

If Halloween didn’t exist, we wouldn’t even think of inventing it.

On any other day of the year, would any responsible parent allow his or her child to go door-to-door, ringing doorbells and demanding candy from total strangers?

On any other day, would those same parents licence that child to consume all that candy after collecting it?

On any other day, would American moms and dads tacitly encourage their kids to hoard eggs and toilet paper and commit low-level vandalism across the neighborhood, or to enter a dark space and be scared out of their wits by fearsome, bloody ghouls leaping out at them?

I dare say they would not.  I certainly hope as much.

Yet on the final evening of October each year, these practices and others become very nearly mandatory in all American homes.  Modern Halloween traditions are as hardwired into our culture as those of any other annual observance.

Rarely do we stop to appreciate how very weird and wonderful this is.

As far as our national holidays go, Halloween is probably the most politically incorrect.

More than ever, today’s culture promulgates child safety above all else.  Halloween promulgates danger.

Today’s culture promotes nutrition and warns against the evils of junk food in spawning childhood (and adult) obesity.  Halloween promotes the idea that he who accumulates the most candy wins.

Today’s culture says, “Be who you really are.”  Halloween says, “Dress up as someone else.”

On a good day, the culture values science and reason.  Halloween values witchcraft.

In the political realm, Halloween would seem to have something to offend everyone.  Liberals must surely take umbrage at the glorification of sugary treats and putting the young in emotionally fraught environments.  Conservatives, meanwhile, doubtless view the practice of dolling out candy to other people’s children as the creeping hand of socialism at work.  How very frightful, indeed.

Halloween possesses a real edge, however tempered and commercialized, that other popular festivities lack.  It is an edge that probably would not withstand the scrutiny of the average school board, say, were it to be proposed as a wholly new concept today.  For goodness’ sake:  Innocent children subjecting themselves to the idiosyncrasies of their grown-up neighbors?  No, that’s far too risky.

As practiced, Halloween is an occasion to get a lot of our basest and most unscrupulous impulses out of our system.  To engage in the sort of tomfoolery we otherwise avoid.  It is our national guilty pleasure, and that is what makes it so essential.

We need our guilty pleasures, because they help to keep us in line the rest of the time.

It’s the tacit agreement we make with our children and ourselves:  Eat your vegetables the rest of the week, and tonight you can gorge on all the Butterfingers and Milky Ways you want.  Be a model citizen tomorrow, but tonight, may all hell break loose.  Go ahead.  We insist.

We should insist.  Children need the opportunity to let loose and break the rules every once in a while.  If we never give them that chance, the urge to do so will bubble beneath the surface until finally exploding into the open, likely at the wrong moment and in a decidedly unattractive way.

And so Halloween presents as a moderating influence on society, rather than a corrupting one.  Who knew?

Naturally, this theory contains holes large enough to fly a broomstick through.  For one, not all children (or adults) can be depended upon to eat their vegetables between November 1 and October 30.  As well, a great deal of young folks can hardly be bothered to wait for one pre-arranged moment to misbehave.

In this way and others, the real problem with Halloween is not that it encourages uncommonly bad behavior, but rather that it encourages behavior that is completely typical.  That it reflects Americans not on their worst days, but on their average days.

The ideal, then, is to ensure a tension persists.  To live such that Halloween remains the exception to the rule, both in theory and in practice.  To make certain its traditions of mischief and sugar highs remain an affront to the enlightened world, not its standard operating procedure.

Like Austin, Texas, let us keep Halloween weird.

Advertisements

Still Searching For Sanity

Three years ago tomorrow, some 200,000 viewers of Comedy Central assembled on the National Mall in Washington, D.C., to witness Daily Show host Jon Stewart demand a mellowing out of American politics.

The event, co-hosted by Stewart and his counterpart Stephen Colbert, was christened, “Rally to Restore Sanity and/or Fear.”  It was originally scheduled as two separate, dueling demonstrations—merged into one for logistical reasons—to weigh the relative merits of conducting a civil discussion about the public concerns of the day versus behaving like crazed goobers in the same pursuit.

Viewed now, from a distance of three years, we can see with depressing clarity which side has won.

In the event’s valedictory address, Stewart said the following:

This was not a rally to ridicule people of faith or people of activism or to look down our noses at the heartland or passionate argument or to suggest that times are not difficult and that we have nothing to fear.  They are and we do.  But we live now in hard times, not end times.  And we can have animus and not be enemies.

In truth, the rally, like Stewart’s program, was as much a critique of American media as American public officials.  “The country’s 24-hour political pundit perpetual panic conflictinator did not cause our problems,” Stewart said, “but its existence makes solving them that much harder.”

Stewart’s central charge was, and is, that cable TV news networks’ portrayal of American life is false.  That most ordinary people are not as mindlessly partisan or as confrontational as political pundits would have us think and, by implication, that lawmakers in Washington, D.C., do not represent the real values of real Americans.

The context of the “Sanity” gathering was the ascendancy of the Tea Party, which by October 2010 had established itself as a forceful ideological movement and would prove politically viable for the first time in the midterm elections two days after the rally.

The idea, according to Stewart and company, was that Tea Party activists were neither as bad nor as good as the respective partisan wings of the media claimed:  Liberals were wrong to paint them all as bigots, while conservatives were wrong to claim theirs as the prevailing sentiments of most people.

Then and now, there is a distinction we must draw—narrow but deep—between having extreme views and expressing one’s views in an extreme fashion.

The “Sanity” rally seemed to imply the former is fine, provided the latter does not intrude.  That it is possible for people with wildly divergent opinions to reach, if not common ground, then at least an honest understanding of their differences and, faced with practical considerations, some sort of middle-of-the-road compromise.  That the real conflict in American politics is not between Democrats and Republicans or even liberals and conservatives, but rather between temperance and intemperance.

As we clear out the debris from this month’s government shutdown, let us ask:  In today’s environment, does this prognosis hold?

Yup.  Almost perfectly, in fact.

To the question, “Why did the government shut down?” the answer comes back:  Because the dominant faction in Congress made demands that, as it well knew, the president was never going to accept.

That the shutdown would go on unless President Obama kneecapped his own proudest domestic policy initiative, the Affordable Care Act, was an insane proposition on the part of the Tea Party wing of the GOP.  It was an impossible condition for compromise, destined to fail, and therefore an entirely theatrical exercise whose costs have run well into the billions.  And this from a party that presumes to value fiscal responsibility.

That Harry Reid, the Senate majority leader, took every opportunity to hurl gratuitous, infantile insults at his political adversaries when he should have been building bridges was, if not as directly destructive to the process, an audacious demonstration of bad form that only served to poison a well that was already waist-deep in arsenic.

Absent such absurdities, negotiating the minutiae of the federal budget would have been a wholly manageable task.  Boring, difficult and protracted, but doable nonetheless.

It should not have required the government to grind to a halt for Congress to figure out how to allocate funds for the incoming fiscal year.  That it did, and the way that it did, effectively proves Jon Stewart’s main points.

We can take it on faith that Barack Obama and Ted Cruz will never see eye-to-eye on anything.  The magic of our system of government is that the world can keep right on spinning even when they don’t.

All we ask of them and their fellow public servants is to meet each other halfway—not ideologically, mind you, but temperamentally.  To calm themselves down and exhibit the shows of good faith that we, their constituents, are owed, if not always deserve.

Unusual Weather We’re Having

For the past few days, we in the Northeast have experienced our first real brush with fall weather.  Just in time for the World Series—baseball’s “fall classic”—winds have picked up and temperatures have dropped into the 50s—and to near-freezing at night—bringing the region’s extended summer to a sudden and swift conclusion.

Under the circumstances, we should be very grateful, indeed.

It was this very weekend a year ago when a tropical storm called Sandy unleashed holy heck upon a great deal of the Jersey Shore and elsewhere, inflicting damage from which some areas are still recovering and others never will.

A year prior—again, the final weekend of October—the region was hammered by a massive and unexpected blizzard, which knocked out electricity and heat in innumerable homes for a wee bit longer than we would have liked.  (In my apartment, things got so bad that we needed to call the fire department to clear out piles of carbon monoxide from the furnace.)

That two gargantuan, unprecedented meteorological calamities occurred exactly one year apart is a coincidence.  But it provides us a thoroughly germane entry point into what has become a full-time job:  Reminding ourselves that crazy weather is something we need to accept as inevitable in our world for the rest of our lives.

It would have been alarming enough for the northeastern seaboard to be struck by an uncommonly strong hurricane the same weekend two years in a row, or to be blanketed by an early snowfall in like manner.

But no, it was one type of ridiculous natural disaster followed by an entirely different type of ridiculous natural disaster.

In other words, the quandary for all mankind is not merely the increase in strength of active weather systems under the banner of “climate change,” but the increase in uncertainty and variability of the very nature of the systems themselves.

As a Bostonian, I have long appreciated comic Lewis Black’s quip that during one trip to Boston, “In four days, I experienced five seasons.”

Indeed, to live in a coastal environment such as the City of Beans has long meant having to prepare for a dramatic change in conditions on a moment’s notice—changes often not detected by even the most sophisticated computers.

To plan a trip to such a locale entails packing twice as many clothes as one will actually wear, as one cannot know in advance how many layers one might require on a given day.

One of the central facts of climate change, as its consequences become increasingly impossible to halt or reverse, is that the entire world is becoming Boston in this respect.

With each passing year, one will not be able to go anywhere on Earth with any confidence as to what might fall from the sky when one arrives.  We will need to “prepare for the worst” without knowing what “the worst” actually is, because it could conceivably be anything.

The original error in the early PR campaigns to raise awareness of global warming was to call it “global warming.”  First, as Bill Maher once observed, it makes the whole phenomenon sound rather appealing, particularly to those in climates that could use a little more warmth.

Second, and more crucially, “global warming” is the wrong term because it is too limited in scope, and does not accurately characterize what the crisis is and why it is a crisis.

In the north, 70 degrees in October sounds positively heavenly.  But that followed by a monsoon, followed by a blizzard, followed by a hurricane?  Not so much.

But that is what climate change hath wrought, and what makes it so frightening.

Feasibly, any city could secure itself against a particular type of extreme weather situation, given the funds and the organizational savvy.  But can every city really secure itself against everything?

The continuing challenge, in spite of the above, is not to completely lose hope by throwing up our hands and assuming there are no further meaningful steps we can take to minimize the damage of future catastrophes.  As the great cliché intones:  Don’t make the best the enemy of the good.

Extreme weather is inevitable, as is the untold suffering it will unleash.  The natural environment really is changing, and almost entirely for the worse.

But doing something about it is, as ever, still preferable to doing nothing at all, just as losing 90 percent of the Jersey Shore is preferable to losing the whole bloody thing.

We might wish for a more appealing choice than that, but then one must always face the world with which one is presented, particularly when one’s own lifestyle was partly responsible for bringing that world about.

Step Away From the Oreo

This week we observed National Food Day, described by its founding organization, the Center for Science in the Public Interest, as “a nationwide celebration of healthy, affordable, and sustainably produced food and a grassroots campaign for better food policies.”

Implicit in this goal of consuming food that is good, I hazard to say, is to avoid food that is bad, and this is where your humble servant possesses a fair amount of expertise.

Like anyone who has ever gone on a diet and been faced with obstacles around every corner—all of them delicious—I have over time subjected myself to every trick and strategy in the book to establish a sane, healthy eating regimen that stands even the slimmest chance of long-term success.

Through this experience, I have arrived at one definite conclusion above all others:  If you wish to avoid eating crap, keep said crap out of your kitchen.

If your objective is to resist temptation, then stop tempting yourself.

This sounds simple and obvious, until you realize how few Americans, based on the obesity figures, seem to have taken it to heart.

Consider, if you will, a study from the late 1960s and early 1970s known as the “Stanford marshmallow experiment.”

In this test, a child would be placed in a room containing nothing but a chair, a table and a marshmallow.  The child would be informed by the tester that he or she could eat the marshmallow at any time, but that if he or she abstained for 15 minutes, the tester would return with a second marshmallow, and the child could eat both.

Among the many interesting findings of this study was the particular set of strategies employed by those who managed to resist the marshmallow’s sugary allure.

“Instead of getting obsessed with the marshmallow,” wrote Jonah Lehrer in a New Yorker profile, “the patient children distracted themselves by covering their eyes, pretending to play hide-and-seek underneath the desk, or singing songs from ‘Sesame Street.’  Their desire wasn’t defeated—it was merely forgotten.”

“If you’re thinking about the marshmallow and how delicious it is, then you’re going to eat it,” Walter Mischel, the experiment’s architect, explained.  “The key is to avoid thinking about it in the first place.”

While this may sound far easier said than done, consider a related psychological rule of thumb, as articulated by Late Late Show host and recovering alcoholic Craig Ferguson:  “The desire to have a cigarette or the desire to have a drink will go away whether or not you have that drink or that cigarette.”

Exactly so, and the same is true with food.

Struck, as we all are, with a seemingly overwhelming hankering for some sugary treat or other, we often assume the only way to make the hankering go away is to seek out and eat said treat.

As it turns out, this assumption is false.  If you simply wait long enough, the urge to indulge will disappear of its own accord, without your having to blow a giant hole in your diet in order to return to some sort of culinary equilibrium.

The connection between food and drugs is useful in other ways, as well.

Just last week we learned that, in an undergraduate lab rat study at Connecticut College, Oreo cookies might be as addictive as cocaine, if not more so.

Whether this finding holds true for humans, it suggests a worthy strategy for how to think about one’s diet:  Treat the foods that get you into trouble as if they were addictive substances.

If Oreos are your Achilles’ heel, if you cannot stop at just one, and if Oreo binges leave you feeling powerless and discouraged, then the solution is not to stare down a stack and shout, “I am not going to eat you!”  Nor should you buy a whole package and assume that this time, finally, you’ll keep your urges under control.

The solution, rather, is simply to quit, cold turkey.  Like the children in the marshmallow study, to turn your back and forget they’re there.  And, whenever possible, to make it so they actually aren’t there in the first place.

In a society plagued by a dearth of healthy eating habits, where one’s poor dietary choices affect everyone else in any number of ways—not least regarding the cost of health insurance—some indulgences just aren’t worth it.

When to Hold, When to Fold

That’s more like it.

New Jersey has become the 14th state to legalize same-sex marriage, thanks in large part to there being at least one Republican official in the United States who knows when to throw in the towel.

Here’s what happened.  The Garden State sanctioned gay civil unions in late 2006.  In February 2012, both houses of the state’s legislature voted to legalize gay marriage outright, but the bill was vetoed by Governor Chris Christie, who personally opposes gay marriage and said he would prefer the issue be resolved by the people of New Jersey through a ballot referendum.

Last month, however, a state superior court judge ruled that New Jersey’s civil unions policy failed to “provide same sex couples with equal access to the rights and benefits enjoyed by married heterosexual couples,” and that the state was therefore obligated to allow same-sex marriage posthaste.

Governor Christie initially sought to appeal the ruling, but was blocked by the State Supreme Court, which denied Christie’s request to halt same-sex weddings until the appeal ran its course.

On Monday, as marriage ceremonies began as scheduled, a Christie spokesperson released the following statement:

Although the governor strongly disagrees with the court substituting its judgment for the constitutional process of the elected branches or a vote of the people, the court has now spoken clearly as to their view of the New Jersey Constitution and, therefore, same-sex marriage is the law.  The governor will do his constitutional duty and ensure his administration enforces the law as dictated by the New Jersey Supreme Court.

In light of the government shutdown that wreaked havoc on the country during the first half of this month, one cannot help but point to the above and ask Republicans in Congress, “Is that so hard?”

The whole casus belli for the shutdown, you will recall, was the adamant refusal by certain members of the House and Senate to recognize the legitimacy of the Affordable Care Act, and the vow to keep Washington, D.C., closed until the act was dismantled.

As this gang of obstructionists had to be reminded, this was a piece of legislation that had been approved by both houses of Congress, signed into law by the president and upheld as constitutional by the U.S. Supreme Court.

And yet, the GOP was having none of it.  In their minds, Obamacare simply could not, and cannot, be accepted as the law of the land, even though, according to the stipulations and processes laid out in the U.S. Constitution, it is the law of the land.

The House has been duly ridiculed for voting to abolish the Affordable Care Act on more than 40 separate occasions—all of them knowingly in vain—but at least in doing so, its members have followed standard operating procedure regarding how laws are passed, albeit to a farcical extent.  After all, you can’t vote to overturn a law without acknowledging that the law exists.

What the shutdown debacle made plain—in case it wasn’t obvious already—is that antipathy toward Obamacare has so overwhelmed the presently-dominant “Tea Party” wing of the GOP that its leaders have come simply to not care about process.  To them, the ends justify the means, and the hell with everything else.

Governor Christie, in his handling of the gay marriage question, has presented himself as counterprogramming to this mentality:  He pushed as far as he could to prevent gay marriage from coming to New Jersey, and when his efforts proved legally futile, he gave up and moved on, conceding that not all political battles can be won, particularly when the tide of history is not in your favor.

We shouldn’t need to heap special praise upon Christie for his actions.  It is a measure of the petulance and ridiculousness of Republicans in Congress that Christie can be viewed as a beacon of statesmanship by comparison for simply doing his job.

But we cannot deny the context of the environment in which we live.  Having one high-profile Republican who respects the legitimacy of established law is preferable to having none at all.

If this week’s yielding on gay marriage is part of Christie’s presumed grand scheme to eventually run for president, he has executed it in a rather ingenious fashion.  The political right can rest assured that his heart is with them, while everyone else can breathe a sigh of relief in the knowledge that, on matters of actual governance, he also possesses a brain.

Guilty

One month from today, we will observe the 50-year anniversary of the day when, according to tradition, America lost its innocence.

You know, the innocence we retained as we slaughtered several million fascists during World War II, seeing some 400,000 of our own men go down in the process.

The innocence that got us through a couple centuries of chattel slavery and the Civil War that finally ended it.

The innocence we carried as we plundered our way through the wilderness throughout the 17th and 18th centuries, collecting the scalps of Natives as we marched.

Those whimsical adventures were enjoyable enough, but everything was ruined the moment we learned that when a bullet enters the president’s head, candy doesn’t come out the other side.

But I guess, in spite of that trauma, we managed to reclaim our purity sometime in the subsequent 38 years, since we lost it all over again on September 11, 2001.

Apparently national virtuousness is like the car keys.  You think it’s gone forever, and then it suddenly turns up in the couch cushions.

Today, it is taken as read that the assassination of President John F. Kennedy on November 22, 1963, was one of those coming-of-age moments for an entire generation of Americans:  A hinge event that marked the end of one era of history and the start of a newer, scarier one in its place.

As we spend the next month ruminating on the meaning of the Boomer generation’s “I remember where I was” moment, let us devote at least a part of this conversation to the possibility that we have overstated the case, both then and now.

Viewed from a temporal distance and in a wider historical context, the Kennedy assassination is not particularly interesting.

In the century preceding Kennedy’s election in 1960, five of the 18 men who occupied the Oval Office did not get out alive:  Three were assassinated, and two more died of natural causes.

As well, the same period saw some half-dozen assassination attempts that failed, either on a sitting president (Harry Truman, Kennedy), a president-elect (Herbert Hoover, Franklin Roosevelt) or a former president running for re-election (Theodore Roosevelt).

Further, on the international scene in 1963, bopping off a world leader had become something of a habit amidst the intertwining tensions of the Cold War.  Kennedy’s own Central Intelligence Agency had supported the successful coup of South Korean President Ngo Dinh Diem, who was killed in the ensuing chaos just three weeks before Kennedy’s own death.

What is more, the 1950s had been positively littered with similar CIA-backed shenanigans all over the globe—some successful, some not—and while the American public was not aware of most of these activities at the time, it would have required extraordinary obliviousness for one to assert that the world was not a dangerous place—particularly after October 1962, when the Cuban missile standoff very nearly destroyed the whole bloody planet.

Nor did the Kennedy assassination itself come out of nowhere.  Hostility toward the president for myriad perceived crimes (most of these involved capitulating to Communists) had long boiled over among various extremist groups in various pockets of the United States, not least in Texas.  If such hatred was not as overt as, say, that of the Tea Party for President Barack Obama, it was hardly a well-kept secret.

So what is this piffle about a sudden loss of national innocence?  What could we possibly be talking about?

The Kennedy assassination was a disturbing, tragic episode in a long line of similar calamities throughout the life of the American republic.  It is unique because it is the only killing of a commander-in-chief to be reported on live television and, thanks to a bystander named Abraham Zapruder, to be captured on film.  And, of course, the only such event remembered by people still alive today.

That’s what it was, and that’s all that it was.  Let’s not get carried away.

It is silly and historically ignorant to suggest the murder of the 35th president was somehow the moment everything changed—the biting of the apple that instigated the banishment from Eden and the moral soiling of all mankind.  As if the entire history of the world had been rainbows and gumdrops until a leader with great hair and a charming family found himself on the wrong end of a Carcano bolt-action rifle.

In the fall of 1963, were we really that naïve?  Were we really that dumb?  Are we so solipsistic that we can only comprehend the significance of events that we, ourselves, were around to see?  Do we truly think that the world stops spinning the moment we close our eyes?

We Americans are renowned the world over for our short-term memories regarding even the most basic facts of history.  Must we reinforce this view by tacitly demonstrating that it’s true?

Is there nothing more noble that we can do for our country?

Congressional Dissonance

The most recent public opinion poll has the U.S. Congress’s approval rating at 7 percent.

A separate poll, taken last week amidst the government shutdown, found 60 percent of respondents affirming that every last member of Congress should be fired, including their own.

Beating up on the ineffectiveness of America’s national legislature is such a cliché—along with the painfully unfunny series of jokes about “things that are more popular than Congress”—that I hesitate to bring it up in any context whatever.

Nonetheless, there is one angle from which the problem of the House and Senate’s perennial unpopularity needs to be considered and understood, and it comes in the form of yet another statistic:  90 percent.

That is the proportion of sitting members of the House who were re-elected in 2012.  The number was 85 percent in 2010 and 94 percent in 2008.  In point of fact, the last time the incumbency rate dropped below 80 percent was 1948.

In 2014, will a sizeable chunk of the Congress’s residing class be thrown out on their unholy patoots?

Don’t.  Make.  Me.  Laugh.

If there is any cliché even more putrid than the notion that Congress is rotten to the core, it is the practice by John Q. Public to ensure that as many members as possible are safely returned to their seats every two years.

The alchemy that allows this to happen is probably too complex to tackle all at once, although a great deal of blame has lately been placed on the phenomenon known as the “gerrymander.”  That’s the practice of carving the boundaries of congressional districts so they are disproportionately Democratic or Republican, thereby guaranteeing that the district’s sitting representative will be re-elected for the rest of his or her natural life.

This explanation is valid as far as it goes, but is somehow not quite good enough.

If we are to understand the dramatic disconnect between our contempt for Congress and our penchant for re-electing it, we must plunge deeper than mere political shenaniganery.

Politics is personal.  We elect the people we elect because, in one way or another, we like them.  Whether it’s because we agree with their philosophies about healthcare or immigration, or simply because they seem like folks with whom we could do shots during happy hour, we apply a test of basic decency and identification to everyone to whom we give our vote.

Expressing disapproval and even hatred for the entire U.S. Congress is easy, because it comes across as one giant blob of uselessness—a conglomerate of mostly anonymous individuals whom, with one exception, you played no role in selecting.  Who do these goobers think they are, and why are they spending my hard-earned money on things I don’t care about?

But when your own congressperson returns home with a suitcase full of cash for that shiny new bridge you’ve been asking for?  Now we’re talking.

Even apart from the pork, we picture our hometown representatives as human beings in a manner that is simply not feasible when applied to the House as a whole.  Try as you might, you cannot empathize with 435 lawmakers as you can with one.

There is a reason so many public officials still go out in the streets to shake hands and talk one-on-one with their constituents:  For all that the Internet has done to streamline the act of communicating, there is still nothing that quite equals the personal connection of good old-fashioned eye contact.

To wit:  Thomas Menino, the outgoing mayor of Boston, has enjoyed stratospheric approval ratings for much of his tenure—at last count, it was at 82 percent.  In a 2009 Boston Globe survey, 57 percent of Bostonians said they had met Menino personally.  Do you suppose these two facts are related?

It is my suspicion—long story short—that eliminating gerrymandering will not eliminate the problem of retaining people the public presumes to detest, because even if overt partisanship were brought into proper proportion, the fact of wanting to take care of your own would remain.

Historically, people tend to vote for their congresspersons based on local concerns, saving their national gripes for senators and presidents.  That, in part, is the point of having a bicameral legislature, with one house more vulgar than the other.

There is an inherent tension here, and it will probably never go away.  We will continue to bitch about the House’s collective intransigence, and we will continue to enable it by sending its members back to Washington.

That is, unless we don’t.

The ball is in our court.  If we don’t follow through, we will have no one to blame but ourselves.