A Nation of Gamblers

Following last week’s calamitous tornado that flattened a significant chunk of Moore, Oklahoma, killing 24 people, we found that one reason for the high fatality count was a lack of storm shelters in some of the elementary schools in the twister’s path.

Of the 24 dead, ten were children, seven of whom had been huddling on the ground floor of Plaza Towers Elementary, which did not have a basement.  Some Oklahoma schools have emergency shelters for such severe climatic events, but because the state does not mandate them and they can be quite expensive to build—sometimes as much as $1 million apiece—many more have chosen not to bother.

Helen Grant, a resident of Moore, is rather annoyed by her state’s shelter-optional policy, saying, “I don’t think you can put a price on human life.”

In point of fact:  Yes, you jolly well can, and we Americans do it all the time.  Be it through our healthcare system or the ways we allocate funds for police and fire departments, we reveal what we truly value in our lives.  Very often, ensuring the preservation of those lives is not terribly high on the list.

Perhaps an even more germane fact of life that the situation in Moore underlines, however, is the collective gambler mentality that leads us to expend neither enough cash nor enough attention on the sorts of projects that would do all of us some good.

Another unfortunate event of last week was the collapse of a bridge over the Skagit River near Mount Vernon, Washington, just in time for Memorial Day weekend.  As the high-traffic holiday approached, we were informed (as if we couldn’t have guessed) that roughly 11 percent of all U.S. bridges are currently categorized as “structurally deficient.”

Why does the Greatest Country in the World allow this to happen?

Well, of course, there are many reasons, most of them involving either money or the nature of bureaucracies, or both.

However, I would postulate the “real” reason—the one undergirding all the others—is precisely the reason New Orleans did not have decent levees in August 2005 or certain neighborhoods in the Bay Area could not withstand the 1989 earthquake.

It’s because Americans don’t think anything bad will ever happen to them—individually and collectively.

No catastrophic twister had struck the greater Moore area for quite some time, so why not assume that one never would again?

New Orleans had not completely drowned as the result of a hurricane in any of our lifetimes, so surely the city was immune.

I have never become seriously ill or broken any bones, so why should I bother buying health insurance?

My car made it over the bridge today.  Why shouldn’t the same be true tomorrow?

We are a nation of happy delusions, a people with a little too much faith in ourselves and the components that physically hold our country together.  Whenever possible, we avoid making investments that might possibly not pay off, assuming in the meanwhile that everything will turn out for the best.

We might not necessarily fashion ourselves invincible, but we put tremendous stock in the notion that we are lucky.

The problem here is that, historically speaking, we are.  And probably more so than we realize.

Every violent storm that causes great damage is a reminder of all the storms that do not, just as the stray meal that gives you acute indigestion makes you more fully aware (and appreciative) of how, most of the time, the human body really does run like a well-oiled machine—even when it really shouldn’t, considering all the unholy junk we shovel into it.

If we are ever to take seriously the daunting national project of upgrading our infrastructure—”investing in the future,” as politicians often put it—we will need to alter, in fairly dramatic ways, how we think about the world around us, abandoning the wishful thought that what works today will still be working tomorrow and forever more.

The unfortunate fact, I fear, is that we will need a lot more disasters to befall us before we truly get the message.  What is more, the disasters will need to be more evenly distributed across this great republic, because the dirty little secret—that is, the one everyone knows—is that most people don’t care about preventing bad things from happening until they happen to them.

Democracy of Death

Few experiences in life are as enjoyable as wandering through a cemetery.  Nothing steels one’s appreciation for life like being surrounded by death.

The attitude a cemetery engenders begins as a form of schadenfreude—“Look at all the people I have managed to outlast!”—but gradually shifts and mellows into an understanding of death’s role as the Great Equalizer, as you are swiftly reminded that it is a mere matter of time and fortune before you, too, are returned to the earth from whence you came.

If this thought does not lift your spirits to new heights, I don’t know what would.

Well, it so affected me, anyway, as I ambled through a pair of local graveyards this past Memorial Day weekend, visiting family members past with a team of family members present.

While our self-guided holiday tour of the dear departed was ostensibly to pay our respects to our own kin, straying from the family plot and exploring the headstones of strangers was no less compelling.  Indeed, it was more so, as we did not know what we might uncover.

What we found—what anyone finds in any cemetery, if one takes the time to look—is the extraordinarily democratic nature of the grounds.  How fantastically interesting it would be (scrambling the space-time continuum accordingly) were the good folks buried within any random tract of land to be occupying the same space at the same time while they were still alive.

Think of it:  Just a few precious feet from a gentleman whose tenure on Earth stretched 104 years lies a poor boy who lived not beyond the age of seven.

A World War I flying ace lounging in eternal rest nearby a lifelong housewife who perhaps never traveled outside her own home town.

A husband and wife so economically well-endowed that they had a rendering of their beloved yacht etched onto the back of their family stone, and then, in the very same row, a lowly duo whose own rock was so cheap it had broken in two and was very clumsily glued back together.

Not all depositories of human remains are so egalitarian, of course.  Our national cemetery in Arlington, Virginia, for instance, only accepts those who served in the U.S. Armed Forces (and their families).  Others only stow adherents to a particular religion.  Still others no longer conduct any burials at all, having exhausted all available real estate or simply desiring to maintain an aura of antiqueness.

Yet I prefer those that do not discriminate, because of the way they serve as a desegregating force—an eternal counterbalance to all the ephemeral self-segregation we conduct in our waking lives—providing for a messy mishmash of company with no business intermingling except for a series of temporal and geographic coincidences.

Is this not what we envision heaven to be like?  An infinite series of odd couplings at a celestial cocktail party, with unlimited time to reach common ground and achieve some manner of redemption and peace?

Studying the headstones at such a place, one recognizes the primacy of chance in the trajectories of our lives, and how cosmic accidents such as what century one happens to be born into can determine one’s fate far more than we realize most of the time.

For instance:  You see the alarming number of graves of people who died in their thirties and forties, with dates of death fairly close together, and wonder how many of those lives might have been drastically different had they encompassed the eradication of polio or smallpox.

How many of our own arcs will be significantly extended (and enhanced) by scientific and medical innovations that did not even exist when we were born?

And so, for all the ways cemeteries illustrate the commonalities across all of humanity—namely, our common fate as decomposing corpses and piles of ash—they simultaneously remind us of the shackling nature of time and other circumstances beyond our control, which doom us to live very distinct lives, indeed.

We are left to regard these facts in any manner we wish, and I suppose that trudging through a graveyard is not everyone’s conception of a gay old time.

But I do not see why this should be so, as the experience can be every bit as engaging as plunging into any great work of nonfiction or touring an exhibit of historical artifacts.  In essence, that is what a cemetery is, for there is no more primary source of history than the people who made and lived it.  We, the people.

Moral Vanity

One of the great challenges in running for office is the necessity to sell one’s virtues to the voting public while also, paradoxically, maintaining an aura of humility.  People tend not to admire political figures (or non-political figures) who come off as a trifle too self-regarding, and yet it is the nature of campaigning to explain to everyone how wonderful you are.

Gabriel Gomez, the Republican candidate in the special election for U.S. Senate in Massachusetts, is finding it especially difficult to square this particular circle.

For a while, any great interest in the race to replace John Kerry in the Senate appeared to be one more casualty of the Boston Marathon bombing.  The party primaries were held two weeks after the attack, and attention was severely limited.  The candidates, for their part, made themselves relatively scarce, with the myriad angles of the Marathon’s aftermath sucking all the oxygen from the room.

Now, with the nominees chosen and the general election scheduled for June 25, the campaign has proceeded full steam ahead, and any fears that this would turn into a sober, issues-based affair have been duly squashed over the past couple of days.

The particular spat that has gotten the nastiness rolling—uninteresting except for what it reveals about the players involved—began with an advertisement by the Democratic candidate, Ed Markey, which assailed his opponent, Gomez, for involving  himself with a group that accuses President Obama of politicizing the killing of Osama bin Laden.  For several seconds of this ad, an image of Gomez sits on the left side of the screen while one of bin Laden floats in from the right.

Team Gomez, testing the general gullibility of Massachusetts voters, ran a TV spot in response saying Markey’s ad “compared [Gomez] to bin Laden.”  In an interview, Gomez himself continued the thought by postulating that, in doing so, Markey was “pond scum.”

And the tone of the race was set.

What lends this silly campaign flash point an added level of intrigue is Gomez’s distinction as a retired Navy SEAL.  He graduated from the U.S. Naval Academy and served as an aircraft carrier pilot before joining the SEALs, where he rose to be a platoon commander.  His military career totaled 13 years before he moved on to his current vocation as a businessman.

It is a highly impressive background, by any standard.  As a Senate candidate, Gomez would be crazy not to underline it as a demonstration of his physical fortitude and dedication to his beloved country.

The question, then, is when to stop.  To recognize the point at which promoting one’s history of service begins actually to hamper, rather than help, one’s campaign.

In reacting to Markey’s supposed “comparison” of bin Laden to him, Gomez phrased his disgust thusly:  “To put me next to bin Laden?  A former SEAL—maybe he doesn’t realize who actually killed bin Laden.  The SEALs did.”

We have seen this rhetorical sleight-of-hand before:  I served in the U.S. Armed Forces; therefore, anything I do or say relating to the military is axiomatically beyond reproach, and any related criticism by my opponent is beyond the pale.

During the 2008 presidential race, columnist George Will coined the term “moral vanity” to describe this attitude as it applied to Senator John McCain—the idea that one’s particular background on a particular issue cannot possibly be questioned, and especially not by those who lack the same experience themselves.

This is a decidedly unattractive quality to possess, as it would seem to rule out any honest debate on a given subject right at the outset.  After all, if one candidate has such moral superiority about this or that issue, why trouble ourselves arguing over it?  Why can’t Candidate B just accept Candidate A’s inherent rightness and move on?

Further—to my initial point—the person who commits such transgressions against intellectual openness tends ultimately only to inflict political harm upon himself, by creating the impression of having drunk one’s own Kool-Aid, and thus lacking the modesty and self-doubt that are essential in building good character and a good leader.

Gabriel Gomez served an honorable Naval career, of which everyone ought to be made aware and no one has any cause to put down.  Of the rightness of his views on the issues—military and otherwise—well, let us be the judges of that.

Secular Sabbath

This weekend marks four years since my grandfather, Jack, died.  As he had served as an Army clerk in World War II, it seemed fitting enough that he shuffled off on Memorial Day.  If nothing else, it means that we, his survivors, get to remember him twice.

For Zady (Yiddish for “Grandpa,” as he was known), the solemn day of remembrance we observe on Monday was an integral part of his life long before becoming an integral part of his death.

Every year we accompanied him to the local Memorial Day parade (before a rendezvous back to the house to grill hot dogs and toast marshmallows), and as we plopped down in his stuffy living room in front of the TV, he would bemoan, with abject disgust, the unholy proliferation of advertisements for Memorial Day sales.  “Get a great deal on a used car!”  “Fifty percent off all mattresses!”  “Three days only!”  “Hurry hurry hurry!”

For a member of the Greatest Generation, as Zady was, to exploit the day on which we remember those who died in order that the rest of us could live was not merely annoying; it was downright profane.

In recent years, of course, the practice of conducting all manner of commerce on national holidays, by businesses large and small, has grown by both leaps and bounds.  Today, there are very few product-peddlers in the United States that do not promise amazing holiday deals, no matter how somber the holiday.

Any notion of tasteful restraint on this front was ceremoniously laid to waste last November, when Walmart stores took the “Black Friday” madness to new heights (or is it lows?) by opening on Thanksgiving itself, rather than waiting until the traditional, rational hour of 12 o’clock midnight to fling their doors athwart and allow the throngs of shameless, thrifty sociopaths to barrel through.

With this state of affairs now firmly accepted in our culture as the not-so-new normal, it is all the more essential to ask and to wonder, in the spirit of Grandpa Jack:  Is nothing sacred?

As a general principle, I try to resist flights of nostalgia for our country’s supposed “good old days,” when things were simpler and everyone had a white picket fence, and America’s youth respected its elders and had not yet been corrupted by birth control and MTV.

As we know, very little that is said about the so-called wholesomeness and moral superiority of the mid-20th century is actually true.  To study the past is to be extraordinarily relieved to reside in the present.

And yet I wonder whether, by all but abandoning the ritual of taking a day off to reflect upon the nobler aspects of the American story, we have indeed managed to hollow out a chunk of our national soul.

My charge, I suppose, is that the primacy of commerce in our daily lives has ballooned so violently, and irreversibly, out of proportion that it is inflicting genuine harm upon the American civic character.

To wit:  It really used to be true that you would walk into town on a Sunday morning and everything would be closed.  There was no question to this.  Of course none of the shops were open—the owners and employees were all at church, along with everyone else.  If you weren’t at church, you were sleeping in or travelling or doing whatever your heart desired.

In any case, the point was that you were effectively forced to spend the day removed from your usual routine to relax, recharge and reflect.  To remember the Sabbath day and keep it holy.

For all the economic and Constitutional objections one might raise against any effort to resurrect such traditions in today’s world (outside the towns that have never quite abandoned them, that is), we might nonetheless regard an occasion such as Memorial Day as a kind of secular Sabbath, treating it with the reverence and undivided attention that it deserves.

Of course, this could all be a sentimental overreaction.  After all, we still have our parades and memorial services, and those who wish to participate still do.  (Unless they are called in to work.)  In a free country, why should we compel those who are busy or uninterested to tag along?

Because every so often, it is worth reminding ourselves that the United States is still one big community, with a shared history and shared values.  That for every vacuous, artificial holiday we have cooked up over the years, there are also those with real meat, meaning and purpose.

That some things are more important than shopping.

The Limits of Rolling Heads

When the news reached saturation levels that the Internal Revenue Service had treated right-wing groups unfairly in their submissions for tax-exempt status, one demand by those outraged by the story rung as loudly and clearly as any other.

“Somebody had better get fired for this!”

It didn’t take long for that to happen.  Last Wednesday, with Congressional and media attention on the IRS affair hitting critical mass, the agency’s acting commissioner, Steven T. Miller, submitted his resignation.  He has been bouncing in and out of Congressional hearings ever since.

The controversy continues, and calls for others to resign or be sacked have been an integral part of the conversation.  The ridding of Miller, the big cheese, was seen less as a central corrective to the scandal and more as a necessary first step.  There was little (if anything) in the way of protests to his forced departure from the IRS; it was accepted as inevitable, with a sigh and a shrug.

The comedian Lewis Black refers to this phenomenon as “The Tough Shit Rule”:  When an organization is found to have engaged in egregious behavior, responsibility has to begin at the top, and the boss has got to go.  “It’s not a question of blame,” Black clarified.  “It’s just the way it is.”

Indeed it is.  But should it be?  Is accountability for its own sake really worth the effort?  More to the point, is it fair?

The IRS situation is as follows:  Before his unceremonious bouncing this week, Miller had served as the agency’s acting commissioner since November 2012, when then-commissioner Douglas Schulman stepped down as his tenure drew to a close.  (Commissioners serve a five-year term, following a nomination by the president and confirmation by the Senate.)

As the chronology of the present scandal makes clear, none of the relevant wrongdoing occurred on Miller’s watch.  To the extent that was aware of such shenanigans during his brief reign, he cannot be blamed for either ordering or perpetuating them, and comes awfully close to seeming like an old-fashioned fall guy—a placeholder with no real purpose except to absorb the blows intended for others.

In instances such as this, we find ourselves in a tight spot:  We want to get to the bottom of things, but doing so can take a dreadfully long time and, doggone it, somebody’s gotta pay in the meanwhile.  Heads must roll.

The impulse is a good one, and often borne out by history.  Costa-Gavras’ excellent 1969 film Z, based on real events, is all about how the low-level goons who carried out the assassination of a political opposition leader were swiftly rounded up and locked away, while the upper-level government officials who actually ordered the hit made off scot-free, literally getting away with murder.

My fear, as we consider this subject beyond the IRS, is that our desire for justice and accountability can run so deep that, in our haste, we end up firing all the wrong people, generating scapegoats while the true evildoers slip out the back door.  That we settle on targets that are convenient rather than truly guilty.

In the case of Watergate—a genuine abuse of power to which some have ludicrously compared the IRS debacle in recent days—the burglars who got the whole mess started and the president who covered it up were, at long last, exposed and brought to account.  However, more than two years elapsed from the break-in to the resignation of President Richard Nixon.  The gears of justice grind slowly.

A further complication—as suggested by the IRS case—is that the labyrinthine, often disorganized nature of the executive branch that allowed Nixon to survive for so long can, conversely, allow wrongdoing to fester underfoot without the man at the top ever knowing about it at all.  Indeed, President Ronald Reagan managed to skirt culpability in the Iran-Contra caper by (falsely, but successfully) pleading ignorance about the whole business.

In short:  As we tend not to uncover the truth about governmental malfeasance until long after the fact, we should temper our natural desire for justice, in order that we do not inadvertently levy punishment upon those who do not actually require it.

Sometimes the head of an offending organization is responsible for the offenses that occurred therein; however, sometimes he is not.  Before we roam about waving pitchforks all willy-nilly, we ought first to establish which of these two possible scenarios is true and direct our indignation accordingly.

If we insist on being a bloodthirsty, unruly mob, let us at least be an intelligent and judicious one.

Dim Bulbs

The light in my bedroom blew out the other day, and so I went to the supermarket for a new one.  Perusing the fine print on the packaging of the newfangled, twirly energy-efficient bulb I plucked from the shelf, it was with an even mixture of awe and alarm that I considered the very real possibility that this light bulb could end up lasting longer than I will.

A slightly more high-profile bulb-related news item of recent days came in the form of a study that sought to measure the effect of one’s political ideology on one’s spending decisions.

In one test, each in a group of 210 subjects was given the choice of purchasing either an energy-efficient compact fluorescent light (CFL) bulb or an old-fashioned incandescent bulb.  The CFL bulb would cost them $1.50 while the incandescent one went for 50 cents; however, the CFL bulb would last about six times longer and yield enormous savings in energy costs along the way.

Among the test subjects who were simply given this information, researchers found that most people made the economically rational decision of selecting the CFL bulb, and that those who self-identified as liberal and conservative did so at roughly the same rate.

However, in a separate test in which a sticker reading “protect the environment” was slapped onto the CFL bulb—with all other factors held constant—self-identified conservatives were less likely than liberals to choose the energy-efficient bulb, opting for the incandescent bulb instead, despite the higher long-term cost.

Long story short:  A disproportionate number of conservatives avoided making an environmentally-friendly purchasing decision precisely because it was environmentally-friendly, even to their own monetary detriment.

The implication of these results—that one’s politics can lead one to behave in irrational, sometimes destructive ways—are handsomely illustrative of a great deal of what always seems to be amiss in Washington, D.C., and what has especially plagued the federal seat of government for much of the Obama era.

Senate Minority Leader Mitch McConnell famously intoned in an interview in the fall of 2010 that “the single most important thing we [Republicans] want to achieve is for President Obama to be a one-term president.”  Given how swimmingly that worked out, one cannot help but wonder if the party might have better served the country had it focused its energies elsewhere.

Let us deconstruct McConnell’s comment.  What would it mean to ensure the president—any president—is not re-elected?  Presumably, it would necessitate making him look either corrupt, incompetent or simply ineffective.

As the first two are fairly difficult to accomplish externally—they more or less require unforced errors on the president’s part—we are left with rendering the commander-in-chief impotent.  And how does one effect this, if not by standing squarely in the path of every last piece of legislation he proposes, if for no other reason than the fact that he was the one who proposed it?

It is the very definition of acting in bad faith, and to announce it as one’s primary intention in advance is to invite a like response from the opposing team, thereby poisoning the well before the bucket has even been lowered.

This is no way to run a country, and it is no way to conduct one’s daily life.

When Al Gore’s documentary An Inconvenient Truth landed in movie theaters in 2006, many within the green movement bemoaned the fact that the former vice president had made himself the unofficial spokesperson for the cause of sounding the alarm about climate change.  The fear was that by having a political figure at the forefront, this most dire of global problems would devolve into an asinine political catfight.

So it did, and the price we pay only grows with each passing year.  Even as a fresh report this week showed that 97 percent of global warming studies in the last two decades have concluded that climate change is real and that humans are culpable for it (among the studies that expressed an opinion, that is), recent polling has less than half of the American public sharing this view.

There might be multiple explanations as to why this is the case, but the way that so many right-wingers view the global warming question as purely a left-wing concern, and ipso facto something to be opposed (or denied) at all costs, is probably one of them.

We call these attitudes “knee jerk” reactions, because those who exhibit them are almost always jerks.

Legal Drunk Driving

The American government thinks its citizens drink too much.  The Irish government thinks its citizens don’t drink enough.

Last week the National Transportation Safety Board recommended that our 50 states united lower the maximum blood alcohol content under which Americans are legally allowed to operate a motor vehicle.  The current definition of “drunk driving” entails a BAC level of 0.08 percent or above; the NTSB advocates reducing it to 0.05 percent.

In Ireland, meanwhile, the Kerry County Council recently approved a motion to allow certain people to be issued “permits” to drive home after drinking what is otherwise too much liquid courage to legally blunder behind the wheel of a car.

In examining these twin test cases, what we see is not only a difference in laws, but also a difference in values and priorities.  I dare say these two proud nations can learn a thing or two from each other.

To begin, we should duly note that the new legalized drunk driving policy in Kerry County could hardly be said to be the work of the entire Irish government.  Per contra, this “permit” idea is the brainchild of a singular elected official to address a singular concern.  Most other Irish politicians are rather amazed—and, in many cases, appalled—that such a conceit passed muster with the Kerry County Council.

The concern is over a chunk of elderly Irish folk living way out in the country, whose social lives depend upon driving into town for a chat and a pint.  Danny Healy-Rae, the councilor (and pub owner) who proposed the permit idea, frames it as a means of ensuring such people are not isolated from the rest of the world by deciding to stay put, from fear of being pulled over by the authorities on the drive home.

“These are not the ones causing accidents,” Healy-Rae explained, defending the initiative.  “What is the alternative for them where no public or other transport is available?  Staying at home lonely, staring at the four walls?”

Well, when you put it that way…

To be sure, the issue of old-age isolation that Healy-Rae addresses is a real one, and not just in Ireland (nor, for that matter, just among the elderly).  Healy-Rae’s opponents acknowledge as much, disagreeing only with his proposed solution.

The real cultural distinction between Ireland and the United States is that, in our country, discussions of this sort never see the light of day.

It is an old story that Ireland has a more casual attitude toward liquor than America, and many in the Emerald Isle complain that people such as Healy-Rae are unhelpfully perpetuating this image by suggesting that more or less any problem can be solved by drinking.

Officially, the United States has been running as far away from this kind of thinking as possible for the last several decades.  For us, imbibing can only ever be the problem, not the solution.  The NTSB’s new recommendation to tighten the definition of legal drunkenness is only the latest illustration.

With the Kerry County story in mind, here is what I wonder:  Would America not be better off if the regulation of alcohol consumption were more localized, rather than centrally dictated from Washington, D.C.?

(Per legislation, the setting of a BAC limit is, in fact, a state issue; however, those who set theirs higher than the federal recommendation are denied significant highway funding as a result.  This can hardly be considered state sovereignty.)

To wit:  In compact, densely-populated cities with preposterously narrow streets, on which the slightest driver error can yield the most disastrous consequences, it is entirely within reason to impose strict standards as to how many beers one can consume before one becomes a public safety hazard behind the wheel.

But out in the desert Southwest, with its wide-open spaces on which one can drive for miles without encountering another human being?  Should the folks there really face the same scrutiny vis-à-vis drinking and driving as the folks in Boston or New York City?

We have decided, after all, that the regulation of speed limits can be outsourced from the federal government to the states, understanding that people driving across Montana or Texas would never reach their destinations if they were subject to the lower limits that make perfect sense in states of smaller scale.

Why not treat boozing in a similar fashion?  While it is true that the effects of alcohol on one’s faculties are not dependent upon geography, why not leave it to local jurisdictions to determine the legal consequences of those effects, in the context of their particular milieu?

I’d drink to that.