The Big Bad ‘Wolf’

Martin Scorsese’s The Wolf of Wall Street is an exhilarating acid trip into the mind of one of the most amoral characters one can imagine.  Jordan Belfort, as portrayed by Leonardo DiCaprio, is a stock broker with a bottomless appetite for sex, drugs and cold hard cash, and there is no level of fraud and manipulation to which he is not prepared to stoop in order to procure them.

Because the movie is so bloody entertaining, many have suggested that it effectively endorses the reprehensible behavior it depicts.  That Scorsese is glorifying Belfort’s ravenous lifestyle as much as he is attacking it.  That The Wolf of Wall Street is a celebration of the greediest corners of the contemporary American culture, not a condemnation thereof.

It doesn’t, he isn’t, and it’s not.

In point of fact, The Wolf of Wall Street is as devastatingly honest a portrait of the toxins of Wall Street gluttony as one could hope to find, and Scorsese ought to be lauded for how (deservedly) hard his film is on its wayward protagonist.

The charge of glorification, which has come even from some of the movie’s admirers, is easy enough to understand.  After all, what Belfort makes clear above all else—particularly through DiCaprio’s voice-over narration—is that his years of scamming, coke-snorting, pill-popping and hooker-grinding were an epoch of orgiastic glee, and he savored every minute of it.

The slick cars, the potent powders, the gorgeous women—he truly could not get enough.  What is more, not only does he not regret the dirty dealings that brought these pleasures about—rather, he boasts about them, as if expecting a pat on the back and a gold watch for his sheer chutzpah.

He was not simply a kid in the candy store.  He was a kid who broke into the candy store, trashed it beyond repair and made off with all the jelly beans before the cops finally turned up.

Except that when the authorities dusted for fingerprints, Belfort’s were all over the place, and boy was there hell to pay.

You see, the point is not what Belfort got away with.  The point is what happened to him when his luck ran out.

Anyone who views The Wolf of Wall Street as a paean to unfettered greed has overlooked nearly the entire second half of the film, during which (spoiler alert!) Belfort’s shenanigans lead to the collapse of his marriage, a prolonged FBI investigation into his business practices and a prison term of some 22 months.

Like Henry Hill, the would-be hero of Scorsese’s GoodFellas, Belfort realizes his dream of untold riches only to have the rug pulled from beneath his feet, as the universe’s arc of justice finally catches up with him to deliver well-deserved retribution (albeit not nearly enough).

I submit that no reasonable person would view this film in its entirety and conclude that Belfort’s life is one worth emulating.  In an admittedly roundabout way, The Wolf of Wall Street demonstrates that carrying on such a morally decrepit existence, however ephemerally enjoyable, is ultimately not worth the trouble.

Of course the good times were a blast.  That’s what makes them the good times.  Do you think Belfort would’ve made the effort if it was all a drag?  Please.  The movie wouldn’t be credible any other way.

The Wolf of Wall Street succeeds for the same reason most anti-smoking campaigns fail.  It presents the whole story of Jordan Belfort—the good, the bad and the smarmy—and trusts its audience to conclude that it amounted to a wasted life.

Kid-targeted anti-drug ads tend not to work because they skip right to the nasty effects of dodgy substances without bothering to explore why one is driven to use them in the first place.  To suggest that drugs are 100 percent bad—that they carry no benefits whatever—is to risk losing one’s intended audience in the first round.  Even children know the difference between propaganda and truth.

Scorsese’s movie treats its viewers as adults.  It portrays Jordan Belfort’s illegal hanky panky as a grand old time because, for him, that’s exactly what it was.

That does not mean the movie condones everything, or anything, that he does.  If you watch The Wolf of Wall Street and come away with a net positive impression of its protagonist and his way of life, the problem is not with Scorsese.  The problem is with you.

As Utah Goes

I can’t say I saw that coming.

Over the past year or two, during which it has become more or less inevitable that gay marriage will eventually be the law of the land, I have wondered which of America’s 50 states will be the final holdout against the sanctioning of same-sex matrimony, and in what year that final domino will fall.

While Alabama and South Carolina are arguably the most surefire bets on this front—they were the last two states to remove miscegenation bans from their state constitutions—the state of Utah could easily have stood at or near the top of anyone’s list.

After all, Utah is unquestionably one of the most conservative members of the union.  The Mormon Church, which more or less runs the place, contributed millions in favor of California’s Proposition 8, which outlawed gay marriage in that state in 2008.  (The law has since been overturned.)  Four years prior, Utah voters endorsed a similar ban for themselves by a 2:1 margin.  A 2011 survey found that only 27 percent of Utahans think gay marriage is a good idea.

And yet gay marriage is now legal in the Beehive State, following last Friday’s ruling by the U.S. District Court, which declared the 2004 ban unconstitutional on the grounds that “it denies the Plaintiffs their rights to due process and equal protection under the Fourteenth Amendment of the United States Constitution”—language borrowed from the U.S. Supreme Court decision in June that struck down Section 3 of the Defense of Marriage Act.

While the ruling will undoubtedly be subject to lengthy legal pushback, initial appeals have been rejected.  Thus, for the time being, gay marriage in Utah is here to stay.  Who’d a-thunk?

To be sure, this was a decision made in clear opposition to the will of the public and most of their elected representatives.  Were the marriage question left up to them, last Friday’s surprise would not have occurred and marriage would likely have remained a strictly heterosexual affair for decades to come.

And so we must revisit the ancient, yet indispensible, question:  Is this how things ought to be?  Is it right that, on a major issue like the right to marry, the opinion of a supermajority of the people can be overruled by a single judge?

Courts have ruled on same-sex marriage before—not least in Massachusetts, the commonwealth where it all began.  In the intervening decade, state supreme courts have issued similar rulings in California, Connecticut, Iowa, New Jersey and New Mexico.

The difference with the Utah case is twofold.  First, unlike the above, Utah was made to issue marriage licenses to same-sex couples by a federal court, rather than a state one.  As such, those who oppose gay marriage have suffered a perfect storm of all they detest.  In their view, the Utah decision marks not only an encroachment by the judiciary on what ought to be left to the legislature, but also an encroachment by the federal government on what ought to be left to the individual states.  What could possibly be more repulsive than that?

Perhaps even more noteworthy, however, is the aforementioned fact of Utah’s exceptional conservatism standing alongside the ostensible liberalism of this ruling.

While it happened that locales like Massachusetts and Connecticut became gay-friendly through judicial means, such events were only slightly divergent from—if not in concert with—the sentiments of the public at the time.  In Utah, this was plainly not the case.

Accordingly, one can understand the annoyance of those who view this as a gross injustice and a perversion of the democratic process.  Why should Utah be saddled with a policy that most of its citizens heartily do not want?

But then I am reminded of the scene in Gus Van Sant’s Milk in which the movie’s hero, Harvey Milk, receives a phone call from a semi-closeted young man in Minnesota whose life has been made a living hell by his backward parents in his backward home town.  Milk’s advice for this kid is to get the hell out of Minnesota as fast as he possibly can.  “I can’t,” the boy responds, “I can’t walk, sir.”  And the camera pulls back to reveal the wheelchair in which he sits.

Gay people exist everywhere, you see, and not all of them have the desire or ability to relocate to areas of the country in which they are most welcome.

Sometimes the role of the state is to protect minorities from majority rule, if it has been judged that the latter is trampling upon the rights of the former.

Sometimes the principle of state sovereignty is not as important as the principle of ensuring equal protection for every individual living in a particular state, regardless of whether anyone else in that state thinks they deserve such protection in the first place.

Some things are not for us, the people, to decide.

An Atheist’s Christmas Dilemma

This being December, the Christian world has been duly saturated with references to, and reverence for, a certain omniscient, mystical father figure.  An immortal, bearded bloke who sees you when you’re sleeping and knows when you’re awake, and who has the power to reward the good among us and punish the bad.

It’s also the season of Santa Claus.

Some years back, the great conservative wit P.J. O’Rourke penned a short essay, “Why God is a Republican and Santa Claus is a Democrat.”  It pitted the Almighty’s strict, no-nonsense wrath against St. Nick’s jolly, gregarious do-goodness, and concluded with the punch line, “Santa Claus is preferable to God in every way but one:  There is no such thing as Santa Claus.”

The bit is amusing as far as it goes—at least for the politically-inclined—yet it doesn’t quite work for an atheist such as myself, for reasons I probably don’t need to explain.

In point of fact, I have long wondered what, fundamentally, the difference between the two aforementioned magical entities is supposed to be.  Why is it preposterous to believe Santa is real but perfectly acceptable to believe the same about God?  Were you to describe both to someone with no previous knowledge of either, would he or she have any particular reason to affirm the existence of one over the other?

If we have decided, as a society, that it is reasonable to think a singular intelligent force is capable of simultaneously peering into the homes and the minds of every last person on Earth, what stops us from saying the same about a guy who does it one house at a time and with the aid of a flying sled?  If anything, isn’t the latter more plausible than the former?

In short:  If there is no Santa Claus, there is probably also no God.  The end.

These are my views on the subject.  My question is this:  Is Christmas the wrong time of year to express them?

Never mind the First Amendment, which will always fall on the side of he who won’t shut up.

Purely as a matter of courtesy and taste—of keeping with the holiday spirit, as it were—should non-believers scale back their public antipathy toward religion and God on what is perhaps the only day that most of America’s Christians take their church’s foundational story seriously?  Or is Jesus’s birthday the perfect occasion for a sincere questioning of his divinity?

Should we heretics remain always on the attack, or is the dawning of the winter solstice and the hanging of the mistletoe an indicator for us to lighten up for a change?

In the public square, the United States’ atheist community is rarely known for either discretion or tact.  During the holiday season, non-believers (and secular believers) are dismissed by much of the faithful as humorless party-poopers who spend each December storming the countryside, stomping on nativity scenes and kicking over Christmas trees everywhere they go—some literally, the rest figuratively.

Like all stereotypes, this one contains at least a few grains of truth.  Indeed, it is perhaps inevitable that those who cannot bring themselves to believe in miracles will cast an equally critical eye at the spectacle of a Macy’s Santa stand-in assuring wide-eyed children that come Christmas morning, all their dreams will come true.

I don’t particularly want to be the Grinch who brings misery unto all in Whoville by suggesting their beliefs and traditions are a sham.  I really don’t.

I want people to enjoy themselves during the holiday season, as I generally do myself.  I would feel dreadful if my antitheist nitpicking (or anyone else’s) somehow got in their way.

Atheists are not “all alike” any more than are Christians or Muslims or members of the NRA.  However, they are oftentimes taken as such, and so I want to make extra certain not to reinforce my clique’s most unattractive perceived characteristics by embodying them myself.

Besides, if I spend day and night complaining about all things Christmas, when will I ever find the time to sip eggnog and bake cookies?

It’s Not Me, It’s You

It looks like nearly everyone in America is against the use of cell phones on airplanes.

And that’s why it’s inevitable.

Last week, the Federal Communications Commission voted to continue its efforts to remove the nationwide prohibition on in-flight calls by airline passengers.  The ban has long been in effect for technical reasons—namely, to prevent a cell phone from jumbling the pilot’s communications with the control tower.  However, the FCC now assures us that recent technological advances have rendered such concerns “obsolete.”

(Should the FCC lift the ban, it would fall to the FAA and individual airlines to set their own rules.)

Accordingly, the phones-on-planes debate now hinges entirely on the question of courtesy:  Just because we can make phone calls during our flight, does that mean that we should?

A supermajority of the American public says no.  A recent Quinnipiac poll found 59 percent of respondents opposed the proposition of making in-flight calls permissible, with 30 percent in favor.  Similar surveys have yielded similar results.  Notably, even a majority of those aged 18-29—that is, the folks most passionately tethered to their technological toys—thinks yapping at 30,000 feet is a bad idea.

So long as this is the case, let us examine precisely what we mean when we say we would like our airplane flights to be phone-free.

Most of all, we would very desperately wish to avoid a long plane ride with a fellow passenger who doesn’t know when to shut up—the person who is completely indifferent to the considerations of others and can only comprehend the world of himself.  (See John Candy in Planes, Trains and Automobiles.)

We have all encountered such cretins at one point or another, be it on the bus, in a movie theater or across the table at Thanksgiving.  The last thing we want to do is encourage them.

The complication is as follows:  If these social pariahs are best characterized for their lack of self-awareness, how can we be so certain that we, ourselves, are not among them?

“Have you ever noticed that anyone who’s driving slower than you is an idiot?” George Carlin once asked.  “And anyone driving faster than you is a maniac?”

Ain’t it the truth?  All the trouble in the world—it’s never our own doing, because we’re perfect little angels.  It’s those wackaloons on the other side of the room who are mucking everything up and making our lives a living heck.

In point of fact, we aren’t against using cell phones on airplanes.  We’re against everyone else using cell phones on airplanes.

Rest assured that I feel your pain as potently as anyone.  Like most of today’s young whippersnappers, I avoid old-fashioned phone conversations whenever humanly possible, and doubly so in public.

And when the regrettable moment arises in which I must conduct a verbal exchange through a machine welded to the side of my face, I make an honest attempt to be audible only to the person on the other end of my Android, rather than to everyone in a 200-yard radius.

Why can’t all of my fellow primates be so considerate and cognizant of their surroundings?  Why isn’t the rest of society as wonderful as I am?

But we must be straight with ourselves:  We want to maintain cell phone bans in confined spaces like airplanes to rein in our society’s most irritating citizens, but never to rein in ourselves, since it never occurs to us that we require a reining-in the first place.

Yet the moment will inevitably come when you realize there’s a very, very important call you forgot to make before takeoff and, doggone it, it just can’t wait until landing!  What do you say to the flight attendant who reminds you that it’s against the rules?  “Don’t worry, I’ll be really quick”?  “I promise I’ll keep my voice down—I’m not one of those people”?  Are you certain about that, or should we take a poll?

Bear in mind, in other words, that the regulations you so earnestly endorse, in order to keep everyone else in line, will not be suspended in your own case.  That when you remove your neighbor’s privilege to yak his way from one end of the continent to the other, you are also removing your own.

Is our culture mature enough to willingly make that trade-off, to sacrifice our own inalienable right to chat for the sake of the greater, silent good?

I am not yet prepared to make that call.

Split Identities

Depending on the rumors one believes, it appears somewhere between possible and likely that Scott Brown, the onetime Massachusetts senator who lost his reelection bid to Elizabeth Warren in 2012, will attempt to re-claim his seat in the midterms next year.

From New Hampshire.

Yup.  The very same Scott Brown who in 2010 famously and improbably became the first Republican to represent the Bay State in Congress’s upper house in more than three decades—and who ceased doing so less than 12 months ago—has rather abruptly decided that the Granite State to Massachusetts’s north is his true home after all.

A state that, conveniently enough, is looking to field a challenger to its possibly vulnerable Democratic incumbent, Jeanne Shaheen, less than 12 months from now.  And a state whose reputation for political independence would, unlike its reliably liberal neighbor, lay even odds for a Republican to prevail in almost any electoral contest.

The basis of Brown’s home state switcheroo is that he and his wife own a house in New Hampshire, which indeed is legally sufficient for him to take the aforementioned political plunge.

Nonetheless, it is clear that Brown’s family’s primary residence has long been in the Boston suburb of Wrentham (they possess three additional pieces of Massachusetts real estate as well), and so a great many of his former constituents were surprised to learn of his emotional migration northward.

Should he proceed with a New Hampshire Senate run, Brown would effectively be behaving as a “carpetbagger”—the Civil War-era smear for those who change home states for purely opportunistic reasons.

Both the word and the concept have been in continuous use since their coinage a century and a half ago, and lately have been equally cast in the direction of Liz Cheney, the former vice president’s elder daughter who is running for Senate in Wyoming despite having lived in Virginia for most of the last two decades.

The presumption, seemingly endorsed by everyone, is that such a practice—like countless other forms of cold political strategery—is intrinsically suspect and dishonorable, and that those who have succeeded in being elected in seemingly random locales have done so in spite of their pretensions of belonging, rather than because such overtures necessarily worked.

But this presumption is wrong, and I wish to rescue so-called carpetbaggery from its lowly reputation and position it somewhere within the realm of respectability.  For you see, dear reader, I have had considerable experience with conflicted hometown loyalty myself and understand how nebulous the notion of one’s “true” geographical center can be.

I was born in Massachusetts and lived there for eight years—just long enough, say, to acquire the soul of a bitter, tormented Red Sox fan—but then relocated to Westchester County, New York, where I remained for the duration of my adolescence and forged my strongest friendships, before ultimately returning to my state of birth upon entering college.  I still reside in Boston today, but periodically sojourn back to New York, where a sizeable chunk of my heart is firmly stowed.

Which place is my real home?  The answer is not entirely self-evident, and my geographical background is far simpler than that of countless people I know, some of whom have hopscotched from one end of the continent to the other and back, never hitching their tents to any one spot for long.

Indeed, in our highly mobile society, how many are left among us who, by choice or happenstance, have managed to stay put in the same place for their entire lives?

And more to the point, who cares if they have?

For all that distinguishes America’s many regions and geographical subcultures from each other, the United States is nonetheless fundamentally a single unit through which one is free and welcome to travel at one’s leisure throughout one’s life.

The supposed demand that one must “choose” one patch of it over the others, while necessary in a practical sense, is a highly overrated facet of our national character, and can lead to some highly unattractive (albeit sometimes amusing) jingoism in the process.

What, after all, does (or could) it mean to be a “true” New Yorker?  What set of characteristics—personal, cultural, political—might give someone a “Midwestern sensibility”?  Could someone be a “Californian at heart” without ever having actually lived in California?  To the subject at hand:  How do these questions shake out with respect to our representatives in Congress?

It seems to me that, whatever his political calculus might be, if Scott Brown opts to run in New Hampshire and the good folks there decide that he would make a fine ambassador of New Hampshire values—whatever those might be—then the subject is closed.

Carpetbagger or not, in politics, geographical identity is in the eye of the voter.

Wisdom From Sandy Hook

In four years of high school, no day was more memorable or enjoyable than the one in which someone called in a bomb threat.

As any number of my childhood comrades would affirm, there was something perversely exciting yet whimsical about being hauled off en masse from the classrooms to the football field for a few hours on a sunny spring afternoon while administrators combed the school building for anything that might go “boom!”

Maybe it was the perfect weather, or the opportunity to break in the field’s brand new artificial turf with a game of “Duck, duck, goose.” Maybe it was simply the prospect of having an extra day to study for a big physics test or to catch up on some long-neglected sleep. (One group of geniuses decided to pass around a joint under the bleachers. Took them nearly five minutes to get caught.)

Whatever the reason, the sudden and imminent prospect of the school blowing up really made our day.

Of course, these happy memories are predicated on the not-unimportant fact that (as you probably guessed) the school didn’t blow up after all. Following a thorough search, nobody found anything suspicious and the whole episode ending up being a ridiculous hoax, as nearly everyone on the field assumed it was from the start.

We got lucky.

Reflecting on the silliness now, on this very somber anniversary of the shooting at Sandy Hook Elementary School in Newtown, Connecticut, I can only shudder at how very unlucky we could have been, and in any number of ways.

At that time—namely, the mid-2000s—my high school had no security apparatus to speak of. Backpacks and lockers were rarely searched, nor were the identities of those entering the building at the morning bell subject to any particular scrutiny. We had no metal detectors and, until my senior year, no security cameras to keep us young’uns in line.

As with myriad other public (and private) spaces across these United States, if someone truly wished to plant an explosive or strap himself with firearms and unleash holy hell, precious little would have stood in his way.

In the end, whether any particular school—or church or federal building or marathon or whatever—falls victim to some vile atrocity mostly depends upon the bad luck of having a crazy person in one’s midst with the will to carry it out.

A tragedy like Sandy Hook occurs and people ask, “Why did it happen here?” Given the realities of living in a free and open society, one could just as well ask, “Why shouldn’t it?”

To a degree, the nature of American life guarantees that, sooner or later, something horrible is going to happen somewhere and there is little, if anything, we can do about it.

Now then.

In the national conversation about gun control, this inevitability factor is conventionally used as an argument for less regulation, not more. It goes like this: Since violent, homicidal people will always exist, and since (some) such people will always find a way to acquire the weaponry they need, any form of gun control is futile and counterproductive.  It would only weaken the defenses of the good guys.

Per contra, if there is any single lesson I have learned from the many gun-related debates of the past year, it is that inevitability is as much an argument in favor of gun control as it is the reverse, and that those on the pro-regulation side might want to cite it more often than they currently do.

I put it to you in the form of two questions.

First: Were federal and/or state laws to make it impossible for an individual to legally purchase a so-called “assault weapon”—that is, a gun capable of killing the maximal number of people in the minimal amount of time (and with the minimal amount of effort)—is it not reasonable to surmise that the total number of individuals who possess such trinkets would, over time, decrease?  Even if only by a little?

And second: With an assault weapons ban on the books, would not the total number of innocent people killed in mass shootings also go down?

Once and if you conclude that an assault weapons ban (or something similar) would, in fact, reduce the total number of assault weapons in circulation across the United States, and that such an occurrence would result in fewer total people being killed or wounded by them—and, therefore, by firearms in general—then the onus is very much on Team Second Amendment to demonstrate why such legislation is nonetheless a bad idea.

Why, in other words, is the death (or likely death) of a maximal number of innocent children a necessary price to pay for this apparent freedom to own an instrument that is designed, among other things, to cause those very deaths?

Yes, there is evil in the world. Yes, there will always be villains who get a hold of deadly, illegal weapons. Violence committed upon innocent bystanders is inevitable.

However, to then say that we cannot and should not attempt to limit the extent of these inevitabilities is an absurd leap of logic that demands the highest and closest of scrutiny in the days and years ahead.

New Tricks

Alexander Payne’s new film Nebraska is ostensibly about a half-senile old coot who falls prey to a hokey marketing ploy that promises him a pile of riches he will never actually collect.

Yet the movie’s melancholy air is all the time haunted by the possibility that this man, Woody Grant, is as much the perpetrator of a playful con as he is the victim of one.

The story begins when Woody receives a letter claiming he has “won” a million bucks, and that he only needs to travel to company headquarters in Lincoln, Nebraska to claim it.  The plot, such as it is, concerns the ridiculous journey that ensues, as Woody is chauffeured by his extremely reluctant son, David, from their hometown of Billings, Montana to Lincoln, with an extended stopover in Hawthorne, the small (fictional) Nebraska town where Woody grew up and where much of his family and friends remain.

It becomes evident early on that the million-dollar letter is merely an excuse for a road trip to allow the audience and David to get to know Woody better.  As played by Bruce Dern, he proves a rather compelling study, insomuch as we come to learn a great deal about him and his past, yet never quite reach a full understanding of what makes him tick.

Among the mysteries that arise is whether Woody is really as loopy as he appears, or whether he is putting on something of an act.

From the opening credits onward, Woody is shown incontrovertibly to be in a declining mental state.  We see this in the way he drifts in and out of conversations, through his apparent lapses in memory—a lifetime of drinking probably didn’t help—and, not least of all, through his stubborn determination to make the 900-mile trek from Billings to Lincoln by any means necessary—even if by foot.

However, this is not to say he has parted ways with his entire bag of marbles.  A teeth-finding mission along the railroad tracks shows that Woody retains a definite and cutting wit, and several barroom episodes demonstrate that he can carry a spirited argument as well as anyone.  In short:  If his default disposition is one of confusion, he can nonetheless summon perfect lucidity when it suits him.

And so I wonder:  Is he taking advantage of his family’s assumption of his senility by taking them for a ride, if only for his own amusement?  Is he more self-aware than he is letting on, and playing it out as a means of enjoying his twilight years as much as he possibly can?

In his final HBO stand-up special—aired just four months before his death—George Carlin mused about the small, often unacknowledged benefits of advanced age, which for him largely involved the mischief you can get away with at age 70 or 80 that you can’t at 30 or 40.  These included slipping out of boring social events by claiming to be “tired,” guilt-tripping young people into carrying your luggage and (to repeat ourselves) exporting your memory to your surrounding kinfolk.

“Don’t be afraid to get old,” said Carlin.  “It’s a great time of life.  You get to take advantage of people and you’re not responsible for anything.”

Carlin was being (mostly) facetious, but I must say I rather fancy the notion that old folks would feign, or exaggerate, the effects of old age just for the fun of it.  That subversive practical joking does not end the moment you become eligible for the senior discount and the early bird special.  After all, why should it?

We young people tend to treat old people in a coddling, patronizing manner.  We speak to them in that artificially high tone of voice otherwise reserved for infants.  We refer to their age as “young” rather than “old” (as in, “he’s 90 years young!”).  We perform tasks they didn’t ask us to perform, because we assume they couldn’t possibly manage on their own.

We do these things with the noblest of intentions, but I cannot help but picture a great proportion of our elders rolling their eyes, thinking, “I’m old; I ain’t dead.”

Where this is indeed the case, old folks cheekily capitalizing on the kindness of young folks seems like the perfect revenge.  If we, as a culture, have decided that being relieved of all personal responsibility is a reward for living a good, long life, then I suppose milking this perk for all it’s worth is another one.