The Right to Hate

I have no evidence that the Westboro Baptist Church is secretly a pro-gay rights organization masquerading as a gang of religious extremists in order to make anti-gay groups look ridiculous.

However, if such a cheeky cabal were formed, I suspect it wouldn’t look a heck of a lot different.

For the past many years, the Westboro Baptist Church has served two essential purposes in American public life.  First, to be arguably the most universally detested organization in our 50 states united.  And second, to ensure, beyond all doubt, that the First Amendment to the U.S. Constitution is as healthy and muscular now as ever it has been.

To review:  The WBC are the folks who shuttle from place to place wielding signs with such heart-warming messages as “God Hates Fags,” “God Hates America” and “Thank God For Dead Soldiers.”  Most of its members are related, either by blood or marriage, to its founder and patriarch, Fred Phelps, who died on March 19, at age 84.

The group is perhaps most notorious for its practice of picketing the funerals of U.S. soldiers, whom it claims were killed as a consequence of America’s tolerance for homosexuality, among other things.  In 2010, this ritual led to a Supreme Court case, Snyder v. Phelps, in which the Court ruled in favor of the church, arguing that protesting a funeral is a form of free expression protected by the First Amendment.

While the death of Fred Phelps does not necessarily mark the demise of the Westboro Baptist Church itself, it may well hasten its diminished presence in the public eye.  As such, we might entertain the notion of referring to the WBC in the past tense, if only for its cathartic effects.

On this subject, I have but one question:  On balance, has the Phelps family been good for America?

My answer:  Yes, but it’s complicated.

I say the WBC is the most hated organization in America—a fairly uncontroversial sentiment—but we might also say it has come by this distinction rather lazily, as far as generating mass hatred goes.

After all, what could be more of a “slam dunk” in the quest for amassing public scorn than to spit on the graves of fallen soldiers and to craft placards with the sort of radioactive language that leads even those who otherwise agree with you to recoil in disgust?

The WBC can be accused of being any number of things, but subtle is not one of them.

Quite to the contrary, they are cartoon characters—hysterical, childish, simplistic, ideologically absolutist to an extent previously not thought possible, and—surprise, surprise—completely convinced of their moral rightness on all fronts.

Indeed, the more time one spends reading the WBC’s various statements on matters of public import, the more one feels the weight of precious seconds of one’s life being irretrievably wasted away.

In other words, the WBC seems to incite the world’s rage and indignation for their own sake, as if it were all one big piece of performance art.  As such, the church can hardly be taken seriously in the first place.  To coin a phrase:  Its antics are not worth dignifying with a response.

Yet we have done exactly that, be it through satire and counter-protests, or in the case of people like Albert Snyder, through lawsuits alleging the infliction of deep emotional distress.

And we cannot blame some folks for taking WBC at face value, since its views do not exactly come from nowhere.  In point of fact, the church’s basic beliefs about homosexuality are drawn directly from the Old Testament, and its musings that God kills Americans as a punishment for homosexuality is an almost word-for-word plagiarism of Jerry Falwell’s infamous explanation for the attacks of September 11, 2001.

In any case, their flagrant ridiculousness has proved exceedingly useful in reminding ourselves that enforcement of the First Amendment can be a very nasty business, since the right to free expression must be extended even to those whose views no one else on planet Earth wishes to hear.

In this way, the Phelps family’s victory at the Supreme Court was a great relief, because it demonstrated that—at least in this case—our federal institutions still take the Bill of Rights seriously.  That our most sacred liberties apply even to those who probably don’t deserve them.  Yes, even organizations like the Westboro Baptist Church, which expresses nothing but scorn toward the very country in which these liberties are practiced.

For better and for worse, that is what America is all about.

Malaysian Mystery

It’s hard to ignore a story about a commercial airliner that went missing nearly three weeks ago and still has not conclusively been found.

But I’ve been trying my best to do exactly that.

It’s not that I’m incurious about how a large pile of metal carrying some 239 human beings could vanish into thin air.  With all the technology now at the world’s disposal, it seems downright impossible for a loaded airplane to disappear without a trace.

And yet I have opted to mostly tune out the ongoing saga of Malaysia Airlines Flight 370, keeping the coverage of its possible whereabouts at arm’s length.  The reason for this concerns the narrow but crucial difference between what we’d call a “search for answers” and a “search for meaning.”

To wit:  Since the Malaysia Airlines plane first lost radio contact with officials on the ground, it has become increasingly clear that we have no earthly idea what the hell happened.  America’s news networks have conjured one hypothetical scenario after another, but these seem to be based on a combination of shoddy information and pure speculation.  Actual, verifiable facts have come in fits and starts, leaving jittery would-be analysts to fill in the blanks.

Indeed, the media’s imagination in this story is commendable for its comprehensiveness:  It was terrorism!  No, it was suicide!  No, it was an asteroid!  No, it’s a government conspiracy!  And so forth.

For most onlookers, the Malaysian jet disappearance serves as a good old-fashioned mystery, sparking the natural human tendency to gather clues and eventually arrive at a resolution.  In other words, a search for answers.  The who, the what, the where and the how.

What’s missing is the why—the element that can only be assessed once all the other questions have been answered, and the element without which everything else is worthless.  The why is what tells us what is so interesting about all these disparate facts in the first place, paving the way for a greater understanding about the world around us.  In other words, the search for meaning.

On the matter of the missing plane, I would very much be interested to find out whether Flight 370 had been taken captive by a hijacker, or instead had merely fallen victim to bad weather or faulty equipment.  I would be intrigued by the prospect of a government cover-up in the investigation of the plane’s trajectory and final resting place, and the ways the passengers’ family and friends have been left in the dark.

But here’s the thing:  Thus far, we have been presented limited, if any, evidence that any of the above is actually the case.  And we can’t interpret the facts until we know what the facts are.

For instance, if it turns out Flight 370 was hijacked by terrorists, we can have an international conversation about terrorism.

If it was a simple (or not-so-simple) mechanical malfunction, we can talk about mechanics.

If the Malaysian and/or Chinese governments have behaved improperly in the course of this ordeal, let us hold the culpable officials to account.

And if we never find out what really happened somewhere over the Indian Ocean on March 8, let us accept that not all mysteries can be solved and not every tragedy yields a take-home message.

In short:  Less us calm the heck down.

Admittedly, this is not an easy thing for us humans to do.  For mysteries big and small, we want answers and we want them now.  We want to know that everything happens for a reason, and that even the most horrific events can be redeemed, however slightly, by the dissemination of the Truth.  Nothing irritates us quite so much as a cold case, and the possibility that we may never find the answers we seek.

What makes this compulsion problematic is our national media, which prefers to report false information rather than no information at all, effectively jerking us around from one conspiracy theory to another for no good reason.

While America’s cable news networks may have jumped the shark permanently when it comes to hysterical coverage, we still have the option to ignore them.  To resist the urge to draw meaning where there is none, and to glean answers from sources who don’t even ask the right questions.

Between Two Ferns and a Hard Place

I must confess that I had never heard of “Between Two Ferns” until the moment when the leader of the free world asked Zach Galifianakis if he dislikes the annual White House turkey pardon because it means there’s one fewer bird for him to eat.

And so when that moment occurred last Tuesday, I had to double check to make sure I wasn’t dreaming.

Upon establishing that I was not—that the tall, thin man on the left was, in fact, the president of the United States and the heavyset furball on the right was, well, Zach Galifianakis—I observed a moment of reflection at the realization that yet another frontier in the continued convergence of politics and pop culture had been crossed.

For those who are still not in the know:  “Between Two Ferns” is a recurring online spoof, viewable on, in which the aforementioned Hangover star mockingly interviews various celebrities with a mixture of detachment and passive-aggressive contempt.  This results in deliriously awkward encounters whose entertainment value is largely determined by how well the interview subjects play along.  According to Galifianakis, the segments are neither scripted nor rehearsed.

When this silly little program began six years ago, its creators probably did not foresee that a future American president would grace its highly-austere set with his presence.  Now that one has—ostensibly as yet another attempt to pitch Obamacare to young people—we are left to muse on whether it was a good idea for him to do so.

Whatever Obama’s purpose and his particular comedic skill, is it not beneath the office of the presidency to engage in this sort of vulgar, lowbrow enterprise that doesn’t even take itself seriously?

What makes the question difficult to resolve is precisely the particular comedic skill of this particular commander-in-chief.  In point of fact, the Obama “Between Two Ferns” bit was actually, genuinely funny, and not just from the shock value of Obama’s mere presence.

As he has proved many times throughout his tenure, Obama has a gift for dry, deadpan wit that is unprecedented in the modern era of presidents.

Sure, men like George W. Bush and Bill Clinton were perfectly capable of cracking the odd joke while in office—speaking at a black tie fundraiser, say, or appearing with David Letterman or Jay Leno.

But could you imagine either of them being able to keep up with a comedic wild man like Zach Galifianakis?  I must say that I cannot, although I would sure love to see them try.

Obama is different.  In the “Ferns” sketch, for instance, when Galifianakis begins a rant about not owning a cell phone because “I don’t want you people looking at my texts,” the president interrupts, “Nobody’s interested in your texts.”  As with other exchanges during the interview—not least the business about the turkey—there is a cutting meanness to this retort that we are not accustomed to hearing from a political leader, even in jest.

Normally, politicians are allowed to be funny, but their humor is not allowed to have any real edge.  In the interests of avoiding controversy, our public officials tailor their sense of humor so it can be understood by the dumbest person in the room, and so that even the person looking to be outraged can find nothing about which to gripe.

The reason we should find this status quo objectionable is because of the low opinion of the public that it implies.  Because expressing a pedestrian sense of humor is just one more pathetic form of pandering.

By occasionally defying convention on this front—by appearing on “Between Two Ferns” in the knowledge that not everyone would “get” it—President Obama deserves a small degree of praise.  By deploying wit that is subtle and sophisticated, he shows a respect for his audience and a nerve to push the envelope regarding what a holder of his office can get away with.  He is not afraid to engage in American popular culture, and thus does not consider himself above or separate from it.

To our original question:  Does this undermine the dignity of the presidency itself?  I think it’s all a matter of context.  To be the president, as in any other position of leadership, is to know when to be serious and when to be silly, and there is little reason he cannot be both.

Bill O’Reilly has argued otherwise, positing on his Fox News program last week that “Lincoln would not have done” a program like “Between Two Ferns.”  To this, Alex Pareene of Salon unearthed priceless (and fairly conclusive) evidence to the contrary, in the form of primary documents from the 1860s showing that there was little that Honest Abe enjoyed more than a good old-fashioned fart joke.

Revenge of the Gays

I must confess that, in my capacity as a ranking member of the gay community, I did not expect to be called a “bully” by a member of the U.S. Congress at this early date.

Yet there was Michele Bachmann, the fourth-term representative from Minnesota, categorizing last month’s veto of Arizona Senate Bill 1062 as the result of coercion by a gay cabal against those with “sincerely-held religious beliefs.”

“The gay community thinks that they’ve so bullied the American people and they’ve so intimidated politicians that politicians fear them,” said Bachmann during a conservative talk radio program last week.  “And so they think that they get to dictate the agenda everywhere.”

The bill in question, you will recall, would have empowered any Arizona business owner to withhold services from anyone, if providing such a service would violate the dictates of the business owner’s faith.

Following a national uproar, the bill was ultimately vetoed by Arizona’s governor, Jan Brewer, who concluded, “Senate Bill 1062 does not address a specific and present concern related to religious liberty in Arizona.  I have not heard of one example in Arizona where a business owner’s religious liberty has been violated.”

Nonetheless, as Bachmann and company would have it, the death of Senate Bill 1062 came at the hands of some nefarious gay mafia that cares not one whit about the freedom of expression and seeks to suppress the right to practice one’s religion untrammeled, and to further the proverbial “gay agenda” in the process.

Never mind that Senate Bill 1062 was conceived and written specifically with same-sex relationships in mind—and along with them, the desire by some Christians to act as if such relationships don’t exist, or at least don’t deserve to be treated as legitimate.

Never mind that the final, fatal blows to the Arizona bill came almost exclusively from Republicans—John McCain, Mitt Romney and a handful of the bill’s original supporters in the State Senate, to name a few.

And never mind that, even without this bill, gay people in Arizona are subject to no legal protections whatever regarding employment.

Nope.  The true “victims” in this drama are not the gays being denied the right to be treated as equals.  Rather, it is the Christians being denied the right to treat others as inferiors.

This is not to say that the right to regard others as morally reprehensible is not real and not worth defending.  To the contrary, the First Amendment’s guarantee to free expression means exactly that.

The blogger Andrew Sullivan—himself a devout Catholic who is also gay—has written to great effect about the need to respect those who object to homosexuality on theological grounds, even while decrying the tendency by some to play the victim, as if the present-day acceptance of homosexuality is, itself, a form of persecution against those who think differently.

What I find most intriguing is the cultural role reversal implied by Bachmann’s and others’ use of the word “bully” with regards to gay rights activists.

Surely it cannot be lost on them—or perhaps it can?—that no single issue has been of more pressing concern to gay young people in recent years than being bullied—be it the outright physical abuse that has robbed innumerable high school students of life and limb, or the psychological torture that has led its targets to take their own lives or simply spend the balance of their adolescent years in abject misery and fear.  See the “It Gets Better Project” for examples.

With this reality in mind, for the gay community to then have the word “bully” deflected back at it strikes as just the slightest bit insensitive and strange, and not a little ironic as well.

To be sure, gays are the not the first minority group to face a charge that had long defined its tormentors.  To wit, the State of Israel is regularly accused of employing Nazi-like tactics against its Palestinian inhabitants, while African Americans are sometimes tarred as “reverse racists” for their support of affirmative action and like programs.

The real question, then, is whether these labels are deserved, and what it means for our culture if they are.

It is undeniably true that the gay rights movement has so successfully executed its “agenda” of achieving legal equality in America that it has become a real political and culture force—a lobby as powerful as most others.

But do this group’s tactics constitute bullying or simple, good old political pressure?  I would argue the gay community has agitated no more aggressively or unfairly for its interests than the NRA or anti-abortion groups have for theirs.  That’s what lobbying is all about.

In the absence of any strong evidence to the contrary, I would suggest that any long-repressed group that so metamorphoses into a social success story not sweat the “oppressor” label too severely, and instead take it as the backhanded compliment that it is.

Can You Keep a Secret?

Sometimes everything works out exactly as it should.  At my folks’ house on Sunday, everything did.

You see, my dear old dad had a big birthday this week, so the rest of us threw him a surprise party over the weekend.  The idea was hatched sometime around Thanksgiving, plans were finalized in the first week of February, and my mom and I acted as principal planners and co-conspirators all along the way.

This being the first surprise party in which I played a significant role, it proved a novel and enlightening experience in the fine art of duplicity.

All told, somewhere north of 60 people were invited, then promptly commanded to zip their mouths shut.  We took care to meticulously choreograph the birthday boy’s schedule without him realizing it, while also devising a strategy for handling the many things that could go wrong.  Not to mention the dizzying prep work of preparing and ordering (and hiding) all the food and decorations, and then only having an hour or so to set them all up.

With all the moving parts that were involved, it was quite impressive that our diabolical plot went off without a hitch.  Nobody spilled the beans, everyone arrived on time.  Even Mother Nature cooperated, for a change.  The guest of honor was surprised and delighted, and a good time was had by all.

As a group, we proved wholly up to the task of carrying on an open deception for an extended period of time and executing the final “reveal” with clockwork efficiency.

Indeed, so good were we at pulling off this playful con, I wonder if we didn’t miss our calling to work for the feds in Washington, D.C.

As everyone knows, whenever disgruntled American citizens are not condemning their government for being lazy, incompetent and generally feckless, they are accusing it of conducting secret, evil grand plots of near-superhuman ingenuity.

The Kennedy assassination.  The Moon landing.  The September 11 attacks.  President Obama’s birth.  Conspiracy theorists contend that none of these events occurred as the official record says.  Rather, they were somehow staged, altered or otherwise effected by elements of the American government for one nefarious purpose or other, and done in the utmost secrecy so that no one, to this day, has any smoking gun evidence to prove any of them.

While not all government-related conspiracy theories are created equal, and some have even proved correct—what else would you call Watergate?—there is an inherently low probability that any such plot is real, precisely because of how unlikely it would be for that many people to be entrusted to such a titanic secret, and then for all of them to keep quiet after all these years.

No, what actually happens is exactly what you would expect.  Whenever some governmental entity attempts to pull something over on the American people—particularly with a high number of agents involved—not all of the holes get plugged, and eventually, something or somebody cracks.  Watergate is a classic illustration, but so, too, is the ridiculous plan by the Christie administration and the Port Authority to inflict gridlock on the George Washington Bridge.  Sure, the truth of these schemes was kept under wraps for a certain amount of time.  But then one day, it wasn’t.

The thing about a surprise party is that the period of secret-keeping is finite:  You only need to clam up until the actual party occurs.  After that, you can relax and congratulate yourself on a job well done.  As well, revelers are given only so much advance notice, lowering the probability that someone’s guard will drop.

To wit:  It’s entirely possible for 60 people to stay tight-lipped for a month, as my family proved last weekend.  But what if we gave our guests a full six months’ or a year’s warning?  Would the surprise still have succeeded?  We certainly weren’t prepared to take that risk.

If we might reduce all of this to a general formula, it would be that the probability of a conspiracy remaining a secret is inversely proportional to the number of people involved, as well as to the amount of time elapsed since the conspiracy formally commenced.

If this seems all too obvious, it is nonetheless an essential insight into why conspiracy theories at the highest levels of government tend to be so idiotic, and why they should be taken with multiple grains of salt.

In the long run, human beings in large numbers are just not that great at keeping secrets.  Sooner or later, somebody blows the whistle or sends an incriminating e-mail or tweet.  Whether by accident or by design, some people just can’t help themselves.

Not every conspiracy can be as top secret as a birthday party for your dad.

Scholastic Aptitude Torment

Upon hearing about the College Board’s forthcoming revamp of the Scholastic Aptitude Test—otherwise known as the SAT—I was all ready to let loose a vitriolic broadside against the standardized college preparatory exam and the ways it ruins the final two years of your high school career.  That is, when it doesn’t also ruin the rest of your life along with them.

But then I stumbled upon an op-ed piece in the New York Times, titled, “Save Us From the SAT,” whose author, Jennifer Finney Boylan, essentially makes all my best points for me (and more interestingly, at that).

“The SAT is a mind-numbing, stress-inducing ritual of torture,” Boylan writes, with only the mildest of exaggeration.  “The College Board can change the test all it likes, but no single exam, given on a single day, should determine anyone’s fate.  The fact that we have been using this test to perform exactly this function for generations now is a national scandal.”

Boylan goes on to recount her own horror story of taking the infernal test when she was a high school junior—a drama that has her coming this close to lodging a fatally low score as a result of accidentally skipping one line of bubbles on the answer sheet.  (She realized her mistake with precious seconds to spare.)

On this, actually, I can do Boylan one better:  The year my own graduating class was subjected to the SAT, the College Board’s grading machines threw a fit and wound up miscalculating thousands of students’ scores, some by several hundred points in either direction.

College Board ultimately spotted, acknowledged and corrected its error, but it took five months to do so, and not before untold scores of college hopefuls (including several I knew personally) were put under the false impression that their dream school was suddenly out of reach, leading some to forego even applying to institutions they otherwise should’ve had every reason to think would accept them.

And so you had the fortunes of multitudes of students greatly disrupted or utterly destroyed, all because of some inane automated clerical error.

To Boylan’s point:  Even when scored correctly, why should one test carry so much weight and be able to inflict so much academic carnage along the way?  And if it’s true that actual college admissions officials do not take SAT scores as seriously as students are made to think, then why are students made to think it?

In the memorable phrasing of Captain Lionel Mandrake in Stanley Kubrick’s Dr. Strangelove, “I would say, sir, that there was something dreadfully wrong somewhere.”

As outlined extensively by Todd Balf in the New York Times Magazine, the point of the SAT is to predict a high school student’s potential in college-level courses, and in a manner that can be fairly applied to every student in America.  The proposed changes, which are more dramatic than in the most recent face-lift in 2005, are a response to the College Board’s conclusion that the exam, in its current form, is neither fair nor a particularly accurate gauge of collegiate success.  The SAT disadvantages poor students, who cannot afford private tutors, and has shown not to correlate with one’s college GPA nearly as much as one’s high school grades do.

In point of fact—as a friend tartly put it years ago—the SAT is little more than a test of one’s ability to take the SAT.

I know this to be true because I had a private tutor of my own, and I can infer with high certainty that those sessions inflated my final scores to levels I would not have attained on my own.  Whether my “natural” SAT grades would have shut me out of my eventual school of choice, I have no idea.  That the mere fact of my parents’ relative wealth enhanced my academic and career prospects before I even picked up a pencil—well, the words “national scandal” could hardly be more germane.

To be sure, the fact that poor folks must often work exponentially harder than rich folks merely to keep pace in America is not solely a problem for the College Board.  It’s a problem for everyone, and has been since the dawn of the republic itself.

One of the finer moments of the new Mitt Romney documentary Mitt comes when the titular candidate acknowledges how his father, George, took a lifetime to reach the economic standing to which Mitt himself began.  It reminds us that any so-called “level playing field” in the United States is an aspiration, not a reality.

The proposed SAT tweaks—particularly the increased emphasis on logic and argument in justifying one’s answers—will probably leave us with a superior product than we currently have.

But they are still no match for the most sensible and just of the College Board’s options, which is to just get rid of the bloody thing once and for all.

Pizza Pizzazz

Love brings people together.  But not as much as pizza.

There are several reasons why Ellen DeGeneres’ pizza delivery stunt at Sunday’s Academy Awards was such a big hit.

(If you missed it:  Halfway into the telecast, the host asked if anyone was hungry.  A few segments later, a pizza guy turned up and everyone in the front row chowed down.)

For starters, as widely noted, it playfully underlined the way that Hollywood stars effectively starve themselves in the days leading to big awards shows—and, in many cases, on every other day of the year as well.

Further, it served to puncture the Oscars’ infamously hoity-toity self-importance and provide a moment of cheeky irreverence of which the annual TV ritual is always so desperately in need.

And perhaps most noteworthy of all—thanks to how gamely most of the would-be targets of this bit played along—it brought Hollywood’s most glamorous kings and queens back down to Earth, allowing them a flash of authenticity that came across as endearingly, well, authentic.

Sure, the regal show business life of champagne and caviar may be every bit what it’s cracked up to be, and that old chestnut about how celebrities “are just like you and me” is mostly nonsense.  But when you’re locked in a theater for four hours and the rumbling of your stomach threatens to drown out the orchestra, nothing hits the spot quite like a greasy, gooey pile of tomato sauce and melted cheese.  Vanity be damned!

Such is the democratizing effect of America’s favorite food—a food that, according to a study released this week, is consumed by some 13 percent of Americans on any given day.

We are a country and a culture divided as severely as ever between haves and have-nots—a world in which the lives of the rich and the poor have less and less in common and seemingly exist in separate, alternate universes from each other.

But pretty much everybody loves pizza.  No amount of money or prestige can efface its bubbling, artery-clogging allure.

And why should it?  What is there about pizza to which one could possibly object?  It’s simple to make; it’s extremely cheap (although it can also be expensive, if you prefer); it’s available for takeout or delivery in every town in America; it comes in all shapes, sizes and varieties; it can be eaten elegantly in a fancy restaurant or scarfed on a street corner while you wait for the lights to change; and it can be served hot or cold, for breakfast, lunch or dinner.

Because pizza is so manifestly irresistible—so universally beloved and consumed—we become deeply suspicious of those in public view who seem resistant to its natural charms and, when pressed, try just a little too hard to blend in with the crowd.  Can you ever truly trust someone who has to pander on pizza?

Case in point:  Bill de Blasio.  In January, the newly-inaugurated mayor of New York City was caught at Goodfellas Pizzeria in Staten Island eating a slice with a knife and fork—a mortal sin within the five boroughs.

Interrogated by reporters on his way out, de Blasio offered an explanation so labored and elaborate (and clearly untrue) that it only made things worse.  As New York Times columnist Maureen Dowd wrote at the time, de Blasio “sounded like a parody of the self-serious New York liberal, convinced he’s right about everything.”

The point isn’t the silverware.  People should be free to transfer their food from their plates to their mouths however they see fit.

The question is:  What kind of a weirdo manages to make eating pizza sound complicated?

In the name of all that is good and holy, if you have to B.S. your way through a pizza party, as though you’ve never attended one before—well, either you’re an alien or a liar.  And if a man is prepared to lie about pizza, what won’t he lie about?

Is this not how our minds work when it comes to our public officials?  We want to know that they are men and women “of the people,” and for whatever crimes they might commit while in office, we are prepared to forgive and forget so long as they prove themselves human and, therefore, relatable.

Back in 1998 during “Monicagate,” for instance, America faced the prospect about its commander-in-chief, “If he’ll lie about sex, he’ll lie about anything.”  Yet the public ultimately gave Bill Clinton a pass, and today he is as popular as ever before.  After all, the only reason Monica Lewinsky was in the Oval Office in the first place was to deliver a pizza.

Prejudice on Parade

What is it about being gay that is incompatible with being Irish?

Or perhaps I should say, what characteristics of those with Irish blood would prohibit one from being openly gay?

Surely it can’t be the affinity for whiskey and beer.  No gay bar worth its salt is complete without a full line of taps.  Indeed, a great many gay folks—particularly closeted ones—would not have the nerve to enter such an establishment without knocking back a half-dozen shots beforehand.

Nor could it be the Irish struggle against persecution, some of it exceedingly violent and typically by groups acting on religious precepts.  Discrimination through America’s immigration system and in the workplace?  Yup, gays know a thing or two about that.

Nor, for that matter, could it be the intense sense of pride the clovered community has accrued in persevering through such hatred and oppression, as it slowly earns both legal and cultural legitimacy from the rest of the world—pride that regularly manifests in the form of a parade.  Here, too, the homosexually-inclined can relate.

Indeed, on reflection, there would seem to be far more that binds the Irish community and the gay community together than sets them apart.

So we must ask:  Why is the former so adamant about shunning the latter?

I speak of the twin dramas playing out in Boston and New York City regarding those cities’ respective St. Patrick’s Day parades.  In each case, the municipality’s newly-installed chief executive has refused to march so long as the event’s organizers prohibit gay groups from joining in.

In Boston, the mayor’s personal parade boycott is, itself, a tradition of sorts.  The newly-retired Mayor Thomas Menino declined to march every year beginning in 1995, on the grounds that the Allied War Veterans Council, which sponsors the parade, excludes gay organizations of all sorts.

Menino’s successor, Martin Walsh—himself the son of Irish immigrants—was prepared to do the same, in keeping with his uncommonly pro-gay record as a state representative.  In recent days, Walsh has appeared to broker a compromise, whereby any gay organization will henceforth be permitted to participate in the parade, provided “they do not wear shirts or hold signs bearing the word ‘gay’.”  Negotiations are ongoing.

In Gotham, meanwhile, Bill de Blasio has become the city’s first mayor to abstain from marching in its own main St. Patrick’s parade—also a gay-free zone—in some two decades.  The scuttlebutt there concerns the minutiae of whether de Blasio should instruct other public officials to follow his lead or allow them to make their own decisions.  (Thus far, he has done the latter.)  In any event, there is no immediate possibility for the parade’s gay embargo to be lifted.

As the two sides of this debate hash out the logistics of the forthcoming celebrations, I simply stand here and wonder:  Why does the debate exist at all?  Why do two minority groups that would seemingly make such natural allies instead find themselves engaged in a prolonged and bitter standoff?

Some Irish folk attempt to resolve this question by pointing skyward and to the Bible:  They say (or imply) that being Irish is really just an extension of being Christian, and that open homosexuality is an act of defiance against God’s design and therefore an affront to any expression of Christian (and particularly Catholic) identity.

Of course, the religious component of Irish identity is inescapable, not least owing to the seemingly eternal strife between Catholics and Protestants on, and near, the Emerald Isle itself.  For many Irish—in Ireland, America and everywhere else—one’s genealogy and one’s faith are one and the same.

On the other hand, roughly five percent of Irish-Americans are gay (if we assume sexuality is consistent across different ethnic groups), reminding us that any Irish organization discriminating against gays is necessarily discriminating against itself.

In any case, while one may conflate one’s ethnicity with one’s religion if one chooses, not all members of any such group do.  In pluralistic America, you are free to define and express your heritage however you see fit, and we Americans do exactly that.

Yes, many Irish-Americans are observant Catholics.  However, many others are not:  Instead, they are observant Protestants, observant Jews, or perhaps they are not observant at all.  Some of them—gasp!—might even be atheists.  And, again, some of them are gay.

Does this make them any less Irish?  Are organizers of a St. Patrick’s Day parade prepared to shun every member of its tribe that does not conform to a rigid, pre-approved set of cultural characteristics?

I fear that they are, and that such an attitude serves no useful purpose—not for themselves, nor for anyone else.

Serving Size Matters

Suppose you’re the person responsible for designing the nutrition labels on the side of food packaging in the United States.  And suppose you can only include one category on the label.  That is, the one category to best enable the average consumer to decide whether to purchase a particular product or not.

How about if you could choose three nutritional metrics, or five?  On what bases do you weigh one measure of a food’s nutritional composition against another?  On a piece of packaging with only so much space, how do you ascertain which stats are essential health information and which are disposable?

Luckily for us, we have the Food and Drug Administration to do that job for us, and this past week it did precisely that in uncommonly sweeping fashion.

For the first time in some two decades, the FDA has proposed tweaking both the appearance and substance of those black and white rectangles that tell us, more or less, what it is that we’re about to eat.

Most dramatic among these recommendations include the physical enlarging of a given food’s calorie count, the addition of the total amount of the item’s “added sugars” and, perhaps most fully reflective of all of the realities of contemporary American diets, the altering (in some cases) of what constitutes one “serving” of the culinary trinket in question.

Of these three actions, the first underlines the fundamental importance of the calorie as a basic building block in the science of food; the second affirms the (relatively) newly-understood impact of sugar in many Americans’ ill health.

And the third?  Well, it just goes to show that when it comes to shoveling grub, we gentle giants simply don’t know when to stop.

The question, it seems to me, is whether we treat our country’s well-established portion problems as inevitable and, in any case, how (and if) to meaningfully correct course.

“Things like the size of a muffin have changed so dramatically,” said FDA commissioner Margaret A. Hamburg.  “It is important that the information on the nutrition fact labels reflect the realities in the world today.”

Well, yes and no.

Sure, the person has not yet been found who is not willing to down a full 20-ounce bottle of Dr. Pepper in the course of a single meal (or gulp), rather than capping it after eight ounces, as the bottle advises.  And yes, the half-cup of ice cream that has long constituted an official “serving” is hardly enough to reach the top of the cone.

Hell, plop a massive, steaming basin of linguini with marinara in front of me, shut the door and return an hour later, and you’ll find an empty pot and an extremely satisfied blogger slumped over blissfully in his chair.  Seven servings per package?  My ass.

So we eat more than the FDA has long presumed.  However, that doesn’t mean we should, and it doesn’t mean our gluttonous habits should be tacitly endorsed through a readjustment of the nutrition tables.

We are told that food labels are mandated to reflect how much Americans “actually eat.”  But if the amount of food Americans actually eat is the whole problem in the first place, what exactly do we expect to accomplish by re-calibrating the labels to bow to this “reality”?  Does it not amount to effectively moving the goal posts for the worse—of turning “too much” into “the correct amount”?  Might this not produce the opposite of its intended result—namely, that people will regard their already outsized portion sizes as a baseline and opt to supersize even more?

Yes, muffins have gotten bigger, as the FDA commissioner put it.  Why throw up our hands and shrug, “Well, that’s now the accepted size of a muffin”?  Instead, why not stand firm and say, “That muffin is too goddamned big and could easily be cut in half”?

Human beings do not need to consume as much as Americans do in order to go about their daily lives—as demonstrated by pretty much every other country on Earth.

This being the case, and so long as “official” serving sizes of most items are arbitrary, anyway, why not err on the side of scaling them down?  Isn’t the point of being a nanny state to nudge people to behave better, not to coddle them?

This week we happily learned that the obesity rate of children aged 2 to 5 fell 43 percent over the past decade.  Apart from anything else, this demonstrates that America’s fatness epidemic is not insurmountable.  Course corrections are possible, after all.

If we can nearly halve the prevalence of obesity in young kids in just ten years, imagine what we could accomplish for those of us old enough to glance at a nutrition label and understand what it says.