Eye of the Beholder

Can a piece of art ever exist entirely on its own, or is it always tethered to the context of its creation?

For instance, is it possible to listen to the Ring Cycle without remembering that Richard Wagner was an anti-Semitic prick whose music inspired the rise of Hitler?

Can one watch Manhattan—the story of a 42-year-old man’s love affair with a 17-year-old girl—and not be distracted and/or repulsed by the personal life of its writer, director and star, Woody Allen?

As a society, we’ve had a version of this argument many times before, trying to figure out how to separate the art from the artist, while also debating whether such a thing is even desirable in the first place.  (The answer to both:  “It depends.”)

Lately, however, this perennial question has assumed a racial dimension, compelling us to re-litigate it anew—this time with considerably higher stakes.

Here’s what happened.  Over at New York’s Whitney Museum of American Art, the curators of the institution’s 78th biennial—an exhibition of hundreds of contemporary works by dozens of artists—chose to include Open Casket, a semi-abstract painting that depicts the mutilated corpse of Emmett Till, the 14-year-old African-American boy who was tortured and lynched in Mississippi in 1955 for allegedly whistling at a white girl.  (The woman in question later admitted she made the whole thing up, but that’s another story.)

As a painting, Open Casket is arresting, with the oils so thickly layered that Till’s mangled face literally protrudes from the canvas, as if calling out to us from beyond the grave.  As a political statement, it fits comfortably into our uncomfortable era of police brutality and racial unease—a natural, even obvious, choice for any socially conscious art show in 2017.

There was just one little problem:  The creator of Open Casket is white.  Specifically, a Midwestern white woman living in Brooklyn named Dana Schutz.

Upon hearing that a Caucasian had dared to tackle Emmett Till as the subject for a painting, many patrons demanded the Whitney remove Open Casket from its walls, while condemning Schutz for attempting to profit off of black pain—a practice, they argued, that has defined—and defiled—white culture since before the founding of the republic, and should be discouraged at all costs.  The message, in effect, was that white people should stick to their own history and allow black people to deal with theirs.

In response to this brouhaha, the Whitney defended its inclusion of Schutz’s work without directly addressing the race question, while Schutz herself issued a statement that read, in part, “I don’t know what it is like to be black in America.  But I do know what it is like to be a mother.  Emmett was Mamie Till’s only son.  I thought about the possibility of painting it only after listening to interviews with her.  In her sorrow and rage she wanted her son’s death not just to be her pain but America’s pain.”

In other words:  Far from being exploitative or opportunistic, Open Casket is meant as an act of compassion and empathy toward black America from an artist who views Emmett Till’s death as a tragedy for all Americans—not just black ones.

Of course, that is merely Dana Schutz’s own interpretation of her work, and if history teaches us anything, it’s that the meaning of a given cultural artifact is never limited to what its creator might have intended at the time.  The artist Hannah Black, one of Schutz’s critics, is quite right in observing, “[I]f black people are telling her that the painting has caused unnecessary hurt, she […] must accept the truth of this.”

The real question, then, is whether offensiveness—inadvertent or not—is enough to justify removing a piece of art from public view, as Black and others have advocated in this case.

If, like me, you believe the First Amendment is more or less absolute—that all forms of honest expression are inherently useful in a free society—then the question answers itself.  Short of inciting a riot (and possibly not even then), no art museum should be compelled to censor itself so as not to hurt the feelings of its most sensitive patrons, however justified those feelings might be.  Au contraire:  If a museum isn’t offending somebody—thereby sparking a fruitful conversationit probably isn’t worth visiting in the first place.

Unfortunately, in the Age of Trump, the American left has decided the First Amendment is negotiable—that its guarantee of free speech can, and should, be suspended whenever the dignity of a vulnerable group is threatened.  That so-called “hate speech” is so inherently destructive—so wounding, so cruel—that it needn’t be protected by the Constitution at all.  As everyone knows, if there was one thing the Founding Fathers could not abide, it was controversy.

What is most disturbing about this liberal drift toward total political correctness is the creative slippery slope it has unleashed—and the abnegation of all nuance and moral perspective that goes with it—of which the Whitney kerfuffle is but the latest example.

See, it’s one thing if Open Casket had been painted by David Duke—that is, if it had been an openly racist provocation by a callous, genocidal lunatic.  But it wasn’t:  It was painted by a mildly-entitled white lady from Brooklyn who has a genuine concern for black suffering and wants more Americans to know what happened to Emmett Till.

And yet, in today’s liberal bubble factory, even that is considered too unseemly for public consumption and must be stamped out with all deliberate speed.  Here in 2017, the line of acceptable artistic practice has been moved so far downfield that an artist can only explore the meaning of life within his or her own racial, ethnic or socioeconomic group, because apparently it’s impossible and counterproductive to creatively empathize with anyone with a different background from yours.

By this standard, Kathryn Bigelow should not have directed The Hurt Locker, since, as a woman, she could not possibly appreciate the experience of being a male combat soldier in Iraq.  Nor, for that matter, should Ang Lee have tackled Brokeback Mountain, because what on Earth does a straight Taiwanese man like him know about surreptitious homosexual relationships in the remote hills of Wyoming?  Likewise, light-skinned David Simon evidently had no business creating Treme or The Wire, while Bob Dylan should’ve steered clear of Hattie Carroll and Rubin Carter as characters in two of his most politically-charged songs.

Undoubtedly there are some people who agree with all of the above, and would proscribe any non-minority from using minorities as raw material for his or her creative outlet (and vice versa).

However, if one insists on full-bore racial and ethnic purity when it comes to the arts, one must also reckon with its consequences—namely, the utter negation of most of the greatest art ever created by man (and woman).  As I hope those few recent examples illustrate, this whole theory that only the members of a particular group are qualified to tell the story of that group is a lie.  An attractive, romantic and sensible lie, to be sure—but a lie nonetheless.

The truth—for those with the nerve to face it—is that although America’s many “communities” are ultimately defined by the qualities that separate them from each other—certainly, no one would mistake the black experience for the Jewish experience, or the Chinese experience for the Puerto Rican experience—human nature itself remains remarkably consistent across all known cultural subgroups.  As such, even if an outsider to a particular sect cannot know what it is like to be of that group, the power of empathy is (or can be) strong enough to allow one to know—or at least estimate—how such a thing feels.

As a final example, consider Moonlight—the best movie of 2016, according to me and the Academy (in that order).  A coming-of-age saga told in three parts, Moonlight has been universally lauded as one of the great cinematic depictions of black life in America—and no wonder, since its director, Barry Jenkins, grew up in the same neighborhood as the film’s hero, Chiron, and is, himself, black.

Slightly less commented on—but no less noteworthy—is Moonlight’s masterful meditation on what it’s like to be gay—specifically, to be a gay, male teenager in an environment where heterosexuality and masculinity are one and the same, and where being different—i.e., soft-spoken, sensitive and unsure—can turn you into a marked man overnight, and the only way to save yourself is to pretend—for years on end—to be someone else.

Now, my own gay adolescence was nowhere near as traumatic as Chiron’s—it wasn’t traumatic at all, really—yet I found myself overwhelmed by the horrible verisimilitude of every detail of Chiron’s reckoning with his emerging self.  Here was a portrait of nascent homosexuality that felt more authentic than real life—something that cannot possibly be achieved in film unless the men on both sides of the camera have a deep and intimate understanding of the character they’re developing.

Well, guess what:  They didn’t.  For all the insights Moonlight possesses on this subject, neither Barry Jenkins, the director, nor a single one of the leading actors is gay.  While they may well have drawn from their own brushes with adversity to determine precisely who this young man is—while also receiving a major assist from the film’s (gay) screenwriter, Tarell Alvin McCraney—the finished product is essentially a bold leap of faith as to what the gay experience is actually like.

Jenkins and his actors had no reason—no right, according to some—to pull this off as flawlessly as they did, and yet they did.  How?  Could it be that the condition of being black in this country—of feeling perpetually ill at ease, guarded and slightly out of place in one’s cultural milieu—has a clear, if imprecise, parallel to the condition of being gay, such that to have a deep appreciation of one is to give you a pretty darned good idea of the other?  And, by extension, that to be one form of human being is to be empowered to understand—or attempt to understand—the point of view of another?  And that this just might be a good thing after all?

Advertisements

The Battle of New York

Back in January, Ted Cruz floated a novel, but pointed, line of attack against Republican frontrunner Donald Trump:  The latter shouldn’t be his party’s standard bearer, Cruz argued, on the grounds that he represents “New York values.”

Now that it appears Trump will, in fact, be the GOP nominee and will likely square off against fellow New Yorker Hillary Clinton in the fall, we might as well take a moment to glance at Cruz’s diagnosis and say, “Well, so much for that.”

If things continue on their current trajectory—an admittedly dubious assumption—the 2016 election will not merely be a showcase for so-called New York values:  It will be an outright endorsement of and/or surrender to the same.

That may seem like an unlikely and counterintuitive conclusion to draw at this particular moment in history, but there you have it.  Donald Trump was born in Queens in 1946 and has never identified with any other metropolis, while Hillary Clinton moved her family to nearby Chappaqua in the fall of 1999 and has held court in and around there ever since.

For all intents and purposes—for better and for worse—a Trump-Clinton race would be a Subway Series for the soul of America, during which the very notion of “New York values” would be fairly up for grabs, demonstrating yet again that the five boroughs do not comprise the Greatest City in the World by accident and that if you want to truly understand America, you can’t do much better than waking up in the city that never sleeps.

There’s certainly no great mystery as to why New York, of all places, has produced such a disproportionate stock of serious presidential contenders through the years.  (Since 1904, New Yorkers have run against each other in three different presidential elections.)  The city, forever and always, is such a crowded, competitive, high-stakes environment for anyone with high ambitions—be they political, financial or cultural—that it’s only natural for someone who finds any measure of success there to think he or she has the mettle to conquer the rest of the universe as well.

In this respect, Ted Cruz is absolutely right about Donald Trump embodying the city from whence he came.  After all, what could be more of a singularly New York sensibility than buying up zillions of dollars of precious Manhattan real estate, slapping your name on every last inch of it, and then sitting in a room thinking, “You know, it’s about time that I really made something of my life”?

By all means, not every inhabitant of this town harbors such an absurd, colossal level of self-regard—such a hunger to expand their brand and rule the world in every way they know how.  And even among those who do, few have such a comically-inflated ego or speak in such horrifyingly crude, prejudicial tones.  For every arrogant blowhard like Trump or Michael Bloomberg, the city also produces such luminous national treasures as Lin-Manuel Miranda, whose seismic new musical Hamilton reflects the city at its most noble:  A beacon of opportunity, welcoming to immigrants, artists, thinkers and revolutionaries.

Indeed, New York City is nothing if not a million different things at once, attracting a million different types of people, each finding his or her own way in the world.  That’s the beauty and the madness of the place and the primary reason that folks from all over the world have been flocking there since the Dutch Republic first landed in 1624.

If Trump represents one strand of what New York symbolizes, Hillary Clinton represents another strand entirely—a strand, oddly enough, that comes pretty close to the definition offered by Ted Cruz.

Said Cruz during a Republican debate, “Everyone understands that the values in New York City are socially liberal or pro-abortion or pro-gay marriage [and] focus around money and the media.”

Cruz was attacking Trump, not Clinton, but when it comes to the latter, I’d say Cruz was pretty much on the money.

While Hillary Clinton took a bit longer to defend the rights and dignity of gay people than many in her party would have liked, she is now in perfect harmony with the supermajority of New Yorkers on that issue.  Meanwhile, her support for abortion rights has been unerring and unquestioned, as have her views on most other socially liberal causes.

As for the presence of “money and the media”:  You bet your sweet bippy.

Since her national debut as First Lady-in-waiting in 1992, Hillary has been as much of a media character as any other political figure.  Throughout the myriad phases of her public life, newspapers, TV shows and the interwebs have built her up every bit as much as they have torn her down.  As with Trump now, Clinton’s relationship with the press has always been mutually beneficial:  She gives them endless material; in return, they give her endless coverage and the occasional benefit of the doubt.

Then there’s the money, which is arguably the most essential component to Hillary’s candidacy and career.  At this moment, if there is anything that could feasibly lose her the nomination to Bernie Sanders, it’s her unnervingly close relationship to Wall Street and other financial giants in a year that most Democratic voters are prepared to burn the leaders of those institutions in effigy.  Clinton herself assures us that she is equally concerned about the outsize power of Big Money in American life and will make every effort to rectify this imbalance once in office.

The problem—as everyone now knows—is that Clinton has collected nearly $2 million in contributions from various big banks over the last several years.  Officially, these were mere “speaking fees.”  In the minds of millions of Democratic primary voters, they were a down payment.

Here is where the business culture of New York comes into play.  If you are the sort of well-connected, highly-respected insider that both Clintons have become since moving to the Empire State, you would regard giving prime time speeches to major companies as an obvious and uncontroversial part of your job (not to mention an easy and painless way to make a buck).

However, for someone outside of that uber-capitalist milieu, it looks awfully shady for a supposed big bank antagonist to accept millions of dollars from big banks and then claim that the money will have no effect on how she treats those corporations as commander-in-chief.

I am reminded—unavoidably—of the moment in 2013 when John Oliver, pinch-hitting for Jon Stewart on The Daily Show, confronted Senator Kirsten Gillibrand about her own six-figure income from companies like Goldman Sachs and JPMorgan Chase.  “What I deeply want to know,” said Oliver, “is what do you have to do for that?  What is required of you for that money?”  That Gillibrand didn’t even attempt to answer Oliver’s query is, in a way, more damning than any explanation she might have given.

Need I mention which state Senator Gillibrand represents?

That Hillary Clinton apparently doesn’t understand how anyone could find fault with her particular financial arrangement is, itself, her biggest problem of all.  She has become so insulated in the universe of pay-for-play that she either a) doesn’t recognize open bribery when she sees it, or b) doesn’t think the voters are clever enough to recognize it themselves.  They say no one ever went broke underestimating the intelligence of the American people; I guess soon enough we’ll find out for sure.

In any case, this year it looks like it’s really gonna happen:  New York vs. New York, and the rest of the country will just have to deal with it.  No doubt those who share Ted Cruz’s worldview will find this situation intolerable.  As someone who lived in the New York metro area for 10 years and still visits from time to time, I consider this geographic quirk among the saving graces of this ridiculous campaign.

Donald Trump, if nominated, would be far and away the most inappropriate presidential candidate in my lifetime, for reasons I have outlined over and over again.  If elected, the damage he would inflict upon the United States is almost too horrific to contemplate.  However, taking all of that as a given and knowing that I would never abandon my country for such paltry reasons as those, I’d much prefer a pigheaded Republican president from New York to, say, someone like Ted Cruz.

There’s that old adage, “He’s an idiot, but he’s our idiot,” and that is my feeling about Trump.  If the GOP insists upon nominating a maniac for the highest office in the land, at least the maniac in question will have spent virtually his whole life marinating in one of the most vital, cosmopolitan, enlightened cities on planet Earth—and is damned proud of having done so.  I don’t see eye to eye with Trump about much, but the conviction that New York is the true capital of the United States—the city that most fully captures America in all of its glory, beauty and absurdity—well, that’s one value about which we are in total agreement, and that is slightly better than nothing.

The Reckoning, Part 2

 In general, life is complicated.  So is politics.  And so, especially, is politics as it relates to race and class.

However, every so often a big public controversy erupts that would lead any honest person to wonder, “Is there anything here that cannot be explained by good old-fashioned racism?”

That question popped into my head multiple times during the new HBO drama Show Me a Hero, whose final two-hour segment aired this past Sunday.

This spellbinding series—the latest from David Simon, creator of The Wire—recounts the racial powder keg that exploded in the city of Yonkers, New York in the late 1980s—a socioeconomic showdown over desegregation and public housing that might well have stayed buried in the past were it not for its obvious parallels to events in the present.

Certainly, the circumstances that led the good people of Yonkers to very nearly lose their minds spawned from legitimate and complex concerns about the well-being of their neighborhoods.  But they were also—on the basis of this show, at least—borne of the fact that a bunch of rich white people really, really didn’t want to live on the same block as a bunch of poor black people.

They insisted it wasn’t about race.  Of course it was about race.

Here’s the deal.  In 1985, a federal judge ordered Yonkers—a city of 190,000 immediately north of the Bronx—to build 200 units of low-income housing in and around its most affluent neighborhoods.  This was essentially a means of desegregating a community in which most of the white folks lived in the nice part of town while most of the black and Hispanic folks lived in slums.

If the city council failed to approve such a plan, the judge continued, then the city would be held in contempt and fined exorbitant sums of money until either a) the council came to its senses, or b) the city went bankrupt.

You’ll never guess what happened.

That’s right (spoiler alert!):  Egged on by their raucous, angry constituents, the Yonkers City Council voted to defy the court’s order to build public housing, thereby incurring daily penalties that soon totaled in the millions, resulting in the suspension of basic city services and the closing of several public institutions.  While the ensuing outrage ultimately forced the council’s holdouts to change their minds, the damage was done and the point was made.

In short:  The white residents of Yonkers were prepared to destroy their own city rather than have a handful of black people living nearby.

It’s almost not enough to call this racism.  It’s a psychosis that exists in a realm beyond racism—a pathology that has convinced itself that segregation is the natural order of the universe and must be defended at all costs.  And all based on the notion that one group of human beings is superior to all the others.

To be sure, there were other forces at work in this struggle.  The fourth-largest city in New York did not almost bring about its own demise solely because of abnormally high levels of white supremacy inside City Hall.  Allocating public housing in a big city is a messy and contentious business under any circumstances.  Not everyone is going to be treated fairly.

Indeed, the “official” argument against desegregation in Yonkers was economic:  If you move a bunch of lower-class families into an upper and middle-class neighborhood, the overall desirability of that neighborhood will decline, and property values will slide right along with it.  If you’re a homeowner who plans to sell one day, of course you want to prevent a precipitous decline in your home’s value in whatever way you can.

But in watching Show Me a Hero, you cannot help but suspect that racism is always, finally, at the root of the problem.  That if people viewed each other as equal human beings, rather than as members of alien tribes, then most of the other conflicts would either cease to exist or become infinitely easier to resolve.

The most compelling evidence for this is the character of Mary Dorman, played with great subtlety by Catherine Keener.  As one such homeowner, Dorman begins as a vehement opponent of the low-income housing plan, publicly carping about property values, et al, while privately confiding to her husband, “These people, they don’t live the way we do.  They don’t want what we want.”

But then something unexpected happens:  She starts spending time with “these people” as a member of the transition committee—a group that essentially handpicks which families will get to move into the new townhouses—and she discovers that, lo and behold, poor black people do want what “we” want and do live the way “we” do, to the extent that their circumstances allow it.

Now, about those circumstances.

We take it as a statistical truth that poor neighborhoods in big cities are disproportionately non-white and contain disproportionately high levels of crime.  That’s to say nothing of how this affects incarceration rates and the chances of success in higher education and employment many years down the trail.

The $64,000 question is:  Why might this be?  How did it happen that folks with darker skin are—by a huge margin—more likely to find themselves impoverished, unemployed or in jail?  Are black and Hispanic people inherently lazier and more violent than white people, or is there something more institutional at work?

Following many decades of study and a little bit of common sense, we find the answer staring us directly in the face.  While there are multiple layers, it can essentially be explained in two words:  housing discrimination.

As Ta-Nehisi Coates definitively showed in his devastating Atlantic cover story, “The Case for Reparations,” white people and the U.S. government spent a great deal of the 20th century actively preventing black people from ever owning a home—and, consequently, from accumulating real wealth.

Through the process of “redlining,” black house hunters were shut out of entire neighborhoods in most major U.S. cities, and in the places they were allowed to live, they could not obtain regular mortgages and had to depend on loans that were neither guaranteed nor honestly granted.  In an interview, Coates described this system as having combined “all the problems of renting with all the problems of buying and none of the rewards of either.”

In other words, housing segregation occurred by design, not by accident.  It had nothing to do with the personal behavior of the black folks who were being victimized, and everything to do with an effectively white supremacist government that made it very nearly impossible for African-Americans to achieve the American dream.

After nearly a century of this madness, to turn around and blame it all on black people who wear their pants too low is to portray a spectacular historical ignorance that, in our culture, is more or less par for the course.

Indeed, here is a classic example of where basic knowledge of the past can yield intelligent decisions in the present and future.

Most critically, to know that housing segregation was a plot intended to keep black people out of polite society is to understand that desegregation is a national moral imperative—one small step in our collective reconciliation with America’s broken soul.

Once you grasp that our country’s appalling wealth gap is a direct consequence of that racist system and that narrowing the gap will improve the quality of life for everyone, then it becomes perfectly sensible to expand affluent neighborhoods to include residents who, in an equal society, would have gotten there anyway.

In the process, both groups will get to know each other on a one-to-one basis, which is the surest means, in any society, of reducing prejudice and fear.  It was no coincidence that support for same-sex marriage skyrocketed at the same time that gay people made themselves visible to straight people in record numbers, thereby implanting this crazy idea that we are all equally human.

Prejudice is a function of ignorance, which in turn is a function of physical separation among different groups of people.  Really, it’s all just a variation on fear of the unknown, and the way to eradicate that is to make the unknown known.

This doesn’t mean we’re not still going to hate each other from time to time.  It just makes it far more likely that we’ll hate each other for the right reasons—namely, for the content of our character, rather than the color of our skin.

The people of Yonkers learned this the hard way, but they learned it nonetheless.  While housing desegregation might not have solved all of that city’s problems, it nonetheless fostered a more open and integrated community in which a greater number of people had a fair shot at making a better life for themselves.

Call me naïve, but I consider that progress.

Prejudice on Parade

What is it about being gay that is incompatible with being Irish?

Or perhaps I should say, what characteristics of those with Irish blood would prohibit one from being openly gay?

Surely it can’t be the affinity for whiskey and beer.  No gay bar worth its salt is complete without a full line of taps.  Indeed, a great many gay folks—particularly closeted ones—would not have the nerve to enter such an establishment without knocking back a half-dozen shots beforehand.

Nor could it be the Irish struggle against persecution, some of it exceedingly violent and typically by groups acting on religious precepts.  Discrimination through America’s immigration system and in the workplace?  Yup, gays know a thing or two about that.

Nor, for that matter, could it be the intense sense of pride the clovered community has accrued in persevering through such hatred and oppression, as it slowly earns both legal and cultural legitimacy from the rest of the world—pride that regularly manifests in the form of a parade.  Here, too, the homosexually-inclined can relate.

Indeed, on reflection, there would seem to be far more that binds the Irish community and the gay community together than sets them apart.

So we must ask:  Why is the former so adamant about shunning the latter?

I speak of the twin dramas playing out in Boston and New York City regarding those cities’ respective St. Patrick’s Day parades.  In each case, the municipality’s newly-installed chief executive has refused to march so long as the event’s organizers prohibit gay groups from joining in.

In Boston, the mayor’s personal parade boycott is, itself, a tradition of sorts.  The newly-retired Mayor Thomas Menino declined to march every year beginning in 1995, on the grounds that the Allied War Veterans Council, which sponsors the parade, excludes gay organizations of all sorts.

Menino’s successor, Martin Walsh—himself the son of Irish immigrants—was prepared to do the same, in keeping with his uncommonly pro-gay record as a state representative.  In recent days, Walsh has appeared to broker a compromise, whereby any gay organization will henceforth be permitted to participate in the parade, provided “they do not wear shirts or hold signs bearing the word ‘gay’.”  Negotiations are ongoing.

In Gotham, meanwhile, Bill de Blasio has become the city’s first mayor to abstain from marching in its own main St. Patrick’s parade—also a gay-free zone—in some two decades.  The scuttlebutt there concerns the minutiae of whether de Blasio should instruct other public officials to follow his lead or allow them to make their own decisions.  (Thus far, he has done the latter.)  In any event, there is no immediate possibility for the parade’s gay embargo to be lifted.

As the two sides of this debate hash out the logistics of the forthcoming celebrations, I simply stand here and wonder:  Why does the debate exist at all?  Why do two minority groups that would seemingly make such natural allies instead find themselves engaged in a prolonged and bitter standoff?

Some Irish folk attempt to resolve this question by pointing skyward and to the Bible:  They say (or imply) that being Irish is really just an extension of being Christian, and that open homosexuality is an act of defiance against God’s design and therefore an affront to any expression of Christian (and particularly Catholic) identity.

Of course, the religious component of Irish identity is inescapable, not least owing to the seemingly eternal strife between Catholics and Protestants on, and near, the Emerald Isle itself.  For many Irish—in Ireland, America and everywhere else—one’s genealogy and one’s faith are one and the same.

On the other hand, roughly five percent of Irish-Americans are gay (if we assume sexuality is consistent across different ethnic groups), reminding us that any Irish organization discriminating against gays is necessarily discriminating against itself.

In any case, while one may conflate one’s ethnicity with one’s religion if one chooses, not all members of any such group do.  In pluralistic America, you are free to define and express your heritage however you see fit, and we Americans do exactly that.

Yes, many Irish-Americans are observant Catholics.  However, many others are not:  Instead, they are observant Protestants, observant Jews, or perhaps they are not observant at all.  Some of them—gasp!—might even be atheists.  And, again, some of them are gay.

Does this make them any less Irish?  Are organizers of a St. Patrick’s Day parade prepared to shun every member of its tribe that does not conform to a rigid, pre-approved set of cultural characteristics?

I fear that they are, and that such an attitude serves no useful purpose—not for themselves, nor for anyone else.

Cold Clichés

There may be no greater cliché than talking about the weather.

If there is, it’s complaining about the weather.

And if there’s an even more ubiquitous cliché than that—at least here in the frigid Northeast—it’s bitching about the evident lack of the global warming we were promised.

We’re all familiar with the script.  The calendar turns, the wind blows, the mercury drops, the snow falls and everyone from Washington, D.C., to the Canadian border shouts in unison, “What the hell is going on here?”

That is, except on weekends like the one we just had, in which temperatures rose a solid 10-15 degrees above normal and we were treated to a brief, tantalizing preview of spring.

During which, of course, everyone turned to the unseasonably sunny heavens and shouted, “What the hell is going on here?”

That’s the thing about clichés.  They require no thinking at all.  Indeed, it is in the absence of critical analysis and rational deduction that they fester and thrive.

It would not seem a terribly arduous undertaking to grasp that some days are warmer or colder than others, that winter equals snow and summer equals heat, and that whatever Mother Nature happens to deliver on one day in one neighborhood is not necessarily representative of the entirety of planet Earth.

And yet we talk about inclement weather as if it’s as mysterious as in the days preceding all modern meteorology and climate science, because, well, what else are we gonna talk about?  Rain and snow bring us together as a people—they are concepts we can all relate to, because they affect each of us in one way or another.

The point at which this becomes a real problem, however, is when this meteorological griping reaches epidemic levels and is shown to be in direct conflict with the actual truth of the matter, leading large groups of people to believe something that just ain’t so.  Such as the belief, in this case, that climate change isn’t real.

From a crucial recent story in the New York Times, titled, “Freezing January for Easterners Was Not Felt Round the World,” we learn that for all the snow and cold spells the Acela Corridor has experienced, this winter isn’t even close to the Biblical anomaly most people assume it to be.

In my hometown of Boston, for instance, last month registered as the 29th coldest January in the past 95 years.  In New York City, it was the 23rd coldest in the same period.

Nationwide, according to the Times, the mean temperature last month was, in fact, below the historical average over the past century.  By one-tenth of one degree Fahrenheit.

Meanwhile, a far more germane (and alarming) statistic concerns the Earth as a whole, for which this January was the fourth-warmest—yes, warmest—on record.

While this might surprise those in the Eastern U.S., folks in the West likely feel the opposite, since states there have faced temperatures that really are extreme—that is, extremely warm.  Parts of California have suffered a crippling drought stretching back several years, while otherwise tundra-like Alaska has been outright balmy, with temperatures regularly besting those of locales several thousand miles to the south.

What’s really going on here—that was our original question, wasn’t it?—is explained in an equally crucial New York Times piece, “Freezing Out the Bigger Picture,” by science writer Justin Gillis, who tersely notes, “Scientists refer to global warming because it is about, well, the globe.  It is also about the long run.  It is really not about what happened yesterday in Poughkeepsie.”

As Gillis goes on to write, we amateur meteorologists tend to refer to “weather” and “climate” as if they are the same thing, which they most decidedly are not.  The effects and the consequences of climate change can only be properly assessed and appreciated by examining the Big Picture, which necessitates tempering the narcissism and ignorance that come with viewing your own local habitat as a representative sample for a few hundred years’ worth of observation and research.

Once you do that, you realize the term “global warming” has always been a misnomer, since the ecological mayhem to which we humans have subjected our home planet has taken far more complicated forms than merely making everything a little bit hotter.

Climate change, or whatever you wish to call it, is a problem of extreme conditions of every imaginable sort—not simply extreme heat or extreme cold.

As such, if we insist on carrying on about local weather patterns being not quite what we had in mind, let us cease acting as if they bear any immediate relevance to the broader trends of the wider world, lest we make ourselves look like complete idiots.

Snow Daze

When you hire the cavalry, you expect it to show up.

In the wintertime, that means the men who drive the snow plows and the anti-ice machines, in order that the city does not grind to a complete halt, preventing everyone from going about their day.

When the cavalry doesn’t show up—on time or at all—that’s when you start asking questions.

This, in so many words, is the whole basis of government.  We agree, as a people, that we want our government to provide us with certain services upon which we can depend, and in return for these services the government may deposit a bill at the end of the year.

Like any such transactions, it’s all based on trust:  I will pay X dollars provided that you deliver service Y.  However, if you don’t, then I won’t.  You can’t get a squarer deal than that.

However, when it comes to the government, there’s a hitch:  Even when the authorities fail to deliver the services in question, we still have to pay our taxes, giving us the distinct impression that we’ve been robbed.  And that’s where much of our collective distrust in government begins.

The trick, and the challenge, is determining when such anti-government gripes are warranted and when they are not.

The citizens of the City of New York experienced a slight but irritating abridgment of their right to properly plowed roads last month, when the city’s anti-snow brigades failed to clear certain sections of Manhattan’s Upper East Side in a timely fashion following a particularly nasty storm.

Some residents of those neighborhoods felt deliberately slighted—an instance of a pro-working class mayor, Bill de Blasio, sticking it to the uber rich.  While such suspicions have hardly proved well-founded—de Blasio staunchly denied any political or socioeconomic motives for the city’s selective snow plowing—the broader point is perfectly legitimate, indeed.

After all, snow storms are an entirely predictable phenomenon in the Northeast.  Why shouldn’t New York be prepared for every possible contingency?  New Yorkers certainly pay their taxes on the assumption that it is.

A year ago this week, the city of Boston faced a more comprehensive and evenly-distributed lack of snow removal in the wake of Winter Storm Nemo, which dumped more than two feet of white powder on New England’s largest city.  Although the actual precipitation from this nor’easter tapered off by Saturday afternoon, Boston city schools had to be closed Monday and Tuesday, because the roads still had not been adequately cleared.

We in the Town of Beans were right to ask, “What the hell happened?”

The apparent answer, in both Boston and New York, was a boring but calamitous combination of poor preparation and poor coordination.  The cavalry was there, but those responsible for calling it into action were completely out to lunch.  There was no good excuse, and officials hardly troubled themselves to offer one.

Then there’s the case of Atlanta, which early last week was crippled by one of the most spectacular traffic jams in modern times, following a seemingly tame snowfall of no more than three inches.

Initially, we callous Northerners found the whole episode hilarious—the perfect illustration of Southerners’ ineptitude when it comes to facing winter weather.

Upon further review, what it actually illustrated was the toxic convergence of poor management with poor voting decisions.

As explained in an enormously helpful primer by Rebecca Burns in Politico, the Atlanta metro area is an unwieldy labyrinth of dozens of local municipalities, each with its own set of executives making decisions independent of all the others.  More so than most other major U.S. hubs, the government of the greater Atlanta region is decentralized to the point of making nightmares like last week’s traffic jam more or less inevitable.

But that’s not the whole story.

In 2012, the people of Atlanta were given the opportunity to approve a “Transportation Special-Purpose Local-Option Sales Tax,” which would have allotted $7.2 billion over ten years for a series of projects that, at least in theory, would have greatly improved and streamlined the city’s complex web of public arteries, and may well have lowered the risk of large-scale breakdowns like the one that just occurred.

The people of Atlanta rejected the initiative by a nearly 2:1 margin, suspecting the city would not spend the money wisely.

As such, however well-placed such fears might have been, it is equally fair to say that last week’s snowy traffic meltdown was not entirely the fault of government.  It was also, in part, a consequence of the public deciding a less gridlock-prone highway system was not worthy of their precious tax dollars.

As they say, sometimes you get what you pay for.  Or in this case, what you choose not to pay for in the first place.