National Shame

Should national pride still be a thing?  Is it good for us to eternally sing America’s praises, or should we finally give it a rest?

For no particular reason, this question has risen to prominence in recent weeks, in the form of a handful of disparate events across our great (or not great) land.  We’ve covered this territory before, but it would appear the issue has not yet been resolved.

It began at the University of California, Irvine, where the student-led Legislative Council voted to ban the display of the American flag—or any other flag—from the lobby of its building.

In a lengthy resolution explaining its decision, the group noted that because the U.S. flag “has been flown in instances of colonialism and imperialism” and that “flags construct paradigms of conformity and set homogenized standards,” it is necessary to remove said flag in order to foster “a culturally inclusive space.”

In other words, the American flag in inherently nationalistic and exclusionary, representing America’s sins while pretending to extol its virtues.  As such, to display it is to tacitly condone the entirety of American history—slavery, genocide and all.  As far as the UC-Irvine Legislative Council is concerned, we’d be better off without it.

As it happens, the anti-flag resolution was vetoed by the student government’s Executive Cabinet two days later, following an uproar that led the school’s administration to condemn the original motion as “misguided” and “not endorsed or supported in any way by the campus leadership, the University of California, or the broader student body.”

So the controversy is over at Irvine, but it certainly isn’t over everywhere else.  And it shouldn’t be, because several of the Legislative Council’s assertions about the flag, and patriotism in general, were absolutely correct, and we might as well fess up to them.

Like it or not, national flags are symbols of a particular set of ideals that, by definition, do not necessarily encompass the values and experiences of every last individual.

Like it nor not, the Stars and Stripes do represent the totality of the United States as a nation and an idea, dating back at least to 1776, if not 1607 or even 1492.

And like it or not, the story of America is an ugly one—a veritable horror show of racism, religious intolerance, ethnic hatred, extermination of Natives, subjugation of women and—as the Irvine group noted—the practice of imposing our way of life on foreign populations that did not ask for them.

Are we sure we’re proud of this?  Is it worth even implying that we are?

Sure, most people do not include any of the above when enumerating the reasons America is a great country.  When we talk (or sing) about being “proud to be an American,” we’re just thinking about the good things:  The First Amendment, free enterprise, due process, trial by jury, the Super Bowl, apple pie and so forth.

When it comes to the bad things that make America America, we compartmentalize and rationalize—two of our finest national traits—by insisting that while the United States has committed plenty of sins, they occurred a long time ago and we have learned our lesson and corrected course.  Done and done.

It is certainly appealing to think that the story of the United States is one of constant positive evolution—a breaking away from all our old habits into the actual beacon of liberty we have always claimed to be.  In this way, we regard our country like we do a child who does something wrong but then realizes his mistake and gradually becomes a better person.

The difference, however, is that children generally do not commit genocide against almost the entire Native American population, or systematically prevent all women and black people from voting.

It’s easy enough to be proud of America at its regal, idealistic best.  But it’s awfully hard to shrug away everything else without making yourself look like a damned fool or a mindless jingoist.

Don’t get me wrong:  I think it’s terrific and commendable that the United States has (mostly) abandoned its racist, sectarian past.  On matters of equality and civil rights, America has never been better.  That’s to say nothing of our superior technology, economy and armed forces.  We might not be the Greatest Country in the World in every category, but we’re well above average, and that’s something to be thankful for.

But as we are reminded every time a white police officer shoots an unarmed black civilian—or when our government unlawfully records our phone calls and e-mails, or systematically tortures prisoners—our country and our culture are not half as perfect as our constant displays of patriotism would suggest.  Many of our national successes are little more than the clearing of a very low bar.

While we have a right to be satisfied with clearing any bar—especially when so many other countries are content not to—we should more readily acknowledge our limitations and residual imperfections, and the fact that we’re not nearly as superior to the rest of the world as we think.

Perhaps this is what inspired the other recent micro controversy on this subject:  The debate at Lexington High School in Lexington, Massachusetts, about whether the theme of an upcoming school dance should be changed from “American Pride” to “National Pride,” so as to accommodate students whose families hail from other countries.

As with Irvine, Lexington’s irrepressible patriotism prevailed and the “American Pride” dance will go on as scheduled.  However, the very fact that there was a scuffle about it—in the town where the Revolutionary War began, no less—suggests that the notion of a more introspective and humble America is alive and well.

But we are left with the problem of national pride itself—regardless of which nation we’re talking about—and whether it should still exist.

I think the concept is silly and absurd, and that George Carlin was onto something in saying, “Pride should be reserved for something you achieve or attain on your own, not something that happens by accident of birth.”

In other words, even if America really were perfect, to be “proud” of it would imply that you, personally, had something to do with creating that perfection.  Since neither of those things is the case, all you’re really saying is that hundreds of years of trial-and-error living and governing by hundreds of millions of people has made America a really nice place to live, and you’re extremely happy that you happen to live here, too.

So why not just say that?  Why not be grateful for having the unbelievable luck of being born into a free, multicultural, pluralistic society and leave it at that?  Why get all uppity and arrogant about it, as if it’s necessary to assert something over and over again in order for it to be true?

To get a sense of how unappealing this sort of mindset can appear to outsiders, look no farther than Texas, where the Supreme Court is about to decide whether the Confederate flag can be stamped onto state license plates.  The case is being brought by the group Sons of Confederate Veterans, which claims that its free speech rights were violated when Texas refused to issue specialty license plates bearing the controversial Southern emblem.

As in past squabbles over whether the symbol should appear in public, the Sons of Confederate Veterans argues the flag represents “sacrifice, independence, and Southern heritage,” while opponents say it represents slavery and racism.

And of course, both sides are correct.  As the official logo of the Confederacy, it embodies a group of states that held millions of black people in bondage for centuries, right up until a terrible war put an end to it once and for all.

How is this any different from the American flag and what it represents?

We Northerners like to claim moral superiority on the grounds that we have always opposed slavery while so many Southerners seemingly still don’t.  On the other hand, groups like Sons of Confederate Veterans are adamant (however dubiously) that their continued “pride” has nothing to do with slavery, and meanwhile, although few slaves existed outside the South through 1865, Northern states greatly profited from their trade and have hardly been immune to racial tension ever since.

We shouldn’t let ourselves off the hook so easily.  It’s unseemly and it’s unwarranted.  There’s a reason that pride is one of the seven deadly sins, while shame is not.

A Frank Appraisal

I’d nearly forgotten how much I adore Barney Frank.

The Massachusetts lawmaker retired from Congress in January 2013 after 16 terms representing the state’s fourth House district.  He had kept relatively quiet in the two years since, but has suddenly been popping up in TV and radio interviews in conjunction with the release of his new memoir, Frank.

His reemergence into public life should function as a reminder of how unique, entertaining and indispensable he still is.

To many, Barney Frank may well be known simply as the co-author of the Dodd-Frank Wall Street Reform and Consumer Protection Act, which attempted to right the American economy amidst the Great Recession by dramatically shaking up the inner workings of the country’s regulatory agencies.

While Frank’s role as chair of the House Financial Services Committee will undoubtedly be a major component of his legacy as a public servant (for better or worse), his special place in my heart—and in the hearts of countless other government nerds—was secured through a lifetime of advocacy for causes and principles that precious few other congressmen have ever bothered to take seriously.

And—it must be said—for his being such a cranky, insufferable firewall against those who have stood in his way.

As a Massachusetts Democrat, Congressman Frank was, in some ways, completely predictable.  On matters of policy, he took an unambiguously liberal view on nearly every issue, from economics to foreign policy to climate change to abortion.

But it wasn’t just that he held clear political stances and stuck with them (rare as that is nowadays).  It’s that he defended his worldview with guns blazing, arguing for his side until his throat grew hoarse—often to the point of rudeness—never giving an inch and never entertaining any doubt that, in the end, he was right.

Specifically, Frank made himself a champion of two would-be lost causes:  Government and liberalism.  That is to say, on the former, he advocated not merely for his own particular government-led solutions to various national ills, but also for the notion that government should be in the business of helping people whenever it possibly can.  On the latter, he not only gave voice to left-wing ideas, but to liberalism itself as a noble means of seeing the world and running the country.

In short, he was (and still is) a big government Democrat and damned proud of it.

For any left-wing politician, this should go without saying.  But it doesn’t.

Unlike most Republicans in Washington, who fall all over each other to claim themselves as the most “conservative” person in the room, today’s Democrats do a fairly rotten job of sticking up for their own brand.  As Frank himself has disapprovingly observed, most Democrats attempt to have it both ways by championing government programs but then echoing the GOP mantra that government should be as small as humanly possible.

They do this out of fear—namely, fear that voters are too conservative to ever be sold liberalism as a governing philosophy.  They have effectively ceded the moral high ground that, in the Roosevelt and Johnson eras, liberalism so firmly held.

Instead, they have adopted non-ideological centrism as their M.O.—a tactical approach that, to be sure, helped to elect Bill Clinton and Barack Obama to four combined presidential terms, but which has also left the party vulnerable to the charge that it doesn’t believe in anything except winning elections.

Barney Frank had no truck with this lame political maneuvering, and instead took the gamble that he could convince people that his left-wing views were the right ones, not least by showing that he believed in them himself.

Indeed, when speaking on issues about which he was passionate, he was seemingly a man without fear.  Even when he knew his position was unpopular—and he certainly had a knack for skirting popularity—he went right ahead to make his opinion clear.  Morally speaking, he didn’t care if he was the only one stumping for this or that cause.  He was determined to say what he truly thought and shape America into what he dearly wanted it to be.

The results were mixed.  In his 32 years in Congress, Frank notched some glorious victories and some devastating defeats.  The real challenge—for him and for any intellectually honest public figure—was to emerge from a lifetime of political and ideological battles with his dignity intact.  On balance, he succeeded.

At this moment, it’s worth appreciating just how difficult it is for a lawmaker to remain true to his convictions while also logging some genuine legislative accomplishments along the way.  For most congressmen, it’s one or the other:  Either you hold firm to your principles and get nothing done—newly-minted presidential candidate Ted Cruz is a sterling example—or you bend and compromise, effecting laws that are not quite what you had in mind but are, under the circumstances, good enough.

In fact, Frank spent a great deal of his tenure bowing to certain political realities, acknowledging that politics is always a mixture of idealism and pragmatism and that intractable opposition cannot simply be wished away.  When push came to shove, he would opt to cut a deal with Republicans to get half of what he wanted, rather than obstinately sticking to his guns and ending up with nothing.

He tried hard not to make the perfect the enemy of the good, and it resulted in an awful lot of good.

The key, through all of it, was that Frank almost always came clean to his constituents as to why he acted as he did.  This would often require an explanation similar to the one I just gave—that politics is the art of the possible—and if the voters didn’t accept that, it was just too bad.

Frank has always prided himself on intellectual honesty, and on the basis of his collected public statements over the years, there may be nothing he despises so much as disingenuousness and hypocrisy—character traits that he still takes enormous joy in calling out.

To wit:  Before his surname became synonymous with financial reform, there was such a thing as the “Frank Rule,” which stated that a congressman who was secretly gay could be “outed” by others if said congressman publicly opposed gay rights and/or supported anti-gay legislation.  As Frank put it, “The right to privacy does not include the right to hypocrisy.”

In a fair way, the Frank Rule is where all the elements of Barney Frank’s awesomeness converge.  It demonstrates his searing disdain for double standards—the practice, in this case, of a lawmaker privately engaging in behavior that he publicly condemns.  It underlines Frank’s penchant for loudly and consistently condemning such conduct when it occurs.

In addition, it alludes to Frank’s outsized concern for ordinary people—especially members of minority groups—who are left vulnerable by unprincipled politicians who consider themselves to be above the law.

And, of course, it concerns the most important cause of Frank’s life and career:  Legal equality for gays.

Frank was America’s second openly gay congressman.  When he came out in 1987, the most pressing civil rights issue was amending the Immigration Act of 1965, which had classified homosexuals as “sexual deviants” who could be denied entry to the United States.  Same-sex marriage was scarcely an idea, let alone a reality.

While Frank has not been personally responsible for every civil rights victory in the quarter-century since, his fingerprints are everywhere, and his public oratory in defense of legal equality for gay people is among the most arresting and passionate as that of any public figure.  In an interview shortly after retiring, he cited the repeal of Don’t Ask, Don’t Tell, on which he played a part, as possibly the finest moment of his career.

And his concern for fellow gays is really just one component of his work to secure civil rights for all oppressed groups, itself motivated by his most zealously-held, and seemingly contradictory, belief:  That people should be left the hell alone by the government.

For all his true blue liberalism, Frank is a social libertarian of the first degree, defending the right of individuals to engage in any activity they want, provided that it doesn’t directly harm anyone else.  For him, this includes the right not just to marriage but also to gambling, to drug use, to prostitution and, fittingly, to free speech.  When the Westboro Baptist Church came under fire for its anti-gay demonstrations at the funerals of soldiers, Frank was one of only three congressmen to side with the church, arguing that even rank homophobia is not a sufficient cause to stifle free expression.

This is precisely the sort of nerve and political boldness of which Congress has been deprived since Frank departed its storied halls, and of which it could not possibly have enough.

We need more public servants like Barney Frank to defend the lost causes that will always need a champion.  For the time being, we can be thankful that, even in retirement, we still have Barney Frank himself to fill the role.

Here, There and Everywhere

From a new survey on sexuality in the United States, two conclusions can be drawn.

One:  San Francisco is still the gayest city in America.

And two:  Everywhere else is tied for second place.

OK, the latter is not precisely true.  But it’s pretty darned close, and it serves as a critical wake-up call for those who think they know how sexual orientation works and are mistaken.

It is often thought that our country’s demographics are segregated by geography—that different regions are populated by different types of people.  Sometimes this assumption is true.  However, here is an instance in which it could not be more false, and it is far past time for us to acknowledge it loud and clear.

The new study is from Gallup, which sought to measure the percentage of self-identified gay, lesbian, bisexual and transgender people in the 50 largest metropolitan areas in the United States.  To no one’s surprise, the region in and around San Francisco came in first, with 6.2 percent of its residents falling under the LGBT umbrella.  Portland, Ore., was second with 5.4 percent, followed by Austin, New Orleans and Seattle to round out the top five.  (My hometown of Boston was sixth.)

Meanwhile, the metro area of Birmingham, Ala., boasted the lowest proportion of publicly LGBT people, with 2.6 percent, followed by Pittsburgh, Memphis, San Jose and Raleigh, N.C.

Viewing the complete results of Gallup’s poll, one could conceivably devise any number of theories about America’s gay, bisexual and transgender population and how it is distributed from one coast to the other.

My own takeaway is as follows:  Gay people are everywhere, and in almost equal amounts.  Whereas black people are disproportionately concentrated in the Deep South, and Jews are most plentiful in New York and Southern Florida, sexual orientation does not discriminate based on geography.  A baby born today has the same probability of being gay—or straight or bisexual—no matter where in the United States he or she is born, and Gallup has just proved it.

At this point, you would be right to cast a skeptical eye on such a claim, since the numbers I have just quoted would seem to suggest the opposite.  If the Bay Area has nearly 2.5 times as many self-identified LGBT folks as Birmingham, shouldn’t we assume that sexuality is, in fact, a byproduct of one’s environment?

No, we shouldn’t, and the key is in the term “self-identified.”

You’ll note that Gallup here has made absolutely no attempt to calculate the actual number of gay people who live in different areas of the United States.  In fact, it would be nearly impossible to do this with any accuracy, since there are so many gays and lesbians who prefer to keep their sexual identity a secret—not least from poll takers, who are duty-bound to take respondents at their word.

So long as a significant proportion of the LGBT contingent remains in the closet—a group whose size, by definition, we can never know for sure—any answer to the question, “How many gay people are there?” will remain elusive.

Our best available option, then, is to take the limited information we have and engage in a bit of learned conjecture.

To wit:  Gallup informs us that, percentage-wise, the area around Portland, Ore., contains roughly twice as many openly LGBT people as Birmingham.  Now tell me something:  Knowing what we know about both places, is a closeted gay person living in Birmingham equally likely to come out as is a closeted gay person living in Portland?

(Hint:  That was mostly a rhetorical question.)

Indeed, why would any gay person in Birmingham come out if they could possibly avoid it?  The state of Alabama has certainly made the notion of living openly as a gay person as unappealing as possible.  Last month, for instance, when a federal judge ordered the state to issue marriage licenses to same-sex couples, the Alabama Supreme Court immediately overruled that decision, rendering some probate judges so flustered that they stopped issuing marriage licenses to anybody.

That, just for starters, is the toxic atmosphere that a gay person faces in the Deep South, a region where public support for gay equality lags far behind the country as a whole—let alone a place like Portland, which, by contrast, has held its annual Pride Festival every year since 1994 and which elected an openly gay mayor in 2008.

So of course there are twice as many openly gay people in Portland as in Birmingham:  The latter gives its residents every reason in the world to remain in the closet, while the former provides an environment as safe and as welcoming as anywhere in the country.

As such, when we learn that 2.6 percent of Birmingham is openly gay, we can only wonder about the untold scores of Birminghamians who are in the closet and, for reasons of self-preservation, have no immediate plans to slip out—men and women who, had they grown up in a place like Portland, likely would have publicly embraced their true selves years ago.

Again, we have no meaningful way to ascertain precisely how many of these poor people there are, but I would be amazed if the number isn’t substantial.

My inkling is that there are just as many members of the LGBT community growing up in Dixie as there are in the Mid-Atlantic and the Pacific Coast.  That if you factored in every municipality’s down-low gays with its out-and-proud gays, the numbers would be roughly equal from one town to the next.

If you insist on more concrete evidence for this hypothesis, you need only look slightly deeper into Gallup’s own data.

Note, for instance, how No. 11 on Gallup’s list, Louisville, Ky., is 4.5 percent openly gay, while No. 43, Milwaukee, Wis., is 3.5 percent openly gay.  As it happens, the survey’s margin of error is +/- 1 percent.  That means that two-thirds of the entire sample—and, by extension, two-thirds of the country—is in a statistical tie on this metric.  And that’s before any of my fancy sociological theories come into play.

Long story short (too late?), there is no credible argument that being born in a certain place makes you more or less likely to be gay.  Period, full stop.

The reason this matters—the reason we must recognize that human sexuality knows no geographical boundaries—is that it serves to counter the idea—implicit in so much of our legislation and rhetoric—that homosexuality can somehow be contained, if not fully stamped out.

While it has been left to other, more authoritarian countries to attempt to literally eradicate would-be sexual deviants—namely, by making their bedroom activities punishable by death—American anti-gay lawmakers are similarly obsessed with the notion that gayness can be made to go away—in this case, by nudging it out of places where it isn’t welcome, such as Alabama, and into modern-day Sodoms and Gomorrahs like Boston and Seattle, which will just have to deal with the hellfire that will inevitably follow.

And this would be fine—an illustration of the wonders of federalism in a heterogeneous society—if homosexuality only existed in blue states, or if every gay person had the ability to pack up and move upon realizing who they really are.

But, alas, that’s just not how it works.

Gay people are everywhere, as are the bisexual and the transgendered.  You can try as hard as you can to push them out of places like Alabama, but they will just keep on being born.  So all you’ve really done, then, is made your state a hotbed of hostility and ignorance toward a group of people who are never going away.  People who, sooner or later, may decide that being targeted and discriminated against for the crime of existing isn’t quite as much fun as it sounds, and will seek other accommodations.

It is a fundamental law of human nature that people will allow themselves to be unjustly victimized for only so long before insisting that their basic dignity be respected.  The police department of Ferguson, Mo., learned this the hard way with respect to black people.  Is it too much to ask that the residual injustices toward gays be resolved with a little less violence and drama?

We had better hope not.  We can’t all move to San Francisco.


It’s exactly as Alfred Hitchcock described it.

There’s a bomb under the table and it explodes.  That’s surprise.

There’s a bomb under the table and it doesn’t explode.  That’s suspense.

At the Boston Marathon on April 15, 2013, two bombs went off without warning, killing three and injuring 260.  That’s surprise.

In raw video released last week, we see the bombers getting into their positions.  We know where the bombs are, and we know that, sooner or later, they’re gonna go off.  We see throngs of Bostonians milling about, having a good time.  They have no earthly idea that something terrible is about to happen, but we do.

That’s suspense.

Included in the barrage of film footage that prosecutors presented to the Tsarnaev jury and the public beginning last Monday—a series of clips from before and after the explosions—is the surveillance video from outside Forum, the restaurant where the second bomb went off.

This segment—four and a half minutes long—is eerie both for what it shows and what it obscures.  We see Dzhokhar Tsarnaev approach and stop, carrying a backpack, eyeing the race that’s still in full swing.  We see him make a brief cell phone call (to his brother, apparently) and drop his backpack to the ground.  Suddenly, there’s a tremor and every head turns to the left—except for Dzhokhar, who, in a sudden hurry, shuffles away to the right.  About 15 seconds elapse, as everyone in view tries to register whatever just happened down the street.  Then, in the very last frame of the clip, we see a flash, and the screen goes black.

It’s become a cliché to hypothesize that Hitchcock would appreciate this or that film, or sequence, or some cinematic technique or other.  Indeed, for so long has Hitch been synonymous with the very concept of screen suspense that his presence is felt any time it is done well.

As it turns out, the Boston Marathon attack was one such instance.  We often say that art imitates life, but every now and again, it’s life that imitates art.  The surveillance footage of Boston’s Public Enemy No. 1 in his final moments as a private citizen, kept under wraps until now, could not have been creepier if the Master of Suspense had directed it himself.

Anyone with a keen sense of dramatic irony will appreciate the contrast, in those four-odd minutes, between the relaxed merriment of Boylston Street on Marathon Monday and the obscene, horrible carnage that was wrought by Tamerlan and Dzhokhar Tsarnaev on Patriots Day 2013—the manner in which that one city block, otherwise filled with so much cheer, was suddenly, silently occupied by the presence of evil.

The clip I’ve described, you’ll note, contains none of that carnage, and is not as viscerally sickening as any number of videos and photographs of the attack that have been freely available for the past two years.  Rather, it is sickening in a more indirect and unusual way:  It disturbs us because we find ourselves in a position of knowing a grave crime is about to be committed, yet we are powerless to stop it.  And so our access to it is purely an act of voyeurism—the most Hitchcockian of all sins.

Surely it is also a sin to speak of a tragic real-life event as if it were a movie.  Hitchcock himself was an entertainer above all else—“I enjoy playing the audience like a piano,” he once said—and “entertaining” is not exactly a word that leaps to mind regarding a terrorist attack that killed an eight-year-old boy (among others) and blew off the limbs of 16 survivors.  As with the September 11 attacks, being able to speak of it in such detached, unemotional terms is a luxury of those who weren’t actually there.

But the Marathon bombing was a public tragedy, in addition to being several hundred private ones, and we onlookers cannot hide our grisly fascination with the various forces that brought it about.

Besides, it is probable that someday a narrative film will be made from the ruins of the modern-day Boston Massacre, allowing us to consider it in cinematic terms for real.  It’s anyone’s guess how this hypothetical movie might approach the event it depicts.  It could focus—in the vein of Oliver Stone’s World Trade Center vis-à-vis 9/11—on one or several of the men and women who saved the day—the cops and medics who, amidst mass chaos and confusion, transported victims from the sidewalk to the hospital.  It could focus on the victims themselves and the effect of their horrific injuries on their lives and careers.

Or, as I would prefer, it could take the gamble that Paul Greengrass did in making United 93, which approached 9/11 by depicting it more or less as it actually occurred—specifically, by following the hijackers of United Airlines Flight 93 all the way through the day, from them praying in their hotel rooms until the moment the plane crashes in the Pennsylvania woods following a struggle between the terrorists and some extremely brave passengers.

United 93 was premised on two risky assumptions.  First, that 9/11 was so inherently compelling that no additional drama was necessary.  And second, that even terrorists deserve to be treated as three-dimensional human beings, and can be presented as such without diminishing the wickedness of their actions and ideas.

As it happens, the Tsarnaevs claim to have attacked Boston for approximately the same reason Osama bin Laden and company attacked New York and the Pentagon:  As retribution for American involvement in the Middle East.

Maybe, then, it would be appropriate to depict the former like the latter:  As men who have firm religious convictions and follow through with them by murdering innocent people, which they justify by insisting their victims are not innocent at all.

To us, this is completely insane and not worthy of our attention.  Unfortunately, we don’t have a choice, because they’ve already gotten our attention and we can’t pretend their actions come from nothing.

They say the best revenge is to live well.  Similarly, the best way to combat evil is to do good.  Western culture is superior to radical Islamic culture not because we say it is, but because the former saves lives while the latter destroys lives.  This was certainly the case in Boston, where so many onlookers to the explosions found themselves running toward the fire instead of away from it, not allowing their fellow citizens to bleed to death on the street.

The bombers, meanwhile—with their supposed high ideals and dreams of martyrdom—didn’t even have the nerve to stay with their bombs.  They ran away.  As if being heartless and homicidal weren’t enough, they were also cowardly.

So if we ever turn the whole ordeal into a film, let’s keep it simple by sticking to the facts of the case—not because historical accuracy is paramount (it isn’t), but rather because the facts of the Marathon attack are more interesting than anything Hollywood could make up.  That’s how you beat the terrorists:  By exposing them as the worthless losers that they are.

In this instance, truth is more compelling—and more suspenseful—than fiction.

Hitchcock would approve.

E-Crimes and Misdemeanors

Here’s the thing about the Hillary Clinton e-mail story:  Clinton flagrantly broke a rule.  And if there’s one thing we know for sure about Hillary Clinton, it’s that she is an absolute stickler for the rules.

This is, except in her own case.

Let me take you back, if I may, to the Michigan presidential primary of 2008.  The state scheduled its vote for January 15—three weeks earlier than Democratic Party rules allowed.  As punishment, the Democratic National Committee stripped Michigan of its delegates to the Democratic Convention, rendering its primary meaningless and leading most candidates—including Barack Obama, John Edwards and Joe Biden—to remove their names from the ballot.  However, Clinton opted to remain on the ballot and, having no opponents, won the primary.

Smash cut to May 31, when party leaders held a meeting and decided to seat Michigan’s delegates after all, because, well—forgive and forget, right?

This meant, of course, that the DNC was retroactively changing the rules—to the benefit of Clinton and to the detriment of everyone else.  Obama, Edwards and Biden had withdrawn from Michigan on the understanding that the primary wouldn’t count and therefore wasn’t worth contesting.  Had they known the DNC would change its mind after the fact, they obviously would have campaigned differently.

Meanwhile, Clinton suddenly won a whole batch of free delegates for no reason except that she deliberately violated the spirit of DNC rules by staying on the Michigan ballot while the rest of her party turned its back.  When asked how, under these circumstances, the Michigan primary could possibly be considered fair, Clinton responded, “We all had a choice as to whether or not to participate in what was going to be a primary, and most people took their name off the ballot but I didn’t.”

You see, this is how Hillary Clinton thinks.  Following protocol is all well and good, but if there is any way around the rules—namely, one that can guarantee a leg up on the competition—you’d be foolhardy not to try it.  It’s a principle that any professional athlete would be well-acquainted with.

Which brings us back to the present, and this silly business about her e-mails.

As it turns out, for the entire time that Clinton was America’s secretary of state—and unlike every other secretary of state—she conducted all e-mail correspondence through a personal account, rather than an official government server.  What is more, none of those thousands of messages was copy-and-pasted into the official record at the time they were sent.

The reason this is a problem is that—at the risk of sounding like a total prude—the latter of those things was, and is, against the rules.

In point of fact, there is such a thing as the Federal Records Act, which since 2009 has said, “Agencies that allow employees to send and receive official electronic mail messages using a system not operated by the agency must ensure that federal records sent or received on such systems are preserved in the appropriate agency recordkeeping system.”

We know that Clinton’s State Department did no such thing while Clinton was there, because when this issue was belatedly brought to the department’s attention and made public in recent days, Clinton’s aides suddenly went into overdrive to sift through some four years’ worth of files.

Which means one of two things is true:  Either Clinton (and everyone on her staff) was unaware of the requirement to archive all of her relevant e-mail messages, or she was aware of it but hoped that nobody else was.  Considering that Clinton is a lawyer and, by all accounts, one of Washington’s most legendary control freaks, one would do well to place one’s bets on the latter.

To be clear, there is absolutely no mystery as to why Clinton—or anyone—would want to bury all of her messages in a private account.  Given the choice of making all of your work-related e-mails public versus strategically deleting any messages that might make you look bad…well, we don’t even need to finish that thought, do we?

However, so far as the Federal Record Act is concerned, the secretary of state does not have a choice in the matter.  She is mandated to save everything for the historical record.  We can argue about whether this policy is good or bad, but we cannot deny that it is, in fact, the policy.

Nor can we deny the obvious advantage Clinton gave herself by flaunting it for her entire term.  Because she has exclusive control over her personal e-mail account, she can cherry pick all she wants without anyone (except maybe the NSA) knowing what she might be hiding.

When she tweeted last week, “I want the public to see my email,” it had all the intellectual honesty of President Obama’s responding to the Edward Snowden NSA leaks by saying, “I welcome this debate.”  If the president desired a discussion about government spying, he sure went into a lot of trouble to ensure that one never occurred.  And if Clinton truly wants her diplomatic correspondence made public, she has nonetheless exerted an awful lot of effort over the past six years to keep it private.

Who are you going to believe—Hillary, or your own eyes?

Having said all of the above, this is probably a good time to note—as so many Clinton supporters have—that shielding four years’ of e-mails from public view is not a capital offense.  It’s not a Watergate-level act of deception, it’s not indicative of a state department rotting from the inside out, and it does not disqualify Clinton from running for president of the United States.

What it does is merely remind us that Hillary Clinton is an abnormally secretive and disingenuous person who will bend (if not break) the rules at every possible opportunity.  We can go ahead and elect her president, anyway—there have been dishonest people in the Oval Office before—but if we do, we should be under no illusions about what we would be getting.

Killing a Ghost

What if Tamerlan had survived?

What if the Boston Marathon bombing had occurred exactly as it did, but without the older of the two perpetrators getting mortally wounded during the would-be getaway?

What if both Tsarnaev brothers had been captured alive, instead of just one?

What if they were both on trial, either separately or together, giving us the chance to extract all the justice that could be got from their crimes, not just half of it?

It’s a shame, really, that Tamerlan Tsarnaev found himself on the losing end of a firefight with police and then run over by the getaway car driven by his brother, Dzhokhar.  As unpleasant as that must have been, it allowed him to effectively cheat the system.  He should be rotting in a prison cell, not six feet underground.  At least not until a jury has had its say.

While all “what if?” speculation regarding the Marathon attack is ultimately meaningless, it’s worth pondering in light of the actual trial that began last week—the one that charges Dzhokhar Tsarnaev with 30 counts of criminal conduct, 17 of which could lead to his being executed by the state.

The defense team’s grand strategy—at least on the basis of its opening statements—is to blame everything on Tamerlan, whom it will portray as the Marathon plot’s mastermind with an overpowering influence on his otherwise laid-back younger brother.

It’s a terribly convenient approach, insofar as Tamerlan will be in no position to refute it.  Truly, he can’t seem to catch a break:  First he was thrown under a car.  Now he’s being thrown under a bus.

The defense surprised many people on the first day of the trial by conceding that its client is, in fact, guilty of all charges.  The idea, according to experts, is that if Team Tsarnaev doesn’t contest Dzhokhar’s guilt, it can more easily persuade the jury not to sentence him to death.

It’s a fascinating legal gamble, but what makes it even more interesting is how much it depends on Tamerlan’s being dead.  Just imagine how different the trial would be if he weren’t.

Were both brothers equally alive, and thus equally at the mercy of the U.S. judicial system, the defense team would presumably not be attempting to exonerate one by scapegoating the other.  While it may well have painted Dzhokhar as a reluctant supporting player in any case, its portrait of Tamerlan would have been a touch more nuanced and sympathetic than the one currently on display.  How one might square that delicate circle is anyone’s guess, and it’s too bad we won’t ever find out.

But really, the more compelling counterfactual in this mess lies with the prosecution and how it would have presented a case against both brothers, rather than just one.

Let’s not kid ourselves:  The main reason Dzhokhar’s lawyers are insisting the Marathon attack was Tamerlan’s idea is because it’s true.

As all available evidence has made clear, it was Tamerlan who “self-radicalized” into an anti-Western Islamic extremist, while Dzhokhar kept his religious views so private that many of his close friends didn’t even know he’s Muslim.  It was Tamerlan who retained a kinship to the former Soviet Union, where he grew up, and never fully assimilated into American culture, whereas Dzhokhar became a naturalized U.S. citizen in 2012.  It was Tamerlan who boasted a long history of violent behavior, including an alleged triple homicide in 2011, whereas Dzhokhar’s pre-Marathon record was clean.

While none of this excuses Dzhokhar’s eventual descent into terrorism and mass murder, it underlines the uncomfortable fact that one of these two losers was more morally repulsive than the other, and that the brother who is most deserving of a public accounting for his crimes is also the one who will never get it, because he’s already dead.

We have been left to deal—literally—with the lesser of two evils.

Which brings us—as it must—to the small matter of capital punishment.

Considering that the Justice Department has sought the death penalty for Dzhokhar, we can safely assume it would have done the same for Tamerlan, who by all appearances would have been a more urgent candidate for it.

But I wonder:  Might prosecutors have stopped with Tamerlan and settled on a life sentence for Dzhokhar, on the understanding that the former’s conduct was (slightly) more irredeemable than the latter’s?  And, even if both were on the hook for execution, wouldn’t the jury be far more likely to endorse it for Tamerlan than for Dzhokhar?

In buying the theory that Tamerlan was the ringleader while Dzhokhar was a loyal foot soldier, wouldn’t a jury—and, by extension, our whole society—be content with executing just one of them?  Wouldn’t that be a reasonable means of making a point about punishing evil without getting carried away—a way to balance justice with mercy?

But of course we don’t have that option, and that leads us to a discomforting possibility:  We are going to punish Dzhokhar as if he were his brother.  We are going to harness the righteous ire meant for two people and unleash it upon the only one who is available to receive it, because, dammit, somebody’s gotta die for what happened on April 15, 2013, and if it’s the least-deserving of the two, that’s close enough.

In other words, we are in danger of blurring the line between justice and vengeance.  We are prepared to execute a criminal not so much because he deserves it as because it would make us feel better.

This is problematic, because it gets perilously close to turning Dzhokhar Tsarnaev into a martyr.  We would be killing him in retribution for his brother’s crimes, in addition to his own.  In effect, he would be dying for another person’s sins.  Are we sure that’s what we want him to be remembered for?

But then that’s always the risk you run when capital punishment enters the equation:  Lending pity to someone who deserves none.

Among the reasons that Tamerlan’s premature death is a shame is how it robbed us of a more honest reckoning with capital punishment itself.

Like Timothy McVeigh, the most infamous death row inmate in modern times, Tamerlan presents a real challenge to death penalty opponents such as myself:  He killed and maimed indiscriminately—in a setting containing a large number of families with young children, no less—and his guilt is absolutely beyond doubt.  Combine this with his long criminal record and his unambiguously adult status (he was 26; Dzhokhar was 19), and you have about as clear-cut a death row candidate as you’re ever likely to find.  Indeed, so long as the death penalty still exists at all, how could you possibly justify forgoing it here?

Having Tamerlan in the courtroom would have allowed us to fully hash out the Marathon tragedy once and for all.  To have all members of civilized society look evil in the face and arrive at some kind of catharsis.

Instead, we have to settle for the second-most reprehensible specimen to inflict himself on the people of greater Boston, and to do with him what he has coming.

Let us just make sure that that is all that we do.

Blood For Oil

Every now and again, a large bridge collapses somewhere in America.

It’s just a shame it doesn’t happen more often.

At least that’s the conclusion I drew after watching John Oliver’s report on infrastructure on last Sunday’s episode of Last Week Tonight.

As we all know, the highways and byways that physically hold our country together are in a fairly wretched state.  Our roads, bridges, tunnels and dams are slowly crumbling beneath our feet and we are doing precious little to stop it.

According to a 2013 report, for instance, roughly 11 percent of all U.S. bridges have been deemed “structurally deficient,” meaning they require “significant maintenance, rehabilitation or replacement.”  When asked if the 66,000 bridges in question were “unsafe,” former Transportation Secretary Ray LaHood rather unhelpfully answered, “I don’t wanna say they’re unsafe, but they’re dangerous.”

The primary source of federal infrastructure money—as Oliver went on to explain—is the U.S. Highway Trust Fund, which collects 18.3 cents for every gallon of fuel consumed by American cars.  It’s a fair enough trade:  If you drive on our country’s highways, then you should help pay for their upkeep.

The trouble is that 18.3 cents per gallon simply isn’t enough to cover all the work that needs to be done, and the Trust Fund is about to go bust.  The main reason is that the tax rate has not been raised in 22 years.  In effect, we are maintaining our federal structures based on the economy of 1993, when gas was just over a dollar per gallon and 18.3 cents was actually worth something.

What we should do—if we insist on linking the Trust Fund to fuel consumption—is allow the gas tax to rise automatically with inflation, so that its real value remains constant.  Many experts recommend this, and several states already take such an approach to state gas taxes.

But we don’t do this, and there is absolutely no mystery as to why.

In point of fact, politicians are terrified of raising taxes of any sort, because they understand a central truth about the psychology of the American voter:  We refuse to pay for anything that doesn’t seem completely necessary at the exact moment that we pay for it.  And we are certainly not, in any case, going to make investments that might not eventually benefit us personally.

Indeed, we Americans are nothing if not selfish and short-sighted—two character traits that will prove our downfall here, as they have on so many other occasions.

To wit:  Pretty much everyone believes in highways that are reasonably well-paved and dams that don’t spontaneously burst open and drown the entire Pacific Coast.  Everyone understands that those things cost money and that only the government can handle it.

But then, when you follow the logic to its natural conclusion, suddenly everyone starts coming up with reasons why infrastructure isn’t such a pressing concern after all.  Or, alternatively, insisting that the government find a different way to fund it.  Namely, one that spends other people’s tax dollars without spending one’s own.

You can argue all you want that an increase in the federal gas tax will yield greater infrastructure capital, but to most people, all it means is a costlier drive back and forth to work, and who wants that?  Even if you succeed in drawing the line from higher gas prices to safer highways, how do we know that that money will benefit the roads that we drive and the bridges that we cross?  Why should folks in New England invest in the well-being of some crumbling interstate in the Midwest?

That’s why it will probably take a few more high-profile bridge collapses for the whole concept to really kick in.

As with most other forms of insurance, people do not like to put money down to protect against something that may not ever happen.  Better—and cheaper—to just hope that everything will work out for the best.  And if it doesn’t?  Well, that’s what grandchildren are for.

Alas, often the only time people truly care about disaster prevention—let alone routine maintenance and inspection—is when a disaster actually occurs, and the same principle applies to the country as a whole.  It’s why our “national conversations” about gun control only break out in the aftermath of some horrid school shooting, or why we ponder hurricane preparedness only once the citizens of New Orleans are waist-deep in toxic sludge.

And sometimes not even then.

We like to think that large-scale calamities, be they natural or manmade, only ever happen to other people.  If we assumed, instead, that they will eventually happen to us, perhaps we wouldn’t mind tossing a few extra bucks into the hat.  You know, just in case.

Unfortunately, not enough of us think like that, which means our elected representatives will have little choice but to follow our lead, kicking the can further down whatever’s left of the road.

America’s infrastructure will continue to get worse before it gets better, and a lot more people will needlessly be killed because we’ve decided, as a people, that it’s not worth paying more for gas in order to save them.

That’s what “national values” are all about, and we might consider improving upon them in the future.  One more bridge for us to cross.

Oscar Soapbox

Would it be considered a lost cause to complain about the mixing of politics and the Oscars?  Is it just too late in the game for us to do anything about it?

Probably.  But every losing issue needs somebody to argue it for the last time, and on this occasion, that person might as well be me.

From this year’s Academy Awards, broadcast a week ago Sunday, arguably the most admired moment came from Patricia Arquette, the winner of Best Supporting Actress, who devoted the final chunk of her acceptance speech to call for equal pay for women.  “We have fought for everybody else’s equal rights,” said Arquette.  “It’s our time to have wage equality once and for all and equal rights for women in the United States of America.”  The remarks yielded howls of approval inside the Dolby Theatre and wide support on the interwebs in the hours and days thereafter.

Indeed, I can’t say I have any quarrel with the substance of Arquette’s remarks.  While I think the specific issue of wage equity is slightly more complicated than it appears—not every case is a matter of out-and-out discrimination by an employer—it’s just about impossible to dispute the principle of equal pay for equal work.

Here’s my question:  What does this have anything to do with the Oscars?

In theory, the Academy Awards are nothing more than the recognition of the film industry’s best work in a given year, as determined by members of the industry itself.  Acceptance speeches by the winners are meant to be exactly that:  A show of gratitude for having been singled out by one’s peers.  And—as has become the practice—an opportunity to thank everyone who helped get them there in the first place (which, as we know, tends to be everyone the honoree has ever met).

As such, Oscar speeches, at their best, are exercises in humility—ironic as that sounds, considering that the speakers are effectively being crowned kings and queens of the universe, or at least of the American culture.

To that end, my own favorite moment from last Sunday was Eddie Redmayne winning Best Actor for his performance as Stephen Hawking in The Theory of Everything.  Although I thought Michael Keaton slightly more deserving of the honor for his work in Birdman, I sort of hoped Redmayne would win, anyway, because I figured (from his previous wins this year) that he would react exactly as he did:  By jumping up and down like a giddy schoolgirl, completely overwhelmed.

There’s a certain feigned modesty that many British actors have turned into a shtick, but with Redmayne—33 years old, with no major starring roles until now—you sense that the gratitude is real.  That he works hard and takes his job seriously, but never in a billion years expected to wind up on the Oscar stage, and knows precisely how lucky he is.  That in a Hollywood overstuffed with jerks and prima donnas, Redmayne is one of the good ones.

That’s what the Oscars are all about:  Giving a moment in the spotlight to stars whose very existence elevates show business to something pure, noble and joyous.

And joy, it must be said, was oddly hard to come by during the balance of the Oscar telecast.  We had Best Song winners Common and John Legend lamenting the continuing racial injustices in the American legal system (and elsewhere).  We had Dana Perry, producer of the documentary short Crisis Hotline: Veterans Press 1, invoking her son’s suicide in a plea for more public discussion of the subject.  We had Imitation Game screenwriter Graham Moore citing his own brush with suicide and begging today’s tortured young people not to give up hope.

Sheesh, what an unholy string of letdowns.

Surely, these are all deathly important issues that deserve a thorough public airing, as they all surely have in recent times—albeit some more visibly than others.

But is the Dolby Theatre on Oscar night really the proper setting for them?

Can’t the Oscars just be the whimsical, frivolous, bloated Hollywood orgy we all think we’re tuning in to on the last Sunday of every February—curled up, as we are, on the couch with a tub of microwave popcorn and a cosmo?

We deal with the discomforting horrors of real life at all other moments of the year.  Why can’t the Oscars, of all things, be a temporary respite?  Arguably the single central function of movies, after all, is escapism.  Shouldn’t the event that celebrates movies follow suit?

Movie stars can, and do, stake out public opinions on any issue that interests them.  But must they do so at the very moment when most of us would just as well not be reminded of the fraught and complicated real world to which we must return in the morning?

I know this is a line of reasoning with holes large enough to drive a tank through.  I know movies are not only about escape.  I know the Oscars represent the largest audience that any artist will ever have.  I know that the Academy is, itself, a highly political organization and that Oscar voting is subject to the same cynical political maneuvering as any presidential election.  I know that the gripes about sexism and racism are as germane to the film industry as to any other.

And I know that, barring a totalitarian freak-out by future Oscar producers, winners are going to continue to say whatever the hell they want when they get up on that stage, even if it means talking over that infernal orchestra and harshing the buzz of everyone at home.

There is no escape from facing the hard facts of life—not even at silly award shows, which you’d think would be immune to them.  Apparently they’re not.

So instead, we are left with the second-best option:  Awarding trophies only to artists intelligent enough to climb on their political soapboxes in an articulate and entertaining fashion, as (it must be said) nearly all of them did last week.

Or we could just give everything to Eddie Redmayne.