Laughing at Evil

Tomorrow, May 1, marks two years since we got Osama bin Laden.  The night when a team of Navy SEALs descended on a fortress-like residence in Abbottabad, Pakistan, and fatally shot the mastermind of the September 11, 2001, attacks on New York and Washington, D.C.

On the occasion of this happy anniversary, coupled with our rather unhappy present dealings with a would-be terrorist who recently wreaked havoc on the Boston Marathon, let us reflect upon the state of America’s enemies in today’s world and the manner in which we react to them.  It may well serve to lift our spirits.

One of the so-called “lessons of 9/11,” nearly a dozen years since that great national wound was inflicted, is that America overreacted in the aftermath.  While this overreaction took many forms—waging a nine-year war in Iraq arguably was one—perhaps the most essential is also the easiest for us to self-correct.

That is:  Our tendency to overestimate our enemies.

Let us review the record.

Exactly a year before the SEALs sent bin Laden to his watery grave, a gentleman named Faisal Shahzad attempted to blow up Manhattan’s Times Square, but his plan was foiled when, as Bill Maher quipped, “he locked himself out of his car bomb.”

A few months earlier, on Christmas Day aboard a Northwest Airlines flight from Amsterdam to Detroit, 23-year-old Umar Farouk Abdulmutallab failed to detonate explosives hidden in his undergarments, managing only to set fire to himself before being tackled by fellow passengers and taken into custody.

Then of course there was the original post-9/11 would-be airline bomber, Richard Reid, who failed to detonate explosives stuffed in his shoe because the fuse was too damp to ignite.

These are mere anecdotes—not a statistically significant representative sample—but we are nonetheless entitled to observe a common denominator:  These men were not geniuses.  In fact, we might go so far as to call them idiots.  Idiots with weapons of mass destruction, but idiots all the same.

In this respect, 9/11 was the exception rather than the rule—an attack that was well-financed, meticulously planned and executed with a 75 percent rate of success (a group of extraordinarily brave civilians in the skies above Pennsylvania prevented it from being a clean sweep).

In the Boston case, Tamerlan and Dzhokhar Tsarnaev did, of course, succeed in detonating a pair of homemade bombs and inflicting extreme suffering on scores of innocent people.

By the same token, their actions following the Marathon attack suggest a pair of amateurs essentially making things up as they go along, and doing so with extreme sloppiness.  Their supposed “plot” to blow up Times Square was apparently conceived on the spot and then abandoned when they realized there wasn’t enough gas in their Honda Civic to reach New York.  Although we cannot yet be certain, it is not terribly likely that Dzhokhar running over Tamerlan with the getaway car and later hiding in a boat bleeding half to death were part of the plan, either—that is, if a plan existed at all.

These are what America’s enemies look like today.  Yes, they are dangerous, and yes, they mean us harm.  But they are not supervillians, and we have no cause to flatter them by treating them as if they are.

Over the weekend I attended a screening of a documentary called Broadway Musicals: A Jewish Legacy, which includes an interview with Mel Brooks, who reflects that the impetus for his 1968 film (and 2001 musical adaptation) The Producers was the question, “How do you get even with Adolph Hitler?”  The answer, as far as Brooks was concerned, was by mocking him—by turning this most evil of human specimens into a buffoon, an object of ridicule rather than fear.

That is what our attitude toward today’s evildoers ought to be, if only for the maintenance of our own sanity.

Don’t forget:  The primary purpose of terrorism (insomuch as people such as the Tsarnaevs could be called terrorists) is to instill fear—to make you uneasy about walking out the front door every morning.

That is why the right and proper response is simply to be not afraid.  To not let them get to you.  To laugh in their faces.  To regard them as the pathetic losers that they are.

This will not, by itself, prevent future acts of violence, and due diligence is always required in the free and open society that we so proudly inhabit.

But it will help to restore our sense of perspective about good and evil in the world around us.  Contrary to the claims of another Mel Brooks bad guy, good will always triumph in the end, because evil is dumb.

Librarian-in-Chief

The George W. Bush Presidential Library and Museum opened for business at the end of last week, in a dedication ceremony that included speeches by our three other living former U.S. presidents and the current one, all of whom were naturally on their best behavior.

In the broader media, this momentous event was used as an opportunity—as these things are—to revisit the presidency, and reflect upon the so-called “legacy,” of our 43rd chief executive four years after his departure from Washington, D.C.

This national conversation about the meaning of President Bush and his new library encompassed an impressive array of concerns—from the presidency itself to the nature of presidential library funding to the striking magnanimity with which all of Bush’s fellow ex-presidents spoke of him in his hour of glory—but there was a most essential question that remained largely, regrettably unexamined.

Why should the American president get his own library and museum in the first place?

President Bill Clinton, in his remarks, drew chuckles when he cheekily observed that the ceremony marked “the latest, grandest example of the eternal struggle of former presidents to rewrite history.”

If only that were all that is wrong with them.  The bigger picture is far more troubling and problematic.

A presidential library and museum is, at its core, an exercise in vanity.  Regardless of the particular chief executive under consideration, the institute bearing his name will inevitably approach his presidency in a sympathetic, biased manner, underlining his successes and glossing over (if not outright ignoring) his flaws, faults and failures.  What possible incentive does such a place have to do otherwise?

Clinton’s museum, unable to credibly omit his 1998 impeachment completely, presents the Lewinsky affair strictly under the heading, “politics of persecution.”  The Ronald Reagan Library barely mentioned the Iran-Contra episode until a 2011 renovation—two decades after the building’s original dedication.  At the Richard Nixon Library, things got so bad vis-à-vis Watergate that Congress was compelled to intervene, passing legislation that transferred control of the museum’s contents to the National Archives.

This is what happens when you erect a monument to a human being—and what is worse, a human being who is still alive and personally involved in the monument’s creation.  It is history as propaganda, in its purest, baldest form.

What we should realize, however, is that even if presidential museums could be made to be objective about their subjects, they would still be a rotten idea.

What would it mean, we might ask, to have an institution dedicated to chronicling the life and times of George W. Bush in 2013?  What would a sober, objective set of exhibits in such a time and place look like?  How might it present the wars in Iraq and Afghanistan?  What conclusions might it draw?  How would it assess the Bush presidency with respect to those of Bush’s predecessors?

On the prospect of arriving at definitive answers to these queries, I humbly ask:  Isn’t it a bit soon?

There is an old story that when asked in 1972 to assess the impact of the French Revolution of 1789, Russian Premier Zhou Enlai remarked that it was “too early to say.”  Although this exchange turned out to be a case of mistaken translation (Enlai was actually referring to the Parisian unrest of 1968), the underlying point is nonetheless true.

To study history is to understand that it can only be viewed in the long term, and that the past is ever subject to change based on the present.  Any museum that presumes to present history so recent that it can barely be called history would be subject to constant revision and refinement, begging the question of whether such a place could rightly be called a museum.

What would remain is the individual who sat in the Oval Office during the period when the contested events occurred, whose life is only interesting in the context of those very events.  In that way, presidents’ long-term reputations are hostage to fortune, as dependent upon the actions of their successors as by their own.

So we return to my earlier question:  Do these people deserve a museum at all?  Does the mere fact of having been president warrant immortalization of this sort?  And even if it did, might we consider resisting the urge all the same?

We might.

Peace of Cake

We had a birthday party over the weekend, to celebrate an adorable cousin of mine turning three.  He is currently in the midst of a Spider-Man phase—he assumed the famed web spinner’s identity last Halloween—and so we were treated to a Spider-Man birthday cake, complete with Marvel figurines and an intricate web design woven into the icing.

How did it taste?  Well, it tasted like birthday cake.

Of course, there are many things to which one looks forward at such a festive event—the gathering of family and friends, the merriment, the opening of presents—but somehow it is always the cake that holds the special place in people’s hearts.  It is the one component of every party that can be counted upon, should all else fail.

It is curious that this should be true.  I cannot say that I have ever met a birthday cake I didn’t like, but by the same token, I equally cannot recall any particular one that stood out from the rest.  It is a dependable treat, but also dependably forgettable.

Perhaps this is merely an extension of the tendency of all food and entertainment considerations at well-attended casual social events:  Nothing fancy.  Something everyone will enjoy, or at least tolerate.

To the extent this is the case, it only serves to illustrate the singular mystery and allure in America of all things cake.

In the course of my life’s culinary adventures—hopefully in yours as well—I have experienced cakes of various sorts that were quite memorable, indeed—some of which inspired behavior of which I am not especially proud, yet don’t particularly regret.

There was that family-and-friends dinner at my brother’s fraternity house the night before his graduation, featuring all-you-can-eat cheesecake topped with fresh berries.  OK, it wasn’t technically all-you-can-eat, but, well, some people hadn’t touched theirs and I, having earlier taken advantage of the all-you-can-drink whiskey bar, hated to see these delectable stray slices just lying around, and before I knew what happened my waistline had expanded halfway out the door.

But then that illustrates a fundamental principle of mathematics, which states that the concentration of sugar and saturated fat in a given dessert is inversely proportional to one’s ability to resist eating absurd amounts of it.

Cheesecake is a particularly salient example, being composed of nothing but butter and cream cheese, as is another favorite of mine, carrot cake, with its melt-in-your-mouth consistency that deceives you into thinking the first five or six slices don’t really count.

You will recall the scene in Matilda in which the sadistic school principal, Ms. Trunchbull, punishes the poor fat kid who stole her dessert by forcing him—in front of the whole student body—to consume an entire chocolate cake.

It is a most unique form of torture, turning a boy’s weaknesses against him by blowing them all out of proportion.  As we all know from experience, there is nothing quite so unpleasant as becoming violently ill from partaking in one’s guilty pleasures just a little bit too much.

Nonetheless, in re-watching Matilda recently, I promptly added to my mental bucket list, “Consume an entire cake in one sitting,” albeit one not quite as large, and probably without a cringing audience.

It is rather a shame that, according to science, it is more or less impossible to literally die from ecstatically stuffing your face, as it would seem a rather fine way to go, considering some of the alternatives.  A nice, sweet conclusion to the story of one’s life, you might say.

That, in a way, points to the broader appeal of cake in all its wondrous forms:  In the culinary canon, it is both a metaphorical and literal happy ending.  And who doesn’t like a happy ending?

Justice

I want Dzhokhar Tsarnaev represented by the finest lawyer in America.

I want him put on trial in an ordinary American courthouse to face judgment by an impartial jury of his peers.

I want him read his Miranda rights and subjected to all manner of due process afforded any other American citizen.

I want him to have his day in court.

Tsarnaev is, of course, the surviving member of the pair of brothers who stand accused of plotting and executing last Monday’s Boston Marathon bombing, which killed three and injured 170.  He is further accused of the murder of an MIT police officer named Sean Collier at the start of a car chase and shootout that led to the wounding of several more police officers and the killing of Tsarnaev’s brother, Tamerlan.

Given the extraordinary circumstances of the whole business, a movement has sprung—led by Senators John McCain and Lindsey Graham, among others—to deny Tsarnaev the usual processes of the American justice system, and instead to treat him as a so-called “enemy combatant.”

The essence of McCain and company’s argument, in effect, is that the American justice system is simply too good for the likes of Tsarnaev.  That the crimes he has allegedly committed are too horrid, too outsized—too much of an affront to all the values we Americans hold sacred—for our usual system to handle.  Courtesies such as due process and trial by jury?  Why, he doesn’t deserve them.

That, in so many words, is the point.

The American justice system is too good for someone like Tsarnaev, if he did what we assume he did.  His crimes were horrid, profane and utterly out of proportion.  He doesn’t deserve the sort of patience and accommodation the United States court system offers all those who are accused of breaking United States law.

And that is precisely why we are going to go through with it anyway, and why, if we don’t, we’ll regret it for the rest of our days.

Our system is designed to be too good for those who undergo its oh-so-elaborate machinations.  It is meant to accommodate those who don’t deserve to be accommodated.  It is built to withstand the most horrendous crimes imaginable and, what is more, to give those accused of committing them every possible opportunity to proclaim their innocence.

Never forget:  The burden of proof is always on the prosecution, not the defense.  As Americans, we would rather let a guilty man go free than an innocent man go to prison.

Were we suddenly to abandon these core principles, granting an exception in any particular case, then our Constitution would be rendered meaningless.

Let’s not dance around what is surely a central, if unspoken, concern:  The danger of Tsarnaev being found not guilty on the basis of a technicality, such as a key piece of evidence being gathered in an improper fashion.

Well:  By all known accounts, Boston’s law enforcement has conducted itself with the utmost care and professionalism throughout this whole ordeal.  What better way to prove it—if only to ourselves—than to put all their hard work to the test?

Some argue the brothers Tsarnaev should be treated as foreign terrorists, their plot as some unorthodox act of war against the United States.

In point of fact, Dzhokhar Tsarnaev is an American citizen who lives in Cambridge, Massachusetts.  He is no less American than you or me or John McCain.  While he (and especially his brother) may well have been influenced by ideas and rhetoric from his place of birth in the Caucasus, in the Boston bombing he and Tamerlan appear to have acted on their own.  Can America really be said to be engaged in a war against an individual?

In any case, we ought to try Tsarnaev as we do any other accused person because that is what a great and strong country does.

A great country is one that follows its own rules.  That takes every opportunity to demonstrate that its system of laws is superior to every other.  That treats its citizens as equals.  That presumes a man innocent until he is proved guilty beyond a reasonable doubt.

In the case of Dzhokhar Tsarnaev, his guilt in the Boston Marathon atrocity seems so self-evident and, indeed, so far beyond doubt that any trial will surely amount to a mere formality—a slam dunk.

All the more reason not to deviate from the usual routine.  As Charlie Brown might say:  We are going to do this trial, and we are going to do it right.

Heaven and Earth

There was a mad killer on the loose, and for upwards of a dozen hours on Friday, he was the only person in Boston allowed to walk the streets.

Yes, as law enforcement embarked upon a massive manhunt for Dzhokhar Tsarnaev, the surviving assumed perpetrator of Monday’s Boston Marathon bombing (his brother and co-suspect, Tamerlan, had been fatally wounded in an overnight shootout with police), the Massachusetts governor, Deval Patrick, took the extraordinary step of asking all residents of Boston and surrounding communities of Watertown, Cambridge, Belmont, Newton and Brookline—roughly one million citizens in all—to lock themselves inside their homes until further notice.

The governor officially lifted the “shelter in place” request around dinnertime, and not more than an hour later, Tsarnaev was apprehended and the nightmare was over.

The relative swiftness of the whole operation saved us the trouble of having to ask a lot of deeply uncomfortable questions, both practical and philosophical.

Now that the drama appears to have drawn to a close, let us ask them anyway.

Getting right to the point:  Was the near-complete lockdown of six towns and cities a wise and reasonable decision?

As a consequence of keeping all residents indoors, authorities effected the shutdown of all businesses, all restaurants, all public transportation, all entertainment and all sporting events—a suspension of commerce expected to cost the region several hundred million dollars in lost revenue.

Sooner or later, we have to ask:  Was it worth all of that to hunt down and capture a single human being?

In general, under what conditions would such a move be unquestionably justified, and when would it not?  Where might we draw the line—or is this not something we could possibly know in advance?

In 2007, Mitt Romney expressed the view—not entirely unpopular then and at other times—that “it’s not worth moving heaven and earth and spending billions of dollars just trying to catch one person”—and this was when the person was Osama bin Laden and the theater of war was outside the United States.  In certain contexts (if not that one), there must surely be wisdom in those words.

Sticking with the events of last week:  Suppose it took two days for authorities to locate and detain Tsarnaev, instead of one.  Would a lockdown have been as justified on Day Two as it apparently was on Day One?  How about on Day Three, or Day Seven?

Had Tsarnaev managed to tiptoe out of town, completely slipping through the FBI’s fingers sight unseen, how many billions of dollars of revenue would the Boston area have been allowed to lose—how long would a million residents be instructed to remain frozen in place—before the authorities decide that enough is enough?

The immediate justification for the “shelter in place” call—particularly in Watertown, where Tsarnaev was (correctly) believed to be—was that it would make it easier for law enforcement to methodically comb the neighborhood for clues and evidence—in other words, to do their jobs with minimal interference.  But was fully clearing the streets truly necessary to do this?  Could they not have completed this task in a relatively normally-functioning environment?  Isn’t it a bit worrying to think that they couldn’t?

It is true (so far as we know) that “shelter in place” ended before the capture of Tsarnaev was confirmed, suggesting that it was a strictly short-term measure (as it is designed to be) that was never going to last beyond Friday.  In fact, this only serves to beg further inquiries.

We were told, for instance, that the public transportation aspect of the shutdown—service was suspended on buses, subways, commuter rail and Amtrak—was to prevent Tsarnaev from spiriting away at high speed.  If we accept this premise, wouldn’t we then need to keep accepting it until the moment he was found, no matter how long it took?

I could go on, but if I haven’t made my point by now, the effort is probably futile.

As I am not the first (or last) person to observe, we Americans are extraordinarily lucky not to have to face these kinds of questions more frequently than we do.  New York Times columnist Ross Douthat rather eloquently tweeted, “If terror attacks were even slightly more common, [this] response would be [an] unsustainable folly,” adding that such an unprecedented event “is a luxury of domestic peace.”

So it is.  And with continued good fortune, the prospect of bringing life in the big city to a screeching halt will remain the exception, not the rule, in regards to how we respond to random acts of terror.

And yet I cannot help but wonder:  What will happen—and what should happen—the next time?

Conventional Wisdom

The breaking news from Ireland this week—understandably overlooked by breaking news closer to home—is that the not-particularly-secular Emerald Isle has put itself on the fast track to recognizing same-sex marriage.

At a gathering of the country’s constitutional convention, 79 percent of those present voted in the affirmative on the gay marriage question—a recommendation the Irish government will decide upon in the coming months.

As surprising as this latest seal of gay approval might be, to me the even more striking revelation was also the more all-encompassing:  That is, the news that the Irish Republic had commissioned a constitutional convention in the first place.

It is not every day that a stable, peaceful, modern democracy decides to embark upon a wholesale rewrite of the defining document of its society, government and culture.

One cannot help but wonder:  Could it happen in America?

The proposition is not a new one.  No less than Thomas Jefferson, in a 1789 letter to James Madison, mused that “every constitution […] naturally expires at the end of 19 years” and can (and should) be completely redone thereafter—Jefferson’s premises being that 19 years constitutes a single generation of Americans and that “the earth belongs to the living and not to the dead.”

Periodic calls for a “second constitutional convention” have sounded through the years, both at the state and federal levels, although none has yet borne any fruit.

The relevant clause in our current Constitution is Article V, which outlines the ways in which the document may be amended, as it has been on 27 occasions to date.  In addition to the usual method, whereby an amendment is proposed by two-thirds of both the House and Senate and then ratified by the legislatures of three-fourths of the states, a national convention may be established “on the Application of the Legislatures of two thirds of the several States.”

A key question here—unanswered, since it’s never been tested—is whether these provisions apply only for the purpose of enacting a single amendment, or whether we could go the extra step and, as Jefferson recommended, actually rewrite the whole thing from scratch.

As (and if) such lawyerly matters get sorted out in due course, we are surely entitled to engage in thought experiments and wish thinking in the meanwhile.  With the Irish example now before us, some of the issues involved become less abstract than they might previously have been.

For instance:  Were a full-fledged second constitutional convention to occur, of whom would it be composed?

Our original convention, for all the political wrangling involved, contained such minds as James Madison, Alexander Hamilton and Benjamin Franklin.  By contrast, in Washington, D.C., today, we have a Congress that recently managed not to pass legislation on gun control that is supported by roughly 90 percent of the public.  Are we really prepared to entrust this lot with refashioning our most sacred national document?  And if not them, then whom?

The convention in Ireland is composed of exactly 100 individuals:  A government-appointed chairman, 33 members of the legislatures of Ireland and Northern Ireland, and 66 randomly-chosen citizens who constitute “a representative sample of the Irish public” with respect to age, gender and geography.

Transplanting this approach to America, we might wonder what a “representative sample” of Americans would look like in such a context.  Would we not want to include a wide cross-section of political views?  Religious affiliations?  Economic statuses?  Levels of education?  Occupational backgrounds?  Or would we be content to have 100 white male lawyers hammering this thing out?

Oh yes, and then there is the minor matter of what a brand new U.S. Constitution might say.

The Irish convention is focused on eight particular items.  Besides same-sex marriage, these include such disparate considerations as shortening the presidential term of office, lowering the voting age, formally encouraging a greater presence of women in public life and—my personal favorite—discontinuing the status of blasphemy as a common law offense.

Would our First Amendment survive a hypothetical rewrite?  How about the Second?  Would we retain term limits on our commander-in-chief, perhaps imposing them on senators as representatives as well?  Would a revamped Bill of Rights be longer or shorter than it is now?

Finally, what of the wisdom of Jefferson’s sentiment in the first instance?  Is it really true that the world belongs only to those who presently inhabit it, or is there value in retaining the basic outline of views of those (such as Jefferson) who, however fallible and unable to predict the future, are due a certain deference from their unique position in history?

To some of these questions, we might never receive adequate answers.  And so we have no choice, then, except to keep on asking them.

Onward

If there is one thing I do not understand, it is why anyone would be afraid of death.

I am acutely aware that some of humanity’s most clever individuals have devoted lifetimes to contemplating the mysteries of the deep, not least one of my artistic heroes, Woody Allen.

Nonetheless, I resolved the issue for myself years ago, with a philosophy articulated most succinctly (funnily enough) by Allen himself, as screenwriter.  In his film Hannah and Her Sisters, Allen’s character probes his father as to why he does not worry about his own death, to which the old man responds, “I’ll be unconscious.  Or I won’t.  If not, I’ll deal with it then.”  Voilà.

We say that death is tragic, but that is only true because it happens one at a time—because there are survivors who must wrestle with their grief and yet somehow soldier on.  Death would be neither tragic nor particularly interesting if we all snuffed it at the exact same moment—an eventuality made possible by the wonders of nuclear weapons, although that is another story entirely.

I muse on these questions, in part, because earlier this week some as-yet-unidentified jerk took it upon himself to blow holes in my hometown of Boston with a pair of improvised explosive devices, murdering three and wounding 170 more.  In the intervening time, I have been asked—as all residents of stricken big cities are—if the Boston Marathon bombing has led me to fear for my safety a bit more than usual.

The question, however compassionate and well-meaning, rather confused me.  Of course I don’t fear for my safety more than at any other time.  I don’t understand why anyone would.

Sure, in the minutes immediately following the Boylston Street blasts, when all hell was breaking loose and the prospect of further explosions was entirely in the cards, one is entitled to a cold rush of fright.

But once the initial shock has receded and life in the big city returns to its normal rhythms?  Well, as a certain British cliché enjoins, you carry on.

Yes, amongst the seven billion souls on our young planet are a few who wish the rest of us harm and have access to weaponry that can do the job.  I was unaware this information had been withheld from certain people before Monday’s madness, but that is what one must believe in order to say we should suddenly cower, having held our heads high up until now.  It is an utterly irrational attitude to strike.

Could I (or anyone else) meet my end by being blown to smithereens?  Sure.  Anything is possible.

On the other hand, as a recreational bike rider, I am far more likely to flutter toward the pearly gates as the result of turning a sharp corner, looking the wrong way and being rendered a human pancake by a passing city bus.

Heck, I live in the attic of a rickety old tree house of an apartment, with every trip down the staircase a delicate fencing match with the Grim Reaper, particularly after a few cocktails.

I could slip out of the shower, bang my head and never be heard from again.

Or I could live a hundred years and go peacefully in my sleep, my weary heart deciding, at long last, that enough is enough.

An individual’s cause of death is a matter for actuarial tables.  It is only death itself that is certain.

In that way, we all step out the door every day under an illusion of immortality, confident that today, at least, we will triumphantly avert death.  But then, we scarcely have a choice in the matter and, after all, on most of these days we are correct.

It is not simply a matter of “not letting the terrorists win,” you see.  As if life only has meaning in the face of a homicidal nemesis.

No, it is about those universal American values that never go away—the ones about life and liberty and pursuing one’s happiness without intimidation and without fear.

I do not wish to expend inordinate amounts of my remaining time on Earth contemplating my own death, or avoiding all activity that could possibly cause it.  It will happen sooner or later, and until it does, I would prefer to devote my life to living.

Tweeted Terror

It began with a phone call.

“Are you aware what’s been going on?”

I had been catching up on last week’s episodes of The Daily Show, so the answer was “no.”

I promptly typed “Boston.com” into my laptop’s address bar, and when the Boston Globe website failed to load, I compensated by opening Twitter in one tab, The New York Times in a second, Facebook in a third and the livestream of Brian Williams’ NBC News broadcast in a fourth.

Within five minutes, I knew as much about what happened on Boylston Street as anyone else in America.

I knew which of my fellow Bostonians were safe and sound.  I knew the basic timeline of the madness downtown, with accompanying video and images marking every moment.  I knew the number to call if I wanted to give blood, and I knew which streets were cordoned off and which subway lines had suspended service.

This was all to be expected—an illustration of the state of mass communication in the United States in 2013.  Since technology only marches forward, it was only natural that all available means of getting the word out would be employed, and as robustly as possible.

We never stop gushing about how much our technology has improved in so little time, and the reason is that it never stops being true.

Recall:  Twitter did not exist on September 11, 2001.  Neither did Facebook.  There were certainly cell phones, but neither I nor most of my friends owned one.  We had the Internet, but in a horribly primitive iteration we would not recognize today.

In the days and weeks following 9/11, I would return home from school every day and ask my parents if anything interesting happened in the world while I was in class.  For the six hours between home room and the final bell, we kids had no way of tracking the news on our own.  On 9/11 itself, I didn’t know the Twin Towers had collapsed until I heard it on the school bus’s radio, some three or four hours after the fact.

We have had this conversation before, about the influence of social media on actual world events and in the ways we react to them.

We agreed, for instance, that the primary reason the popular uprising in Iran in June 2009 lasted as long as it did was because of Twitter.  As Iran’s iron-fisted government attempted to shut down all forms of communication and crush its people’s will to resist, the people found they could tweet the latest developments to the rest of the world without being censored.  Consequently, a real movement took root.

In the story of the world’s continuing technological advancements, the central question seems to be which aspects of mass communication and social networking are truly revolutionary, and which are merely evolutionary.  That is, to what extent resources such as Twitter are simply streamlined versions of stuff we already had, or are entirely new entities that really are changing the world around us.

The answer, as always, is that they are both.

One cannot help but wonder how past trauma might have been altered with present-day tools—a rather depressing thought experiment, tempered ever-so-slightly by the reminder of our natural yearning for self-improvement.  Yes, the current ubiquity of smart phones would’ve been nice a decade ago, but hey, at least we have them now.  Small consolation is better than no consolation at all.

Another memory from 9/11:  The walls of photos of missing loved ones in Lower Manhattan, with phone numbers and other contact information for anyone with any information to use.

The same system emerged Monday in Boston, except it was electronic:  Everyone verifying his or her own status on Facebook and, more importantly, the swift implementation by Google of a “person finder”—a central database both to find and provide information about a particular person’s whereabouts in the bombing’s immediate aftermath.

It is encouraging to imagine, in whatever horrific incident we will inevitably face a decade from now, how much our means of communication will have further evolved from their capabilities today.

With privacy steadily becoming a thing of the past, is it not a foregone conclusion, for example, that we will all soon be equipped with some sort of personal tracking device?  With Google’s forthcoming mad invention known as “Google Glass” introducing the concept of a computer worn directly on one’s face, what cause have we to despair about the continued promise of the human mind to produce good in the face of the occasional evil?

Speak For Yourself

I promise this whole column will not be about Justin Bieber.

But the 19-year-old pop music superstar, who was already not having a terribly great month, unwittingly stirred a rather improbable ruckus the other day that can serve as a proverbial “teachable moment” for us all.

In Amsterdam, amidst the European leg of his current tour, Bieber visited the Anne Frank House and signed its guestbook thusly:  “Truly inspiring to be able to come here.  Anne was a great girl.  Hopefully she would have been a belieber.”

(“Belieber” is the official term for a Justin Bieber fan.  As if you didn’t know.)

Of course, to a normal, mentally balanced human being, those are just about the most innocuous, uninteresting three sentences one could possibly write, scarcely requiring any further comment.  Contemporary teenager expresses affection for an historical figure roughly his age, dreaming that, had they occupied the same time and place, she might have liked him back.  End of story—if, indeed, this could even be called a story.

However, as Bieber’s every action has become the object of acute fascination by a not-insignificant gaggle of followers—admiring and despising—for whom a sense of proportion is not a strong point, this utterly harmless episode has ballooned into a controversy on the strength of the perceived lack of humility on Bieber’s part in presuming to speculate about Anne Frank’s tastes in music.

It’s an exercise in silliness—a demonstration of the spectacle one becomes when one is prepared to be offended by absolutely anything.  However, although Bieber meant no disrespect in his sweet nothing of a guestbook message, we can nonetheless wring some small semblance of meaning from it by reflecting upon the pitfalls of a practice that long foreruns the Biebs—that of ventriloquizing the thoughts of those long past.

Touring bookstores across the United States in 2005 to promote his biography Thomas Jefferson: Author of America, Christopher Hitchens regularly advised against historians and biographers assuming more than they could possibly know about their respective subjects.  To theorize what a particular historical person might have thought about a given subject, Hitchens argued, is beyond the competency of even the most learned student of history, and should be avoided at all costs.

(Hitchens granted himself one exception:  That in seeing Sally Hemings for the first time, Thomas Jefferson must surely have thought, “Maybe there is a God after all.”)

We all like to claim our favorite historical and literary authorities as corroborators of our most deeply-held views, figuring that the collected writings and opinions they churned out while alive license us to infer what they would say about things now, if only they weren’t dead.

Certainly Jesus Christ has fallen victim to this practice over the last few centuries, becoming an unsolicited spokesman for believers and nonbelievers alike.  (My favorite example:  Max von Sydow lamenting in Woody Allen’s Hannah and Her Sisters, “If Jesus came back and saw what was going on in his name, he’d never stop throwing up.”)

Men such as George Orwell and Abraham Lincoln have proved especially malleable in recent years, welcomed as moral leaders of pretty much every ideological movement currently in business.  Then there was the recent hilarity of the chairman of “Gun Appreciation Day” saying unironically that Dr. Martin Luther King, Jr., would oppose gun control “if he were alive today.”

Quite apart from the practical difficulties of conjuring imagined opinions of the dead to make a point about the living, there is a far more troubling matter:  The implication that our convictions are only as valid as the individuals who might share them.

To wit:  Suppose it were demonstrated that King really would oppose an assault weapons ban, say, or that Orwell would find President George W. Bush’s warrantless wiretapping program appalling.  So what?

The case for or against a particular policy ought to stand or fall on its own merits, with the identity of its supporters and detractors a distantly secondary consideration.

I refrain from smoking cigarettes because medical science has demonstrated that tobacco causes cancer—not because Adolph Hitler recommended that I do so.

I was in favor of a right to same-sex marriage both before and after President Barack Obama gave it his seal of approval.  The arguments for and against did not change just because the president did, much as the case for abolishing slavery did not hinge on President Lincoln’s personal endorsement.

Do not depend on the reputations of others to determine what is right.  Have the nerve to think for yourself, as if your own opinions and powers of reason were as legitimate as anyone else’s.

Be a belieber in your own self-worth.

The Virtues of Vices

It finally happened.

The first hint came directly from the United Kingdom, where “Ding Dong!  The Witch is Dead” shot to the top of the radio charts.

Then there were the leads and headlines in newspapers around the world, which did not make any concerted effort at deference or respect for the newly dear departed.

But what finally sealed the deal for me was watching Late Late Show host Craig Ferguson, who normally avoids giving his political views in public, devote a chunk of his monologue to say that he did not much care for Margaret Thatcher, the former British prime minister who shuffled off this mortal coil last Monday, at the age of 87.

My epiphany from the aforementioned reactions to Lady Thatcher’s death—and plenty more besides—was that the unspoken prohibition on speaking ill of the dead has, at long last, been lifted.  In today’s world, if you want to vent what you truly think about a person who has just passed—no matter how unflattering it might be—the sky is the limit.

Not that we hadn’t been moving in this direction already.  When Jerry Falwell snuffed it in 2007, his longtime critic Christopher Hitchens did a round of cable TV interviews in which he condemned Falwell as a “toad” and an “ugly little charlatan,” theorizing that “If he had been given an enema, he could have been buried in a matchbox.”  Whatever blowback Hitchens received from this vitriol, so soon after Falwell’s demise, it sure didn’t lower his visibility on TV or any other news medium.

Nonetheless, the concept of spitting on someone’s grave while the body is still warm has long been frowned upon in polite society.  When a person dies—be they a world-famous celebrity or a member of one’s immediate family—the default reaction is to speak as highly of the deceased as one can muster, usually through a deft mixture of suppressio veri—withholding certain truths—and suggestio falsi—telling outright lies.

I recently wrote a short and utterly glowing appraisal of the late film critic Roger Ebert, a man of many flaws that I easily could have noted without taking away anything from the qualities that made him great.  Why did I choose to omit them anyway?

We are hesitant to speak ill of the dead.  It strikes us as somehow impolite, disrespectful—bad form.  Even if the corpse in question was a ruddy bastard in his time amongst the living, we give it the old college try to spin his faults in the most positive possible light.  (As but one example:  Get a load of what was said of Richard Nixon upon his death in 1994.)

Nowadays, this tendency appears to be steadily withering away, with underlining a dead person’s blemishes becoming far less of a taboo than it once was.  I say this is good.

Against any notions of impropriety, the case for speaking both good and ill of the dead is, interestingly enough, founded on the notion of respect.

“When I was growing up, ‘respect’ meant that you took people seriously,” said Salman Rushdie, speaking on an entirely different subject.  “It didn’t mean that you never disagreed with them.”

In like spirit, we should more widely recognize that we do our dear departed no favors by acting as if they were saints when we well know that they weren’t.  That the harsher, more honest approach to eulogizing them may well be the more respectful as well.  Would it not be slightly embarrassing, witnessing your own funeral, to be made into a far finer person than you truly were?

To be human is to inhabit three dimensions.  To possess both vices and virtues.  To be flawed.

A funeral or memorial service—be it public or private—is a means of, and an occasion for, assessing and reflecting upon the totality of a particular human being.  What a waste, in this most sacred moment, to include only half the story.