Embracing Limits

You don’t look to the State of the Union address for moments of illumination in the world of government and politics, but we got one nonetheless this past Tuesday night.  Not from President Obama, mind you, but rather from Representative Cathy McMorris Rodgers of Washington, who delivered the “official” Republican Party response.

The United States, said Rodgers in her introductory remarks, is “a nation where we are not defined by our limits, but by our potential.”

What a silly thing to say.  And what a wholly dangerous thing to believe.

If there is one thing we have learned for absolute certain in the first 237 years of the American republic, it’s that nothing gets us in greater trouble than not knowing and working within our limits, both individually and as a nation.

During much of the George W. Bush administration, for instance, Washington, D.C., operated under the assumption that America’s military might was effectively infinite.  President Bush and company viewed the rest of the world as a laboratory in which the United States could test its theories about democracy at its leisure, with or without the consent of the people who happened to live there.  They truly believed—or at least professed to believe—that America was capable, in Bush’s words, of “ending tyranny in our world.”

However, in the fullness of time—and at the cost of billions of dollars and thousands of lives—we realized that this triumphalist view was a trifle naïve and rose-colored on the president’s part, and that even the U.S. Armed Forces are not invincible.  That America’s resources to carry out its interests abroad are finite, as are foreigners’ willingness to bend to America’s ideological will.

And all because our country chose to define itself by its (supposed) potential rather than by its limits.

Oops.

I know, I know:  That’s not really what Congresswoman Rodgers meant.  Nor is it what all of our other public officials (not least President Obama) have in mind when they pass along the exact same sentiment regarding unlimited potential.

No, what they mean to evoke is the proverbial “American dream”:  The proposition that someone born poor can become rich.  That the most humble of origins can be transcended through nothing more than honest hard work and a little bit of luck.  That the daughter of an orchardist can become a member of Congress and a black boy born to a single mother can become president of the United States.

And they’re wrong about that, too.

Not about the dream.  Indeed, it is a matter of historical and contemporary fact that the circumstances of one’s birth do not necessarily determine the course of one’s entire life, even as the latest studies on the subject suggest that America’s economic mobility levels have essentially flatlined over the past four decades.

Sure, given the right circumstances, anyone can achieve greatness in America.  That point is valid as far as it goes.

But that has nothing at all to do with this essential concept of limits.

A limit is exactly that:  A ceiling through which one cannot possibly break.  The point where all resistance is futile.  The moment when one must accept one’s fallibility and find another way to achieve one’s objectives—if, indeed, they can be achieved at all.

If one proceeds to secure a particular station in life, then it wasn’t a limit in the first place.  Maybe you thought it was, but it turns out you were under a misconception that had never been properly tested or challenged.  You know, a misconception such as, “America will never elect a black president,” or, “Gay people will never be able to get married.”

These assumptions were only true so long as we kept telling ourselves so.  There was nothing preventing such supposed impossibilities from being realized except our own resistance to them.

However, there are some challenges in life that really are insurmountable, and the mark of true intelligence and sophistication is being able to recognize them when they appear and to know when not to push one’s luck.  It’s not always easy to distinguish a limit from a mere setback, but it should always be one’s goal to do precisely that.

I submit, in order words, that the United States should in every way be defined by its limits, and should take it as a matter of pride rather than a source of shame.  Understanding and working within one’s limitations is how all wise policy is made.

America cannot do everything, nor can individual Americans.  It’s high time we stopped acting like we can.

A World Without Oscar

What do I think of this year’s Oscar nominees, you ask?

Well, my favorite movie of 2013 received a whopping one Academy Award nomination.  My second favorite garnered none at all, as did three other entries on my personal top 10.

That’s what I think of this year’s Oscar nominees.

Actually, I think this year’s lineup is just fine.  The year 2013 produced many excellent films with many exceptional performances, and a commendable number of both turned up among the Academy’s honorees at the annual nominations announcement on January 16.  Some of the esteemed film society’s selections caused many analysts to scratch their heads, but plenty of others were well-deserved and, dare I say, inevitable.

You know.  Just like every other year.

In truth, in the movie world throughout January and February, the only thing more fashionable than the Oscars is complaining about the Oscars.

These critiques take a dizzying number of forms, each one more predictable than the next.  Some folks gripe about the media’s intensive focus on fashion and the red carpet, while others bitch about the ceremony’s boring hosts and self-indulgently long running time (the last telecast to run less than three hours occurred in 1973).

The more pointed dissents, however, cut directly to the institution’s original primary objective, which is to bestow the title of “best” onto a given year’s assortment of motion picture releases.

After all, when it comes to movies, what does “best” mean anyway?  Isn’t weighing one movie against another—movies with utterly disparate subject matter and purpose—the ultimate exercise in comparing apples to oranges?  Was Humphrey Bogart not onto something when he mused, “The only honest way to find the best actor would be to let everybody play Hamlet and let the best man win”?

Even more so than other awards shows—at least the Grammys divides its categories by genre—the Academy Awards is a patently absurd attempt at applying some sort of objective standard to an inherently subjective art form.

This year, then, let us take this curmudgeonly thought a step further and pose the following query:  What if we got rid of the Oscars once and for all?  What if there wasn’t this annual gathering of Hollywood’s best and brightest congratulating themselves on a job well done, with the rest of us following along for the ride?  How would the movie industry change, and would such a scenario produce a preferable environment to the one that currently exists?

The upsides are certainly tempting to ponder.

For starters, the abolition of a movie awards season might engender a more evenly-distributed release schedule (qualitatively speaking) compared to the lopsided, bottom-heavy one we have now.

With no Academy voters with whom to curry favor, studios would not be tripping over each other to release all their high-quality “prestige pictures” in the final weeks of December while spending the year’s remaining eleven and a half months churning out utter cinematic dreck.  There would be little reason not to unload Important Films by Important Directors about Important Subjects at any old time of the year.

Studio executives would still care about nothing except profit, of course, and would still employ careful strategery regarding when (or if) a particular project might see the light of day.  (Don’t expect future installments of Star Wars and Star Trek to ever open on the same weekend.)

But these considerations would no longer be tethered to some gold-plated grand payoff that so often comes at the expense of the very consumers whose dollars these executives seek in the first place.

Per contra, minus Oscar’s sinister allure, the relationship between producers and consumers would become considerably more direct, and a bit more honest as well.  For the film industry, the only real point of the Oscars is to sell more tickets and DVDs.  In its absence, moviemakers would no longer be able to rely on a political, drawn-out, behind-the-scenes marketing campaign to reel in unsuspecting viewers.

Instead, any such scheme would need to be directed squarely at those viewers themselves.  The message would shift from, “Watch this movie because it won a bunch of shiny trophies,” to, “Watch this movie because you just might enjoy it.”

And that directs us to the real question in all of this:  Is it useful to turn the experience of watching movies into some sort of competition?  Does it finally do more harm than good to reduce a medium that many still regard as a form of art into a horse race?

So long as great movies can be seen and cherished for their own sake, shouldn’t we stop pretending they require anything more?

Lies, Damned Lies and Embellishments

Details matter.

When it comes to the million little factoids that make up a person’s life—particularly when that person promulgates those factoids in the course of running for public office—vagueness and imprecision simply will not do.

Wendy Davis, a Texas state senator now running for governor, finds her gubernatorial campaign under siege in light of certain supposed facts about her background having been proved false.

Davis became nationally-known last June when she spoke against an anti-abortion bill on the Texas Senate floor for 11 hours straight.  Now seeking the highest office in the state, she has centered much of her candidacy on her experience as a young, poor single mother who managed to achieve an exceptional career in academia and law more or less on her own.

“She began working after school at 14 to help support her single mother and three siblings,” reads her bio on the State Senate website.  “By 19, Wendy was a single mother herself, working two jobs to make ends meet in hopes of creating a better life for her young daughter.”

Similarly, a video on Davis’s campaign homepage has her elder daughter explaining, “She married young and by 19 was divorced and raising me as a single mother,” adding, “They say everything’s bigger in Texas.  Well, that certainly wasn’t the case for the trailer we lived in.”

As well, the campaign site’s bio reads, “With the help of academic scholarships, student loans, and state and federal grants, Wendy became the first person in her family to earn a bachelor’s degree, graduated first in her class, and went on to Harvard Law School.”

Upon further review, we find that all of these details about Davis’s life are true—except for the parts that are not.

As reported last Sunday by the Dallas Morning News, it turns out that while Davis separated from her first husband at age 19, they were not formally divorced until she was 21.  She and her daughter did live in a mobile home, but only for a few months before moving in with Davis’s mother and then getting an apartment of their own.

Additionally, while her education was indeed partly funded by scholarships and government grants, it was also made possible by Davis’s second husband, who paid the tuition for her final two years at Texas Christian University and her final year at Harvard Law School.

In short:  Without approaching outright fabrication, Davis embellished, contorted and cherry-picked the details of her upbringing in order to seem as appealing as possible to the good people of Texas, and to put forth a personal narrative that ties perfectly with her stated policy principles—not least regarding the empowerment of women—in her pursuit of political power.

While there is no mystery as to why she—or anyone—might resort to such machinations (see previous paragraph), in this case I am compelled to ask that question nonetheless.

Studying the discrepancies between the facts and Davis’s claims, one wonders how it could have possibly been worth her while.

I would posit—and I think most would agree—that there is nothing inherently less honorable about being a 21-year-old divorced single mother compared to being a 19-year-old divorced single mother.  Wouldn’t any voter drawn to the inspiring grit of the latter be equally enamored of the former?

Politically speaking, Davis’s origin story is a goldmine as it is.  Why exaggerate the particulars when the truth is plenty compelling enough?  By embellishing, even just a little bit, Davis accomplished nothing except making herself vulnerable to charges of being a liar and a fraud.

That’s not to say she actually is either of those things.  In the garden of political deviousness, her so-called “misstatements” are extremely small potatoes, indeed.  By themselves, they should not (and probably will not) cause irreparable damage to her campaign or her career.

And yet.

We live in an age in which no assertion gets past online “fact checkers” and everything is subject to the greatest possible scrutiny.  Like it or not, every public official knows this to be true and has been made to behave accordingly.

As much as we might disapprove of a political and media system in which major attention is focused on the most minor matters—it is, after all, a system that fosters an awful lot of nonsense and unwarranted hysteria—this dynamic is nonetheless grounded in the highly attractive principle of valuing truth over propaganda, and of conducting a public discourse in which facts matter and untruths must be exposed.

Not all lies are created equal, but that does not erase the fact that any lie is still a lie.  Period, full stop.  The foundation of any healthy relationship—personal or political—is a basic trust between the parties involved, and that is something that no one—not least a prospective governor of Texas—ought to forget or forsake.

Arguing With Himself

One of the more irritating things about disagreeing with President Obama is that you suspect he can make your case better than you can.  That he knows exactly where you’re coming from, that he empathizes with your point of view, and worst of all, that he therefore has prepared the perfect counterargument to your most deeply-held beliefs.

This was never a problem when George W. Bush was president.  At least in public, Obama’s immediate predecessor professed not to care one whit about how and why his intellectual adversaries viewed the world as they did.  As grating as this often was to such critics, it at least offered a certain comforting clarity:  Bush believed what he believed, he was never going to change his mind, and that was that.

By contrast, Obama—the professor and former editor of Harvard Law Review—seemingly is forever weighing both sides of every issue, both in private and in public.  Even when he ultimately takes a firm stand, as his job requires him to do, he makes it plain that he understands the point of view of those in the opposing dugout, and that he values their opinions as legitimate and worth hearing.

This fact about our sitting commander-in-chief has been well-established during his first five years in office, but it comes into increasingly sharper focus in a sprawling new profile by David Remnick in the current issue of the New Yorker, in which Obama grants all sorts of points to the other side even as he defends his administration’s policies as strongly as ever before.

Most of the time, this habit of ideological accommodation is simply a byproduct of the president’s background in law and academia—worlds in which one must all the time play devil’s advocate and never be too certain about one’s rightness on any point, if only for the sake of getting along with one’s peers and not coming off as irretrievably arrogant.

At other times, however, I wonder whether Obama’s rhetorical hedging serves a more calculated and cynical political purpose:  Namely, to provide him the wiggle room to change his mind later on down the road, without appearing like a rank hypocrite.  I wonder as well, in some such cases, whether he has already changed his mind privately and is merely laying the groundwork for the moment when he will summon the nerve to say so publicly.

The Rosetta Stone for this theory was Obama’s flip-flop on gay marriage, and he may well be preparing the same stunt on legalized marijuana.

As you will recall, the president officially endorsed same-sex marriage in May 2012, although many suspected he secretly backed the idea already.  Indeed, in the New Yorker profile, Obama discloses to Remnick that it’s “fair to say that I may have come to that realization slightly before I actually made the announcement.”  As Remnick notes, the evidence would suggest that “slightly before” actually means “16 years before,” since Obama explicitly endorsed gay unions in a Chicago newspaper questionnaire in 1996, when he was running for the Illinois State Senate.

Even apart from the questionnaire, Obama always framed his views on the subject in a way that allowed him to reverse himself without appearing to abandon his original stated principles.  Asked to explain his position in years past, he floated the usual buzzwords like “tradition” and “faith” to justify withholding marriage from homosexuals, but always also extolled the need to treat everyone as equal before the law.

In short, his official opposition to gay marriage was half-hearted at best, and quite possibly phony from the start.  Political opportunism as usual.

On marijuana today, the president makes it clear to Remnick that he opposes its full legalization, saying of pot-smoking, “It’s not something I encourage, and I’ve told my daughters I think it’s a bad idea, a waste of time, not very healthy.”

Yet he spends just as much time—more, in fact—musing about the myriad injustices of America’s current anti-marijuana policies, particularly the fact that a disproportionate number of those imprisoned for smoking weed are either black, Hispanic or poor.

Says Obama, “[W]e should not be locking up kids or individual users for long stretches of jail time when some of the folks who are writing those laws have probably done the same thing.”  Does this sound like a guy who will still be defending the continued prohibition of marijuana, say, three or four years from now?  I may be wrong, but I would have to say no.

Indeed, I would even hazard to guess his opposition to legalized pot is as false today as was that to gay marriage pre-2012, and that it is only a matter of time before circumstances compel him to publicly abandon a sentiment he does not truly believe in his heart.

A Gun in Every Seat

A man finally finds a good use for movie theater popcorn, and this is the thanks he gets?  For shame!

As widely reported, there was a ridiculous and horrifying incident in a Florida movie theater at the beginning of last week, in which a moviegoer shot a fellow moviegoer to death for the crime of sending a text message while the coming attractions rolled across the screen.

We all fantasize about doing terrible things to the people in the theater who won’t shut up.  Well, here’s a guy who actually did.

And we can’t say he wasn’t provoked.  According to the official report, 71-year-old Curtis Reeves had asked 43-year-old Chad Oulson to put his phone away, but Oulson refused.  Following an unsuccessful attempt by Reeves to locate an usher, a full-on argument broke out between the two men, which escalated when Oulson launched a mysterious projectile in Reeves’ direction—an object that has since been identified as a bag of popcorn.

Reeves, apparently fearing for his life, squeezed off one shot from the .380 semi-automatic handgun he happened to be carrying.  The bullet, after passing through Oulson’s wife’s outstretched hand, struck Oulson squarely in the chest, and that was that.

What makes this sordid little episode interesting is the way it calls to mind the philosophy of America’s gun rights community, which presumably would look at such an unfortunate incident and conclude that the main problem is that Reeves, a retired police officer, was the only person in that theater who was packing heat.

As far as the National Rifle Association and its supporters are concerned, the ideal scenario in any public setting is for everyone to be armed and to assume that everyone else is as well.  After all, why would anyone ever behave rudely toward anyone else with the sure knowledge that it will result in a firefight?

It’s “mutual assured destruction” on a micro scale, and the logic has a certain simplistic charm:  You can kill me and I can kill you.  Since neither of us wants to die, we’ll try our best to leave each other well alone and co-exist peacefully.

They say good fences make good neighbors.  Why shouldn’t the same be true with deadly weapons?

Having always resided in the Northeast, I have long tried to understand this mentality, precisely because it is so alien to the environment in which I grew up.  I have never so much as held a gun, nor (to my knowledge) have there ever been any gun owners among my family and friends.

Walking the streets of Boston or New York,  I assume that some passersby are wielding lethal weapons of one kind or another—obtained legally, one would hope—but then I don’t often think about it at all.  It’s none of my business and, with a little luck, will never become so.

And so it is with sheer idle curiosity that I wonder whether, and in what situations, this NRA utopia of near-universal gun ownership might work—“work” as in “foster a safer, more virtuous society.”

On paper, it certainly could.  Imagine:  You sit in a movie theater and there’s a guy texting in front of you.  Aren’t you (and everyone else) less likely to start a nasty argument if you assume his weapon of choice is a pistol rather than popcorn?

For that matter, would anyone in that auditorium dare to send a text message in the first place, knowing that it could incite any number of folks in close proximity to whip out semi-automatics and ensure it was the last phone call he ever made?  Sure, he may be armed as well, but also horribly outnumbered.  Would it really be worth the risk?  If his message were really that important, would he not be compelled to take it outside, as basic movie theater etiquette demands?

In a world of rational actors, we could do a lot worse than to conduct our daily lives as if the slightest breach of social decorum could—or rather, would—bring about our sudden and violent demise at the hands of our dear fellow travelers.  To foster a culture in which such offenses as texting in a movie theater were subject to such harsh recriminations that no sane person would undertake them.

It would be nice to live in such a world, and I wish we did, because it might well lead us to treat each other better and truly think before we act.

Unfortunately, the world we actually inhabit is replete with irrational actors who behave in insane ways.  Who mistake a bag of popcorn for a weapon of mass destruction and destroy innumerable lives as a result.

It’s a shame that the social policies affecting so many seem to be dictated by the actions of so few.

Constitutional Buffer Zones

Long have I wondered exactly where the line is between the freedom of speech and the maintenance of public order.

As it turns out, the answer is 35 feet from the front door of Planned Parenthood.  And it’s not a line, but a semi-circle.

That’s the situation in the commonwealth of Massachusetts, which today defended itself in front of the U.S. Supreme Court against a challenge to such a policy.

As reported in the New York Times on Monday, in 2007 the Bay State passed a law that created a “buffer zone” around the entrances to reproductive health care centers, marked by a painted yellow arc on the sidewalk, in order to prevent confrontations between anti-abortion activists and patients or staff from getting out of hand.

Now, one such pro-life voice—77-year-old Eleanor McCullen of Boston—has filed suit, arguing that her right to peaceably stand outside a Planned Parenthood clinic and try to talk women out of procuring abortions, as she does regularly, is being unfairly abridged.

That’s the question before the Supreme Court, and also before all of us:  Does McCullen’s right to say what she wants about abortion take precedence over an unsuspecting young woman’s right to walk, unmolested, into a facility that offers abortions?

Fourteen years ago, the Supreme Court said no.  In the 2000 case Hill v. Colorado, the court ruled that a similar “buffer zone” law in Colorado did not violate the First Amendment to the U.S. Constitution, on the grounds that the policy did not restrict free speech, as such, but merely “one arena for speech.”

Further, the court argued, the law was “content neutral,” meaning it did not differentiate between anti-abortion and pro-abortion views.  The “buffer zones” were off-limits to everyone.

Finally, the majority opinion held that, with those two conditions met, the state had a “compelling interest […] to protect citizens entering or exiting a medical facility from unwanted communication.”

“Even though speakers have a right to persuade,” the ruling explained, “this cannot extend to unwilling listeners because people also have a right ‘to be let alone.’”

Massachusetts today, in the person of Attorney General Martha Coakley, is defending its own protest-free perimeters on many of the same grounds, with a strong emphasis on that final point.  In point of fact, several of the state’s abortion providers had been subject to aggravated and sometimes violent incidents prior to the 2007 law’s passage.  In the intervening years, such scuffles have become far more scarce, which Coakley and others attribute to the “buffer zones” now under scrutiny.

“This law is access balanced with speech balanced with public safety,” said Coakley.  “It has worked extremely well.”

Taking this to be true, the critical question is:  Does it matter?  Is the mere possibility of public safety being compromised sufficient to justify shuffling (non-violent) protesters to one side, forcing them to express their views from a distance?

On my own better days, I prefer to fashion myself a First Amendment absolutist.  That is, one who thinks the freedom of expression must be extended to everyone in nearly every circumstance, particularly when the views in question are repulsive or challenging, and that the state had better have a damned good reason to act or legislate to the contrary.

The freedom of speech is a right, not a privilege.  As with all rights that become subject to regulation, the burden of proof necessarily falls on the regulator to show why such an act is warranted.

Does not making pregnant women feel bad qualify?  Count me skeptical.

We can probably agree that no one should be made to feel physically intimidated in public.  That’s what our various assault and harassment laws are for.  Why should free speech enter into it?  Being offered a leaflet and being physically obstructed from entering a building are not equivalent, and we must be very careful not to suggest they are.

Are women entering Planned Parenthood really so fragile that they must be cordoned off from opinions that might cause them distress?

I must say I find the “right to be let alone” argument bizarre in this context.  The fact is that the moment you leave your house every morning, you subject yourself to the possibility of encountering people you would rather avoid and views you would rather not hear.  That’s what it means to live in a free and open society.

Is one person’s constitutional right to procure an abortion more important than another person’s constitutional right to advise against it?  In this case, is it even necessary to choose one over the other?

Christie on the Cross

And the Oscar for best performance by a governor in a leading role goes to….

Or is that “supporting role”?

I must say, I did not intend to view all 108 minutes of Chris Christie’s press conference Thursday, at which the New Jersey governor denied any involvement in the plot to gridlock traffic on the George Washington Bridge for four days in September as a form of petty political payback.

But there is just something about the Garden State’s chief executive that commands one’s attention in a way that few other public officials do.  In this case, it was the sort of attention one might otherwise reserve for a high-speed car chase or a film by Paul Thomas Anderson—the fascination that comes from a pure, audacious, jaw-dropping spectacle.

The kerfuffle in question entails the revelation that a top Christie aide conspired with the Port Authority of New York and New Jersey to close precious access lanes to the George Washington Bridge—the most-traveled span in the world—for no reason except to punish the mayor of Fort Lee, New Jersey—the city on the bridge’s western end—for the crime of not endorsing Christie’s bid for re-election last year.

Facing the obvious questions—What did the governor know and when did he know it?—Christie’s court proved to be one giant mea non culpa:  An exhaustive insistence that Christie himself had nothing to do with this dastardly scheme, complete with a passionate and thorough accounting of how and why this was the case.  (For one thing, Christie repeatedly claimed, he never sought the mayor’s endorsement in the first place.)

As supposedly “out of character” as the governor was, substituting his usual bluster with contrition and introspection, what was most remarkable about the whole episode recalls the signature of the Christie we know and (sometimes) love:  His knack for saying exactly what he thinks and leaving no doubt whatever as to precisely what he means.

Yes, by flatly denying involvement in a flagrant act of government mischief, Christie traced over the footsteps of every public official who has ever been so accused.  With no smoking gun to prove his culpability, what other choice did he have?

The difference, then, is one of degree.

In point of fact, Christie did not mimic the usual routine of vagaries and lawyerly evasions that might get him off the hook in some future legal showdown.  There was no “I do not recall” or “It depends upon what the meaning of the word ‘is’ is.”

On the disputes that matter most—Did he know the true motive behind the lane closings?  Was he aware that some of his top deputies are callous, vengeful pukes?  Was the whole thing his idea?—he offered his side of the story in the most certain possible terms.  He has established a narrative of innocence and he’s sticking with it to the bitter end.

What this means—as we will learn soon enough—is that either he is telling the truth, or he is lying in the most spectacular and self-destructive possible way.

By denying allegations of corruption in such an absolute, “read my lips” manner, Christie has left himself no feasible trap door through which to drop, should his pleas of non-involvement be proved false.  Were evidence of his own handiwork on the bridge caper to come to light—as he must assume it would, if such evidence existed—his status as a serious statewide (and potentially national) leader would vaporize on contact.

Accordingly, if Christie harbors even half the ambition for higher office that everyone assumes, and if he possesses any semblance of political self-preservation, his performance at Thursday’s press conference suggests he is either not guilty of all charges, irretrievably deluded, or simply the dumbest man on planet Earth.

In any case, “performance” is the correct term for the ridiculousness to which curious onlookers were treated last week, for better or for worse.  By proclaiming a lack of culpability for a bizarre bridge debacle until his voice grew hoarse and reporters’ quills ran dry, Chris Christie reaffirmed his status as America’s most singularly watchable chief executive, assuredly leading many to question whether it will become that much more tempting to ultimately look away.