Everybody’s a Critic

Actor-writer-director Jon Favreau has a warm and wonderful new movie called Chef, in which Favreau plays a renowned cook who leaves his prized position at a swanky Los Angeles eatery and dips his toe into the food truck industry.

The hinge event that causes this sort-of career shift is the result of a bad review.

After a popular local food blogger pans the restaurant’s latest roster of courses, Favreau bitterly lures him back to give him a piece of his mind, which swiftly takes the form of a full-blown meltdown.  How unfair, he protests, that he should passionately and lovingly devote his life to achieving culinary greatness, while this wretched, lazy little puke does nothing but stuff his face and scribble snarky putdowns on the web and call himself a foodie.  Who does he think he is?

While this tantrum is ostensibly at Favreau’s expense—both for those in the room and for us in the audience—the chef nonetheless earns our sympathy, as we acknowledge the basic truth in what he says.  Lording over the kitchen of a great restaurant is a backbreaking job, requiring tireless dedication, creativity and raw skill.  Blogging, meanwhile, requires little more than an Internet connection, a thesaurus and a little too much spare time.  (Cough cough.)

Very few of us could ever hope to be a halfway decent chef.  But anybody can be a critic.

The French filmmaker Jean-Luc Godard famously asserted, “In order to criticize a movie, you have to make another movie.”  This is the attitude toward the art of criticism (if it can be called an art) that seems the most just and ideal.

While it is certainly possible to be a great art critic without being an artist yourself, those who have actually immersed themselves in the subject at hand possess a particular wisdom, and warrant a particular respect, that the armchair quarterbacks of the world do not.  In the spirit of the famous axiom that the only true way to learn how to do something is by doing it, there is no finer means of tearing something down than by building something up in its place.

In politics (as we move from the sacred to the profane), our elected representatives would be well-advised to take this truth to heart.

In Washington, D.C., under the Obama administration, the GOP has assumed the role of the “Party of No.”  Whenever President Obama or Democrats in Congress propose some major piece of legislation, Republicans oppose it almost as a reflex—as if the mere fact of Obama’s support axiomatically calls for Republican dissent.

While it is to be expected that America’s two major parties would disagree about the big issues of the day—indeed, that’s sort of the point—today’s GOP is distinctive for its tendency to meet ideas not with other ideas, but rather with nothing at all.  Be it on health care, immigration, gun control, Syria, Ukraine—and on and on and on—the Republican leadership’s view is “Obama is doing it wrong, and never mind how we would do it right.”

Obviously, this is not the case on every last issue, and there have been lonely efforts by individual senators and congresspersons to craft alternative approaches to things like the Affordable Care Act, immigration reform and all the rest.  But these proposals have all been dead-on-arrival upon reaching the House or Senate floor, vetoed by the likes of Speaker John Boehner or Senate Minority Leader Mitch McConnell, who seem to have very little interest in actually solving problems.

The reason this should be a cause for our concern is that the GOP is scheduled to assume control of the Senate in November’s midterm elections, meaning the party will take on far more responsibility in actually proposing, debating and passing laws.  At that point, Republicans will cease being mere critics and start being active participants in the democratic process.  Are they prepared for this eventuality?  For the sake of the republic, let us hope so.

From the moment he assumed the presidency, Barack Obama (and his supporters) rather woundingly learned of certain key differences between campaigning for president and actually being president.  Namely, that the latter is considerably more difficult than the former.

It’s easy enough to stand at a podium and bitch about all that’s wrong with the world and how wonderful things would be if you were in charge.  But holding the keys to the Oval Office and assuming personal responsibility for all actions taken in the name of the United States?  Well, it gives you a far greater appreciation for the challenges of governance than does reading about them in a newspaper or from watching cable news.

Much in the same way that knowing how to prepare a molten chocolate cake yields more wisdom about food than merely eating it ever could.

Party to a Scandal

It seems somehow profane—if grimly appropriate—that America is spending this Memorial Day weekend sifting through evidence that the U.S. Department of Veterans Affairs has become so unwieldy, antiquated and (in some cases) corrupt that some ailing veterans have effectively been left to die in the waiting room, and then, for good measure, had their files deleted or fudged, so as not to arouse suspicion in the agency.

Let us hope the timing of this horror story does not become even more tragically ironic by carrying on until the eleventh of November.

In the meanwhile, as Congress, the executive branch and the fourth estate proceed to ascertain precisely what malfeasance occurred at the VA and to what extent, we can take mild comfort in the assurance that such a thorough investigation is now taking place—albeit after an unpardonably long period of not taking place when it could have.

Further, we might take the opportunity to examine how and why this story came to the public’s (and Washington’s) attention and, more broadly, why certain scandals are dealt with and why others are ignored.

To be sure, some of these so-called investigations are bald, cynical displays of partisanship—a means by one political party of humiliating the other in order to curry favor with voters, no matter how silly or innocuous the supposed crimes might be.

Into this set can be placed the Republican Party’s continued fascination with the attack on the American consulate in Benghazi, Libya on September 11, 2012. The allegation, as it were, is that the Obama administration knew immediately that the assault had been premeditated, but publicly asserted otherwise in order to avoid looking weak in the months leading to the 2012 election.

It would be a compelling story if there were any evidence to support it, but alas, there is not. Instead, there is the steely determination of some in the GOP to keep this narrative alive for the purpose of making President Obama look bad and effecting the rise of a Republican majority in the Senate in November’s midterms.

But what about the real scandals in the recent past—the ones that truly do undermine the integrity of our democratic system? Is the official scrutiny of these also irretrievably tethered to partisan political calculations?

It would appear so, albeit with varying degrees of importance.

The Watergate burglary and subsequent cover-up might have been textbook examples of justice obstruction and abuse of power, but the Senate hearings on the matter convened in a Congress that was heavily Democratic, and therefore predisposed to making President Richard Nixon’s life a living hell. Likewise with the Congress in 1987, during the investigations of the Iran-Contra caper.

Would these prosecutions have proceeded with Republican chairpersons? We cannot know for sure.

In any case, what does it take for the majority party in Congress to embark upon a full and honest accounting of crimes committed by officials in the same party? How does the rare comprehensive ethical housecleaning in government come about?

In an early episode of the ABC drama Scandal, a gang of political operatives hatches a scheme to commit a particularly heinous offense against the democratic process. When one of the conspirators expresses moral qualms, another pipes in, “It has to be unanimous. The only way we trust each other is if everybody’s ass is on the line.”

I think that’s the answer. In order for the investigation to be bipartisan, the scandal has to be bipartisan. The only way one political party will purposefully inflict self-harm is when the other party is in the line of fire as well.

That, in so many words, is what has occurred at the VA.

Yes, the sorts of book-cooking and wait-listing that have so enraged folks across the political spectrum have undoubtedly occurred on President Obama’s watch. But it is equally plain that the root causes of these digressions—namely, the horrifyingly sloppy and short-sighted ways in which the nation has sent its young men and women into war in the first place—is the responsibility of nearly every administration over the last half-century, Democratic and Republican alike.

This point has become well-understood among the electorate, and so both wings of Congress can (and now do) feel free to right this wrong without fear of being singularly throttled for it on the first Tuesday in November. Everyone’s posterior is already on the line, and so if there is to be punishment, it will be felt by all. Politically-speaking, it’s a wash.

Taking all the above to be true, we are left with the frustrating prospect that politics really does drive policy, and always will. That a party will always put its own interests before those of the country, whenever the two conflict.

How do we get them to stop? Only by ensuring that the party’s interests are America’s interests as well, and that’s an undertaking that can only begin at the ballot box.

Fact, Faith and Truth

In politics and elsewhere, there is an old refrain, “You are entitled to your own opinion, but not to your own facts.”

Today, a small university in Tennessee is arguing that, actually, yes, you are.

Bryan College, a Christian liberal arts institution with some 700 undergrads, has long required its professors to sign a Nicene Creed-like “statement of belief,” which includes such assertions as, “the origin of man was by fiat of God,” and, “the holy Bible […] is inerrant in the original writings.”

Recently, the school amended this declaration to also say that Adam and Eve are “historical persons created by God in a special formative act, and not from previously existing life-forms.”

A considerable proportion of the student body has protested the revision.  As well, two faculty members have filed suit, and one professor has resigned.

The college’s president, Stephen Livesay, for his part, has explained that the new clause is meant as a mere elucidation of Bryan’s already-existing views about the origins of the cosmos.  The impetus for it, it would seem, is the alarmingly high acceptance of Charles Darwin’s theory of natural selection among today’s young people.  At least that is what Livesay might be referencing when he says, “We want to make certain that we view culture through the eyes of faith, that we don’t view our faith through the eyes of culture.”

It is noteworthy—if not immediately relevant—that Bryan College is named for none other than William Jennings Bryan, the three-time presidential candidate best known for his defense of the teaching of creationism in the Scopes Trial of 1925.  The school’s campus sits in the city of Dayton, where the trial took place, and was founded, in 1930, for the purpose of “the higher education of men and women under auspices distinctly Christian and spiritual.”

On its website, the school expands on this theme thusly:

Bryan College is founded upon the belief that God is the author of truth; that He has revealed Himself to mankind through nature, conscience, the Bible and Jesus Christ; that it is His will for man to come to a knowledge of truth; and that an integrated study of the arts and sciences and the Bible, with a proper emphasis on the spiritual, mental, social and physical aspects of life, will lead to the balanced development of the whole person.

All well and good, so far as I’m concerned.  There is no reason why a Christ-centric young person shouldn’t have an institution of higher learning at which to feel at home.

Where the problem lies is in the words “historical persons.”

Of course, one is free to believe anything one wants about life, the universe and everything, and about the origins thereof.  What is more, a private college is equally free to establish a particular set of beliefs as its core philosophy, and even to require its faculty to affirm it.  (Bryan College is careful to note that “students are neither required to subscribe to any statement of belief nor placed under any duress with regard to their religious position.”)

But there is a massive difference between professing a belief to be morally true and claiming a belief to be literally true.  A university has many responsibilities, but none is greater than the pursuit of knowledge and truth in all realms, including the field of science.  In making assertions of fact—rather than professions of faith—a college, like a politician, is not entitled to its own reality.

To say Adam and Eve are “historical persons” is objectively not true, as demonstrated by every relevant study of geology and genetics from On the Origin of Species onward.  One can posit Adam and Eve and Creation as a scientific hypothesis, but it is one that has yet to bear fruit (so to speak).

By imploring its professors to accept this baseless claim as read, Bryan College is actively engaging in the promulgation of an untruth, which it may not do if it is to be taken seriously within the university system.

Then again, this inevitably calls into question the relationship between a university and religion itself, and whether the two can ever truly coexist.

As a nonbeliever, I am certainly tempted to simply answer “no” and leave it at that.  In certain essential ways, the respective core functions of churches and universities are not merely different, but are actively at odds with each other.

And yet I am also willing to take a more narrow view of the question by conceding, for instance, that one can believe both in God and in evolution, or that the denial of science does not necessarily prevent one from being a scholar in other subjects.  One should not completely dismiss F. Scott Fitzgerald’s claim, “The test of a first-rate intelligence is the ability to hold two opposing ideas in mind at the same time and still retain the ability to function.”

But this does not make the claim that Adam and Eve were living, breathing humans any less untrue.  Bryan College should knock it off.

Buying Free Speech

There’s something I finally realized about Donald Sterling:  He just doesn’t care.

And far more important:  He doesn’t have to care.

Here is a guy—the for-the-moment owner of the NBA’s Los Angeles Clippers—who is worth somewhere north of a billion dollars.  Who could own a home in every state and park a yacht in every port.  Whose $2.5 million fine from the NBA for telling his mistress not to associate with African Americans is proverbial chump change, compared to what remains in the Sterling family vault.

Indeed, Sterling could continue to say horrible things about black people for the rest of his natural life, and he would go right on marinating in a lap of luxury the rest of us can scarcely even conceive.  He could be fined and otherwise “penalized” over and over again for his abject wretchedness, and it would all be nothing more than a drop in the financial bucket.  In the grand scheme of his lavish lifestyle, Sterling wouldn’t feel a thing.

Over the years, we Americans have come to grudgingly accept that there are certain things rich people can get away with that the rest of us cannot.  For instance, wealthy folks have a way of manipulating the U.S. tax code in ways the less well-off tend not to do.  The haves can afford the high-priced lawyers and endless litigation processes that often enable them to evade well-deserved time in prison—an advantage rarely heaped upon the have-nots.

But what the Sterling episode so handsomely illustrates and underlines is an equally (if not more) worrisome facet of America’s class divide, and that is the fact that, in practice, rich people have a disproportionate right to the freedom of speech.  As with so much else, the ability to escape the consequences of unwelcome free expression is something that can ultimately be bought.

Consider:  Were some fast-food employee, living on minimum wage, to be found to harbor and express the exact views presently attributed to Sterling, such a person would presumably be dismissed from his job for reasons of fostering a hostile work environment—and justifiably so.  From this point, he would then be unemployed and possibly unemployable, getting by on practically nothing and becoming extraordinarily hesitant (not without reason) to exercise his First Amendment rights ever again.

For this hypothetical working man, the presence of nasty, ignorant views is very nearly a matter of life of death.

For a guy like Sterling?  Not so much.

To be sure, $2.5 million is one heck of a sum to surrender for the “crime” of being a bad person, and one with which a lower-class person would obviously never be slapped.  As well, no one has credibly argued that racism in the workplace is acceptable under any circumstances.  Whether the racist in question is rich or poor, the basic rules of social etiquette are the same.  The right to say what is on one’s mind does not imply the right to avoid the consequences of doing so.

But let us not pretend that there is not a universe of difference in the real-world application of this principle.

The NBA’s exorbitant fine for Sterling’s ugliness—or, say, the comparable figures leveled upon Howard Stern for “indecency” at various points in his career—only goes to show what blessed lives our most wealthy brethren lead.  To restate my original point:  If you’re a billionaire, can a loss of $2.5 million really be considered a “punishment” at all?

No.  Rich people’s well-being cannot finally be put under threat because of what they say or think.  They’ll always be able to pay their way to ever more freedom.

In the 2010 Supreme Court case Citizens United v. Federal Election Commission, the high court established the principle (at least in the popular mind) that money is a form of speech, and therefore that campaign contributions need not be subject to certain limits.  This implied that the more money you have, the more “speech” you are able to express in the midst of a political campaign, meaning that the degree of one’s influence on public officials is dependent almost entirely on one’s wealth.  More than ever, Citizens United demonstrated that money equals power.

Alarming as this was (and is), the dynamic of which I speak is slightly different and even more invidious, because it extends this premise well beyond politics and into every sphere of American life.

It’s one thing for wealth to inflate the lengths to which your free speech might extend.  It’s quite another for that wealth to determine whether you have the right to free speech at all.

Disorder in the Court

Given the choice, which kind of a Supreme Court would you prefer:  One openly driven by ideology and politics, or one that pretended to be driven by anything else?

In the world we now inhabit, those are options A and B, and I’m afraid there is no C.  So take your pick.

Sure, we all pine for the days when members of the high court made decisions objectively—interpreting the law with complete disregard for their personal views, prejudices and politics.

Remember those days?  Neither do I.

It would be nice to think that a given judge—not least a Supreme Court justice—could rule on a subject of great import using nothing but pure logic and legal know-how.

However, this assumes that the law itself can be interpreted objectively, which it very plainly cannot—at least not in certain circumstances.  Indeed, that’s the reason we have nine Supreme Court justices instead of, say, one or none at all.  If the meaning of the law and the Constitution were obvious, why bother even having a third branch of government in the first place?

No, the output of our elected representatives—and the Founding Fathers themselves—is always a matter of debate, and every legal mind in America views the law in his or her own way.  As Bill Maher once tartly observed, the law means whatever a lawyer says it means.

I underline this point in light of a highly informative recent column by Adam Liptak in the New York Times, which asserts that the U.S. Supreme Court today is more divided along ideological and partisan lines than it has ever been in its history.  That is to say that, in the most contentious cases (with the rare exception), all the justices appointed by Democratic presidents wind up on one side of the ruling, while all the Republican appointees land on the other.  Liptak adds that this chasm “reflects similarly deep divisions in Congress, the electorate and the elite circles in which the justices move.”

The implication of this fact (assuming it’s true) is that the American people will become increasingly justified in viewing the Supreme Court’s actions as nothing more than “politics by other means,” thereby losing all their residual faith that the institution is either separate from or independent of the two other branches of government, which it is meant to check and balance out.

“The perception that partisan politics has infected the court’s work,” Liptak writes, “may do lasting damage to its prestige and authority and to Americans’ faith in the rule of law.”

While I do not doubt Liptak’s (and others’) assessment that the high court’s recent tendency to rule along party lines is something more than a coincidence—and also that this development is an historical anomaly—I do not see why we should necessarily regard it as regrettable or surprising.

To wit:  Why should we expect a given justice not to possess some sort of “ideology” when it comes to doing his or her job?  While there are many justices, past and present, who insist that their legal views are informed only by the law itself, these self-appointed bastions of purity are being too clever by half.

When two people look at the same case and reach completely opposite conclusions, one of two things must be true.  Either one of the two people is mistaken, or there is more than one way to apply the law to a given situation.

Assuming the latter (although many would prefer to assume the former), we are left to speculate as to what might lead two intelligent people to view the law differently.  My answer is that they simply have different ways of seeing the world—influenced by their particular life experiences—and we call this point of view an “ideology.”

Recently, for instance, the Supreme Court ruled, 5-4, that a local town board did not violate the Constitution by beginning each session with a prayer.  The majority argued—correctly—that the First Amendment does not explicitly prohibit all prayer in public places, while the minority insisted that the “wall of separation” between church and state, as articulated by Thomas Jefferson in 1802, is necessarily broken whenever sectarian religion is introduced—even though the Constitution does not say so.

The above can be seen as different readings of the First Amendment, both completely defensible and endorsed by hefty minorities of the American public.  But they are also indicative of divergent ideologies—in this case, an ideology of narrowness (“the Constitution says X”) versus an ideology of broadness (“in saying X, the Constitution means Y”).

It is my view that so long as a coherent legal argument can be made to support any ruling in any case, it should not concern us precisely how that argument materialized in its speaker’s head.

Everyone has a worldview that informs how they think and live their lives, and Supreme Court justices are no different.  This is true whether we admit it or not, so we might as well admit it.

Declining to Serve

Is patriotism overrated?

I do not mean the sort of patriotism that leads one to join the U.S. Marine Corps, or to petition the government for a redress of grievances.

Rather, I speak of the impulse of certain people to take high-ranking positions in the government out of a sense of patriotic duty, at the expense of all other considerations.

I ask because I happened upon a lengthy profile of former Treasury secretary Timothy Geithner in Sunday’s New York Times.  In this feature—written by Andrew Ross Sorkin, in conjunction with Geithner’s forthcoming book, Stress Test: Reflections on Financial Crises—there is an intriguing passage detailing the moment in 2008 when then-candidate Barack Obama approached Geithner about possibly being his Treasury head.

By Geithner’s (and Sorkin’s) telling, the meeting amounted to Geithner essentially making the case against himself, explaining point-by-point why he was the wrong man for the job.

The punch line, as it were, is that in the fullness of time, Geithner’s reservations proved entirely correct.

“I’ve been up to my neck in this crisis,” Geithner says he told Obama at the time, referring to his role as president of the Federal Reserve Bank of New York amidst the 2008 Wall Street meltdown and the resulting government bailouts.  “You’re going to have a hard time separating me from these choices if you ask me to work with you.”

“[B]y appointing Geithner, one of the main architects of the rescue strategy,” Sorkin explains, “the president would essentially be forced to endorse the bailouts, which could have negative political consequences.”

Could they ever.

Even without the benefit of hindsight, Geithner clearly saw his nomination as fraught, and makes it clear now that he did not “want” the position in any meaningful sense.  And yet when it was formally offered, he accepted it all the same.  In the words of Rahm Emanuel, Obama’s then-chief-of-staff, “[Geithner] said he wasn’t going to lobby for the job or put a campaign on for it, but obviously, if asked, he would serve.”

If asked, he would serve.  Famous last words.  Indeed, it recalls the explanation by Jon Huntsman, the Republican former governor of Utah, about becoming America’s ambassador to China under Obama, a Democrat:  “The president asked me, the president of all the people.  And during a time of war, during a time of economic difficulty for our country, if I’m asked by my president to serve, I’ll stand up and do it.”

When he said this in the spring of 2011, Huntsman was plotting to run against Obama in 2012 and needed to neutralize the inevitable charge within the GOP of being a little too intimate with the enemy.  (As for how well that strategy worked out, the polls speak for themselves.)

On the one hand, “if asked, I will serve” is an admirable formulation—a means of “putting country first” and sacrificing such things as quality time with your family or a seven-figure gig in the private sector.

Conversely, this altruistic attitude is oftentimes utter nonsense—a bit of false modesty acting as a rhetorical shield against high ambitions and starry eyes.  After all, to be an ambassador or Cabinet official might not yield great wealth or prestige, but it’s one heck of a résumé-booster.  Then there’s the old Samuel Johnson line, “Patriotism is the last refuge of the scoundrel.”

Taking Geithner at his word—that he genuinely viewed himself as ill-suited for Treasury secretary and accepted it only out of a sense of obligation—we are left to wonder whether he should have declined the offer, and, more broadly, whether “if asked, I will serve” is a lousy reason to work in government in the first place.

To wit:  If you’re unsure of your fitness for a particular post, wouldn’t it be more “patriotic” to just say no?  If your involvement in the executive branch could do more harm than good, wouldn’t it be in the country’s best interest for you to resist the call of duty?

To enter public service for its own sake is a wonderful sentiment, and it has produced innumerable good works over the centuries.  However, government and politics ultimately depend on old-fashioned competence and cold-eyed realism, and patriotism is not always sufficient in pulling those things off.

In some circumstances, putting country first means putting yourself last.

Build Up This Wall

My fellow Americans:  Is it really that hard to separate church from state?

I admit that for an atheist like me, the challenge of keeping my religious convictions to myself in public is no challenge at all.  I have no religious convictions in the first place, and thus no church from which to part ways when involving myself in the affairs of state.

I understand that for the super-majority of my countrymen for whom God and/or religion play a significant role, this is not such an easy task.  I understand that one’s sincerely-held articles of faith cannot simply be checked at the door upon leaving one’s house—not any more than can my own view that the existence of God is neither real nor necessary in leading a virtuous life.

And yet I nonetheless wonder why America’s believers are so very insistent upon foisting their godliness upon us nonbelievers—we, who would rather be left to practice our devil worship in peace.  Is it really too much of an imposition to keep one’s religion within one’s own heart without introducing it into the public square?

Apparently so.

The reason I ask, you see, is because of this week’s Supreme Court decision in Town of Greece v. Galloway, in which the court ruled, 5-4, that a town board could begin its public sessions with a prayer without violating the First Amendment to the U.S. Constitution.

In the upstate New York town of Greece, legislative meetings have indeed kicked off with a formal prayer, led by some chaplain or other, since 1999.  The town says it welcomes religious adherents of all faiths (and atheists) to deliver these invocations.  In practice, however, they have been overwhelmingly Christian.

Consequently, two residents of Greece—Susan Galloway, who is Jewish, and Linda Stephens, who is an atheist—issued a formal complaint, which was dismissed by the U.S. District Court in 2010, but reversed in 2012 by the U.S. Circuit Court of Appeals, which ruled that Greece’s tradition violated the Establishment Clause of the First Amendment.  This past Monday, the U.S. Supreme Court found precisely the opposite, and that is where we now stand.

The Establishment Clause to which we refer reads simply, “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof.”  Thomas Jefferson famously elaborated on this concept in a letter in 1802, writing of a “wall of separation between Church and State,” justified by the fact that, in his view, “religion is a matter which lies solely between Man and his God.”

In this context, one cannot help but wonder how the Greece case is not a textbook example of religion intruding where it plainly does not belong.

The high court’s majority view, as articulated by Justice Anthony Kennedy, is that the prayers at Greece’s board meetings serve a purely ceremonial role—they are “meant to lend gravity to the occasion and reflect values long part of the Nation’s heritage,” as Kennedy put it—and, as such, cannot reasonably be seen as an endorsement of any particular religion or, indeed, any particular doctrine therein.  No one is being coerced into accepting the chaplains’ remarks as gospel, so to speak, and (as previously noted) no member of a minority faith is excluded from participating.

I am prepared to accept all these premises as true, but I would nonetheless argue that it ultimately doesn’t matter.

To wit:  The fact that a chaplain’s prayer at the start of a government session does not actively seek to convert its audience does not negate the fact that it is a plainly religious act being performed in a government building, on government time, with the full endorsement of the government.

If it really is just a symbolic gesture to “lend gravity to the occasion,” why not substitute it with a piece of prose equal in significance but without the problematic sectarian bent?  Say, the Preamble to the Constitution or the Pledge of Allegiance (albeit without the meddlesome “under God”)?  Why does this invocation need to have religious tones, when our country’s founding documents so strenuously caution against it?

It is worth recalling that in 1777 when Thomas Jefferson drafted the Virginia Statute for Religious Freedom—a document that would serve as the template for the First Amendment—it was in explicit opposition to an initiative by Patrick Henry for the government to financially support all churches, rather than none at all.

Henry lost this argument in Virginia, yet his scheme sounds an awful lot like what the good folks of Greece, New York—and, thanks to the Supreme Court, any locality in America—have just won for themselves.  Namely, the notion that so long as all faiths are accorded equal moral weight and consideration (if only in theory), the principle of removing religion from government need not be strictly enforced.

It might sound like a sensible idea, but it sure ain’t in the Constitution.