It’s the Court, Stupid

There was a moment last week—thankfully, it was only a moment—when American liberals’ hearts stopped and it felt like the world was about to end.

It came when the U.S. Supreme Court announced that Associate Justice Ruth Bader Ginsburg had recently undergone radiation treatment for a tumor in her pancreas—the latest in a long line of cancer scares for Ginsburg going back several decades. (She was diagnosed with colon cancer in 1999 and pancreatic cancer in 2009.)

While this most recent brush with mortality apparently ended well—“The tumor was treated definitively and there is no evidence of disease elsewhere in the body,” the court said—it served as a reminder—which we most certainly needed—that, at 86, the Notorious RBG will not be on the Supreme Court forever; that she is as susceptible to the ravages of age as the rest of us; and that her long and storied history of cheating death will one day come to an end.

Sooner or later, one way or another, Justice Ginsburg will be forced to relinquish her seat on the Supreme Court, enabling the then-president to nominate a successor—someone who, in all likelihood, will serve for the next 30 or 40 years.

As four out of five actuaries will tell you, that president will be Donald Trump.

Consider: Beyond Ginsburg’s own series of health calamities, only three Supreme Court justices in history have lived longer while on the bench than Ginsburg already has. Should Trump be defeated in 2020, Ginsburg would be two months shy of 88 when the new president is sworn in, at which point she could safely retire without the court’s center of gravity swinging irreparably to the right.

But if Trump is re-elected and serves until January 20, 2025? Well, what’s 88 plus four?

Did I mention that Stephen Breyer, the other long-serving liberal on the court, is just five years younger than Ginsburg and possibly less indestructible than she is?

I bring all of this up for one exceedingly simple reason: While the 2020 election may come to signify any number of things—about America, about democracy, about the future of Western civilization writ large—it will most assuredly determine the composition of the Supreme Court for a generation or more, and there is no more compelling reason for left-leaning voters to support the eventual Democratic nominee than that.

Long story short: The re-election of Trump all but guarantees a 7-2 conservative majority on the nation’s highest court. Just for starters, that means the disintegration of Roe v. Wade; the end of Obamacare as we know it; the solidification of the so-called “unitary executive theory,” whereby the president can do pretty much whatever the hell he wants for any reason. It means further erosion of the Voting Rights Act and firmer entrenchment of unchecked voter suppression. It means LGBTQ equality is no longer guaranteed but corporate personhood is. It means guns for all and unions for none.

It’s the great flaw of the Democratic Party (among many others) that its leaders can’t turn these dire, self-evident truths into a foundational election year issue—that they can’t seem to impart the monumental importance of the judicial branch in Americans’ day-to-day lives, and the singular role the president plays in shaping the composition thereof.

You know who did understand this dynamic and communicated it repeatedly, and to great effect, in 2016? Donald Effing Trump.

For all his blabbering, unprincipled incoherence on the campaign trail, candidate Trump made it crystal clear at every available opportunity—particularly when his back was against the wall and it looked like his entire candidacy was going up in smoke—that a vote for him was a vote for a right-wing judiciary from one end of the federal government to the other. That if Republicans entrusted him with control of the executive branch, he would bequeath them an unimpeachably conservative roster of judges—all with lifetime appointments—in return.

It was a brazen quid pro quo of the first order, and boy oh boy, did he deliver.

Ask a certain breed of conservative—the sort who found Trump by turns offensive, odious and embarrassing—why he held his nose and voted for him anyway, and he’ll simply rattle off two names: Neil Gorsuch and Brett Kavanaugh.

That’s to say nothing of the president’s myriad appointments to the all-important circuit courts, filling vacancies that Senate Majority Leader Mitch McConnell cynically—and, in retrospect, brilliantly—kept open while Barack Obama was in office.

This is neither to excuse nor justify the conscious enabling of an authoritarian, racist windbag by millions of voters who supposedly knew better.

Rather, this is to remind Democratic presidential candidates and their advocates that scaring their own voters about the future of the Supreme Court is an entirely valid and potentially fruitful strategy, and if self-preservation is an instinct they possess—a debatable question, at best—they could do a lot worse than to order a few million yard signs reading, “Democrats 2020:  Because RBG Isn’t Getting Any Younger.

Night and Day

If there is one thing I have learned for sure about Hillary Clinton, it’s that she is both better and worse than everyone seems to think.

Worse because of her ongoing paranoia, deceit and iron-fistedness vis-à-vis her quest for the Oval Office.

Better because of her wit, intelligence, compassion and jaw-dropping stamina as they relate to the exact same goal.

In the spring of 2008, I wrote an op-ed for my college newspaper in which I petulantly griped about how Hillary Clinton has a way of getting under your skin even as you find yourself agreeing with most of what she stands for.  How her single-mindedness and love-hate relationship with rules and facts tend to overshadow her finer qualities, even for those who are otherwise prepared to accept her as the standard-bearer for the Democratic Party.

Re-reading that article seven-and-a-half years later, I am somewhat alarmed by how well it holds up.  While my writing has matured (arguably), my hang-ups about a potential President Clinton Part II were pretty much exactly the same then as they are now.  They include:  Her penchant for making up stories when the truth is readily available for all to see; her brazen disregard for the rules whenever they are inconvenient; and her tendency, in any case, to exacerbate the little scandals that pop up whenever she is in power, invariably by blaming the whole thing on her would-be enemies, be they Republicans, foreign governments or a White House intern.

All of those quirks still apply, and must forever be held in consideration when one endorses Clinton for president or any other office.  As ever, a vote for Hillary is a vote for all the baggage that comes with her.  And that’s before we get into the issues that involve actual substance.  As the enduring success of Bernie Sanders demonstrates, there remains a great minority of Democratic primary voters who consider Clinton the wrong candidate at the wrong time and who, should she become the party’s nominee, might even stay home on Election Day rather than pull the lever for her.

Against all of that, however, I come bearing news:  Politics has changed a lot over the last two election cycles and we no longer have the luxury to vote only for candidates we like.  When and if we make it to November 8, 2016, most of us will be faced with two people whom we don’t particularly want to be president, but we’ll need to choose one of them all the same, because that’s how elections work.

I know:  This sounds like a “lesser of two evils” lecture.  It’s not, because presidential campaigns are not a choice between two evils.  Deciding to ally with Stalin against Hitler—that was a choice between two evils.  When we vote for a commander-in-chief, the decision is between not just individuals, but two opposing philosophies of how to run the government of the most important republic in the world.  There’s nothing evil about it, but the choice is stark nonetheless—now more than ever before.

If you think there is no meaningful difference between Republicans and Democrats, you’re not paying close enough attention.  If you’re unwilling to vote for either because their candidates just aren’t perfect enough, you’re a child and a fool.

Last Saturday’s Democratic debate drew only a fraction of the audience of any GOP contest this year.  That’s a real shame, because, if nothing else, it affirmed Bill Maher’s observation in 2008 that to see both parties talk, it’s as if they’re running for president of two completely different countries.

Case in point:  At the most recent Republican forum, you would be forgiven for thinking that 9/11 happened yesterday and that terrorism is the only thing worth caring about when it comes to the welfare of the United States and its citizens.  It was practically the only subject that came up, while such things as the economy, health care, infrastructure and even immigration received little more than a passing shout-out from any of the nine candidates.

The Dems spent plenty of time on terrorism, too—the San Bernardino massacre made it unavoidable—but they allocated equal, if not greater, emphasis on subjects that are—let’s be honest—considerably more urgent and germane to all of us at this moment in time.  Along with the issues I just mentioned, these included gun control, race relations, income inequality, college affordability and the fact that America’s prisons are overstuffed with people whose only “crime” was getting high and having a good time.

This isn’t your ordinary, run-of-the-mill disagreement over national priorities.  This is a dramatic, monumental clash over whether the only thing we have to fear is fear itself.  The whole GOP platform has been reduced to, “Be afraid all the time, because you could die at any moment,” while the Democrats act as if tomorrow might actually come and we might as well live and govern accordingly.

Is this the lowest bar we’ve ever set in the history of presidential elections?  Possibly.  Indeed, it’s downright depressing that the very act of governing is no longer seen as a given for anyone in public office.

What is far more depressing, however, is that so many citizens seem to think it doesn’t matter which party is in charge, or that both parties are equally at fault for all of the preventable problems that have occurred throughout the Obama era.  Neither of those assumptions is true, and there are tangible consequences to thinking otherwise.

Care for some examples?  Listen to the GOP’s own rhetoric:  If a Republican is elected president next year, it means the Affordable Care Act is in danger of actual repeal, as is the nuclear agreement with Iran.  It means reversing climate change is no longer a priority, along with the rights of black people, gay people, poor people, women, immigrants, Muslims and refugees.  It means the Supreme Court will net at least one conservative justice, which could easily lead to decisions adversely affecting all of the above and more.  It means our “war” against ISIS will almost certainly escalate to include actual boots in the sand, and God knows what impact that’ll have on our national debt (to the degree that anyone cares).

I realize, of course, that America’s conservatives would be thrilled by such results, but that’s not really who I’m talking to right now.

No, I would mostly just like to remind my fellow leftists that there is a limit to what your disgust with “establishment” Democrats like Hillary Clinton can accomplish.  Clinton is most certainly a flawed candidate, and a flawed messenger for the liberal view of good governance.  She is plainly compromised by her close relationship with the financial industry and remains insufficiently skeptical of large-scale military interventions in the Middle East.  She hasn’t yet mastered the art of damage control and offers little assurance that she won’t create more damage in the future.  A second Clinton presidency would guarantee a fair share of political nonsense from the day she arrives to the day she leaves.

Know what else it would guarantee?  Health insurance for tens of millions of people.  Funding for Planned Parenthood.  Increased protections for the LGBT contingent.  A more liberal Supreme Court.

And it would guarantee our first female commander-in-chief.  Sure, I know we’re supposed to be a meritocratic society that doesn’t care about race, sex, etc., but let’s not pretend that following our First Black President with our First Woman President wouldn’t be unimpeachably gratifying.  We already know beyond doubt that a woman can manage a country at least as well as a man—perhaps you noticed that, for the last 10 years, one such woman has been more or less running all of Europe—but wouldn’t it be great to have it actually happen here?

Of course, none of this matters during the primary phase of the campaign, where we are now.  So long as Democratic voters still have a legitimate choice between Clinton and Bernie Sanders (and, I suppose, Martin O’Malley), they have every obligation to argue about which option makes the most sense for where the party ought to be, and that choice is always a balance between ideological purity and perceived electability.  If you want Sanders as your nominee, you’d best make your case now, before it’s too late.  (I’ve already made mine.)

But should time run out and your preferred candidate lose, realize that our whole electoral system operates on the principle that the party is ultimately more important than any individual within it, which means a great number of people will be forced to compromise some of their deepest-held beliefs in the interest of party unity—because it’s better to support someone with whom you agree 60, 70 or 80 percent of the time rather than ensuring victory for someone with whom you agree not at all.

If total ideological alignment leads to total electoral defeat, then what good did those principles do you in the first place?  Republicans have been learning this lesson continuously since the moment President Obama was elected.  Are Democrats on the verge of making the same stupid mistake?

National Shame

Should national pride still be a thing?  Is it good for us to eternally sing America’s praises, or should we finally give it a rest?

For no particular reason, this question has risen to prominence in recent weeks, in the form of a handful of disparate events across our great (or not great) land.  We’ve covered this territory before, but it would appear the issue has not yet been resolved.

It began at the University of California, Irvine, where the student-led Legislative Council voted to ban the display of the American flag—or any other flag—from the lobby of its building.

In a lengthy resolution explaining its decision, the group noted that because the U.S. flag “has been flown in instances of colonialism and imperialism” and that “flags construct paradigms of conformity and set homogenized standards,” it is necessary to remove said flag in order to foster “a culturally inclusive space.”

In other words, the American flag in inherently nationalistic and exclusionary, representing America’s sins while pretending to extol its virtues.  As such, to display it is to tacitly condone the entirety of American history—slavery, genocide and all.  As far as the UC-Irvine Legislative Council is concerned, we’d be better off without it.

As it happens, the anti-flag resolution was vetoed by the student government’s Executive Cabinet two days later, following an uproar that led the school’s administration to condemn the original motion as “misguided” and “not endorsed or supported in any way by the campus leadership, the University of California, or the broader student body.”

So the controversy is over at Irvine, but it certainly isn’t over everywhere else.  And it shouldn’t be, because several of the Legislative Council’s assertions about the flag, and patriotism in general, were absolutely correct, and we might as well fess up to them.

Like it or not, national flags are symbols of a particular set of ideals that, by definition, do not necessarily encompass the values and experiences of every last individual.

Like it nor not, the Stars and Stripes do represent the totality of the United States as a nation and an idea, dating back at least to 1776, if not 1607 or even 1492.

And like it or not, the story of America is an ugly one—a veritable horror show of racism, religious intolerance, ethnic hatred, extermination of Natives, subjugation of women and—as the Irvine group noted—the practice of imposing our way of life on foreign populations that did not ask for them.

Are we sure we’re proud of this?  Is it worth even implying that we are?

Sure, most people do not include any of the above when enumerating the reasons America is a great country.  When we talk (or sing) about being “proud to be an American,” we’re just thinking about the good things:  The First Amendment, free enterprise, due process, trial by jury, the Super Bowl, apple pie and so forth.

When it comes to the bad things that make America America, we compartmentalize and rationalize—two of our finest national traits—by insisting that while the United States has committed plenty of sins, they occurred a long time ago and we have learned our lesson and corrected course.  Done and done.

It is certainly appealing to think that the story of the United States is one of constant positive evolution—a breaking away from all our old habits into the actual beacon of liberty we have always claimed to be.  In this way, we regard our country like we do a child who does something wrong but then realizes his mistake and gradually becomes a better person.

The difference, however, is that children generally do not commit genocide against almost the entire Native American population, or systematically prevent all women and black people from voting.

It’s easy enough to be proud of America at its regal, idealistic best.  But it’s awfully hard to shrug away everything else without making yourself look like a damned fool or a mindless jingoist.

Don’t get me wrong:  I think it’s terrific and commendable that the United States has (mostly) abandoned its racist, sectarian past.  On matters of equality and civil rights, America has never been better.  That’s to say nothing of our superior technology, economy and armed forces.  We might not be the Greatest Country in the World in every category, but we’re well above average, and that’s something to be thankful for.

But as we are reminded every time a white police officer shoots an unarmed black civilian—or when our government unlawfully records our phone calls and e-mails, or systematically tortures prisoners—our country and our culture are not half as perfect as our constant displays of patriotism would suggest.  Many of our national successes are little more than the clearing of a very low bar.

While we have a right to be satisfied with clearing any bar—especially when so many other countries are content not to—we should more readily acknowledge our limitations and residual imperfections, and the fact that we’re not nearly as superior to the rest of the world as we think.

Perhaps this is what inspired the other recent micro controversy on this subject:  The debate at Lexington High School in Lexington, Massachusetts, about whether the theme of an upcoming school dance should be changed from “American Pride” to “National Pride,” so as to accommodate students whose families hail from other countries.

As with Irvine, Lexington’s irrepressible patriotism prevailed and the “American Pride” dance will go on as scheduled.  However, the very fact that there was a scuffle about it—in the town where the Revolutionary War began, no less—suggests that the notion of a more introspective and humble America is alive and well.

But we are left with the problem of national pride itself—regardless of which nation we’re talking about—and whether it should still exist.

I think the concept is silly and absurd, and that George Carlin was onto something in saying, “Pride should be reserved for something you achieve or attain on your own, not something that happens by accident of birth.”

In other words, even if America really were perfect, to be “proud” of it would imply that you, personally, had something to do with creating that perfection.  Since neither of those things is the case, all you’re really saying is that hundreds of years of trial-and-error living and governing by hundreds of millions of people has made America a really nice place to live, and you’re extremely happy that you happen to live here, too.

So why not just say that?  Why not be grateful for having the unbelievable luck of being born into a free, multicultural, pluralistic society and leave it at that?  Why get all uppity and arrogant about it, as if it’s necessary to assert something over and over again in order for it to be true?

To get a sense of how unappealing this sort of mindset can appear to outsiders, look no farther than Texas, where the Supreme Court is about to decide whether the Confederate flag can be stamped onto state license plates.  The case is being brought by the group Sons of Confederate Veterans, which claims that its free speech rights were violated when Texas refused to issue specialty license plates bearing the controversial Southern emblem.

As in past squabbles over whether the symbol should appear in public, the Sons of Confederate Veterans argues the flag represents “sacrifice, independence, and Southern heritage,” while opponents say it represents slavery and racism.

And of course, both sides are correct.  As the official logo of the Confederacy, it embodies a group of states that held millions of black people in bondage for centuries, right up until a terrible war put an end to it once and for all.

How is this any different from the American flag and what it represents?

We Northerners like to claim moral superiority on the grounds that we have always opposed slavery while so many Southerners seemingly still don’t.  On the other hand, groups like Sons of Confederate Veterans are adamant (however dubiously) that their continued “pride” has nothing to do with slavery, and meanwhile, although few slaves existed outside the South through 1865, Northern states greatly profited from their trade and have hardly been immune to racial tension ever since.

We shouldn’t let ourselves off the hook so easily.  It’s unseemly and it’s unwarranted.  There’s a reason that pride is one of the seven deadly sins, while shame is not.

Blue Sunday

Why should the government tell me when I ought to buy booze?  Why should the government dictate when businesses are permitted to sell booze?

Alcohol is a legal commodity.  Why should the government be involved at all?

I’m not talking about age limits, which are probably necessary and which the state can be said to have a “compelling interest” in enforcing (to use the legal jargon).

No, I speak of the tradition whereby a state or local government can restrict the hours during which local businesses—namely, liquor stores—can sell alcoholic beverages to their customers, necessarily abridging people’s purchasing power and, to a degree, cutting into those businesses’ profits.

Specifically, I refer to the famous “blue laws” in the commonwealth of Massachusetts, which most concern me because, well, that’s where I live.

Until this past week, no Massachusetts package store (“packies,” we call them) could sell alcohol before noon on Sunday.  This regulation was established in 2003, amending a previous bylaw that prohibited Sunday alcohol sales outright.

On Tuesday, however, both houses of the state legislature approved a further loosening of this statute, moving up the opening liquor bell on Sunday from noon to 10 a.m., the hour at which bars and restaurants are already able to serve alcoholic beverages to their guests (mostly in the form of Mimosas and Bloody Marys, one assumes).  Whether or not Governor Deval Patrick ultimately signs this bill, state law will continue to stipulate that off-premises liquor sales not occur between 11 p.m. and 8 a.m. on the six remaining days of the week.

By no means is Massachusetts the only corner of America where alcohol cannot be bought and sold at all hours of the day and night.  Nearly all 50 states have such restrictions of one kind or another, be they statewide or on a town-by-town basis.  (A notable exception is Nevada, where it’s very nearly illegal to be sober.)

But the Sunday issue is a singular phenomenon, slightly separate from (and more interesting than) all other liquor laws in these United States, and also more specifically tied to the history and sensibilities of old-fashioned New England.

While no one can quite agree on the origin or exact meaning of the term “blue laws,” the concept arose for unambiguously religious reasons.  Blue laws were, in the first instance, a means of enforcing the Fourth Commandment, “Remember the Sabbath day, to keep it holy.”

Accordingly, the term encompasses proscriptions on all manner of formal activity performed on Sunday, when everyone is supposed to be at church.  These include prohibitions on selling cars, opening grocery and department stores, hunting game and, for a short time in one part of New Jersey, “singing vain songs or tunes.”

While a great many of these ordinances have since been relaxed or abolished, some are still on the books, particularly regarding booze.

Crucially, the rationale for them has evolved from its Biblical roots, since any attempt to regulate business practices based on a few lines from Exodus would be seen today as flatly unconstitutional.

Instead, we have the Supreme Court asserting, in the 1961 case McGowan v. Maryland, “The present purpose and effect of most of our Sunday Closing Laws is to provide a uniform day of rest for all citizens […] [T]he fact that this day is Sunday, a day of particular significance for the dominant Christian sects, does not bar the State from achieving its secular goals.”

In other words, the government is within its rights to mandate that certain businesses take a day off, on the grounds that it is in the best interests of everyone—believers and nonbelievers alike—for them to do so.

And so the question becomes:  Is this rationale good enough to merit the forced halting of free enterprise during certain designated hours?  What “secular goals” are we talking about, anyway?

Is it to protect the lowly employees of these establishments from being overworked?  Tell that to the minimum-wage laborers at 24-hour Walmarts and IHOPs, which somehow manage to evade such regulation of their business hours.

Is it to stop people from drinking too much, with the added assumption (to borrow an adage from How I Met Your Mother) that no good purchasing decisions are ever made after 2 o’clock in the morning?  The argument seems reasonable and desirable on its face, until you begin to apply the lessons of Prohibition, which include the fact that the most surefire way of getting people to drink is by making it difficult for them to do so.  That alcohol today is so very available, indeed, only reinforces the peculiarity in thinking its effects can be reined in by locking the cash register for a few hours every week.

Which returns us, in a way, to my original question:  What’s the point?

So long as booze remains a legal product, and so long as individuals continue to enjoy it, you cannot physically prevent them from doing so, and you probably shouldn’t try.

Those who want to take a “day of rest”—at church, at home or anywhere else—are free to do so.  No one is stopping them.

As for people who would love to take Sunday off but can’t because they have to work:  I’m afraid the shuttering of package stores will not be of much help in this regard (except, of course, for those who work in one).

Meanwhile, come September there will be a significant chunk of the American public for whom the prospect of Sunday-as-Sabbath means precisely one thing:  NFL football.  And you know what really complements a nice, relaxing day of game-watching—say, something cold and refreshing that you could pick up on the drive over to your friend’s place?

Here in Massachusetts, let’s say it’s probably a member of the Adams family.  And I don’t mean the presidents.

The Right to Hate

I have no evidence that the Westboro Baptist Church is secretly a pro-gay rights organization masquerading as a gang of religious extremists in order to make anti-gay groups look ridiculous.

However, if such a cheeky cabal were formed, I suspect it wouldn’t look a heck of a lot different.

For the past many years, the Westboro Baptist Church has served two essential purposes in American public life.  First, to be arguably the most universally detested organization in our 50 states united.  And second, to ensure, beyond all doubt, that the First Amendment to the U.S. Constitution is as healthy and muscular now as ever it has been.

To review:  The WBC are the folks who shuttle from place to place wielding signs with such heart-warming messages as “God Hates Fags,” “God Hates America” and “Thank God For Dead Soldiers.”  Most of its members are related, either by blood or marriage, to its founder and patriarch, Fred Phelps, who died on March 19, at age 84.

The group is perhaps most notorious for its practice of picketing the funerals of U.S. soldiers, whom it claims were killed as a consequence of America’s tolerance for homosexuality, among other things.  In 2010, this ritual led to a Supreme Court case, Snyder v. Phelps, in which the Court ruled in favor of the church, arguing that protesting a funeral is a form of free expression protected by the First Amendment.

While the death of Fred Phelps does not necessarily mark the demise of the Westboro Baptist Church itself, it may well hasten its diminished presence in the public eye.  As such, we might entertain the notion of referring to the WBC in the past tense, if only for its cathartic effects.

On this subject, I have but one question:  On balance, has the Phelps family been good for America?

My answer:  Yes, but it’s complicated.

I say the WBC is the most hated organization in America—a fairly uncontroversial sentiment—but we might also say it has come by this distinction rather lazily, as far as generating mass hatred goes.

After all, what could be more of a “slam dunk” in the quest for amassing public scorn than to spit on the graves of fallen soldiers and to craft placards with the sort of radioactive language that leads even those who otherwise agree with you to recoil in disgust?

The WBC can be accused of being any number of things, but subtle is not one of them.

Quite to the contrary, they are cartoon characters—hysterical, childish, simplistic, ideologically absolutist to an extent previously not thought possible, and—surprise, surprise—completely convinced of their moral rightness on all fronts.

Indeed, the more time one spends reading the WBC’s various statements on matters of public import, the more one feels the weight of precious seconds of one’s life being irretrievably wasted away.

In other words, the WBC seems to incite the world’s rage and indignation for their own sake, as if it were all one big piece of performance art.  As such, the church can hardly be taken seriously in the first place.  To coin a phrase:  Its antics are not worth dignifying with a response.

Yet we have done exactly that, be it through satire and counter-protests, or in the case of people like Albert Snyder, through lawsuits alleging the infliction of deep emotional distress.

And we cannot blame some folks for taking WBC at face value, since its views do not exactly come from nowhere.  In point of fact, the church’s basic beliefs about homosexuality are drawn directly from the Old Testament, and its musings that God kills Americans as a punishment for homosexuality is an almost word-for-word plagiarism of Jerry Falwell’s infamous explanation for the attacks of September 11, 2001.

In any case, their flagrant ridiculousness has proved exceedingly useful in reminding ourselves that enforcement of the First Amendment can be a very nasty business, since the right to free expression must be extended even to those whose views no one else on planet Earth wishes to hear.

In this way, the Phelps family’s victory at the Supreme Court was a great relief, because it demonstrated that—at least in this case—our federal institutions still take the Bill of Rights seriously.  That our most sacred liberties apply even to those who probably don’t deserve them.  Yes, even organizations like the Westboro Baptist Church, which expresses nothing but scorn toward the very country in which these liberties are practiced.

For better and for worse, that is what America is all about.

What’s the Use?

The U.S. Supreme Court recently struck down Section 4(b) of the Voting Rights Act of 1965, which was the provision that singled out several states and counties—mostly in the Deep South—for special scrutiny regarding how they conduct their voting procedures, requiring that any changes thereof be cleared by the Justice Department.

The presumption—borne out by facts—was that such designated states and counties had rigged their voting rules, usually through literacy tests or poll taxes, in order to suppress the votes of black people and other minorities, a practice the Voting Rights Act was designed to stop and prevent.

The essence of the Supreme Court’s argument in invalidating Section 4(b) is that the law has achieved its desired purpose and today is no longer just or necessary.  It is, rather, a mere relic of a bygone time.

While the accuracy and implications of this judgment have, themselves, fallen under careful scrutiny—can states such as Texas and Alabama truly be trusted not to disenfranchise certain people from voting?—the suggestion that a law that was once legitimate can become illegitimate, and thus subject to repeal, is an intriguing one and worth pondering further.

As a test case, let us consider the Third Amendment to the U.S. Constitution.

Admittedly, not many people at present are walking around considering the Third Amendment, at least not compared with Amendments One and Two.

That, in so many words, is my point.

The Constitutional clause in question—you may well have forgotten—concerns the “quartering” of troops in the homes of U.S. citizens.  It reads, simply, “No Soldier shall, in time of peace be quartered in any house, without the consent of the Owner, nor in time of war, but in a manner to be prescribed by law.”

The background of, and justification for, this conjunction in the Bill of Rights is the pair of Quartering Acts, enacted by the British Parliament in 1765 and 1774, respectively, which stipulated that British troops could—and, in the proper circumstances, shall—be housed in private residences in the American colonies, whether the rightful owners of those residences liked it or not.

The latter iteration of this legislation was condemned by rebel American colonists as one of the so-called “Intolerable Acts” that were imposed as a response to the Boston Tea Party.  Among the grievances in the Declaration of Independence was an attack on King George III for “quartering large bodies of armed troops among us.”

Accordingly, the Third Amendment can reasonably be seen as a direct response to a particular problem of a particular time.  Given the realities of the epoch in which we now reside, we are entitled to ask:  Has this law outlived its usefulness and relevance to the maintenance of the American republic, and therefore made itself a candidate for repeal?  In the 21st century, is the Third Amendment nothing more than a solution in search of a problem?

Divorced from its original context and framed strictly in relation to the rest of the Constitution, the amendment and the rights it guarantees would seem to have been taken care of elsewhere.

The principles underlying the prohibition of quartering, we might agree, are the right to privacy and the protection against unreasonable search and seizure.  The former is addressed in the “Due Process Clause” of the Fourteenth Amendment, while the latter is explicitly guaranteed by Amendment Number Four.

Is the threat of U.S. troops being stationed in private housing any more complicated than that?  Were the proviso to be repealed, would anything actually change?

More than two centuries of American legal history prove the quartering amendment to have exerted extremely limited influence, indeed.  To date, only one federal court case has ever been directly shaped by it:  In Engblom v. Carey in 1982, the U.S. Court of Appeals for the Second Circuit ruled that New York State could not evict striking prison officials from employee housing and replace them with members of the National Guard.

That’s about it.  Some amendment, eh?

It seems to me that if we are prepared, as we apparently are, to dismiss a central provision of the Voting Rights Act on the grounds that it no longer has any practical function in today’s world, it stands to reason that we might direct this attitude toward aspects of America law where it much more persuasively applies—not necessarily in the pursuit of justice, but simply in the pursuit of not clogging our law books with unnecessary piffle.

The Definition of Family

My mom and dad got married 28 years ago today, and in spite of all they have in common, they are still together.

On this final day of the most popular month of the year for weddings, let us reflect upon the rather momentous recent development in the history of marriage in the United States—namely, the twin Supreme Court rulings last Wednesday that struck down Section 3 of the Defense of Marriage Act and, in effect, rendered California’s Proposition 8 moot.

This first decision establishes the equal treatment of same-sex unions under federal law, while the second allows such unions to resume and flourish in America’s most populous state.

Asked for his reaction to these pronouncements, New Jersey’s governor, Chris Christie, reiterated his opposition to gay marriage and his view that the issue ought to be adjudicated by the public rather than the judiciary, offering the familiar trope, “You’re talking about changing an institution that’s over 2,000 years old.”

Indeed, this question about “changing the definition of marriage”—the cornerstone of the argument against gay marriage—is one that must always be addressed, and which holds particular interest for your humble servant at the present time.

In addition to my parents’ anniversary, this weekend sees a sizable family reunion on my mother’s side—an assembling, as such events are, of all sorts of couples (and non-couples) marking their places within the larger family unit, which itself serves as a microcosm of the melting pot that is America.  In discussing the meaning of the country, considering the definition of marriage is inescapable.

Per example:  While both my parents are Jewish, three of their four combined siblings married people who were not.  (One spouse later converted, but the others have retained their religion of birth.)  Under traditional Jewish law, these so-called “interfaith marriages” are invalid:  The Talmud expressly forbids them, and most rabbis refuse to officiate over interfaith wedding ceremonies in their synagogues.  Only through contemporary civil laws are they allowed to exist at all.

Another of my family’s matrimonial duos consists of my white cousin and her black husband, a union that in 1967 would have been illegal in 16 states and, in 1948, in 14 others.  What is more, American public approval of interracial marriage—Governor Christie’s standard for determining which marriages are legitimate and which are not—did not eclipse 50 percent until 1994.  In the early 1980s, when my cousin and her husband were born, only a third of their fellow citizens thought their eventual merger was a good idea.

(As a footnote:  Despite the Supreme Court ruling anti-miscegenation laws unconstitutional in Loving v. Virginia in 1967, the state of Alabama did not remove its own ban from the books until a 2000 ballot proposal to do so, which passed with 59 percent of the vote.)

This is all an illustration of a very simple point:  My family, as I know it, only exists because the definition of marriage has changed on several occasions within my parents’ own lifetimes, both in law and in the minds of the people.

When my uncle brought a nice Christian girl home for dinner for the first time, my traditional Jewish grandparents were slightly less than welcoming toward the idea, and toward her.  With time, however, they came to accept a non-Jew into the family, growing to love their daughter-in-law as their own.

A generation later, faced with a suitor for their granddaughter who was not Jewish and (gasp!) not even white, no objection was raised because no objection was felt.  He was a great guy, they were in love, and that was that.

Neither of my grandparents lived to see their great-grandson be born and quickly become the most delightful member of every family gathering, so it is left to the rest of us to appreciate the radical changes to a 2,000-year-old institution that allowed him to come into existence.

I recount this set of personal anecdotes in light of the latest turn in America’s understanding of marriage because I suspect mine is not the only family affected by such turns in the past.  To the contrary, I cannot imagine any great number of families which are not.

Accordingly, in a culture that has come to regard the phrase “changing the definition of marriage” negatively—people such as Christie use it as a slur, while members of Team Gay tend to avoid it altogether—I offer the humble proposal that, in light of the facts, we instead treat the concept as a necessary and welcome one, and something to which every one of us, in one way or another, literally owe our lives.