Cold Clichés

There may be no greater cliché than talking about the weather.

If there is, it’s complaining about the weather.

And if there’s an even more ubiquitous cliché than that—at least here in the frigid Northeast—it’s bitching about the evident lack of the global warming we were promised.

We’re all familiar with the script.  The calendar turns, the wind blows, the mercury drops, the snow falls and everyone from Washington, D.C., to the Canadian border shouts in unison, “What the hell is going on here?”

That is, except on weekends like the one we just had, in which temperatures rose a solid 10-15 degrees above normal and we were treated to a brief, tantalizing preview of spring.

During which, of course, everyone turned to the unseasonably sunny heavens and shouted, “What the hell is going on here?”

That’s the thing about clichés.  They require no thinking at all.  Indeed, it is in the absence of critical analysis and rational deduction that they fester and thrive.

It would not seem a terribly arduous undertaking to grasp that some days are warmer or colder than others, that winter equals snow and summer equals heat, and that whatever Mother Nature happens to deliver on one day in one neighborhood is not necessarily representative of the entirety of planet Earth.

And yet we talk about inclement weather as if it’s as mysterious as in the days preceding all modern meteorology and climate science, because, well, what else are we gonna talk about?  Rain and snow bring us together as a people—they are concepts we can all relate to, because they affect each of us in one way or another.

The point at which this becomes a real problem, however, is when this meteorological griping reaches epidemic levels and is shown to be in direct conflict with the actual truth of the matter, leading large groups of people to believe something that just ain’t so.  Such as the belief, in this case, that climate change isn’t real.

From a crucial recent story in the New York Times, titled, “Freezing January for Easterners Was Not Felt Round the World,” we learn that for all the snow and cold spells the Acela Corridor has experienced, this winter isn’t even close to the Biblical anomaly most people assume it to be.

In my hometown of Boston, for instance, last month registered as the 29th coldest January in the past 95 years.  In New York City, it was the 23rd coldest in the same period.

Nationwide, according to the Times, the mean temperature last month was, in fact, below the historical average over the past century.  By one-tenth of one degree Fahrenheit.

Meanwhile, a far more germane (and alarming) statistic concerns the Earth as a whole, for which this January was the fourth-warmest—yes, warmest—on record.

While this might surprise those in the Eastern U.S., folks in the West likely feel the opposite, since states there have faced temperatures that really are extreme—that is, extremely warm.  Parts of California have suffered a crippling drought stretching back several years, while otherwise tundra-like Alaska has been outright balmy, with temperatures regularly besting those of locales several thousand miles to the south.

What’s really going on here—that was our original question, wasn’t it?—is explained in an equally crucial New York Times piece, “Freezing Out the Bigger Picture,” by science writer Justin Gillis, who tersely notes, “Scientists refer to global warming because it is about, well, the globe.  It is also about the long run.  It is really not about what happened yesterday in Poughkeepsie.”

As Gillis goes on to write, we amateur meteorologists tend to refer to “weather” and “climate” as if they are the same thing, which they most decidedly are not.  The effects and the consequences of climate change can only be properly assessed and appreciated by examining the Big Picture, which necessitates tempering the narcissism and ignorance that come with viewing your own local habitat as a representative sample for a few hundred years’ worth of observation and research.

Once you do that, you realize the term “global warming” has always been a misnomer, since the ecological mayhem to which we humans have subjected our home planet has taken far more complicated forms than merely making everything a little bit hotter.

Climate change, or whatever you wish to call it, is a problem of extreme conditions of every imaginable sort—not simply extreme heat or extreme cold.

As such, if we insist on carrying on about local weather patterns being not quite what we had in mind, let us cease acting as if they bear any immediate relevance to the broader trends of the wider world, lest we make ourselves look like complete idiots.

What’s the Matter With Kansas?

Just when you thought certain American principles were settled law, along come the good people of Kansas to argue otherwise.

In this case, it’s the principle that no persons in the United States can be denied service at a place of business purely on the basis of who they are.

According to the Kansas House of Representatives, however, an individual can be turned away in such a fashion, provided that the owner of such an establishment has a faith-based objection to that person’s genetic and/or cultural makeup.

Long story short:  Kansas wants to be able to discriminate against gays, on the grounds that God says homosexuality is an abomination that is undeserving of equal treatment under the law.

To be precise, House Bill No. 2453 stipulates, among other things, that “no individual or religious entity shall be required by any governmental entity to […] [p]rovide any services, accommodations, advantages, facilities, goods, or privileges,” if doing so “would be contrary to the sincerely held religious beliefs of the individual or religious entity regarding sex or gender.”

The bill passed in the Kansas House on February 12 by a score of 72-49.  However, it died in the State Senate this past Tuesday, due to a lack of support in that chamber, although backers have vowed to resurrect it in some form.  Other states have introduced similar bills in recent months, including one in Arizona that passed both houses of the state legislature this past week.

We have faced this “religious exemption” question for a long time.  It has featured prominently in the debate over the Affordable Care Act’s provision regarding employer-funded contraception.  (To wit:  Should an employer be compelled to pay for employees’ birth control if he or she is morally opposed to birth control?)

On gay marriage and gay civil unions, a “religious exemption” has sometimes been included in a particular bill as a political compromise:  A state will agree to recognize same-sex marriages, but not force objecting clergymen to officiate over them.

However, bills like the one in Kansas represent a different animal altogether, owing to how far beyond the walls of synagogues, churches and mosques they reach.

Supporters of the Kansas bill insist their intent is merely to “protect” people such as florists and wedding photographers from effectively endorsing a practice to which they might object.  However, the language of the bill—not least the excerpt quoted above—would seem to cover any business owner of any sort who happens to find homosexuality repulsive.

As Andrew Sullivan rather arrestingly pointed out, had such a bill passed, the proprietor of a lunch counter could refuse to serve gay customers on account of their gayness, and the government could do not a thing to stop it.  Does that sound like justice to you?

The underlying assumption in all this is that the First Amendment’s guarantee that “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof” is absolute.  That the government cannot prevent someone from engaging in an activity that his or her religion demands he or she perform.

Actually, it can.  And it does it all the time, albeit in a manner that calls into question the meaning of the words “free exercise.”

Sikhism, for example, mandates that its adherents carry a ceremonial dagger, known as a Kirpan, on their person at all times.  This injunction has led to various scuffles and lawsuits over the years challenging whether, and to what extent, the government may regulate the carrying of a weapon in public places such as schools and airports.  Religious accessory or not, a knife is still a knife.

Prior to 1890, the Church of Jesus Christ of Latter-day Saints allowed for men to take more than one wife—a legal arrangement the United States plainly did not, and does not, permit.  The Mormon Church ultimately abandoned its tradition of plural marriage—not least to secure statehood for Utah—but were it to change its mind, the United States would be under no obligation to honor such a “free exercise” of one’s faith any more than it would for a religious sect that preached in favor of slavery, rape or the subjugation of women.

In short:  The stipulations of a particular religion cannot take precedence over the law of the land, should the two come into conflict.

In the instance of Kansas (or anywhere else), the right to the free exercise of religion does not extend to denying others the rights otherwise guaranteed them by their government.

Whatever their motivation, some acts are just plain wrong.

Romney, Revisited

I suppose the most surprising thing about Greg Whiteley’s Mitt is how unsurprising it all is.

The film, which premiered at the Sundance Film Festival last month and has since become available on Netflix, is a documentary of Mitt Romney’s two runs for the White House, in 2008 and 2012, covering six years in the life of the former Massachusetts governor and his family, to whom Whiteley was granted exclusive access.

A mere 90 minutes in length, Mitt inevitably is able only to scratch the surface of Romney’s journey from long-shot also-ran in 2008 to frontrunner and Republican nominee four years later.  I would be extremely interested to view, say, a three-hour cut culled from the same material.  Except I suspect that such a document, if it existed, would be no more informative or enlightening than the version we currently have.

I don’t mean this as a critique of the film itself, but rather as a reflection that Romney, the presidential candidate, is exactly who we thought he was all along.  It may be that unless and until he runs yet again—it’s not impossible, if you believe the rumors—we have learned all we ever will about what makes Mitt Romney tick.

Mitt, the documentary, is an agreeable mixture of fly-on-the-wall observation and one-on-one interaction with its subjects.  There are poignant moments with Ann Romney, who expresses her exasperation with the whole process of running for president, and a flash of amusing candidness from Josh Romney, one of the couple’s five sons, who asks the camera whether he should say what he really thinks or what he has been “trained” to say about his perfect, wonderful old dad.  (He ends up doing both.)

On the other hand, Mitt contains precious few interactions with the man himself, and makes only nominal attempts to truly get inside Mitt Romney’s head.  Was it a condition of Whiteley’s access that he only probe so deep?

That’s not to say the movie contains no moments of illumination or clarity.  My sense is that Romney voters will see Mitt as a validation of their support, while those on Team Obama will feel their efforts to defeat Romney were well-founded and worth the trouble.

In other words, most people’s perceptions about the GOP standard-bearer will be confirmed by this film, and there is definite value in learning that your superficial hunch was right on the money.

In perhaps the movie’s most compelling scene, we find Romney in a hotel room with family, railing self-righteously against the longstanding “flip flopper” label as unfair and untrue—but also, interestingly, as perhaps politically insurmountable.  “Maybe I’ve gotta live with it,” he concedes.  “In which case, I think I’m a flawed candidate.”

It is a moment in which Romney, even in the midst of a temper tantrum, is able to soberly and realistically assess his prospects, treating his own personality as if it were any other product that he, a CEO by trade, might be selling to a wide audience.  Few presidential candidates are as clear-eyed and self-critical as that.

We see this quality elsewhere in the movie, as well.  Following his triumphant first debate performance against President Obama, Romney wastes no time in concluding (correctly, it turned out) that the second and third matchups will not be so easy, citing past examples from history to make his point.

Further, it was indeed Romney, all the way back in January 2008, who most deeply understood what a formidable force Barack Obama had become in the Democratic primaries, following his victory in the Iowa caucuses.  (Romney’s fellow Republican candidates were still focused entirely on Hillary Clinton.)

One conclusion we might draw is that, in the world of politics, Mitt Romney is perhaps better-suited as a strategist rather than a candidate—a man who understands human nature but struggles to forge connections with actual humans outside of his familial inner circle.

We could even go as far to say that he would make a better president than candidate.  There is something to be said for a commander-in-chief with the ability to resist drinking his own Kool-Aid—a person who can diagnose a problem when he sees it, even if he doesn’t necessarily possess a ready-made solution to it.

While the theory of “running America like a business” is hardly settled political science, there are clear advantages to having a chief executive who thinks like a businessman.

The final takeaway of Mitt, then, is that despite Romney’s evident strengths, he was unable to convince the American public that his product on offer was something they were interested in buying in the first place.

Who Knows Best?

Should an art museum care what you think?

The Museum of Fine Arts in Boston is testing that question with a special exhibition that opened on St. Valentine’s Day, called, “Boston Loves Impressionism.”

Modest in size but large in scope, the display seeks to showcase the museum’s impressive holdings from the famed 19th century art movement, and to underline Boston’s “early adopter” status of a genre that the rest of the art world had yet to take seriously.

What makes “Boston Loves Impressionism” unusual is that its contents were selected not by the museum, but by the public.  In effect, it is the MFA’s first-ever exhibition to have been crowdsourced.

In the weeks preceding the show’s opening, the MFA’s website allowed users to vote for their favorite Impressionist works in the museum’s collection.  The presentation, which runs through May 26, is comprised of the 30 paintings (actually, 29 paintings and one sculpture) that received the most votes.

That’s not normally how it works.  While the curators of any great art museum must always bear the tastes and interests of their visitors in mind, the decisions in building a particular show are ultimately up to the museum itself.

And why not?  Aren’t the people who run such an institution precisely the sort of folks who know what constitutes great art?  Is it not their duty to educate and edify the masses, rather than pandering by giving them what they already know they like?  Shouldn’t some things be left strictly to the experts?

The world of electoral politics would seem to function entirely differently from the world of fine arts.  It is, after all, a system in which voting is the rule rather than the exception.  In theory, elections are the purest manifestation of the principle of self-rule—the idea that we, the people, are the experts who should, and do, have the final say.

In practice, not so much.

While the first primaries and caucuses of the 2016 presidential campaign are still nearly two years away, behind the scenes the race has already begun.  While we like to think that a presidential primary is an opportunity for us to choose from a wide cross-section of possible candidates for high office, the truth is that by the time the polls actually open, a great deal of the most consequential decisions have already been made for us, and by people who fashion themselves much more important than us, indeed.

On the Democratic side, for instance, two facts have become abundantly clear.  First, that Hillary Clinton is probably going to run.  And second, that should Clinton proceed, several other possible candidates will be pressured by certain Democratic Party operatives and fundraisers to take this cycle off and wait until 2020 or beyond to take the plunge themselves.  They will be informed, in effect, that it’s Clinton’s “turn” to be president and that her candidacy is “inevitable,” and we might as well clear the deck of all the alternatives in order to make her path to the Oval Office as easy as possible.  Never mind if it turns out that actual voters think otherwise.

To be sure, not all of this pressure will be external.  As we have already learned, figures like New York Governor Andrew Cuomo and Vice President Joe Biden would love to run for president in 2016 but are waiting for Clinton to decide before laying out their own plans.  The implication is that a “yes” from Clinton would mean a “no” from Cuomo and Biden, in the interest of so-called party unity.  To challenge the Democrats’ presumed nominee would be seen as downright rude.  Again, never mind the part of the process where actual citizens cast actual votes.

And should some traitor break through the lines, accrue popular support and manage to secure more total delegates than Hillary Clinton, the party has a final veto mechanism in the form of “superdelegates.”  That is, a group of party members—some elected, some not—who can effectively overturn the will of primary voters by throwing their support behind the person who came in second place, if they believe the person who came in first could not possibly win the general election.  Say, a black first-term senator with a Muslim father and a funny name.

This didn’t happen in 2008, but it jolly well could have, and it could certainly happen in the future if the rules remain the same and primary voters go crazy and nominate someone of whom the party’s backroom dealers disapprove.

And it’s all predicated on the notion that the higher-ups of a political party in what’s supposed to be the World’s Greatest Democracy hold a lower opinion of the views (and rights) of the public than do the curators of an art museum in Boston.

The Man Who Would Not Be King

This weekend we celebrate the birthday of America’s most famous quitter.

George Washington is known by all as the first president of the United States and, before that, the commanding general of the Continental Army during the Revolutionary War.

Perhaps the most germane fact about Washington—slightly less-known—is how very grudgingly he assumed both of those roles.  He was America’s original Reluctant Hero.

As a member of the Continental Congress in Philadelphia, he was nominated for the post of army commander based on his distinguished record during the French and Indian War, as well as for his unquestioned integrity and self-confidence.  He accepted the commission, but insisted all along that he was wholly unqualified for the job.

Following the revolution, Washington returned to civilian life the most beloved man in America, and the gaggle in Philadelphia would have happily anointed him King for Life while the convention hammered out a constitution in the meanwhile.  But Washington preferred to retire to his home in Mount Vernon, which he briefly did.

The idea of being president, as with being general, was very much not of Washington’s own design, but rather was foisted upon him by those who figured there was no finer available specimen—a notion rather thunderously reinforced by his receiving 100 percent of the electoral votes for both of his presidential terms.

After four years, he was utterly prepared to call it a career and go home for good.  However, the emerging political schism between the era’s two major parties, the Republicans and the Federalists, proved enough of a threat to the life of the nascent republic that the mere presence of Washington—a figure who was morally above the fray—looked to be the only thing that could hold the country together.  True to form, Washington took it as his patriotic duty to serve a second term, before finally securing his long-elusive retirement in March of 1797.  (He died on December 14, 1799.)

For his unwavering humility and patriotism in all these endeavors, Washington is often dubbed the Cincinnatus of his day:  A man who is called to power and greatness, but who then relinquishes that authority at the first possible opportunity in exchange for more modest pursuits.

“If he does that,” said King George III of his American counterpart, “he will be the greatest man in the world.”

There is little mystery as to why anyone (let alone the head of the British Empire) would hold such an act in such high regard.  The way most of us scratch and claw to achieve the most minimal wealth and prestige, for someone to purposefully turn away from it seems downright perverse.

If the world offers you a silver spoon, then doggone it, you wrap your lips around it and never let go!  To have the restraint to do otherwise…well, it makes the rest of us look bad.

On this Presidents’ Day, we might turn to our elected leaders of today and wonder:  Are George Washington’s most noble qualities really so out of reach?  Is the idea of selflessly serving one’s country and then gracefully returning to private life really so dead?

In this early stage of the current election cycle, we have already seen an uncommonly high number of sitting congresspersons announcing their retirements at the end of this term.  However, in nearly every case, the given reason is not one of high-minded ideals, but rather of abject disgust and exhaustion regarding the whole governing enterprise.

One after another, senators and representatives have told us that Washington D.C.’s famous dysfunction has spun so wildly out of control that they just can’t take it anymore.  Whether it’s the fault of executive intransigence on the left or Tea Party extremism on the right, the difficult but essential art of governing has become a fool’s errand and is no longer worth their time.

In short, they have given up.

As such, the problem is not merely that most of today’s American leaders have fallen so spectacularly short of Washington’s golden standard.  Rather, it is that they no longer regard it as an ideal in the first place.  They run for office on the promise of doing the people’s hard, grueling work, but when the work proves just too hard and too grueling, they run away.

By resigning in this fashion, our delegates are making the right decisions for the wrong reasons.  And so America has, I fear, left its founding-est father with some very naughty children, indeed.

Locker Room Readiness

Remember the time when the National Football League said it was not ready for an openly gay player in its ranks?

Gosh, it seems like it was only yesterday.

Oh, that’s right.  It was only yesterday.

Well, okay.  In fairness, it was actually on Sunday.  And it wasn’t the league itself, but rather a handful of NFL executives and coaches who didn’t have the nerve to publicly reveal who they are.

As reported by Pete Thamel and Thayer Evans of Sports Illustrated, these anonymous self-appointed football spokespersons harbor serious doubts that Michael Sam, the University of Missouri defensive end who recently announced he is gay, would be able to make it in the NFL.

The primary concern of these skeptics—if “concern” is the right word—regards the so-called realities of the league’s so-called “locker room culture,” which they imply, with varying degrees of bluntness, is incompatible with open homosexuality.

“There’s nothing more sensitive than the heartbeat of the locker room,” said an assistant coach.  “If you knowingly bring someone in there with that sexual orientation, how are the other guys going to deal with it?”

On this and similar claims, I might simply refer you to Frank Bruni of the New York Times, whose blisteringly funny Tuesday column, titled, “Panic in the Locker Room!” tore the whole notion to shreds (“How is it that gladiators who don’t flinch when a 300-pound mountain of flesh in shoulder pads comes roaring toward them start to quiver at the thought of a homosexual under a nearby nozzle?”).

Bruni’s point—made by countless others, gay and straight alike—is that this supposed fear of sharing a locker room with gays is just plain silly, since every pro player in history already has.  It’s just that most of them don’t realize it, because the homosexuals in question were either closeted or—gasp!—minding their own business.

I can certainly confirm this assertion as far as it goes.  In the locker room of my middle school gym classes, I was so self-conscious—in such a rush to undress, dress, then get the hell out of there—I never quite found the time to ogle those around me.  What is more, even if I did have the time, I would not have dared use it in such a way:  Being as deeply in the closet as I then was, it could not possibly have been worth the risk to so thoroughly blow my cover in such a potentially explosive environment.

But what I also notice in the language of Sports Illustrated’s anonymous interview subjects is a disconnect that closely resembles the talk in 2008 about whether America was “ready” for a black president.

As you will recall, in every opinion poll on the subject, you’d have 70-something percent of respondents affirming that, yes, America was prepared to vote for a black person for commander-in-chief.  However, when the wording of the question was altered to, “Would you personally consider” voting for the same, the number jumped into the 90s.

In other words, there was a small but measurable group of people whose opinion apparently was, “I’m personally evolved enough not to discriminate between black and white—it’s everyone else that’s the problem.”

In effect, we were a (nearly) racism-free electorate even as we told ourselves we weren’t.  Oh, we of little faith!

That’s the phenomenon that appears to be happening now in the NFL with regard to gays.  In one corner are these higher-ups insisting the league has not reached a point of comfort.  (As another of SI’s anonymous spokespeople put it, “In the coming decade or two, it’s going to be acceptable, but at this point in time it’s still a man’s-man game.”)

And in the other corner—particularly on Twitter—is a seemingly bottomless well of support from players and others who evidently feel that it has.

As such, with the suddenly imminent prospect of an actual, real-life, open homosexual in the league, we have good reason to expect much the same result.  Namely, that following a brief period of unease and awkwardness on the part of players, coaches and the media, normalcy will set in and, soon enough, everyone will be at a loss to explain what all the fuss was about.

For better or worse, this is always how these sorts of things play out:  As a manifestation of that cliché about the impossible becoming inevitable.  We assume X will never occur, then it does occur, we get used to it, and life marches on.

It’s the American way to be skeptical of things we think we don’t understand, but then to realize that we sort of understood them all along.  The key is to have the Michael Sams of the world with the guts to get the proverbial ball rolling.

Satanic Impulses

This Friday, St. Valentine’s, marks 25 years since Salman Rushdie was sentenced to death.

We can count it as a victory for the cause of freedom around the world that Rushdie is still alive today.

For those unfamiliar with the story:  In 1988, the India-born British author published a novel called The Satanic Verses, which at one point quotes a controversial would-be passage from the Quran whose very utterance is considered blasphemous by some Muslims.

By including these so-called “Satanic Verses” in his novel, Rushdie himself was accused of blaspheming against Islam, and protests against the book erupted around the world.

When word of the kerfuffle reached Iran, that country’s leader, Ayatollah Ruhollah Khomeini, called upon all the world’s Muslims to hunt down and kill Rushdie, along with his editors and publishers, promising a monetary reward for those who succeeded.  This so-called “fatwa” was issued on February 14, 1989.

It wasn’t any idle threat.  Although Rushdie survived the ordeal, it was not before spending some 12 years in hiding, constantly changing his address, moving under armed guard 24/7, effectively devoting his life simply to not getting himself murdered.

While Rushdie himself was never harmed, the novel’s Japanese translator was fatally stabbed, its Norwegian publisher shot and wounded, and countless bookstores bombed or otherwise disrupted for daring to offer the book for sale.

This was the price for expressing an unwelcome thought about organized religion.

Today, a quarter-century after the fact, we might reflect on how depressingly little the world has changed.  How the act of transmitting unpopular or controversial views remains terrifyingly fraught.  And how, on this subject, America really is exceptional.

To wit:  The president of the United States has in recent years asserted his authority to order and execute the killing of an individual—even an American citizen—if such a person is found to have colluded with groups such as al Qaeda to physically harm America or American interests.

What the commander-in-chief cannot do, however, is act likewise toward an individual who writes a book or makes a speech suggesting (for example) that America is a wicked, imperial power, or that makes disparaging comments about Christianity or the NRA or anything else.

What is more, while the art of offense-taking is alive and well in American culture, it is very rarely expressed through violence within our borders.

For instance, when some prolific atheist publishes a book denying the existence of God, the usual gaggle of clergymen, “family values” spokespeople and the like descend upon cable news shows and other outlets to vent, and sometimes to call for boycotts of the offending work.  But that’s about as far as it goes.

This is no accident.  Rather, it is a direct and purposeful consequence of our country’s indispensable First Amendment.

Because the U.S. Constitution stipulates that everyone has the right to say whatever the hell they want, and can practice any religion that they want, no one has any cause to feel that their views are being institutionally suppressed or abridged, and violent uprisings thus become far less likely to occur.

Elsewhere?  Not so much.

Over in Russia, we find two members of the punk rock group Pussy Riot arrested and jailed for “hooliganism motivated by religious hatred,” following a 2012 performance of a protest song called, “Mother of God, Drive Putin Away,” in a Russian Orthodox church.  (The women were released this past December, likely as a propagandistic show of goodwill in preparation for this month’s Winter Olympics in Sochi.)

Just this week in Nigeria, the government fine-tuned its already anti-gay policies by criminalizing not only gay marriage and gay sex, but also the “public show” of gay relationships, exhibited “directly or indirectly,” as well as the proliferation and mere support of various gay organizations.  In effect, saying a kind word about gay people is now illegal in Nigeria, as well as in countless other African and Middle Eastern countries in one way or another.

In short, the dissemination of dangerous ideas will always be a struggle, and the right to free expression will always need to be fought for, so long as there are those with the determination (and the weaponry) to fight against it, often with the support—be it latent or overt—of their government.  Salman Rushdie learned this lesson more painfully than most, and he will not be the last to suffer the consequences of saying things that some people would prefer not to hear.

And so it is up to each of us, as defenders of America’s most sacred principles, to see that the battle is not lost, and that the cause of freedom lives to fight another day.