Gay Talk

I confess that I do not know how it feels to be falsely accused of being gay.

Counting myself among the 5 percent of humanity (give or take) that finds itself drawn to members of the same sex, the only presumptions of homosexuality I have faced have been entirely correct.

On the other hand, I have considerable experience in attempting to convince the world that I am straight, and being rather terrified by the possibility that my efforts had failed.

Truth be told, a leading reason I finally came out was in order to ask my family and friends whether they suspected all along that I was gay and were simply being polite until I was ready to make it public.  Whatever curiosity everyone else may have had about me was dwarfed by the curiosity I had about them.

Cory Booker apparently does not have this problem.

Booker, you may know, is the mayor of Newark, New Jersey and the Democratic U.S. Senate candidate in October’s special election to replace the late Frank Lautenberg.

He is also 44 years old and unmarried, and this week was subject to whisperings that his lifelong bachelorhood is due not to an open preoccupation with politics, but rather to a secret preoccupation with men.

These gay rumors themselves are not particularly newsworthy; Mayor Booker has faced them before.  Indeed, the ruckus this week came about not by outside chatter, but by Booker himself, who in a Washington Post profile referred to past interest in his personal life in order to express his general thoughts on the matter.

What is newsworthy is precisely what Booker said, and, more compelling still, what he did not say.

In the Post feature, when the nature of his love life popped up, Booker remarked, “[P]eople who think I’m gay, some part of me thinks it’s wonderful.  Because I want to challenge people on their homophobia.  I love seeing on Twitter when someone says I’m gay, and I say, ‘So what does it matter if I am?  So be it.  I hope you are not voting for me because you are making the presumption that I’m straight.’”

Such a reaction is interesting for at least two reasons:  First, for its jollity; second, for its deliberate ambiguity.

On the first point, we might concede that it has become de rigueur for any ambitious Democrat to go whole hog on all matters relating to homosexuality and gay rights—to make it plain that, in the year 2013, to be suspected of being gay should be taken, if not as a compliment, then at least not as the insult and provocation it used to be.

Then there is the matter of Mayor Booker’s own sexual preference, which he didn’t trouble himself to define one way or another.  It should be noted that he has, in the past, identified as heterosexual and referred to having had relationships with women.  This time, however, he did not.

Why not?

He could have made himself clear.  He could have said, “I don’t mind if people think I’m gay, even though I am actually straight,” rather than asserting the former without acknowledging the latter.  There is no rule that an openly straight person cannot be an effective advocate for gay rights, which Booker evidently intends to be.

The fact is that Booker is attempting to have it both ways.  He says that his sexual orientation is irrelevant to whether he would make a decent U.S. senator, yet he apparently does not trust the good folks of New Jersey enough to make it clear what his orientation is.

As an analogy:  A candidate for office might insist that his religion should not determine whether one votes for him or not, but he would nonetheless feel no hesitation in saying to which church he belongs.  It would be outright bizarre if he didn’t, and it would lead people to wonder whether he is withholding valuable information about his true self.

One’s sexual preference ought to be no different:  It is part of the makeup of an individual’s identity, and something of which, according to Booker himself, one has no cause to be ashamed or evasive.

Accordingly, Booker should quit dancing around the issue and just be straight with us (so to speak).  If he is confident that voters will not view his sexuality as a drawback, then he should have the nerve to give them an honest choice from which to prove him correct.

After all, while it might be wrong to withhold one’s vote on the suspicion that a candidate is gay, it is entirely reasonable to withhold one’s vote on the suspicion that a candidate is a liar.

The Boys Who Cried ‘Impeachment’

It is now official.

Certain members of the Republican Party in Washington, D.C., are prepared to draft articles of impeachment against President Barack Obama.

For what high crimes and misdemeanors might the commander-in-chief be prosecuted?

Not to worry.  We’ll figure that out later.

I jest not.  At a town hall in Oklahoma last week, Senator Tom Coburn responded to an attendee’s demand for the president’s head by assenting that, on the question of impeachment, “I think you’re getting perilously close.”

To be precise, the questioner accused the Obama administration of being “lawless,” to which Coburn replied, “I think there’s some intended violation of the law in this administration but I also think there’s a ton of incompetence.”

In any event, Senator Coburn conceded that, for the moment, he could not identify a presidential transgression that would justify impeachment proceedings, adding that he is not sufficiently versed in such matters to make an informed judgment one way or the other.

He is not alone.  The truth is that no one quite knows what an impeachable offense really is, because it’s one of those nebulous legal concepts that can be made to mean whatever one wants it to mean.

In theory, Congress’s impeachment powers are an essential component of any democratic republic—a check by the legislature on the actions of the executive, to ensure the commander-in-chief and his deputies do not abuse their authority as servants of the people.

In practice, not so much.

During the censure of President Bill Clinton, Democrats accused Republicans of using impeachment as a merely political instrument—a means of embarrassing and one-upping someone the GOP simply didn’t like.  The opposition was itching for an excuse to bring Clinton down, the theory went, and his lying about oral sex became their smoking gun.

The implication, then and now, is that the whole Monica Lewinsky episode was an unnecessary and trivial use of Congress’s power to impeach, which was intended only for the most egregious of presidential crimes.  That even if Clinton’s actions did constitute “obstruction of justice,” it was not the sort of obstruction that posed an existential threat to the American republic.

The fact of the matter is that it doesn’t make a difference.  For all that is unclear about the principle of impeachment in the United States, a rudimentary reading of history makes it perfectly plain that it can be employed at any time for any reason.

In Federalist No. 65, Alexander Hamilton characterized impeachment as retribution for “those offences which proceed from the misconduct of public men, or, in other words, from the abuse or violation of some public trust.”

Benjamin Franklin, in typically pithy fashion, offered that impeachment, and even removal from office, was justified whenever the president “rendered himself obnoxious.”

In other words, the “original intent” of America’s founders was not for impeachment to be particularly rare or exceptional.  Rather, it was meant to be used in any instance in which the Big Cheese got out of line, with the yardstick for what constitutes “out of line” planted wherever Congress sees fit.  Historically speaking, one could argue that Congress has broached the subject too little, rather than too much.

The problem today is twofold.

First, we live with a national legislature that is shaped by political loyalty even more so than during the Clinton years, meaning that any possible future vote to impeach President Obama will be a purely partisan exercise.  Regardless of the merits, Democrats will rally to the president’s defense, while Republicans will vote “Aye” as reflexively as they have to abolish Obamacare, which they have done 40 times thus far.

On the other hand, we also live with an executive branch that, in the last several decades, has exerted ever-greater authority over the other branches and over many aspects of American life.  Activities and programs that used to be unthinkable are now routine (wiretapping, drones, Gitmo, etc.), with executive fiat becoming the rule rather than the exception.

In short:  The America we presently inhabit would seem a prime target for the periodic article of impeachment, yet is represented by people incapable of drafting one properly.

Is this an irony of politics, or is it precisely what one might expect?  Perhaps it is both.

What I fear in any case—as Commissioner Gordon might put it—is that we have been saddled not with the government we need, but with the government we deserve.  A government that cannot seem to accomplish anything except for what it has no business accomplishing in the first place.

And if this is indeed the case, then the most heinous high crimes of all have been committed not by our elected officials, but by us.

The Forgotten Dreamer

This week, as the United States observes the 50-year anniversary of the March on Washington for Jobs and Freedom, President Barack Obama is leading the nation in honoring one of the march’s most important figures, if not the most important of all.

This honoree is a man who fought all his life to ensure that the promise of equality for all Americans would not be a mere dream.  Who knew from personal experience the horrors of prejudice and injustice, yet refused to be intimidated into keeping his unpopular and sometimes dangerous views to himself.

He was an indispensable leader throughout the Civil Rights Movement of the 1950s and 1960s.  Without him, the march we commemorate on Wednesday would hardly have been possible.

I speak, of course, of Bayard Rustin.

Fifty years out, one of the more unfortunate legacies of the March on Washington is the notion that it was all about Dr. Martin Luther King, Jr.  That on a muggy August afternoon in 1963, several hundred thousand supporters of racial equality spontaneously assembled at the Lincoln Memorial in Washington, D.C., to hear Dr. King declare, “I have a dream.”

In this “official” narrative, people such as Bayard Rustin have for decades been almost entirely left out.  I certainly don’t remember his name popping up in my high school history textbook.  There is no national holiday celebrating his birthday, nor are there streets bearing his name that cut across Harlem or Chicago’s South Side.  He is, if not an invisible man, an unjustly overlooked man.

Perhaps that is finally changing.  Rustin, who died in 1987, is among this year’s recipients of the Presidential Medal of Freedom.  As well, the National Museum of American History in Washington, D.C., currently boasts an exhibit about the March on Washington that pays scrupulous attention to the many men and women, beyond Dr. King, who were instrumental in bringing the idea of a mass demonstration to fruition.

Rustin’s role was as follows:  Having previously organized one of the earliest “Freedom Rides” to protest bus segregation laws throughout the South, he was put in charge of drafting the program, recruiting activists and other marchers, coordinating the buses and trains to transport them all to Washington, and hiring marshals and traffic directors to ensure everything ran smoothly.  All of these things he did more or less single-handedly.

In short:  While the March on Washington owes its sterling reputation to Martin Luther King, it owes its very existence to Bayard Rustin.

My question:  Why do you need me to tell you this?  Why has such an essential character spent most of the last half-century being expunged from the history books?

The likeliest explanation for this is threefold:  Rustin was a socialist.  He was a draft dodger.  And he was gay.

None of these would-be revelations was a secret at the time.  He had been arrested and jailed in 1953 for engaging in “sex perversion,” i.e. consensual sex with another man.  A lifelong pacifist, he had refused to serve in World War II.  As for his political affiliations, he was a member of the Socialist Party of America for much of his adult life, becoming its chairman in 1972.

For these reasons, many within the Civil Rights Movement fought to prevent Rustin from playing such a leading role, including for the March on Washington.  No less than Roy Wilkins, executive secretary of the NAACP, admonished Rustin to keep strictly behind the scenes, lest Wilkins and others be forced to answer for Rustin’s background during what was, after all, a rather delicate operation on the “winning hearts and minds” front.

And so we have one of the great ironies of the 1960s:  A central figure in the fight for the rights of minority groups has been very nearly absent in the popular mind because he was a member of one too many minority groups.

It is useful to remember, in this celebratory week, that history is never as simple or as morally clear as we would prefer it to be.  Like all the civil rights battles therein, it is a messy, complicated business whose participants are neither saints nor devils.

My hope, in light of Bayard Rustin finally getting his due, is that we make a greater effort to render our country’s most colorful episodes in a realistic, rather than idealistic, light.  That we treat our heroes and villains as if they existed in all three dimensions, not as proverbial cardboard cutouts.  That we forgo our usual national tendency never to let the facts get in the way of a good story.

After all, oftentimes the truth can make for a mighty good yarn as well.

The Next ‘Last’ Battle

It was, you might say, a slight changing of the subject.

Army Private Bradley Manning, upon being sentenced to 35 years in prison for leaking classified government documents, announced that from this point forward, he wishes to be regarded as a woman and intends to undergo the necessary processes and procedures to make this so.

In truth, this revelation’s timing was more noteworthy than its substance.  It had long been known that Private Manning had a complicated sexual identity.  In 2010, when the WikiLeaks story first broke, his effective status as a gay man in our “Don’t Ask, Don’t Tell” military added an additional layer of intrigue to the whole saga.

By disclosing his intention to become a woman at the moment he is to begin a lengthy prison term, Manning complicated the matter a bit, as American military prisons are apparently not prepared to grant such a request.  (He did not announce during the trial for fear of it becoming a needless distraction, as it probably would have.)

In any case, the extremely high-profile nature of his situation will make his attempt at gender reassignment uncommonly visible as well.  Purposefully or not, Manning seems to have made himself a poster child for the rights of those who identity as the opposite sex—a fight that will not go away even if Manning does.

I raise this point, in part, because of how the gay marriage movement is regularly referred to by sympathetic voices—so much so that it has become a cliché—as “the last great civil rights struggle in America.”

What a silly thing to say.  As if the moment when all 50 states legalize same-sex marriage will signify the end of all civil rights disparities in the United States.  That nothing else stands in the way of America achieving its objective of treating all of its citizens as equal before the law.

No.  There will always be some group or other that could rightly claim that its members’ rights as free citizens are not being respected or enforced.

And why is that?  Because there will always be prejudices against people we think we don’t understand.

On the subject of addressing civil injustice, we should recognize that it is a battle with two fronts—one is legal, while the other is cultural.

In some cases, advances in the former help to bring about advances in the latter—for instance, the prohibition of slavery proved paramount in enabling white people (albeit not all of them) to recognize black people as their equals—while in other cases, the reverse is true—for instance, the present-day evolution of gay marriage laws is a consequence of the evolution of gay marriage views in the minds of the people.

While the precise intermingling of cultural and legal forces on a particular issue is not always clear, a relationship between the two nearly always exists to some extent.

In the case of the transgender community, it is probably fair to say that the American culture has not yet decided what to think about it.

While there has been a handful of transsexual folks on the pop culture scene in recent years—Cher’s daughter-turned-son Chaz Bono is one; Matrix co-director Lana (formerly Larry) Wachowski is another—the concept is still a fairly mysterious one to most people, and not always for the same reason.

Some are baffled by the very notion that one could possibly identify as the opposite sex, seeing it as unnatural and/or sinful.  Others are not necessarily hostile to the idea but nonetheless cannot quite grasp it.  As was long the case with gays, transsexuals have been culturally invisible; most non-transsexuals have never actually met one (or think they haven’t), and thus have no practical frame of reference.

The antidote to prejudice against the transgendered, then, would be to have more positive role models, which means that those who have kept their sexual identities a secret might want to consider “coming out,” in order to get the ball rolling.

Which returns us to Bradley (or rather, Chelsea) Manning, who is perhaps not quite the role model we had in mind.

Where Manning might prove useful is on the legal side of the equation.

Should he and his lawyers proceed to raise hell about a prisoner’s right to be treated according to the sex with which he or she identifies, the broader concerns about the civil rights of transgender people might be given the thorough public hearing they have thus far been denied.

It would be a messy way for the transgender rights movement to get reinvigorated, but then civil rights struggles are rarely clean or easy.

Whatever It Takes

At a Republican Party gathering last week in Boston, New Jersey governor and presumed future presidential candidate Chris Christie asserted to his fellow attendees, “I am going to do anything I need to do to win.”

The remark was part of a general critique by Christie about certain characters in his party—particularly Kentucky Senator Rand Paul, with whom he recently feuded—who seem content to express their political philosophies from the sidelines, but are not terribly interested in the business of actually governing or, indeed, winning elections at all.

“I think we have some folks who believe that our job is to be college professors […] they basically spout out ideas that nobody does anything about,” said Christie.  “For our ideas to matter we have to win.  And if we don’t govern all we do is shout to the wind.”

The tension between those who insist on defending their principles at every turn and those who are prepared to compromise is a debate that never gets old and will never go away.  It is the essence of politics, and we cannot possibly discuss it enough.

Today, however, let us resist the urge to do so (at least directly) and instead confine our attention to the quotation with which we began, “I am going to do anything I need to do to win.”

Not many political figures say such a thing in such a blunt fashion, but then Christie is not generally known for either subtlety or restraint.  Indeed, one could reasonably conclude that his rather remarkable success in building a national following has been built on precisely the opposite:  He is going to tell you exactly what he thinks, and if you don’t like it, you can shove it.

Because Christie so prides himself on his penchant for straight talk, we are surely entitled to regard the things he says as if he really means them.

In that spirit, let us unpack the precise implications of his “anything to win” pledge.

I note that most public officials do not disclose a victory-at-all-costs attitude in public, including those who probably hold one in private.  A leading reason for this is that the notion that one is prepared to achieve a specified ends through any means necessary is usually regarded as an unattractive one, particularly in the case of politicians, who are presumed to be ethically suspect from the first.

What one hopes, in electing a particular person to a particular position, is that he or she possesses some modicum of a moral center.  That for all the principles one is prepared to bend for the sake of producing a greater good, there are certain codes that one simply will not violate.  That some acts are so morally repulsive that the candidate would sooner lose the election than bring himself to commit them.

It recalls the famous question from the Gospel of Mark, “For what shall it profit a man, if he shall gain the whole world, and lose his own soul?”

To vow to do anything and everything required to win, then, suggests the absence of a soul, or a soul that is rotten to the core.  The real world consequences of this are not something to be casually shrugged off.

Ask yourself:  Is it likely that someone who abandons any and all scruples in the course of a campaign is suddenly going to regain them once in office?  Is it not logical to infer that someone who will do “anything it takes” to win will also do anything it takes to govern?

Further, if a man is shown to have no principles he is not willing to abandon for the sake of political expediency, then what point is there in voting for him in the first place?  Can a man whose political views are only as good as the present situation allows truly be said to have political views at all?

You may vote for someone because you agree with his position on a particular matter, but what reason have you to trust that his position will not change at a moment’s notice, and possibly never change back?

To be sure, these are necessary questions to ask about anyone running for high (or low) office, and it is the reason why a person’s character is an essential thing to bear in mind when weighing one candidate against another.

The difference with Christie—at the risk of repeating myself—is that he has taken the unusual step of announcing his untrustworthiness in advance, for all the world to hear.

One could go ahead and vote for the guy anyway, justifying it in any number of ways.  However, should Christie prove himself a man of his word by proving that he is, in fact, not a man of his word, one is no longer entitled to be surprised.

A First Name Basis

One of the more memorable excursions on my recent trip to Israel was the afternoon we spent in Tzfat, a charming little northern town known as the birthplace of Jewish mysticism, or Kabbalah.

While there, we encountered a fellow named Avraham, an artist who gave us a crash course in the ways of Jewish mystics, with particular emphasis on the significance of one’s first name to one’s identity.  Every name has an origin and a narrative behind it, Avraham explained; thus, inherent in your own name lies the entire story of you.

While Avraham’s disposition and speaking patterns suggested someone who had ingested perhaps a few too many domed plants in the course of his life, his point was taken:  To an extent that we don’t often appreciate, our first names are an essential and unavoidable component of our personalities and our very selves.

We build our reputations by how we think and what we do, but it all begins with the name by which we are called.  All else is secondary.

What makes this interesting is that, with the odd exception, all of us earthlings go about our daily lives with a name that was chosen by somebody else.  If nomenclature is the most fundamental distinction of one person from another, it is also the rare aspect of our outward identity that we do not shape ourselves.

Usually our name is determined by our parents, although as we learned from a scuttlebutt in Tennessee last week, it can also be ordained by a judge.  As well, various states have various laws prohibiting certain names from the lexicon, either for reasons of length, taste or the simple keeping of the peace.

In any case, your first name is something you have to live with, so to speak, and most people do.  Upon coming of age, one is entitled to legally change one’s name for almost any reason (or none at all), but we generally decline the offer.

Certain athletes, singers and other manner of celebrities do occasionally take the plunge, precisely in order to shape a particular image of themselves for the wider world to absorb, but most of us regular folks do not follow their lead.  After 18 years of getting accustomed to a particular moniker, we figure we might as well see it through to the end.

Suppose we didn’t.  Suppose that, rather than merely offered the opportunity to reinvent ourselves when we become adults, we were actually compelled to do so.  That your birth name expired at the stroke of midnight on your eighteenth birthday, at which point you must either renew it or choose a new one.

What would you call yourself, if you had to start all over again?  What are the sorts of names that are more “you”?  Would you opt for originality or emulate a beloved family member or personal hero of some sort?  Would you veer wildly from the name you have thus far worn, or would you alter it just the slightest bit?

How might such considerations play out across the wider population?  Which names would suddenly become ubiquitous, and which would become scarce?  Would the “most popular names” lists mirror those of names bestowed by parents upon their newborn children, or would they not?  Would the voter rolls come more to resemble a Who’s Who of players from the Bible, or would we quickly become a nation of Kims and Kanyes?

The question this begs returns us to our friend Avraham:  What does your name mean to you?  Would it mean more to you had you selected it yourself?  Based on the person you have become, might your parents have chosen differently, or was their at-birth decision right on the money?

In your life, have you attempted to live up (or down) to the connotations typically associated with your first name, or have you rather tried to appropriate it to your own ends?  Have you taken ownership over your identity, or has your identity taken ownership over you?

A Tale of Two Cities

Watching last week’s debate among the nominees for mayor of New York, I finally understood why so many reporters preferred to spend the last few months talking about Anthony Weiner’s penis.

They just wanted to ensure the race did not descend into pettiness and farce.

On the debate stage, there was Christine Quinn, speaker of the City Council, responding to her opponents’ every attack not by addressing the attack, but by attacking the opponent.

(For instance, when Weiner took Quinn to task for changing the city’s term limit law to enable Mayor Bloomberg to run again, Quinn offered nothing in reply except to say that Weiner was in no position to demand other people’s apologies.)

There was Weiner and former city comptroller Bill Thompson, arguing over which of them was their party’s nominee for mayor in 2009—as if the question were a matter of opinion.  (Thompson was the 2009 nominee.  Weiner sought the party’s nomination in 2005 and 2009, but failed on both occasions.)

Certainly there was some substance to be found in the hour-long forum, particularly from Bill de Blasio, the city’s public advocate, who evoked Mario Cuomo’s old narrative about a “tale of two cities,” and from John Liu, the city comptroller, who is faring so dismally in the polls that he can afford a smidgen of dignity and humor now and then.

But on the whole, the contest to succeed Mike Bloomberg has thus far proved an uninspiring and trivial affair—an alarming prospect for America’s largest, greatest and most important urban center.

Meanwhile, in my hometown of Boston, our own mayoral race has been of such comparatively high class and substance that many residents have yet to even notice it.

To be fair, the fact that the campaign to succeed the retiring Thomas Menino currently boasts 12 candidates, none of them particularly well-known, might explain the public’s faint attention and interest.

By the same token, the fairly low-profile standing of the race thus far can be equally attributed to the lack of any notable scandal or silliness on the part of any candidate or the local media.

However, these shallow levels of visibility could soon change.  This week saw the dozen hopefuls debate the issues for the first time, in what will presumably help many voters differentiate among their many options and gravitate in one direction or another.

These debates—there were three, with four participants apiece—were a pleasure to behold.  With only the occasional lapse into inanity—an extended discussion about cage fighting, for instance—they were serious, sober affairs that, at least in my own case, served to enlighten and entertain rather than to depress and annoy.

Included in the three fora, among other things, were extended dialogues about the vices and virtues of charter schools, the practicalities of running public transportation 24 hours a day, how to root out racism in the Boston Police Department, and whether the city has a workable contingency plan in the event that a mishap at Boston University’s new biolab necessitates a large-scale evacuation of the city.

While some candidates were plainly better-prepared than others to tackle such questions, all took them seriously and none strayed too far from the subject at hand, all the while keeping interruptions and petty squabbles to an absolute minimum.

So there you have it.  In Boston, a mayoral race of real weight and maturity.  In New York, a circus.

Of this, I humbly ask:  Why?  Why can’t the most consequential city in America manage a contest for chief executive as grown-up as the one in a municipality one-thirteenth its size?

Is it simply that New York’s nature as an outsized metropolis naturally attracts outsized personalities to lord over it?  That, like the American presidency, the very fact of the office’s bigness and impossible complexity tends to repel the sort of sane, well-rounded individuals who would do the job the highest service?

Or is it rather the case that the challenge of running New York actually turns normal people into caricatures?

Would the New York candidates behave better if they were running in Boston instead?  For that matter, would the Boston candidates behave worse if they campaigned in New York?

Is New York simply having a particularly bad year and Boston a particularly good one?  If the major league baseball standings are any indication, perhaps there is no likelier explanation than that.

Give Me That Thing I Love

I am old enough to remember when exposure to popular music meant watching “Total Request Live.”

If you wanted to see the newest, hottest music videos, you tuned in to MTV at three in the afternoon and followed along with Carson Daly as he counted down the day’s top ten, which were determined by the number of votes each video received in a 24-hour period on

Following its premiere, a given video would be allowed to air for up to 65 days, regardless of its popularity, after which it would be “retired” as the world eagerly awaited its artist’s next move.

For the pop music stars of the day, this was the way of the world for a solid decade, from the late 1990s until the onset of YouTube, iTunes and the like.  If you wanted to matter, you were expected to churn out a new song and a new video according to the TRL schedule.  While one could resist adhering to the MTV model and still find success, those who played ball tended to ascend the Billboard charts with a greater velocity than those who did not.

That was then.

Fast-forward to this past Monday, when Lady Gaga released “Applause,” the first single from her forthcoming album, Artpop, due out in November.  The track is the music superstar’s first new single in nearly two years, so one can imagine the high levels of anticipation within her fan base, whose numbers remain considerable.

“Applause” was intended to drop next Monday, but suddenly appeared on the artist’s YouTube page a week earlier than planned.  Why?  Because unauthorized copies of the song had already leaked onto the interwebs, and Gaga figured she might as well produce the genuine article before the cat strayed any farther from the bag.

I offer two observations on this, one from each side of the transaction that is the release of any new piece of popular music.

From the production side, it is plain enough that, compared to the rigid structure of TRL and TV-based entertainment in general in the late 1990s and early 2000s, music artists today can do what they damn well please with regard to the timetables of their creative works.

Any singer or group with a loyal fan base need not worry about rushing ahead with any new material for fear of becoming irrelevant or forgotten.  With the occasional reminder that a new album or tour is somewhere on the horizon, one is assured of keeping his or her career afloat.

This is notable because it seems to contradict the prevailing view that Americans have such profoundly short attention spans that no celebrity can afford an extended absence from the cultural bloodstream, lest they become incapable of rejoining it when they return.

And from the consumer end of the equation, we might notice the remarkable decentralization of the ways we can now access the music we desire.

Far from huddling around the TV or radio at the specified time of day to hear the latest hits, we need exert no greater effort than plugging the name of a song or artist into a YouTube search engine—that is, if said tracks are not already floating across our customized home page, waiting to be clicked.

In effect, music is no longer “requested” so much as demanded, and we prefer not to wait.  If the artist herself will not deliver fast enough, we will gladly turn to an inferior bootleg to tide us over in the meanwhile.

In this way, the artist’s obligation to the suits at MTV has not disappeared after all:  It has merely shifted to the listeners themselves, who are ever more rabid for fresh material and have grown up in a culture that promises whatever they want, whenever they want it.

As a thought experiment for the day, then, let us imagine retrofitting today’s mentalities with yesterday’s formats.

Without a leaky Internet from which one can play just about any piece of music ever produced, could today’s young folks possibly cope with an America where one’s life’s soundtrack is hostage to the whims of TV executives and radio DJs?  Where one does not have on-demand access to the tunes of one’s choosing without—I shudder at the thought—paying for them?

Millennials are famously adept at adapting to ever-changing modes of technology, but could they adapt backwards?  If YouTube suddenly vanished without a trace, plunging us into the dark ages when listening to the radio was not a last resort, could we stand it?  And if not, what does that say about us?

Not Born in the U.S.A.

Ted Cruz is the junior senator from Texas.  Elected just last November, he has swiftly garnered national attention, along with fellow Washington neophytes Marco Rubio of Florida and Rand Paul of Kentucky, as a passionate adherent to, and banner-carrier for, the enduring Tea Party movement in American politics.

As well, thanks to the dreadful precedent established by President Obama that less than a half-term in the Senate qualifies one to seek higher office, Senator Cruz is now regularly included among serious and semi-serious contenders for the Republican presidential nomination in 2016.

Should he heed the call and take the plunge, one very particular fact about the senator’s background will summarily and sharply come to the fore.

Ted Cruz was born in Canada.

His father, Rafael, grew up in Cuba before fleeing to America shortly before the 1959 revolution.  His mother, Eleanor, was born and raised in Delaware.  In 1970, when Cruz was born, the family happened to be living in Calgary, Alberta, where the couple worked in the oil business.

Should Senator Cruz run for president—in 2016 or any other year—his place of birth will be no small piece of trivia.  On the contrary, it could single-handedly derail his campaign before it even begins.

As outlined in Article 2, Section 1 of the Constitution, there are precisely three qualifications to be president of the United States.  One must be at least 35 years old; one must have lived in the United States for at least 14 years; and most interestingly, one must be a “natural born citizen.”  (The Twelfth Amendment would later clarify that the vice president is bound by this same triad.)

As we were reminded during the nonsensical non-controversy regarding President Obama’s birth certificate, determining precisely what constitutes “natural born citizenship” is where the real fun begins.

Rather surprisingly, this term has yet to be officially defined.  The Constitution is of no help at all and the Supreme Court has never been tasked to offer an opinion of its own.  Like certain other clauses in our founding documents that we have never fully hammered out, the meaning of a potential president being (or not) a “natural born citizen” has been hitherto backlogged as a “we’ll worry about it when it happens” problem.

Of course, this doesn’t mean there isn’t a trove of legal hypothesizing on the matter, both by independent thinkers and government officials.  In 2011, a report by the Congressional Research Service offered the following:

The weight of legal and historical authority indicates that the term “natural born” citizen would mean a person who is entitled to U.S. citizenship “by birth” or “at birth,” either by being born “in” the United States and under its jurisdiction, even those born to alien parents; by being born abroad to U.S. citizen-parents; or by being born in other situations meeting legal requirements for U.S. citizenship “at birth.”  Such [a] term, however, would not include a person who was not a U.S. citizen by birth or at birth, and who was thus born an “alien” required to go through the legal process of “naturalization” to become a U.S. citizen.

This fairly all-encompassing definition, if formally adopted, would encompass people such as Senator Cruz, who has argued that his mother’s standing as a U.S. citizen at the time of his birth made him a citizen as well.

It would also include someone like Senator John McCain, who was born in the Panama Canal Zone, where his father was stationed as a naval officer.  The Canal Zone was under U.S. control at the time and was therefore technically a part of the United States, although under the above definition, Senator McCain would be “natural born” even if it were not.

And by the way, this definition would also apply to the imaginary Barack Obama who was born in Kenya.  Like Cruz, Obama descends from a foreign-born father and an American mother, and would not necessarily need to have been born on U.S. soil to qualify for the White House.

Of course, all of this is merely the tip of the iceberg, as even the broadest interpretations of “natural born citizen” do not extend to immigrants, of whom the United States is becoming ever more composed.

There have been several attempts in recent decades to open the opportunity to run for high office to those who were born abroad and became U.S. citizens later in life, including in 2003 to allow for Austrian-born Arnold Schwarzenegger to occupy the Oval Office.

While all such proposed legislation has failed thus far, one senses it is only a matter of time before the question arises once again.  After all, were we to definitively establish that a foreign-born U.S. citizen with one non-citizen parent is good enough to be president, is it really that much of a jump to say the same for those with none?

Contrary to Unpopular Belief

In the middle of last week, a meme swept across Twitter called “confess your unpopular opinion.”  Emboldened by the hashtag-ification of this injunction, scores descended upon the Twitterverse to express their deepest, darkest convictions about anything their hearts desired.

While one generalizes about the entirety of Twitter at one’s peril—the site does, after all, represent pretty much everyone on Earth with a computer or a cell phone—perhaps the most useful tweet under this “unpopular” header was by dallin33, who wrote, “95% of #confessyourunpopularopinion tweets are in fact a popular opinion.”

Following a brief random sampling, I found this to be very much the case, with tweeters making such “bold” assertions as “gay marriage should be legal” (a view currently shared by 52 percent of Americans) and “God is real and he loves you and me” (more than 9 in 10 believe in God in one form or another, albeit not among young people).

I have written from time to time about the importance of expressing dissenting views.  Just last week I bemoaned the fact that so few people in our free and open society seem to think differently from the majority in the first place, and how this makes the airing of contrary positions that much more essential.

This Twitter twaddle demonstrates a related but distinct phenomenon:  The innate desire of many people to be intellectual and philosophical rogues, even when they are plainly not.  People who value and exercise opposition for its own sake, wearing it almost as an accessory.

On Internet social networks, one encounters such specimens all the time.  My college film classes were teeming with them:  Rebellious hipsters who churned out pages of copy about how Citizen Kane is worthless trash and Billy Madison is a masterpiece.

My problem with such oddball, improbable sentiments is not that they exist, but rather that, deep down, they probably don’t.  That the sorts of people to whom I refer are not being straight with us:  They don’t really believe the uncommon opinions they express, but they publicly assume them because doing so is more fun than simply agreeing with everyone else.

A part of me wants to applaud these self-appointed devil’s advocates for at least attempting to think outside the box, in a world where far too many just go with the flow, never questioning the conventional wisdom and assuming that if the majority agrees on a particular proposition, then it must be true.

And yet, I would much prefer if these intellectual gadflies came by their views honestly, as I suspect many of them don’t.

Christopher Hitchens once wrote a book called Letters to a Young Contrarian, which he began with a denunciation of the volume’s own title.  A “contrarian,” Hitchens argued, is not some sort of vocation to which one could or should aspire.  It is merely a descriptive term for one who, for whatever reason, tends to disagree with the majority most of the time, as Hitchens himself was known to do.

In this way, Hitchens saw his book as a means of reassurance for those who find themselves in the minority and worry that there is something wrong with them as a result.  However, Letters was not meant to stoke or encourage dissent in those who do not naturally possess it.

I think that is the correct balance to strike.  People should be made to feel confident in expressing their views, whether they are popular or not, and to arrive at them honorably.

There is nothing inherently special about thinking differently from others.  The point is that one has the right to do so, and should not be deterred from espousing such thoughts out loud over such shallow considerations as political correctness or simple agreeableness.

However, just as one should have the courage to express inklings that are unfashionable or strange, one should be equally dignified to acknowledge when one’s conclusions fall squarely in line with popular belief.  After all, once in a blue moon, the majority is actually correct.

Give the poseurs credit for one thing:  At least they know an unpopular opinion when they see one.