A Terrible Titan

John Silber, president of Boston University from 1971 to 1996, died on Thursday.  Combing the obituaries and appraisals of his life and career, one senses a positively titanic figure in the world of academia—a man whom it would be impossible to ignore, and precisely the kind of person one could simultaneously hate and respect.  A man almost beyond understanding.

That is the essence of America, and the point Orson Welles evoked with Citizen Kane, whose subject’s life was likened to a jigsaw puzzle.  We fool ourselves when we say our leaders should be Superman-like cardboard cutouts with absolute moral clarity.  Complexity is the new black.

I did not attend Boston University during Silber’s long tenure as president, although he was still hanging around campus as president emeritus.  (He also served as Chancellor from 1996 to 2003.)  The legend of President Silber rang loudly in the halls and the dorms, both from those who experienced his leadership firsthand and those merely passing along rumors in a 16,000-person game of “telephone.”

In many young minds, Silber’s legacy was and will continue to be predominantly defined by his extremely turbulent relationship with all matters relating to homosexuality.  It was not merely that, as acting president in 2002, he disbanded a gay-straight alliance at the Boston University Academy (a BU-affiliated high school), but that he condemned members of the group as practitioners of “evangelism” and “homosexual militancy” for purposes of recruiting and training young cadets to engage in unspeakable sexual exploits.

It would be nice if the BUA incident were the sum total of Silber’s executive accomplishments and we could simply write him off as a wretched human being with not a shred of human empathy or any discernible connection with the real world.

But then we read that Silber had a son, David, who was gay and died of complications from AIDS in 1994, and realize the enormously complex role the subject of homosexuality must have played in Silber’s life—that it was more personal to him, for good or ill, than we might ever know.

We also cannot dismiss him outright because, as the record makes clear, he presided over, and was largely responsible for, a veritable renaissance in the life of his university.  During his rule, Boston University’s endowment exploded from $18 million to $422 million, its research grants from $15 million to $180 million, and he balanced every one of its budgets.  The school’s total real estate holdings tripled, with $700 million going towards various construction projects, and the pedigree of its personnel took a definite upward turn.

I have always reserved a soft spot in my heart for people who say exactly what they think about controversial subjects without fear of what might happen to them personally or professionally.  People who, like Joan Jett, don’t give a damn about their reputations.  Silber was such a man, to put it in the mildest possible terms.

I was tickled to read how, for instance, Silber’s initial BU job interview consisted of him ticking off his myriad complaints about the state of the university at the time.  As reported in his New York Times obituary, “He called the campus ugly, bemoaned a faculty laden with ‘deadwood’ and said the university might be dying.”  He was hired on the perceived strength of his ideas.

Having watched on YouTube a fairly recent interview with Silber by the Texas Tribune, I found a man who still very much lived up to the title of his first book, Straight Shooting, and—to my mild surprise—a man with whom I could find much common ground on matters of bureaucracy in education.  (His views on the dangers of political correctness in academia are well worth debating.)

Silber was also a man of irrepressible bluster.  Asked for self-assessment by a local TV reporter during his 1990 run for Massachusetts governor, he cited “honesty” as his primary strength, and then proceeded to stonewall the question of personal weakness.  As Mitt Romney has learned in 2012, the public is not terribly impressed when the second half of a sentence contradicts the first half.

I mentioned Citizen Kane as an entry point into understanding the lives of certain men—or rather, into realizing they can’t be understood at all.  John Silber’s was one that might make a decent film in its own right, and he seemed to share many qualities with Charles Foster Kane himself.  “Here’s a man who could have been president,” we are told in Kane, “who was as loved and hated and as talked about as any man in our time.”

I would be very interested in what a search for Silber’s childhood artifacts might dig up.

Electoral Math Problems

My presidential campaign epiphany of the week:  It’s over.  The election is over.  Democrats can get their bottles of Korbel in position.  Republicans might as well begin Monday morning quarterbacking now.  The president will be re-elected on the sixth of November.  Not a doubt in my mind.

This moment of clarity occurred with a studied gaze at the electoral map.  To coin a phrase:  It is a simple matter of arithmetic.

You begin with the map as it looked in 2008, when Barack Obama walloped John McCain by a score of 365-173.  Concede that, unlike McCain, Mitt Romney will win Indiana, as practically every poll suggests.  Despite the DNC’s flag-planting in Charlotte, Romney is likely to carry North Carolina as well.

The third surprise of 2008 was Obama’s victory in Virginia.  While the numbers show him likely to win the Old Dominion a second time, for the purposes of this exercise we’ll move it into Romney’s column anyway.  Finally—just for yuks—we will do the same with Ohio and Florida, the two biggest prizes of all, which, at the moment, are leaning very definitely in favor of the incumbent.

The result of this electoral tweakage, in which we give Romney every plausible benefit of the doubt?  Obama still wins, albeit by a razor-thin margin of 272-266.  So there you have it.

The presidency, as everyone knows, is not determined by which candidate receives the most votes, but rather by which candidate receives the most votes in a combination of states whose cumulative number of senators and representatives exceeds that of the states won by the other candidate.  As God intended.

I speak, of course, of the Electoral College.  There is a time in every cycle to reexamine its usefulness in our democratic system, and as Robert Plant so pungently crooned, “Now’s the time; the time is now.”

My favorite piece of revisionist history from 2004 concerns the state of Ohio, which George W. Bush carried by 119,000 votes, a margin of 2.1 percent.  Had those votes swung to John Kerry, Kerry would have won the state, the Electoral College and thus the election, while trailing Bush in the national tally by nearly three million votes—six times the margin by which Al Gore led Bush in 2000.

What does this mean?  Constitutionally speaking, not a damn thing.  No president has ever been elected by national popular vote, and our system used to be far less democratic than it is now.  (For instance, U.S. senators were not elected by popular vote until ratification of the 17th Amendment in 1913.)

Yet somehow it just feels wrong that, as in the above hypothetical, a man could become president with three million fewer votes than his opponent.  In fact, it is theoretically possible for a candidate to win 72 percent of the national popular vote and still lose the election in the Electoral College.  (This would entail winning 100 percent of the vote in small states and a hair under 50 percent in big states.)

You could argue from dawn to dusk that such a ridiculous outcome will never actually come to pass—and you would be correct—but that is pure evasion.  It could happen, and there would be nothing we could do about it.  Those are the terms from which we must begin this debate.

The debate itself, once it does commence, can spin off into a thousand possible directions.  Some considerations are more a matter of opinion than objective fact, as one weighs the relative importance of competing American values, chief among these being the perennial argument over federalism.  It began amidst the drafting of the Constitution itself and it has never quite abated.

Make no mistake.  More than anything, the Electoral College has endured because the country began with it.  There may be no more powerful force in American government than that of precedent.  An entire wing of judicial theory, known as stare decisis, is based upon it.  The essence of conservatism (at least on a good day) is to err on the side of long-established tradition and to alter or abolish such traditions with the utmost care and skepticism.

You don’t need me to point out that the “we’ve always done it this way” argument has occasionally led the U.S. astray.  Pick your favorite example; there are many from which to choose.

As far as the Electoral College is concerned, the smart money would say that if the 2000 election failed to generate a robust movement to abolish it, we just might need to accept that we’re stuck with it for as long as most of us will live.

For those made unhappy by this state of affairs, I hope some solace can be found in how very entertaining the prognosticating profession has been made by the system’s existence.  You wouldn’t want Nate Silver out of a job, now, would you?

A Matter of Debate

Next Wednesday, October 3, is the first debate between Mitt Romney and Barack Obama.  It and the two others later in the month will determine who wins the election.  We know this to be true, because people on television have said so ever since the conventions ended—conventions that, themselves, were also going to determine who wins the election.  Along with Romney’s vice presidential selection.  And the economy.  And Libya.

If all else fails, maybe we’ll just try voting.

I have to level with you:  I did not watch the Republican primary debates early in the year.  However enlightening the prospect of nine people deliberating the world’s problems in 90 minutes might have looked on paper, I somehow found them a slight waste of time.

One-on-one debates, for all their faults, at least offer the possibility of an actual, coherent argument to break out.  We get to see our candidates forced to think on their feet and speak not only in complete sentences, but in complete thoughts.

But about those faults.

Despite all the would-be tweaking the modern-day presidential debate has undergone in the last many election cycles, the final product never seems to improve.  Griping runs the spectrum:  Some complain the candidates are not given enough time to speak, while others insist they are given too much.  Some would like to see more genuine interaction between the two speakers; others prefer monologues.

The problem nearly everyone seems to agree upon—and rightfully so—is the ease with which the contestants manage to evade and subvert the truth.  With barely enough time for the candidates to speak, there certainly isn’t a moment for the moderator to call “bull.”

On this point, there is one way to improve the format that is so obvious, it has never even been considered:  Fact-checking in real time.

By now everyone is familiar with “fact checkers.”  They are the folks on television and especially the Internet who take it upon themselves to find out, through basic research, whether the things that politicians say are actually true.  In our parents’ day, these people were known as “journalists.”

 Today’s fact checkers became famous for their meticulous rebukes of various claims by the Romney and Obama campaigns, particularly during the conventions.  No doubt they will be kept very busy—as they have in years past—during the debates, afterwards assessing which candidates told the biggest whoppers and fudged the numbers the most.

Why can’t we hire these sleuths to work the room while the event is still going on?

The conundrum, as it now stands, is that most Americans turn off and tune out by the time the analysts roll onto the screen.  They see the candidates’ claims, but not the information that disproves the claims.  And so we are left with a voting public that is highly misinformed about the facts, in ways that are not entirely its own fault.

The proper manner of regulating, then, is for a line of quick-fingered researchers to be sitting alongside the moderator, computers at the ready.  Every time a candidate says something that is empirically untrue or misleading, a red light will flash and the debate will grind to a halt as the record is corrected.  The speaker will not be allowed to continue his thought until he acknowledges the truth.

Presumably, this would make for terrible television, and will therefore never be tried.  Nor would I expect either campaign to agree to a format that all but guarantees its candidate will be humiliated.  Indeed, the newest innovation to this year’s series is quite the opposite:  Both men will be informed of the topics of discussion in advance, effectively making unpredictability impossible.  Isn’t that always how it goes?

Newt Gingrich, for his part, has suggested returning to the format of the Lincoln-Douglas debates of 1858.  Those proceeded as follows:  One man would speak, uninterrupted, for 60 minutes straight.  Then the other candidate would offer a 90-minute rebuttal, followed by a 30-minute closer from the first guy.

To transplant that into today’s world…well, I can only think of the rule of thumb from Christopher Hitchens, “One has to be a spellbinding person to speak for more than half an hour.  And if one is spellbinding, one doesn’t really need the half-hour.”

On the other hand, three hours of near-continuous talking would sure resolve the issue of evasiveness.  If a 90-minute soliloquy does not give the viewing public a sufficient impression of a candidate’s mettle, his character and his command of the issues at stake, then the real problem is not the candidates or the networks, but those poor souls on the other end of the TV screen.

Don’t Vote

Oftentimes in life, abstinence is the way to go.

Joseph C. McMurray, an economics professor at Brigham Young University, recently posted a frightfully interesting article about the wisdom of voting, titled, “The mathematics of democracy:  Who should vote?”  Drawing from 18th century political philosopher Nicolas de Condorcet, McMurray argues that, statistically speaking, society would be better served if a certain hunk of the population did not vote at all—namely, the least-informed hunk.  He recommends, in effect, that such lowly individuals stay home on November 6, entrusting the future of the country to those more in the know.

It has long been a pipe dream of mine, and of many people I know, to bring literacy tests back into the electoral process.  That is, to require that citizens know a few basic things about the country in which they live before being allowed into a voting booth.  The theory is that the more educated and informed the voter is, the more likely he or she will elect good leaders who will enact good policies.

Of course, the last time America tried this, it was for the less-than-noble purpose of suppressing the votes of black people in the Jim Crow-era south.  Possibly we could agree that, should we resurrect the literacy test for the present day, a fair amount of tweaking would be in order before putting such a system into effect.

In any case, McMurray’s notion of all voters not being created equal is an intriguing one, and worth exploring further.  In practice, I would argue, voting for its own sake is neither noble nor particularly smart.

There will come the moment, for instance, when you’ll be marking your own ballot and arrive at the race for city comptroller.  For all the due diligence you have paid to this year’s campaigns, the comptroller battle somehow eluded your attention:  You know nothing about any of the candidates, nor are you certain precisely what a comptroller does.  Nonetheless, as with every race, you are as entitled as anyone else to cast a vote.  Indeed, all your life you have been taught it is your duty to do so.

Resist the temptation.

Sure, you could have fun with this, such as voting for the candidate with the silliest name, or the representative of your preferred political party.  Or picking the female candidate, because there aren’t enough female comptrollers in the world.  And so forth.

Except that these are real people who take this process seriously; their jobs are on the line.  Do they really want to be elected in such a facetious manner?  Is it not wholly irresponsible—an abuse and a mockery of this most sacred democratic exercise—to cast a vote that is so plainly uninformed?  I say—in case I haven’t yet made myself clear—that it is, and should be avoided at all costs.  Leave the comptroller-selecting to those who are sufficiently well-versed in comptrollership.

As the recent violence in Egypt and Libya was breaking out, ostensibly in reaction to an Islam-insulting film, Salman Rushdie made the point that just because one has the right to free expression does not mean one should necessarily exercise it.  Speaking about the mysterious cabal behind the film in question, Rushdie said, “Even jerks have the right to free speech, but they’re still jerks.”

Ignorant people deserve the privilege to vote, but they are ignorant nonetheless, and they are part of the problem rather than the solution.  Thomas Jefferson understood this when he wrote, “If a nation expects to be ignorant and free in a state of civilization, it expects what never was and never will be.”

People often complain that voter turnout in presidential elections is far too low, and therefore the true desires of the American public are never really represented or given voice.  Often left un-pondered, then, is the possibility that turnout is actually too high—at least among those who have no business involving themselves in the electoral game.  What point is there in culling the views of the greatest possible sampling of Americans if it includes views of people who have no idea what they’re talking about?

The real problem, you see, is that the ignorant tend not to know who they are, as suggested by those amusing polls that show 80-90 percent of the people consider themselves “above average.”  So this is not, alas, simply a matter of voluntary abstention—hence the appeal of some theoretical literacy test to winnow the field of potential voters into a truly elite corps.

Since we are not actually going to do such a thing anytime soon, the next-best option lies in what founders like Jefferson so strongly recommended:  Education.  Said John Adams in 1776, “Laws for the liberal education of the youth, especially of the lower class of the people, are so extremely wise and useful, that, to a humane and generous mind, no expense for this purpose would be thought extravagant.”

That is the kind of government spending we need.

The Heart, the Head and ‘The Master’

I spent a good deal of high school thinking I would devote my life to writing about movies.  In this pursuit, four years of college taught me exactly one thing:  I don’t know the first thing about them.

Higher education, properly understood, exists for two reasons:  First, to impart knowledge; second, to impart wisdom.  In my own case, the knowledge was that I did not comprehend the world of film.  The wisdom was in realizing I didn’t need to.

The allure of movies, I have slowly come to appreciate, is that they cannot (and possibly should not) be completely, fully understood in the usual sense of the word.  They exist in a dimension beyond simple logic, and are driven as much by emotion as by reason.  Our favorite films are the ones that linger in our minds in ways we can’t quite put into words.  As an old Supreme Court justice famously said, we know it when we see it.

I make these points having just seen The Master, the latest project by Paul Thomas Anderson, at whose feet I will partly lay blame for scaring me out of becoming a movie critic.  His pictures, which include one of the very best from the 1990s (Magnolia) and the 2000s (There Will Be Blood), are as much symphonies as films—pure, visceral experiences, like an opera in a language one doesn’t speak—and can be seen and appreciated as such.

The Master very much follows in this tradition, with its relentless and brooding score, its uber-sharp-focused (yet paradoxically dreamy) photography, and especially its macabre lead performances by Joaquin Phoenix and Philip Seymour Hoffman.  The storyline is not difficult to follow, per se, but the real pleasures in The Master are tangential to its narrative thrust.  One needn’t take it literally, comprehending every detail, to be wholly enraptured.

I told you about movies so I can tell you about politics.

People like to think that electoral politics is a simple matter of appraising “the issues” and choosing the candidate whom we deem “better” for them.  That the dilemma of whom to cast one’s ballot for is reducible to some kind of equation—wholly rational, with no emotional or “gut” considerations necessary.

Ah, were it to be so!

By what rationale, might I ask, did most fans of Barack Obama in 2008 lend their ever-so-enthusiastic support?  Was it from the sheer force of his arguments about the need for affordable health care and withdrawing troops from the Middle East?  Or was it rather from the tenor with which he made such arguments, and the words and phrases he employed?  Was the attraction logical or visceral?

Could the average Romney voter, if pressed, name the specific policies of the Republican ticket that draw him or her to support it over the Democrats’?  Does the proposition of voting for “change” or as a means of “taking the country back” withstand the plain light of day, or does it merely reflect the way people feel?  Do I need to stay for an answer?

Needless to say (but I will say it anyway), not every citizen votes his heart over his head, just as many moviegoers can’t stand anything that approaches the ponderous or the abstract.  Just the facts, thank you very much.

What I wonder, though, is whether viewing politics with an emotional, rather than rational, bent is actually the preferable approach.  David Brooks argues frequently that good decision-making requires a healthy combination of both, and that dismissing emotional considerations entirely is impossible.  Might this be a good thing?

We would do well to consider the limits of pure rationality in the complex world of governance, particularly when our present enemies—in Afghanistan, Iran and elsewhere—find so little use for it themselves, thanks in no small part to the outsized influence of organized religion in that region of the world.

By no means should we abandon reason altogether—that way madness lies.  I am acutely aware that when comedian Lewis Black joked that the best way to defeat psychotic nemeses is to “be more psychotic than they are,” he was being (mostly) ironic.  The whole point of The Master is that following men who make unverifiable claims tends to lead one astray.

The fact remains, however, that most voters do not comprehend the nuances of public affairs any more than I comprehend the nuances of film.  Our faculties of thought can only get us so far before the wisdom of our gut kicks in to take care of the rest.  Such an impulse need not be batted away.  Humans are rational creatures, but that is not all we have to offer.

Gay Linguistics

I am apparently a member of the gay community.  I guess my invite to the Labor Day community barbeque got lost in the mail.

But then I forgot:  We are no longer called the “gay community,” but rather the LGBT (lesbian, gay, bisexual and transgender) community.  No, scratch that—I believe the moniker is now LGBTQ (“questioning”) or, in some circles, LGBTQIA (“intersex” and “asexual”).

Nothing to make you feel included like being reduced to an initial.

Gay rights has been called the final frontier in the quest for civil equality in the United States.  A major raison d’être for its recent breakthroughs, I am convinced, is the way it has learned and successfully exploited a most fundamental rule of politics:  He who controls the language, controls the debate.

The argument about abortion, after all, is nothing if not a competition over defining terms:  One side would have it be a fight of “pro-abortion” versus “pro-life,” while the other prefers, simply, “pro-choice” versus “anti-choice.”

So it has gone with gay civil rights.

We might begin with the word “gay” itself.  As Frank Bruni chronicled in the New York Times, the prime time speakers at this month’s Democratic National Convention referred to (and boasted about) gay issues with unprecedented ubiquity, all the while shielding them in euphemism and indirection.

The repeal of “don’t ask, don’t tell”—the Obama administration’s signature achievement on the subject thus far—was defined, with near-perfect synchronicity, as a matter of not gauging soldiers’ military aptitude based on “who they love.”  Same-sex marriage was alluded to along similar lines, with “love” as the buzzword of choice.

Republicans, for their part, have become no less queasy about using “gay” for their own purposes, shrouding the marriage question, as ever, in terms such as “traditional family values” and the like.  Long gone are the days of Pat Buchanan pounding his fist and tarring Bill Clinton and Al Gore as “the most pro-lesbian and pro-gay ticket in history,” which, in 1992, they surely were.

On same-sex marriage—the gay rights initiative to beat the band—the movement has achieved its most significant linguistic coup:  “Marriage equality.”  Not unlike the anti-abortion cohort that coined the term “pro-life,” gay rights activists have made a most savvy attempt to frame the debate as a matter of a fundamental founding American principle—in this case, the business about all men being created equal.  With those ends in mind, “changing the definition of marriage” suddenly sounds a lot less sinister to the average skeptic.

“Marriage equality” is the turn-of-phrase that appears in this year’s Democratic Party Platform—the first such platform to endorse the practice, whatever its name.  It is the term all politicians who support the cause have been trained to employ, just as they have adopted the aforementioned alphabet soup approach in referring to the “community” itself.  Again, the word “gay” has been all but whitewashed from the conversation.

As successful as this linguistic re-calibrating has been, it nonetheless rather rubs me the wrong way.  Precisely because language is so important in public discourse, there is something shady afoot when one side of a debate makes an effort to hide its very nature behind unnecessary jargon—particularly a group that lays so much stock in the concept of “pride.”

Why, for instance, is the most prolific gay rights group in the country called the Human Rights Campaign?  Do they fear a diminishing of support and influence if they made it clear, in the title, what they actually stand for?

Why is a straight man’s significant other his “girlfriend” while a gay man’s significant other is his “partner”?  Why is a straight woman’s spouse her “husband” while a gay woman’s spouse is her “spouse”?  If we want to be treated exactly like everyone else, why do we allow this rhetorical distinction to persist?

Words matter, as a certain president once said.  So long as the movement goes on, I would advise that it conduct itself with the greatest possible intellectual honesty and forthrightness, which means saying what it means in language that clarifies, rather than distorts or conceals.  Trust that the force of the argument itself is sufficient to win the hearts and minds of those whom we need and desire to join the cause.

Have a little backbone.  A victory without integrity is not a victory at all.

Beer, Pandering and Politics

Perhaps you heard the excellent news on the first of this month, that the White House has released the recipe for the executive mansion’s first-ever homebrewed beer, known as White House Honey Ale.  Rejoice!

It turns out the Obama administration had been surreptitiously serving its hometown hops for such occasions as the White House Super Bowl party and St. Patrick’s Day, as well as during a one-on-one chat between the president and Dakota Meyer, a U.S. Marine Corps Sergeant and Medal of Honor recipient.  But it was not until late last month, when a series of formal recipe requests were filed by enterprising, homebrewing attorneys, that the existence of such a beverage became widely known.

One need not be a beer enthusiast—I am more of a whiskey man myself—to appreciate the surprising primacy with which beer has occupied the Obama administration.  Recall, for one, the “beer summit” the president held in July 2009 between Harvard Professor Henry Louis Gates and Sergeant James Crowley, following Crowley’s arrest of Gates in the doorway of his own house.  (Gates and Crowley ordered Blue Moon and Sam Adams, respectively; the president opted for Bud Light.)

In his quest for a second term, Obama has been photographed regularly in local watering holes in various swing states, chugging his fair share of brewskies with adoring (and highly inebriated) supporters.  Having allegedly kicked his nicotine habit, the president is known to be a casual drinker on his own time, but as with everything else, swilling kicks into overdrive during campaign season.

Obama has clearly embraced the well-worn election year Rorschach test, “Which candidate would you rather have a beer with?”  Taken literally, the beer battle of 2012 is not exactly a fair fight, insomuch as Mitt Romney, a practicing Mormon, abstains from all varieties of alcohol.  It seems almost unfair for Obama to crack open a cold one as he mingles with the common folk while Romney is compelled to settle for O’Doul’s.

In his 2008 primary showdown against Hillary Clinton, Obama truly seemed to have met his match, as Clinton could be seen engaging in all manner of imbibing as a means of loosening her image as some kind of tightwad.  But it was not a purely protean act on her part, as we saw in a priceless photo log from a recent trip to Colombia in her role as secretary of state.

Taken less-than-literally, the beer question is really a roundabout way of asking, “Which candidate is more of a regular guy?”  Or, more directly still, “Which candidate is more like me?”  The implication is that we, as a people, would prefer our leaders to be just like us and we intend to vote on that basis.

As with all forms of mass pandering, there is a self-fulfilling prophesy at work here.  The candidate downs a beer because he thinks it is what his audience wants him to do.  Presumably if the public communicated that a candidate should not be drinking on the job, he would stop.  Our leaders are nothing if not trained monkeys we can command at will.  We can blame only ourselves for making them this way.

My own view on the beer matter (so long as I’m here) is that candidates should act as if dignity and self-respect still mattered in public life—to act as free agents, not the mere instruments of the people’s whims.  In short:  To lead.

I don’t mean to say that our representatives should abstain from drinking or, for that matter, from pandering in general.  After all, they are called “representatives” for a reason.  There are plenty worse ways for a candidate to spend his or her time than interfacing with the public.

All I ask for is truth in advertising.  If a pol insists on saying and doing what he thinks the people want to hear and see, he must maintain the façade throughout his entire term in office, realizing that that is the implicit deal he has struck with his constituents.  Whether he truly means it—whatever “it” is—is secondary.

With luck, such pretensions will not endure the plain light of day and the people will recognize a phony when they see one, which would be to everyone’s benefit in the long run.  St. Mark asks us, “What shall it profit a man, if he shall gain the whole world, and lose his own soul?”  Some politicians have had the misfortune to learn this lesson the hard way, and that was when the real drinking began.