Notorious THC

I didn’t inhale on 4/20 this year.

However, I did ingest.

Specifically, I sucked on two pieces of watermelon-flavored hard candies infused with THC—the active ingredient in cannabis—until they completely dissolved on my tongue and entered into my bloodstream.

To be clear, I didn’t pop two pieces into my mouth in rapid succession.  I’ve read Maureen Dowd’s infamous 2014 column about her ill-fated run-in with a hopped-up chocolate bar in Colorado (“I barely made it from the desk to the bed, where I lay curled up in a hallucinatory state for the next eight hours”) and I know better than to over-indulge an edible with so little experience under my belt.

No, I did exactly what the packaging told me to do:  “Start with one piece.  Wait two hours.  Be mindful.  Enjoy.”

In retrospect, I should’ve been a little less mindful.

Precisely 120 minutes after my first dose, I felt no physical or psychological effects whatsoever.  At that point, I rather restively administered dose number two, from which proceeded another hour-plus of nothing, followed, at long last, by a slight tingle of…….something.  Not a high, per se, let alone a full-blown case of the giggles and/or the munchies.  Nope, just a passing wave of vague euphoria that ended almost as quickly as it began—five, ten minutes, tops.

And that, regrettably, was that.  An evening’s worth of buildup to a virtually non-existent payoff.  So much for the warning on the back of the box:  “The effects of this product may last for many hours.”

What made this 4/20 test run all the curiouser was how very different it was from the night before, Good Friday, when I introduced myself to the world of edibles for the very first time.  In that case, I consumed a single lozenge around 8 o’clock.  At 9:15, while sprawled on the couch watching TV, I found myself breaking into a spontaneous full-body sweat, my heart thumping 50 percent harder than it was a moment before, my mind unable to concentrate on anything beyond removing my socks so my feet wouldn’t suffocate.

While I wouldn’t describe this scene as Maureen Dowd-level paralysis—“hallucinatory” is too grand a word to apply here—I nonetheless kept more-or-less completely still as the weird and less-than-wonderful sensation swept over me, resigned to sit quietly until the perspiration subsided and my heart tucked itself back into my chest, where it belongs. 

When both of those things occurred—again, it didn’t take terribly long, although it sure felt like it—I had no particular urge to double my money with another hit of THC just then.  As a newbie, better to quit while I’m ahead, declare the experiment a success (of sorts) and spend the balance of my Friday night with a relaxing—and altogether predictable—bottle of merlot.

It’s a truism of the pot world that marijuana affects everyone differently.  As has now been proved to me, it is equally true that its effects on a given individual can vary from one day to the next.

Of course, none of the above would be of the slightest interest to anybody, except for one extraordinary fact:  It was all perfectly legal.

Through a voter referendum, the commonwealth of Massachusetts legalized the sale and consumption of marijuana for recreational purposes on November 8, 2016.  And last Thanksgiving—a mere 742 days after the fact—the state’s first two cannabis retail establishments officially opened for business.

Today, there are 15 pot shops (and counting) sprinkled across Massachusetts—including, as of last month, the first recreational dispensary in Greater Boston, New England Treatment Access (NETA) in Brookline, which is where I found myself last Friday morning.  When I arrived at 9:50, there were at least 30 people lined up outside the former Brookline Bank where NETA is housed, waiting to get in.  When the place opened 10 minutes later, at least as many cheery townsfolk were lined up behind me.  Apparently I wasn’t the only one who knew that April 20 was mere hours away.

Customers were escorted inside the ornate marble building five at a time—after getting their IDs checked and scanned, a Brookline police officer stationed casually nearby—and were promptly handed an eight-page menu of the shop’s litany of products, as they waited for the next available register.  (As with the bank that used to occupy the same space, all valuables were concealed safely behind the counter.) 

While tempted by the Belgian dark chocolate bar—Maureen Dowd’s experience notwithstanding—I finally opted for the 16-piece “D-Line Gems,” which the sales associate fetched and rung up for an even $30—$25 for the candy itself, plus a 20 percent sales tax that, per the law, is added to all cannabis-related purchases.  (Actually, it’s three different taxes in one—“local marijuana tax,” “sales tax (cannabis)” and “marijuana excise tax”—but who’s counting?)

Oddly, I wasn’t the slightest bit interested in purchasing an actual cannabis plant, nor the various accessories that go with it.  At my advanced age (31), I suppose I just don’t have the patience for the rituals that old-fashioned pot smoking entails.  As a working man who regularly interacts with the general public, I could certainly do without the smell.

In truth, I could probably do without marijuana altogether, whether smoked, sucked, swallowed or swilled.  Before last week, I hadn’t touched the stuff in nearly nine years, and only a handful of times before that.  Sometimes it’s been a blast; other times, a bust.  I expect I’ll be paying NETA another visit sooner or later, although I doubt it will become much of a habit.

In a country that still occasionally calls itself the land of the free, I’m just happy, at long last, to have the choice.

Advertisements

Get on the Cannabus

I did not smoke pot this past April 20.  Truth be told, I haven’t smoked pot at all since the summer of 2010—and only a handful of times before that.  I don’t say this to impress you.  Were a joint to spontaneously appear in front of me, I’d likely grab it faster than Donald Trump grabs a Filet-o-Fish (and to greater effect).

I first encountered (read: inhaled) marijuana during my freshman year of college—specifically, in my dorm’s communal bathroom on Good Friday—because some guy down the hall had a secret stash and I happened to be idling nearby.  While I wouldn’t call that evening life-changing—if memory serves, it consisted mainly of eating a family-sized bag of Doritos and avoiding eye contact with the RA—it set the template for every weed-smoking episode that followed:  I didn’t actively seek it out, but when the opportunity presented itself—invariably through some vague acquaintance whom I’d probably never see again—I didn’t put up much resistance.  Following years of curiosity—and all the hysterical anti-drug propaganda that went with it—I wanted to understand what the fuss was about, and I was seldom disappointed with the result.

That was then—a blessedly distant world of prohibition in which to get high was to put oneself at the mercy of the American legal system—a risk that, as with underage drinking, undoubtedly added to the allure and pleasure of the overall experience.  (White privilege probably helped, too.)

In the intervening years, however, something rather strange has happened:  Marijuana has become legal.  As of this writing, nine states and the District of Columbia have OK’d the personal recreational use of the cannabis plant in all its forms, while another 20 states have sanctioned it for medicinal purposes—a gateway maneuver if I ever saw one.

Among the nine-and-a-half states that have gone whole hog on the pot question is my home commonwealth of Massachusetts, whose voters approved a pro-pot ballot referendum on November 8, 2016—an admittedly ironic day for such a liberal, forward-thinking decision.

Strictly-speaking, marijuana became legal in Massachusetts less than six weeks after Election Day, with residents allowed to grow, possess and consume small amounts of the substance to their hearts’ desire in the privacy of their own homes.  However, government bureaucracy being what it is, it will not be until July 1—fully 20 months after the vote—that recreational pot shops will open their doors and, for the first time, their products will be commercially available to those, like me, who have been largely cut off from the cannabis black market up to now.

Of course, the $1 billion question is whether the normalization of weed will turn me—and, in time, the entire state—into a lazy-eyed smokestack who spends all day listening to Pink Floyd and giggling at the wallpaper.  Whether ease of access will translate into frequency of use, and all the productivity-depleting horrors that supposedly follow.

Having never tended my own private marijuana nursery, I cannot know that answer for sure until the magic hour arrives.  However, my hunch is that very little will change in my consumption habits overall, and I would wager the same about most of the fellow inhabitants of my state.

How so?  First, because, as a rule, the per-serving market rate for legal weed tends to exceed that of alcohol—already the far more entertaining of the two drugs—and I am nothing if not a cheap date.  Second—and speaking of booze—I can’t help but notice that, pound-for-pound, I imbibed a lot more liquor before turning 21 than after.  As enjoyable as moderate drinking can and will always be, once all the legal barriers fell—once I could walk into a package store without a fake ID and emerge with a six-pack of Sam Adams unmolested—the temptation to overindulge was just never the same.  Call me an old fogy, but I find that spending the majority of one’s Sunday hunched over a toilet bowl isn’t nearly as fun at age 30 as it is at 18, 19 or 20.

The dirty little secret about drugs—as with pretty much everything—is that nothing dulls the appetite like legalization, and the most surefire way to create a culture of addicts is to take their favorite product away from them.  History is littered with examples of this very phenomenon—not least in the United States between 1920 and 1933—although my personal favorite is the observation made in the 1990s to Salman Rushdie—then under fatwa for writing The Satanic Verses—that “in Egypt, your book is totally banned—totally banned!—but everyone has read it.”

To be honest, it’s unlikely I’ll be smoking cannabis ever again—even after July 1.  Having never learned to roll a joint properly and not wanting to set off smoke alarms in my own house, my pot consumption, such as it is, will almost surely come in edible form, be it candy, chocolate or whatever else the kids are cooking up these days.  While I understand the pitfalls of ingesting marijuana-laced baked goods for the first time—elucidated most memorably by Maureen Dowd in a 2014 New York Times column—the notion of sucking smoke deep into my lungs has struck me as an increasingly unappetizing means of getting high when biting into a slightly odd-tasting cookie will produce more-or-less the same result.

But that’s just me.

Vote for Burr!

“I’ve never agreed with Jefferson once / we’ve fought on like 75 different fronts / but when all is said and all is done / Jefferson has beliefs / Burr has none.”

So raps America’s first treasury secretary at a critical moment toward the end of Hamilton, Lin-Manuel Miranda’s sublime remix of U.S. history that has taken Broadway (and my iTunes library) by storm.  The moment occurs in the heat of the presidential election of 1800—a campaign that is still considered the ugliest and most over-the-top in history—which saw an Electoral College tie between the top two candidates, Thomas Jefferson and Aaron Burr, meaning the presidency would ultimately be decided by the House of Representatives.

As congressmen remained deadlocked after 35 rounds of voting, it fell to Alexander Hamilton to act as unofficial kingmaker.  And what a nauseating choice it was:  Jefferson, as leader of the rival Democratic-Republicans, had long served as Hamilton’s singular political nemesis.  Burr, meanwhile, had once been a fellow Federalist but abruptly switched parties in 1790 to run—successfully—for the U.S. Senate against Philip Schuyler, a respected Federalist who also happened to be Hamilton’s father-in-law.

In short, the choice in 1800 was between a man who embodied everything Hamilton hated and a man who embodied nothing at all except sheer, naked ambition.  In the end, Hamilton sided with Jefferson, the House followed suit and the rest…well, you know the rest.

(Prior to the vote, Hamilton had effectively killed off incumbent President John Adams with a 54-page pamphlet attacking his administration.)

I recount this pivotal episode in American electoral history partly as a rebuttal to the longstanding myth that the Founding Fathers were essentially perfect.  That in addition to being uncommonly learned and intelligent, they were also uncommonly virtuous and civil and refreshingly devoid of any pettiness or ego.  That they sacrificed everything—their lives, their fortunes and their sacred honor—for the noble cause of American independence, wholly unencumbered by personal agendas or other selfish interests.  That, in short, they were totally unlike the lowly political leaders we’re stuck with today.

Deep down, of course, we know that most of the above is complete nonsense.  We have read first-hand accounts of the founding era for eons and understand how personally and tragically flawed the authors of America truly were.

With Hamilton—arguably the richest and most historically accurate depiction of the founding ever created (in spirit, if not in verse)—we have been given a singularly visceral insight into how those flaws actually played themselves out.  How, for instance, a certain orphan immigrant had his life cut short—likely by several decades—because he dared to question the honor of another man and, when challenged, couldn’t summon the nerve to take it back.

The rivalry between Alexander Hamilton and Aaron Burr is the stuff of legend, although until recently the intrigue had been confined to their famous duel on July 11, 1804—a face-off that ended Hamilton’s life and Burr’s career.  In fact, their relationship ran more or less continuously from 1776 onward, and in Miranda’s play, Burr is both the narrator and co-protagonist.

In a show that aims to narrow the gulf between yesterday and today—and thus make the past more accessible to us in the present—the decision to grant Burr such narrative primacy has proved eerily prescient to our contemporary political climate.

After all, we are right now engaged in a sustained national debate about whether the likely Republican presidential nominee is—to quote Maureen Dowd—“more like Hitler, Mussolini, Idi Amin, George Wallace or a Marvel villain.”  Donald Trump, in any case, is widely recognized as a man with no core convictions except to become the most powerful man on Earth and, to that end, a man willing to alter his views at the drop of his dopey red hat.

In this way, Donald Trump is Aaron Burr.

What is more, should the GOP contest lead to an open convention in July—an increasingly plausible scenario—Republican delegates will be confronted with a strikingly similar prospect to that faced by Hamilton in 1800:  Should they allow the party—and possibly the country—to be ruled by someone whose only objective is the acquisition of unlimited power?

The party’s dilemma, of course, is that there is no Jeffersonian figure to save them from themselves.  There is, instead, Senator Ted Cruz—a candidate who is nominally an across-the-board conservative but is also, alas, a smarmy, cynical narcissist with no friends or accomplishments to speak of.  (Jefferson, however ambitious, was at least capable of feigning humility.)

In fact, it is America’s non-Republicans who now gaze upon the GOP primary with the same horror as when Hamilton was forced to choose between Jefferson and Burr.  It’s been a nagging question for us liberals:  Assuming no other alternatives, do you go for the guy who is decent enough to believe in something—albeit the opposite of everything you believe in—or is it preferable to roll the dice with someone who only believes in himself and, thus, could possibly be dealt with under certain conditions?

For a good long while, I leaned ever-so-slightly toward the latter, figuring that in a country whose major global enemies are apocalyptic religious fundamentalists, the last thing we need is to elect a fundamentalist of our own.  (Say what you will about Trump, but I find his rank indifference to religion among his more reassuring qualities.)

But now I’m not so sure.  While I foresee no universe in which the politics of Ted Cruz suddenly become tolerable—to say nothing of Cruz himself—there is an equally compelling argument that someone like Cruz would at least bring a certain predictability to the presidency that a reckless barbarian like Trump—by his own boastful admission—would not.

In 1800, Hamilton and others viewed Burr as not just unprincipled but as outright dangerous to the continued health of the nascent republic—not least because of his inherent unknowability and bottomless opportunism.  Indeed, isn’t it more or less tautological that men with no firm concept of good are the most liable to commit evil?  As Hamilton asks early in the play, “If you stand for nothing, Burr, what’ll you fall for?”

That, in a sense, points to the arguments both for and against Donald Trump.  By standing for nothing in particular, he becomes capable of pretty much anything, good and bad.  By choreographing a willingness to hedge, hem and haw as circumstances require, he suggests a capacity to be all things to all people—or, conversely, nothing to no one.  By steadfastly refusing to box himself into a consistent and coherent set of political views, he conjures the image of a man existing outside the box entirely, for better and for worse.

Among those who are so exasperated by Washington, D.C., that they want to blow the whole damn place to smithereens, Trump’s pitch may well make perfect, if perverse, sense.  In this moment, maybe a human wrecking ball is just what the doctor ordered.

Or maybe—just maybe—Trump is exactly what he looks like:  A clueless, shameless charlatan who will be utterly in over his head in the Oval Office and will realize—however belatedly—that however much he wanted the job at first, he never really had a plan for following through.

And yet—as farcical as it seems now—it’s still possible that November’s election will see the final victory of Aaron Burr.  Wait for it:  History is happening in Manhattan, and the world will never be the same.

A Land Far, Far Away

On this Labor Day—the grand finale of summer—I presume that you, the average American, have recently gone on some sort of vacation.

If so, may I ask why?

Of course, the potential answers to this are endless.

If you’re like me, perhaps you got away to explore some exotic foreign city, visiting museums, touring historic sites, taking in an unfamiliar culture to learn more about the world around you.

Or, if you’re like me at other times, you went on holiday to do nothing of the sort.  You sat on the beach, by a pool or in a hotel room and didn’t do jack squat.  You slave away and zig-zag from one activity to the next all the rest of the year, and this was your one chance to cool down and tune out.

Then there’s everything in between.  Visiting family and friends.  Catching up on House of Cards and the latest bestsellers.  Hiking the tallest peaks and diving in the deepest oceans.  Hauling off to the most remote villages or squeezing into the densest of modern metropolises.

In a way, all these and more are really more of the “what” than the “why” when it comes to vacationing.  Disparate as our myriad getaway adventures are, they do, in the end, serve the same essential underlying purpose:  Allowing us to be somewhere else.

Bill Maher summed it up well enough in one of his stand-up specials:  On any TV game show, the winners are invariably ecstatic upon winning an all-expense-paid trip, no matter what the destination happens to be.  Whether it’s Paris or some third-world hellhole, the point is that they’re able to escape.  To leave their normal routines behind, if only for a few days, and experience something different and new.

Well, we all have that need, do we not?  However comfortable our day-to-day lives, however content we are with the place we call home, sooner or later it’s necessary to mix things up.  Familiarity may not always breed contempt, but it most certainly breeds idleness and boredom, not to mention a depletion of one’s creative faculties.

They say the definition of a romantic is one who always wants to be somewhere else.  What do you call a person who always wants to be in exactly the same place?  If you ever meet such a person, be sure to let me know.

And why do we feel this way?  What do we desire—what do we gain—from a periodic, temporary change of scenery?

Fresh air, for starters.  First in a literal sense, if you happen to reside in some stuffy apartment or a town that’s a wee bit overcrowded.

But also in the figurative sense of getting a fresh view of the world and many of the people in it.  As many a wise man and woman have said, the main benefit of travelling abroad is to see your home country in a whole new light.  As Plato teaches us in the “Allegory of the Cave,” to know that the world exists beyond your immediate field of vision is the first step in the pursuit of knowledge and wisdom.

Or, to bring this point more down to Earth:  Have you not noticed how your mind tends to clear and relax when you are physically removed from your natural habitat?  How certain problems that seemed insurmountable suddenly ease up and dissipate like the morning mist?

I have found that much of my cleverest and most interesting writing has emerged from some remote location, be it a cabin in Vermont or upon the seat of my bicycle.  I dare say I am not alone in this fact.

We all know that humanity’s greatest insights invariably arise in the shower—particularly at a blissfully hot temperature, first thing in the morning.  Well, in the context of an ordinary day, is there any finer example of a momentary respite from reality than that?

No, I think it is self-evident that vacations—mental and physical—are necessary and pleasurable not only for their own sake, but for their lasting benefits once our lives have returned to normal.  They allow us to broaden our horizons and improve ourselves and others in a manner that cannot come about in any other way.

All of which leads me to a very simple question:  Why do we make an exception for the president of the United States?

If the principle of relocating in order to recharge one’s mental batteries is good enough for us, why isn’t it good enough for the most important man on the face of the Earth?  Isn’t he precisely the sort of person who needs and deserves to be tanned, rested and ready for any problems that might come his (read: our) way?

I ask in light of the incessant and utterly predictable blather about the Obama family’s annual sojourn in Martha’s Vineyard for two weeks in August, and particularly about the commander-in-chief’s borderline obsessive proclivity for playing golf—a criticism most entertainingly leveled by Maureen Dowd in a priceless recent column in the New York Times.

Of course, attacking the sitting president for taking time off is a thoroughly bipartisan affair—liberals never tired of pointing out how much time George W. Bush spent at his Texas ranch while in office—and it has managed to become so much of a cliché as of late that many have decided to give it a rest.

However, insomuch as this issue of presidential sick days is no longer a genuine bugaboo for most folks, it is for all the wrong reasons.  And it is worth underlining why the chief executive’s free time is so valuable to the republic.

The conventional gripe is that the president’s responsibilities are too momentous to be given even an hour’s reprieve.  And the conventional retort is that the president is never truly on vacation, anyway:  However exotic his surroundings, he is always and forever in his insulated universe of secret service agents, security briefings and the fact that everything he does and says is under the closest possible scrutiny.

Both of those things are true, but neither accounts for the value of the change in locale and routine, as documented throughout history, from Dwight Eisenhower’s own golf outings to Franklin Roosevelt’s extended stays in Hyde Park, New York and Warm Springs, Georgia.

But then very little imagination is required on our parts, if anything I have said here is true.  Anyone who has ever worked in an office setting knows how confining the experience can be over time, and how beneficial it is to one’s mental health to, say, eat one’s lunch on a park bench rather than at one’s desk.

Now picture what the Oval Office is like, where the stakes are incalculably high and the workday never really comes to an end.

To those who truly believe that the American president should not be hanging around a golf course so long as something terrible is happening somewhere in the world, I can only ask:  Would you prefer that he never left the White House at all?

Perhaps you do, and certainly most of us like to high-mindedly bang on about a president’s duty to be always at the ready, as if he were Batman.

But the truth is that the human need for the occasional breather does not end at the Oval Office door.  That the president be allowed to travel and (partially) relax like anyone else is in the best interest not only of him, but of the country as a whole.  It is a would-be paradox that we should acknowledge on this of all days, in which we celebrate the working man, and work itself, by taking the day off.

Popularity Fallacy

Jeez, can we knock it off about Bill Clinton’s amazing popularity, already?

You see the talk everywhere these days, including most recently in a column by Maureen Dowd in Sunday’s New York Times.

“As Hillary stumbles and President Obama slumps,” Dowd writes, “Bill Clinton keeps getting more popular.”  As evidence, Dowd cites a Wall Street Journal poll from June ranking the “most admired” presidents of the last 25 years (Clinton won by a mile); a YouGov survey measuring the perceived “intelligence” of the last eight commanders-in-chief (again, Clinton finished first); and a May Washington Post poll putting Clinton’s overall “favorable” rating at a 21-year high.

Indeed, strictly to the question, “Do most people today like Bill Clinton?” the answer is an indisputable “Yes,” and it hardly depends on the meaning of the word “like.”

However, I would argue the question itself is a silly and fairly useless one, as it is with regards to every living (or recently dead) ex-president.

Of course Bill Clinton is more popular today than he was, say, during the “Gingrich revolution” in 1994 or the Lewinsky fiasco in 1998.  Of course he enjoys more general goodwill than President Obama or possibly-future-President Hillary Clinton.

Bill Clinton left the White House on January 20, 2001.  Know what he’s been doing in the 13-and-a-half years since?

Not being president, that’s what.

George W. Bush, for his part, ended his presidency with an approval rating of 34 percent.  Today, that number is 53 percent.  What has Bush been doing these past five years to merit such a rise in stature?

Not being president and painting.

Bush’s father, George H.W. Bush, also clocked approval numbers in the mid-30s during his final months on the job.  Today, he is nearly as admired as Clinton.  What’s he been up to?

Jumping out of airplanes, fishing, and (all together now) not being president.

Of course, I am being a tad unfair and simplistic.  America’s modern-day ex-presidents have, to varying degrees, done a great deal of good work after leaving office, for which they deserve kudos and a second look.  (Jimmy Carter has probably accomplished more in “retirement” than half our presidents did while in power.)

What is more, my “not being president” theory doesn’t even begin to address the large variance in overall perception among the many former presidents under examination (e.g.  Clinton ranks considerably higher than Carter), and the myriad possible explanations for it.

But the fact remains that nearly every president in modern history has become more admired in retirement than he often (or ever) was while in office.  To this extent, I think my reductionist hypothesis holds, and I’m sticking to it.

Consider:  To assume the presidency is to become the servant of each and every citizen of these United States, and to be personally responsible for their well-being (as far as they’re concerned, at least) and that of the country as a whole.  To be president is to be constantly photographed and broadcast, to be forever seen, heard and discussed, and to be drenched in a bottomless well of gripes and crises from every corner of the known universe.

However, the moment your term expires, all of that goes away.  To become an ex-president is to be freed not only from the duties and burdens of the office, but also from any expectations of leadership.  You can disappear into the woods, and no one will go looking for you.  You can play golf and eat junk food and no one will give you a second thought.  Constitutionally-speaking, a former president doesn’t have to do a damn thing for the rest of his life, and many have been quite happy to oblige themselves.

Long story short (too late?), we Americans approve of our former chief executives because we have no immediate or compelling reason not to.  Because they no longer wield supreme influence over our daily lives.  Because they are no longer on every TV screen every hour of every day.  Because they have transitioned from celebrities with power to mere celebrities.  Because their every move and every word are no longer of any relevance to our own existence, and maybe—at least in some cases—because we have forgotten the days when they did.

Today, Bill Clinton’s long-windedness and snark are adorable.  Would we feel the same way if he were employing them back in the Oval Office on the public dime?

George W. Bush has garnered near-universal praise for his marked disinterest in the nuances of foreign policy in his time away from Washington, even though this same quality yielded a decidedly different response when he was squarely in the middle of the action.

Time may not heal all wounds, but it can certainly numb them and render them moot.  As Paul McCartney said, reflecting on his years with the Beatles, “You always forget the bad bits.”

As we now consider the supposed “inevitability” of Clinton’s leading lady in her possible campaign for president, let us bear in mind that Hillary Clinton’s own popularity—not as high as her husband’s, but certainly an improvement over President Obama’s—is largely the product of her nearly six-year absence from the rough-and-tumble world of retail politics.  Once and if she returns to the arena, are the Democratic primary voters who so loathed her in 2008 going to be able to forgive and forget this time around?  Or is the thawing of their icy hatred contingent on her present status as an above-the-fray figure?

I think it is all-too-obvious that our views of one famous person or other are shaped by that person’s role in our own lives, and that the more benign and unobtrusive such a person is, the more popular he or she tends to be.

So stop talking about Bill Clinton’s enduring popularity as if it’s some sort of anomaly or in any way newsworthy.  It’s not and it’s not.  Rather, it is exactly what you would expect, particularly for a guy who wants nothing more than to be liked and who will go to extraordinary lengths to make it so.

A world leader being relieved of his power and becoming less admired as a result?  Now that would be news.

Everybody Might Get Stoned

I must confess I’ve never ingested marijuana in chocolate form.  From what I’ve been reading lately, I’m not sure I’d ever want to.

It turns out the cannabis baked into pot brownies and other such “edibles” is far more concentrated and potent than we inexperienced noobs had previously grasped.  One need hardly take more than a small nibble to become buoyantly blazed for the better part of the evening.

With my relationship with sugar being what it is—I take a few bites, everything goes dark, and suddenly the whole box of Tagalongs is empty—I would be liable to inadvertently gorge myself into a stoned oblivion from which I might never completely return.

A sugar high is disorienting enough.  One need not pile an actual high on top of it.  (To say nothing of pot’s well-known ability to direct one’s hand deep into the cookie jar.)

But of course many people do exactly that, and with Colorado having become the first state to legalize the recreational use of marijuana, the buying and selling of cannabis-infused baked goods has entered the legal free market for the first time.  As such, the country has been compelled to think more critically and carefully than ever before about precisely how this new industry should operate.

While this question has myriad angles—many of which mirror those about the regulation of legal marijuana overall—perhaps the most essential involves the wide dissemination of basic scientific facts.  Namely, how much pot does one need to eat in order to achieve the desired effect?

It’s a rather important piece of information to possess if one has even the pretense of wanting to make intelligent consumption decisions.  And yet, an alarmingly high number of potential marijuana users are completely clueless.

They can’t be blamed too much:  When a substance is totally banned, discussions about proper dosage tend not to pop up all that frequently.  (Much like how abstinence-only sex education doesn’t bother teaching how to operate a condom.)  And so, when it then becomes legit, there is a lot of catching up to do.

(We should also not fail to note that, thanks to the Controlled Substances Act of 1970, the prohibition of a particular drug effectively prevents the scientific community from conducting all kinds of research into how the drug works.)

In any case, the pot edibles debate barged into the mainstream press in the last week after New York Times columnist Maureen Dowd recounted a recent evening in a Denver hotel room, during which she unwittingly ate several times more chocolate weed than necessary (at least for a first-timer like her) and proceeded to experience something resembling Leonardo DiCaprio’s Quaaludes adventure in The Wolf of Wall Street.  (Thankfully, she did not get behind the wheel of a car.)

Dowd’s column went viral, thereby alerting the masses to what is an entirely legitimate critique of the pro-pot push:  Widespread legal marijuana will necessarily invite its use by a sizable pool of new customers—perfectly intelligent in all other respects—who have no idea what they’re getting themselves into, and who will very predictably make highly regrettable decisions that will not be completely their fault.

That is, unless the marijuana-smoking-and-eating community makes a considered effort to educate the public about exactly what its product does.  Do pot proponents not have an obligation—moral, if not legal—to not simply assume that everyone else is as informed about the powers of weed as they are?

This wouldn’t seem to be an especially arduous challenge.  If the wrapper of a regular candy bar is capable of quantifying a “serving size,” then why can’t a weed-laced version of the same bar?  It may be true that marijuana, like alcohol, affects everyone in a slightly different way, but surely it is possible for a label to explain, “If you eat this whole bar at once, terrible things will happen.”

I speak from relative ignorance on this subject, owing to my aforementioned lack of interest in patronizing the “edibles” industry myself.  (Honestly, can’t y’all just smoke it from a pipe like everyone else?)

But then I might change my mind one day, and I would rather the relevant dosage information be planted directly in front of my nose—not to mention the noses of my countrymen, some of whom are not nearly as cautious or clever as I.

Plus, it would clearly be in the interests of the marijuana industry overlords to see that this happens.  It would, after all, relieve them of most of the culpability for when their customers ignore the warnings and eat the whole brownie anyway.

Stupid people can always be counted upon to do stupid things.  But when smart people start doing them, too—well, that’s quite a high risk to take.