Thursday, March 31, 2011

Movie Quote of the Day (“Bull Durham,” on Baseball and Walt Whitman)


Annie Savoy (played by Susan Sarandon): [narrating] Walt Whitman once said, ‘I see great things in baseball. It's our game, the American game. It will repair our losses and be a blessing to us.’ You could look it up.”—Bull Durham (1988), written and directed by Ron Shelton

Like Annie, I’m a member of the Church of Baseball. And so, with Opening Day here, I say: Amen, and play ball!

Wednesday, March 30, 2011

TV Dialogue of the Day (“Frasier,” on Proper Pet Training)


Dr. Lilith Sternin (played by Bebe Neuwirth): (to Eddie the dog, sternly--or, perhaps better to say, Sternin-ly) “Go away!” (Eddie runs off)


Dr. Frasier Crane (played by Kelsey Grammer): “Now why does he listen to you and not to me?”


Lilith: “By my tone of voice. He knows I mean business.”


Frasier: “I see, so you're saying your voice is more commanding than mine.”


Martin (played by John Mahoney): “Hell, I took half a step before I realized she was talking to the dog!”—“The Show Where Lilith Comes Back,” Frasier, Season 1, Episode 16, written by David Isaacs and Ken Levine, directed by James Burrows, air date Feb. 3, 1994

Tuesday, March 29, 2011

Quote of the Day (Pat Moynihan, on the Rosenbergs and Other Cold War Spies)


“It had been governmental secrecy that had allowed critics of the Rosenberg and Hiss cases to construct their elaborate theories about frame-ups and cover-ups. For years the Rosenbergs' defenders demanded that the government reveal its secrets about the case. When the government gave in and released the documents, the secrets made the government's case even stronger….As the secret archives of the Cold War are released, the original case made against Soviet espionage in this country has received ever more conclusive corroboration. Secrecy raised doubts about the great internal-security cases of the Cold War; ending that secrecy has resolved them.”—Daniel Patrick Moynihan, Secrecy: The American Experience (1999)


The conviction of Julius and Ethel Rosenberg on espionage charges, which took place 60 years ago today, occurred only one-third of the way through their ill-fated legal odyssey, but it represented an extremely early point on their passage into American legend.


The title of Alistair Cooke’s account of the Alger Hiss case, A Generation on Trial, only hints at the volcanic emotions of the Rosenberg case. Somehow, in the minds of the couple’s supporters, this was the Dreyfuss case and Sacco-Vanzetti multiplied.


The late Senator Moynihan’s discussion of the U.S. government’s long refusal to disclose a key source of its certainty of the couple’s guilt—the VENONA decryption of intercepted Soviet diplomatic communications—highlights not only an important element of the case, but also brings to the fore an issue with continuing importance today: When does the need to keep a national-security secret no longer operate?


At the time that the Venona transcripts became available to scholars, in 1995, the Rosenbergs had been dead four decades, and even the Soviet Union had ceased to exist. It’s easy to see that at this point, there was no longer any need to keep this intelligence motherlode secret.


But why couldn’t the secrets have been divulged two decades earlier? People can argue—and have—that the initial decoded encryptions were so small and fragmentary compared with all the intercepted communications that it took a long time to make sense of them. There’s also the claim that however long it took to decode them, the need remained compelling to ferret out the truth about at-large Soviet intelligence agents no matter how long it took.


The 1970s represented a critical point, when the writing and study of postwar American history—not to mention the course of American culture—could have been positively affected by disclosure. Weren’t any weapons systems depicted in the intercepts outdated by this point? Didn’t discussion of American troop movements belong to history by theh? But the mania for secrecy took on its own logic.


In the meantime, a thousand conspiracy theories, born of New Left historiography, with the U.S. overwhelmingly seen as at fault, were allowed to bloom. All kinds of questions, including the origins of the Korean War, remained unanswered. Moreover, the guilt and innocence of people accused of espionage remained a live question.


It became possible for the Rosenberg children to argue in the 1970s—the decade of Richard Nixon and Watergate—that their parents were only guilty of living in a time of hysteria and government coverup. Over time, the Rosenberg case formed the backdrop of the likes of E.L. Doctorow’s fictionalized The Book of Daniel (and the Timothy Hutton film Daniel) as well as Tony Kushner’s depiction of Ethel Rosenberg as avenging spirit in Angels in America.


VENONA’s release left the Rosenberg defenders with an increasingly eroded defensive position. It takes a long time for an illusion to die, and we should not be surprised that those cherished by so-called "Red Diaper Babies" should be any different. As the probability strengthened that Julius did, as charged, steal and pass to the Soviets the secrets of the A-bomb and that Ethel, at minimum, knew it, die-hards tried one excuse after another, including that the pair did not trade anything important and that the espionage began while the U.S. and U.S.S.R. were still allies.


After all this time, it really is hard to say how much the U.S. government gained by clamping down on the VENONA secret. It is certain, however, that the New Left gained a pair of dubious martyrs. After all, how could the Rosenbergs have not have suspected, by the time of their execution in 1953, that they were dying for a murderous ideology and tyrant that in no way merited leaving their sons orphans?

Monday, March 28, 2011

Quote of the Day (Will Rogers, Anticipating Jon Stewart by Decades)


"Everything is changing. People are taking the comedians seriously and the politicians as a joke."—American humorist and man-of-all media Will Rogers (1879-1935)

Sunday, March 27, 2011

Quote of the Day (Annie Dillard, on Redemption)


“Only redemption—restoration, tikkun—can return the sparks of light to their source in the primeval soul; only redemption can restore God’s exiled presence to his being in eternity. Only redemption can reunite an exiled soul with its root. The holy person, however, can hasten redemption and help mend heaven and earth.”—Annie Dillard, For the Time Being (2000)

Saturday, March 26, 2011

This Day in Theater History (Tennessee Williams, Poet of “The Wild at Heart,” Born)

March 26, 1911—His first and middle names were “Thomas” and “Lanier,” and he was born in the 11th year of the 20th century. But playwright Tennessee Williams was, like many of his protagonists, a misfit who from the beginning “found life unsatisfactory,” and perhaps for that reason was not above playing with what he didn't like.
That instinct has found its way into otherwise reputable reference sites. Even the estimable “American Masters” Website lists his year of birth, erroneously, as 1914. Gore Vidal, who made the acquaintance in 1948 of the Pulitzer Prize-winning dramatist of A Streetcar Named Desire, Cat on a Hot Tin Roof, and other plays of greatly varying worth, chuckled, in the essay “Some Memories of the Glorious Bird and an Earlier Self,” that Williams, instead of copping to being 37, “claimed to be thirty-three on the sensible ground that the four years he spent working for a shoe company didn’t count.”
Williams’ centennial has not gone unnoticed. In contrast to Eugene O’Neill, whose centennial in 1988 did not inspire rummaging through his artistic attic, theater troupes all over the country are scouring through even Williams’ lesser work, as noted in this Newsweek essay by Jeremy McCarter. (I myself saw one of these several weeks ago: The Milk Train Doesn’t Stop Here Anymore, mounted by New York’s Roundabout Theatre Co., with Olympia Dukakis in the lead.)
McCarter’s explanation for the Williams revival—that he’s a “poet of liberation” who pioneered frank treatment of human sexuality, especially gay themes—only goes so far to account for this rebirth of interest. After all, no such artistic resurrection has come for William Inge, another gay man who once occupied a rarefied place atop the American theater with Williams, only to be beaten down by critical arrows and his own substance abuse before dying a lonely death.
Now, Vidal’s pose as a provocateur is so reflexive that I find myself discounting, oh, at least 15% of whatever he writes, on any subject, from the get-go. But his amusing and affectionate retrospective of starting out in the Forties with Williams puts his finger unerringly on why “The Glorious Bird” endures: “Most beautifully, the plays speak for themselves. Not only does Tennessee have a marvelous comedic sense but his gloriously dramatic effects can be enormously entertaining. He makes poetic (without quotes) the speech of those half-educated would-be genteel folk who still maintain their babble in his head.”
After The Night of the Iguana, Williams never had another Broadway hit for the last two decades of his life. Critics noted that, even as these works became more lurid, they also became more heavily symbolic. Many tied the change in atmosphere to Williams’ growing pill and alcohol abuse.
But Vidal also might have been onto something in noting of Williams, “Constantly he plays and replays the same small but brilliant set of cards.” In other words, even while his dramaturgy changed, his underlying themes remained the same.
That remained overwhelmingly, in the playwright’s own words, his preoccupation with offering “a prayer for the wild of heart that are kept in cages.”

Quote of the Day (Niall Ferguson, on Improving High-School History)


“Here are three positive suggestions to make high-school history more engaging and thereby more memorable. First, replace those phone-book-size tomes with Web-enabled content. Second, make the new stuff more interactive. (There's solid evidence that well-designed games and simulations hugely improve learning.) And third, ask more exciting questions.”—Niall Ferguson, “How to Get Smart Again,” Newsweek, March 20, 2011

Friday, March 25, 2011

Song Lyric of the Day (Carly Simon, on “Love’s Debris”)


“You say we can keep our love alive
Babe, all I know is what I see
The couples cling and claw
And drown in love's debris.”—“That’s the Way I’ve Always Heard It Should Be,” lyrics by Jacob Brackman, music by Carly Simon, from the Carly Simon LP (1971)

The above credits indicate that the title of this post is, in a way, misleading. Technically, it’s lyricist Jacob Brackman who came up with these searing verses. But Carly Simon composed the music and delivered the angst-ridden vocals. And, because she poured her heart out to friend Brackman, I would say that he is merely channeling her voice.

Forty years ago this month, Simon’s self-styled debut album was released by the up-and-coming label Elektra, with “That’s the Way I’ve Always Heard It Should Be” the single that launched her long career. The song was a most unlikely success, removed alike from the protest songs that had dominated the charts until the past year or so, as well as the cheerful pop (the Partridge Family!) increasingly taking over. It chronicled not a foreign war zone, but brooding domestic warfare, where the wounds weren’t so obvious.

In a prior post, I discussed Simon’s father, Richard Simon--exiled from the publishing house he founded, stricken with heart disease, too depressed to protest the affair his wife was conducting in their own home with a younger man hired as companion for her pre-adolescent son. His plight was unforgettably evoked, a decade after his death, in what I consider the most powerful song of his daughter’s career:

“My father sits at night with no lights on
His cigarette glows in the dark.
The living room is still;
I walk by, no remark.”

Despite its autobiographical roots, “That’s the Way I’ve Always Heard It Should Be” seems anomalous amid all the rest of Simon’s work. The dominant themes of that, remember, are desire (“Anticipation,” “Give Me All Night”) and loss (“Coming Around Again,” the entire Torch LP). But the single flew into what critic Robert Christgau, in an otherwise condescending summation of her early career, termed “pop music’s wedding-bell clichés.” In its dread of intimacy’s restrictions on freedom, it sounds more like the work of Simon’s folk-rock contemporary, Joni Mitchell.

For all its commercial success that arrived after months of careful radio promotion by advocates (such as longtime family friend and then-WNEW-FM deejay Jonathan Schwartz), the song was something of a cause celebre in the feminist community. Many voiced their deep dismay over the ending, in which the narrator, after repeating her lover’s urging to “raise a family of our own, you and me,” surrenders: “We’ll marry.” Is that all that women existed for, they asked?

They missed the point, a common problem with those viewing art stemming from a persona that may or may not be the author‘s--or, for that matter, stemming from reality. She might be in a different time and place, but Simon’s narrator, in her isolation and emotional brittleness, shares something in common with Carol Kennicott, the would-be rebel who yields to the small-town mores she’s desperately opposed in Sinclair Lewis’s 1920 novel Main Street.

Simon’s vocal makes the capitulation to convention in a dying fall, whose short, wan quality contrasts dramatically with the powerful images and arguments against marriage given free rein earlier. You can’t help feeling that the union of the would-be partners in the song will not end happily--either enduring, as Simon’s parents did, beneath the thinnest of veneers, or openly fracturing, as the singer’s marriage to James Taylor finally did in the early 1980s.

Thursday, March 24, 2011

Movie Quote of the Day (“Steel Magnolias,” on Elizabeth Taylor)


Truvy (played by Dolly Parton): “When it comes to pain and suffering, she's right up there with Elizabeth Taylor.”—Steel Magnolias (1989), screenplay adapted by Robert Harling from his stage play, directed by Herbert Ross

First she was the child star, then the adult beauty, then the star onscreen and off. But, in the end, she made the greatest public impact as friend and survivor.

Steel Magnolias was made over 20 years ago, but by that time Elizabeth Taylor had already endured nearly three decades of medical conditions that could have killed her. Her first Oscar, for Butterfield 8 (1960), was widely regarded at the time by many (including, evidently, the actress herself) as a sympathy vote, coming a mere few months after a near-fatal bout with pneumonia. Over the years she also contended with a bad back (courtesy of a fall from a horse, at age 12, while shooting National Velvet), respiratory problems, weight issues, alcohol and pill addiction, and, in the end, congestive heart failure.

Time, W.H. Auden wrote, “is indifferent in a week/To a beautiful physique.” As indicated by a rather nastily titled Doonesbury collection by Garry Trudeau from the early ‘80s--A Tad Overweight, But Violet Eyes to Die For--Taylor’s fame lasted longer than her looks.

But the fact is, amid the wreckage of so much of her life (eight marriages!), she did endure. If Lindsay Lohan wants a model of how to emerge on the other side of the worst that life can throw at you, she can do far worse than look at Taylor for a role model.

People should be remembered not simply for the sum of their worst moments, but for what they were at their best. In that light, I chose for the image accompanying this post a movie still that Taylor herself, I suspect, might have cherished. It comes from A Place in the Sun (1951), a film released when she was only 19, co-starring Montgomery Clift. Yes, it shows the looks that made women envious and men temporarily insane (critic James Agee, more than two decades her senior, admitted, after seeing her in National Velvet, that he was "choked with the peculiar sort of adoration I might have felt if we were both in the same grade of primary school"), but details so much more.

Clift’s conflicted sexuality prevented the two from ever having a physical relationship, but Taylor’s deeply felt relationship with her co-star was, in its way, as loving as any she enjoyed with any of her husbands. He died young, but knew that she would always be there for him. You might be down on your luck, but as long as Taylor was around, you were rich in friends.

Maybe it was that ability to be there for someone else that was the instinct behind her advocacy for AIDS victims (prominently including another closeted gay friend, Giant co-star Rock Hudson), including through the Elizabeth Taylor AIDS Foundation. Maybe it was through this instinct that--years after the end of a career she increasingly regarded as superficial, and years after medical problems that would have killed anyone else--Taylor lived and is recalled instantly by those who never saw her in her celluloid heyday.

Wednesday, March 23, 2011

Quote of the Day (Anthony Trollope, Showing Why He Never Had Writer’s Block)


“There are those who...think that the man who works with his imagination should allow himself to wait till—inspiration moves him. When I have heard such doctrine preached, I have hardly been able to repress my scorn. To me it would not be more absurd if the shoemaker were to wait for inspiration, or the tallow-chandler for the divine moment of melting. If the man whose business it is to write has eaten too many good things, or has drunk too much, or smoked too many cigars—as men who write sometimes will do—then his condition may be unfavourable for work; but so will be the condition of a shoemaker who has been similarly imprudent....I was once told that the surest aid to the writing of a book was a piece of cobbler’s wax on my chair. I certainly believe in the cobbler’s wax much more than the inspiration.”--Anthony Trollope, An Autobiography (1882)

Tuesday, March 22, 2011

Song Lyric of the Day (Rodgers and Hart, Evoking the Melancholy in Spring)


“Spring is here!
Why doesn’t the breeze delight me?
Stars appear,
Why doesn’t the night invite me?”—"Spring Is Here,” music by Richard Rodgers, lyrics by Lorenz Hart, from the musical I Married an Angel (1938)

At least temporarily, I understand a bit what Rodgers and Hart were expressing. While standing by the side of a duck pond in Demarest, N.J., over the weekend, taking this and other photos, I felt that the season was not quite there yet in my heart.

Maybe in a few more weeks—after that snow we‘re supposed to get again tomorrow…

Monday, March 21, 2011

Movie Quote of the Day (Barbara Stanwyck, Putting the Make on Poor Henry Fonda)


Charles Pike (played by Henry Fonda): “Now you, on the other hand, with a little coaching you could be terrific [at playing cards].”

Jean Harrington (played by Barbara Stanwyck): “Do you really think so?”

Charles: “Yes, you have a definite nose.”

Jean: “I'm glad you like it. Do you like any of the rest of me?”—The Lady Eve (1941), written and directed by Preston Sturges

The Lady Eve, released 70 years ago today, furnished Barbara Stanwyck with one of her golden opportunities to do what she did best onscreen: demonstrate that she was forever deadlier than the male. Even the sharpest of studs become inadvertent prey for "Stanny," as demonstrated in the unforgettable moment she sidles down the stairs in Double Indemnity, making you sense immediately that Fred MacMurray’s seemingly wised-up insurance salesman is already toast.

Henry Fonda’s Charles Pike is such a poor dumb sap that, had his character been transposed to film noir, it would have been a case of cruel and unusual punishment. Instead, this shy, none-too-bright heir to an ale fortune provides Stanwyck with some of the great moments in the screwball comedy genre that Hollywood buffed to a high sheen in the Forties and Fifties.

Even before Jean’s nose becomes the topic of conversation, it’s Jean’s legs that get Charles’ attention: this card sharp trips the ale heir when they’re on an ocean liner. Later, annoyed by his rejection, she puts him through another scam: impersonating a grande dame.

Early on, she evokes chuckles when, sizing Charles up, she announces, “I need him like the axe needs the turkey.” By the film’s denouement, she’ll discover, to her surprise, that she needs him, period--the most unlikely and delightful of love stories.

One of the great crimes of Hollywood is that it never saw fit to present Stanwyck an Academy Award while she was in competition, waiting until she was in her mid-70s before giving her one of those honorary Oscars that attempt to redress wrongs to aging former box-office idols while there’s still time. (The presenter that night was John Travolta, who admitted later to being stunned that he was standing there next to a woman who had long been an idol of his family growing up in my hometown, Englewood, N.J.)

The year that The Lady Eve came out was one of the years she could have won. (Instead, the Academy awarded the Best Actress trophy to Joan Fontaine, for Suspicion.) She could easily have been nominated that year for The Lady Eve, or even for her tough-as-nails reporter in Frank Capra’s Meet John Doe. Instead, her nomination was for a turn maybe even funnier than the one she had in The Lady Eve: the deliciously named nightclub singer Sugarpuss O’Shea in Ball of Fire.


In a wonderful centenary tribute published three years ago in The New Yorker, critic Anthony Lane wrote of the female in the accompanying photo: “It was a face that launched a thousand inquisitions: the mouth too tight to be rosy, and a voice pitched for slang, all bite and huskiness. When I think of the glory days of American film, at its speediest and most velvety, I think of Barbara Stanwyck.”

Sunday, March 20, 2011

Quote of the Day (John Bunyan, on “Want of Repentance”)


“If you have sinned, do not lie down without repentance; for the want of repentance after one has sinned makes the heart yet harder and harder.”—John Bunyan, The Pilgrim’s Progress (1678)

Saturday, March 19, 2011

Flashback, March 1861: Confederate Congress Adopts Pro-Slavery Constitution

Declaring its principles immediately, the Provisional Congress of the Confederate States of America adopted a document that, in nearly all respects, was identical to the U.S. Constitution. The most significant differences, however, involved what the seven seceding states had in common: ownership of slaves, which received special protection.

The last two sentences are so unexceptional that, not too long ago, there would have been a “dog-bites-man” quality to them. But some events and commentary over the last year or so suggest, amazingly enough, that the place of slavery as the primary cause of the Civil War is still very much a live question.

(See, for instance, Virginia Governor Bob McConnell’s statement that he didn’t include slavery in a proclamation of Confederacy History Month because he didn’t feel that it was a “significant” factor for his state, or claims by the Sons of Confederate Veterans that slavery wasn’t the “sole” cause of the conflict.)

How, these people ask, could slavery cause the conflict when a majority of Southern whites did not own slaves, when several border states that (barely) stayed in the Union permitted slaveholding, and when so much of the North was every bit a match for the South in virulent racism?

All granted—and all beside the point.

In a prior post, I discussed how Abraham Lincoln’s Second Inaugural Address—most famous for its eloquent phrase, “with malice toward none, with charity for all”—logically identified the exact way in which slavery lay behind the origins of the war. It involved Southerners’ insistence that the institution be extended into new territories, and the resistance by Northerners to this idea. “All knew,” he said, that “somehow” slavery was the war’s cause.

Some Southern naysayers (the temptation is overwhelming to call them “slavery deniers”) would undoubtedly scoff at citing on this point Lincoln, the Confederacy’s greatest rhetorical foe. He would say such a thing, wouldn’t he? they might ask.

Contemporary Southern Voices on Slavery

But you don’t have to turn to a Northerner to see how the origins of the war were perceived in its own era. You can turn to sources impossible to turn away from—Southern ones.

I’m not even going to discuss here to all the major events of the decade before Lincoln’s election that aggravated North-South relations: the admission of slave vs. free states, the Fugitive Slave Law, Harper’s Ferry, the rise of a Republican Party that opposed slavery’s expansion to the territories, a Democratic Party torn asunder in the election of 1860 by Southern belief that frontrunner Stephen Douglas would not guarantee the introduction of slavery into lands west of the Mississippi.

No, all we have to do is look at what those who mattered in the South—the men who made the laws, who governed, who ruled with the consent of the mass of whites—said or wrote about the place of slavery in their would-be new nation.

Nearly three decades ago, one of my college professors told me he felt textbooks were “a cultural menace to our society.” I think I know what he meant. Already by that point, they were being “dumbed down” so much—so stripped of unusual language or allusions that might be grasped by reading the context of the statement—that, over time, they had become joyless to read and dead on arrival in one’s hands.

How much do high-school history courses encompass primary-document reading outside of texts? And are college history courses that much better in this regard?

I ask these questions because primary documents disclose so much. In the case of slavery as primary cause of the Civil War, they settle the issue resoundingly.

Start with South Carolina, the hotbed of antebellum secessionism. Its secession declaration, released only a few weeks after Lincoln’s election, mentioned slavery no less than 18 times! (If the subject didn’t matter, why did South Carolina keep dwelling on it?)

Slavery Without the Euphemisms

Let’s turn now to the Confederate Constitution. True, some sections, dealing with the structure of the government, departed somewhat from the document molded in Philadelphia by the Founding Fathers. (For instance, the President was limited to a single six-year term, appropriation bills had to pass higher voter hurdles, and Cabinet members were given the right to speak in Congressional debates.) 

But you couldn’t tell many sections of the two documents apart if you placed them side by side.

Except, as I’ve written earlier, those concerning slavery.

Begin with the word itself. The creators of the founding document of the Confederacy had no qualms about using it—unlike many of their descendants in the 20th and 21st centuries, who have preferred to use the far more genteel “servants,” or the Founding Fathers, who couldn’t bring themselves to name at all the African-Americans laboring against their will.

Northern delegates to the Philadelphia convention in 1787, knowing that anti-slavery sentiment was percolating in their states after a momentous war fought in the name of liberty, feared that using the term would enrage many constituents. 

At the same time, they worried that the young nation, hemmed in by foreign powers, would not survive at all without the Southern states.

In contrast, Southern delegates wanted the power and influence of their states recognized as far as possible by counting everyone within their borders as part of the census. 

Yet there were limits to their unanimity. Many Southerners were already either hoping slavery would disappear (James Madison) or moving in that direction (George Washington).

So, in the end, the South yielded to the sensitivities of the North. Thus, only three-fifths of the slave population would count in censuses; moreover, they even went along with the North’s euphemisms for slaves, i.e., “other persons,” “such persons” and “persons owing service.”

Move forward more than 70 years later, to Montgomery. Now representatives of the seceding states had no compunction about naming this group: “slaves.” In fact, “slave” or “slavery” appears 10 times in seven different clauses.

These clauses were rewritten to build firewalls around the rights of slaveowners, who in this document were guaranteed the right to take their property wherever they wanted in the states or territories. 

They even adopted the hard line of Chief Justice Roger Taney, who ruled in his notorious Dred Scott decision that the peculiar institution could never be prohibited from any territory.

The "James Madison" of the Confederate Constitution

Again, there should be no surprise in any of this. One of the Montgomery delegates, Thomas R. R. Cobb of Georgia, regarded as the "James Madison" of the Confederate Constitution (the original manuscript is believed to be in his handwriting), had written the influential Inquiry into the Law of Negro Slavery in the United States of America (1858), which not only argued that slaves, dating back to Roman times, lacked any recognition as persons but gave short shrift to the extensive manumission that occurred in ancient times.

The other Confederate delegates were fully prepared to follow Cobb’s lead. As William C. Davis noted in Look Away! A History of the Confederate States of America, elected officials in the South formed an oligarchy, largely based on requirements that officeholders own considerable property. 

The key aspect of this wasn't merely land but value. In the mountain areas of states such as Virginia, for instance, the land itself had little value for farming. Possession of slaves, then, constituted value. These officeholders were used to deference from not only slaves but even poor whites.

The most oligarchic Southern state was South Carolina, where not only did property-value qualifications rose with the office but even the election of the President was done through the legislature rather than popular vote. It was also the one gripped the most by secessionist fever. 

But elites were hardly confined to that state: 49 of the 50 Montgomery delegates were slaveowners. Many of them regarded fellow officials as, in effect, members of a club. If ever there was a self-interested founding national document, it was the Confederate Constitution.

Slavery the "Cornerstone" of the Confederacy: Alexander Stephens

Finally, we have the word of Cobb’s fellow Georgian, Alexander H. Stephens

Bear in mind that Stephens was as far from a fire-eater as the delegates could get. A Whig Unionist who had become friendly with Abraham Lincoln while the two served in the House of Representatives in the late 1840s, he had only yielded after his state had voted in favor of secession in the winter of 1861. 

His election as Vice-President of the Confederacy was meant to signal to the wider world that the South had seceded with only the greatest reluctance.

Yet even Stephens—what passed for a “moderate” in Montgomery—expressed his admiration for the Confederate Constitution in the most radical terms. 

In a speech in Savannah, Ga., on March 18, only a week after the Provisional Congress had adopted the document, he observed that it corrected one of the major errors of Thomas Jefferson and other statesmen of his time, that “the enslavement of the African was in violation of the laws of nature,” that they also violated “the assumption of the equality of races.”

In contrast, the Confederacy, he noted, “is founded upon exactly the opposite idea; its corner stone rests upon the great truth that the negro is not equal to the white man, that slavery subordination to the superior race is his natural and normal condition. This, our new government, is the first, in the history of the world, based upon this great physical, philosophical, and moral truth.”

The "Cornerstone" speech--which goes on to liken this "discovery" to the ideas of Galileo and Adam Smith--deserves recognition, along with James Henry Hammond's 1858 "Cotton is King" speech, as the summa of Southern self-delusion, an intellectual justification of pseudo-science by a self-interested elite that would bring untold carnage and grief in its wake to hundreds of thousands across the nation.

One Founding Document That Endured--And Another That Didn't

It took only 10 days for the Confederate Congress to debate its constitution, and only another two weeks before the required five states put the document into force through votes in favor of ratification.

It was in marked contrast to the U.S. Constitution, which required four months of deliberation and nearly another year after that before it went into effect.

But then again, the document forged in bitter behind-the-scenes disputes on slavery, reflected publicly in clauses reeking in ambiguity and embarrassment, endured far longer than the South’s version, passed with overwhelming consensus by an elite anxious to preserve its ancient privileges, and without any ambivalence over its founding principle: not just racism but the all-encompassing moral and legal subjugation based on that belief system.

Quote of the Day (Tom Stoppard, on Murder and Criticism)


"Nobody would kill a man and then pan his book. I mean, not in that order.”—Tom Stoppard, Arcadia: A Play (1993)

Friday, March 18, 2011

Quote of the Day (Bernard Malamud, on Injustice vs. Optimism)


“I respect man for what he has to go through in life, and sometimes for how he does it, but he has changed little since he began to pretend he was civilized, and the same thing may be said about our society. That is how I feel, but having made that confession let me say, as you may have guessed, that I am somewhat of a meliorist. That is to say, I act as an optimist because I find I cannot act at all, as a pessimist. One often feels helpless in the face of the confusion of these times, such a mass of apparently uncontrollable events and experiences to live through, attempt to understand, and if at all possible, give order to; but one must not withdraw from the task if he has some small thing to offer.”—Investigative Magistrate B.I. Bibikov to prisoner Yakov Bok, in Bernard Malamud, The Fixer (1966)

Bernard Malamud’s The Fixer influenced me as much as any novel I read in high school—both for its unobtrusive but effective style and its message about the indomitable spirit of the individual, even in the face of crushing injustice and circumstance. Many times, when I feel things are hopeless, I especially recall the above quote. It has, in a way, became a kind of personal credo, like the W.E. Henley poem "Invictus."

Malamud passed away writing at his desk 25 years ago on this date, only a day after telling editor and friend Robert Giroux that he expected to finish the last four chapters of his latest novel, The People, by the fall. It was an unjust twist of fate that he wasn’t able to complete it, but nothing like what befall his hero Yakov Bok in his Pulitzer Prize-winning novel.

The most unlikely of heroes, Bok leaves his village for Kiev in 1911 to make a better life for himself as a handyman. Instead, he finds that, even though he is a nonobservant Jew, he is accused of the ritual murder of a Christian child, in a Czarist Russia rocked by periodic pogroms over the age-old "blood libel" canard.

(Several weeks ago, Sarah Palin plunged into controversy by using the term “blood libel” to describe how liberals had tarred her with the extremism associated with the shooting of Rep.Gabrielle Giffords. Before she appears again on Fox News, she might want to read The Fixer before asking herself if her case even remotely compares to the one in this book—a situation based on an all-too-common reality of early 20th-century Europe, such as the real-life case of Menahem Mendel Beilis, a Ukranian Jew whose sensational trial inspired Malamud's treatment here.)

In the scene from which the above quote is taken, a window of hope is opened for Bok, only to be cruelly closed again. He’s finally found, in Bibikov, the closest thing to decency among those examining his case—true, a man who’ll urge Bok’s prosecution for the “crime” of living in an area forbidden to Jews, but at least determined to drop the far more serious murder charge. "If the law does not protect you, it will not, in the end, protect me," Bibikov notes.

Almost immediately, that remark becomes unexpectedly true, as Bibikov is murdered. His death only adds to a string of troubles that make Bok a modern-day Job.

But, against all odds, the “fixer”—a proverbial “little man” standing against a giant, monstrous legal system—endures. By at least surviving until his trial, he stands a chance of disproving the murderous falsehood that would doom not only himself but all Jews.

What helps him go on? Who is to say that it isn’t the words of the proto-existentialist Bibikov, urging, decades before Camus, the necessity of action even when all seems hopeless?

For all their differences in station and outlook, Bibikov and Bok end up sharing something: an ability to transcend former beliefs and circumstances by committing, come what may, to doing the right thing. Bibikov, a bureaucrat in the service of an absolutist ruler, realizes that twisting the law will destroy his country as surely as any prisoner. And Bok, early on a self-described nonreligious, nonpolitical Jew, is fortified on the last page by this hard-earned understanding: “One thing I've learned, ...there's no such thing as an unpolitical man, especially a Jew.”

How did I, a parochial school student, come to empathize so powerfully with a victim of anti-Semitism? It might derive from a statement Malamud made late in life: “All men are Jews, except that they don’t know it.” At one and the same time, Malamud depicts both the particular details of the lives that made Jews the scapegoats of the 20th century and the universal instincts that made them irrevocably a part of humanity.

In his first published novel, The Natural (1952), Malamud invoked the mythic overtones associated with baseball, only, in the end, to debunk them. But in The Fixer, he endowed a common man with an almost mythic heroism.

At the height of his career, Malamud ranked with his contemporary Saul Bellow and younger colleague Philip Roth in a kind of triumvirate of great Jewish-American writers. In the quarter-century after his death, while their reputation stands high or has even risen, his stock has mysteriously declined. The Fixer demonstrates why a long-overdue reassessment to restore him to his rightful honored place in American letters is in order.

Song Lyric of the Day (“No Nukes,” Urging an End to “Atomic Poison Power”)


“Just give me the restless power of the wind
Give me the comforting glow of a wood fire
But take all your atomic poison power away.”—“Power,” by John and Johanna Hall, from the No Nukes CD (1980)

For a long time, I felt that “Power,” the anthem of the 1979 all-star "No Nukes" concert at Madison Square Garden, was a song of the moment, with no lasting relevance. Events of the last week have proven how wrong I was.

If a country such as Japan—which has taken extraordinary measures to prepare for an earthquake—can suffer the kinds of horrors we are now seeing associated with nuclear power, how much hope do we have of evading a similar fate in the United States, where industries have become accustomed to chipping away at regulations meant to ensure the public good?
The next time politicians—Republican or, amazingly enough, Democratic—ask us to consider expanding atomic energy in this country, just remember this: They’ll do to the environment what they did to our financial system a few years ago.

Thursday, March 17, 2011

Quote of the Day (Bob Dylan, on the Clancy Bros. and Irish Rebel Songs)


“I got to be friends with Liam [Clancy] and began going after-hours to the White Horse Tavern on Hudson Street, which was mainly an Irish bar frequented mostly by guys from the old country. All through the night they would sing drinking songs, country ballads and rousing rebel songs that would lift the roof. The rebellion songs were a really serious thing. The language was flashy and provocative – a lot of action in the words, all sung with great gusto. The singer always had a merry glint in his eye, had to have it. I loved these songs and could still hear them in my head long after and into the next day. They weren’t protest songs, though, they were rebel ballads … even in a simple, melodic wooing ballad there’s be rebellion waiting around the corner. You couldn’t escape it.”—Bob Dylan, Chronicles: Volume One (2004)

Wednesday, March 16, 2011

This Day in Texas History (Sam Houston Defies Secessionists)


March 16, 1861—Twenty-five years to the day that the republic he helped establish approved a constitution, Sam Houston, now governor of the state of Texas, made the last great stand of his long, stormy public career, defying secessionist sentiment by refusing to join the Confederate States of America.

Like another Andrew Jackson protégé, Missouri Senator Thomas Hart Benton, Houston was a slaveholder who, for the last decade, had found himself on increasingly thin political ice for not backing unlimited expansion of slavery into Western territories. The victorious commander in the successful Texas War of Independence was now watching a large segment of the state turn its back on him.

Perhaps because he had seen so much bloodshed in his life—first as an Indian fighter under Jackson, then in his win over Mexican dictator Santa Anna at the Battle of San Jacinto—Houston was prescient about the cost of secession. “I see only gloom before me,” he observed in rejecting calls to link up with other states breaking out of the Union.

In a prior post, I touched on Houston’s improbably victorious campaign for governor in 1859, only three months after he had been unceremoniously turned out of the U.S. Senate. Why talk about him again?

Because he was a colossus—a man whose virtues and failings (including alcoholism for much of his life) were as large as his frame. Because his life and career were filled with one surprising and dramatic turn after another (e.g., adoption by Cherokees, then becoming an Indian fighter several years later).

And because the current occupant of the governor’s mansion in Austin, Rick Perry, has made his outsized predecessor all too relevant. By paying heed to the loudest calls in his state—i.e., those who insist there can be no compromise with those who see any form of constructive role for the federal government—Perry is deviating from the wise example of Houston, who sought to conciliate factions.

Last year, the Dallas Morning News speculated that Perry’s invocation of “states’ rights” posed problems for voters who associated the term with segregation. But the term had an even longer, and equally problematic, association: with the agitation that led to the Civil War.

Houston correctly foresaw that passage of Stephen Douglas’ Kansas-Nebraska Act--in particular, its advocacy of “popular sovereignty,” enabling residents of a territory to choose whether they wanted slavery or not--would open up unparalleled agitation concerning “the peculiar institution.” His delicate balancing act in the 1850s--denouncing abolitionists and “fire-eaters” alike--failed to placate the latter, who increasingly--and correctly--saw him as their foe.

The election of Abraham Lincoln in November 1860 put Houston on the spot as no other event did. He stalled on holding a special legislative session just before the state’s secession convention, then yielded when he thought he could influence the vote. (Even then, he didn’t make any secret where his sympathies lay. Working on the first floor of the Capitol building, he referred to the delegates convening on the floor above as “the mob upstairs.”)

He was mistaken about his ability to sway events. On February 1, 1861, only four days after delegates began to convene in Austin, they voted in favor of the Texas Ordinance of Secession, 166-8. A downcast Houston, seeing another opening, said he could abide by the vote if the people endorsed it. Though this vote was less lopsided than the first, it was just as decisive, 44,317 to 13,020.

Now Houston tried to argue that, though the voters wanted secession for the state, it wasn’t as part of the Confederacy. Rather, it was as an independent republic--the kind he’d brought into being a quarter century before.

The last maneuver Houston could summon against the secessionists--securing the federal arsenal at San Antonio by calling on Texas Rangers supporting him--likewise failed. With the Confederacy calling on all state officeholders to swear allegiance to the new provisional government forming in Montgomery, Ala., all escape routes out of his dilemma were closed off.

No matter how much he might have talked about supporting states’ rights over the years, people sensed where Houston’s heart really was. It undoubtedly related to sentiments like this, voiced at the time of the Kansas-Nebraska Act: "Mark me, the day that produces a dissolution of this [Union] will be written in the blood of humanity."

It was undoubtedly because he feared that “blood” that Houston refused to take the one active step that would have enabled him to stay in office while keeping Texas in the Union. Abraham Lincoln offered him 50,000 troops and the rank of major general if he would put down the rebellion in the state. But Houston couldn’t fire on his own people--and, after five decades of military conflict, had grown too tired of the tumult.

Instead, Houston opted for a defiant act of resignation. On March 16, the bluff-speaking 68-year-old wrote a letter to the people of Texas that repeatedly cried out his opposition:

“Fellow-citizens, in the name of our rights and liberties, which I believe have been trampled upon, I refuse to take this oath. In the name of the nationality of Texas, which has been betrayed by this convention, I refuse to take this oath. In the name of the Constitution of Texas, which has been trampled upon, I refuse to take this oath.”

The post of governor being declared vacant, Houston retired to his farm in Huntsville. His last two years alive were spent in fear of the calamity he was certain would strike the state--and that did, in fact, hit his family. A year after his warning of bloodshed, his firstborn son with wife Margaret, 18-year-old Sam Houston Jr., was so badly wounded at the Battle of Shiloh fighting for the Confederacy that he was left for dead. It took him months under the care of his mother before he recovered.

In July 1863, Houston himself died. Not long before the end, he confided his torment about and love for his state to friend Ashbel Smith:

“It is my misfortune to be a prophet like Cassandra, for my warnings are disbelieved. This war will be disastrous to the South and to Texas. The Northern armies will cut the Confederacy asunder. But while forecasting the perils and woes of Texas, I love her. Texas may spurn my counsels; Texas may cast me off, but in my abiding love for Texas, her fortunes are my fortunes; I shall lay my bones to repose in her bosom, I shall leave my blessings on her.”

Quote of the Day (Christopher Caldwell, on the Decline of Borders)


“Borders became one of the hottest corporate chains of the 1990s because it didn’t do things by halves. Its bookstores were of an unheard-of size and sophistication. They stretched not just from coast to coast but around the world. Since it filed for U.S. bankruptcy protection in mid-February, it has shown the same thoroughness in dismantling its empire that it did in building it. The Borders bookstore down the block from my Washington office, where I have browsed almost weekly for the past decade and a half, looks gutted, sacked. At least 200 of Borders’ 642 stores are to close. Those that remain will continue the chain’s drift away from books, and towards cat calendars and stationery.”—Christopher Caldwell, “A Fate Written in the Stores,” Financial Times, March 5, 2011

I took the accompanying photo of the Borders Books and Music store in Fort Lee, N.J., only a few miles from where I live. Once, this Borders possessed every bit—and more—of the vitality that Caldwell mentioned in his piece—not just brimming with all sorts of books, CDs and DVDs, but also hosting author appearances and all kinds of groups (including a writers’ group to which I belonged).

But the last few years, as digitization cut into the market for CDs and DVDs, the shelves devoted to these became depopulated and the store had far more space than it could afford. The coup de grace came with the onset of the e-book, for which the chain was not prepared.

In the weeks since the closing announcement, the Fort Lee location—at least on the weekends—has been filled with people hoping for great bargains—lines I hadn’t even seen the past Christmas or two.

Undoubtedly, many former independent bookstore owners are shedding no tears over the fate of Borders. That megastore, along with archrival Barnes & Noble, knocked off numerous smaller stores that couldn’t compete, in much the same way that former behemoth Blockbuster (until last year, in the same shopping center as the Fort Lee Borders) had crushed mom-and-pop video outlets.

But I felt a sense of sadness at the closure. Borders provided a place to come to--not just a vast bookselling emporium, but also a spot to meet others--authors, fellow bibliophiles, friends. (Even up to the last weeks before the closing announcement, you couldn’t get a seat in the Borders coffeeshop. Too bad more of those kids weren’t using the space to read books and magazines instead of making it a de facto homework hangout.) I’m not sure that the loss of such a space compensates for the greater convenience afforded by digitization.

In this case, digitization might only lead to the further atomization of a society that doesn’t need to be broken down any more than it has already into small, solitary units.

Tuesday, March 15, 2011

Quote of the Day (Peggy Noonan, Flaying Rumsfeld Alive)


“I like Donald Rumsfeld. I've always thought he was a hard-working, intelligent man. I respected his life in public service at the highest and most demanding levels. So it was with some surprise that I found myself flinging his book against a wall in hopes I would break its stupid little spine.”—Peggy Noonan, “Bin Laden Got Away,” The Wall Street Journal, March 12, 2011

I have a funny feeling these two won’t be sending each other Christmas cards anytime again soon, don’t you?

Monday, March 14, 2011

Quote of the Day (Newt Gingrich, From Patriotism to Hanky-Panky)


“There’s no question at times of my life, partially driven by how passionately I felt about this country, that I worked far too hard and things happened in my life that were not appropriate."—Presidential hopeful Newt Gingrich, explaining to David Brody of the Christian Broadcast Network how two prior marriages collapsed because of his extramarital affairs, quoted in Maggie Haberman, “Newt Gingrich: ‘I Was Doing Things That Were Wrong,’” Politico.com, March 8, 2011

Faithful Readers, you might have noticed that on Mondays, my “Quote of the Day” tends to be humorous. The liberal contingent among you might have assumed from the scary image accompanying today’s post that this time my circuits got crossed and I was running a Halloween-oriented quote. You’re undoubtedly angry with me, then, for not giving you sufficient warning to steer the kiddies away from this Internet horror show.

But once you get past that alarming image of Newt Gingrich, I’m certain you’ll agree that I am, in fact, adhering to my Funny Monday routine. If the above quote isn’t the funniest thing I’ve ever put out there for you—well, it’s got to be among the top five, anyway.

In her New York Times column on Saturday, Gail Collins put me onto the scent of this preposterous speculation by what should have been, by all rights, this year’s most unlikely Presidential candidate. It turned out that she could only hint at the half of it.

Throughout the 1990s, I groaned about the Clinton-Gingrich Era, an age of polarization led by two baby-boomer politicos who, in temperament if not party, had more than a little in common:

*Both hailed from the South;
* both sought to weasel their way out of military commitments during the Vietnam War;
* both were, at one time or another, college instructors who were policy wonks at heart;
* both reshaped their party in their own image, leading the faithful back from the political wilderness;
* both possessed volcanic tempers;
* both possessed mighty high opinions of their potential (Clinton famously marketed himself in the 1992 Presidential election as an “agent of change,” while Gingrich, according to Bush I budget director Richard Darman, trashed negotiations with House Democrats in 1990 to further his own ambitions).

But I find this especially fascinating: Both men thought they could surmount any trouble over serial infidelities by holding to the novel notion that oral sex really wasn’t sex. (At the height of the Lewinsky imbroglio, I quizzed a couple of married friends about how their wives would react if they made a similar claim. Both agreed that they wouldn’t live to tell the tale afterward.)

A couple of years ago, I rejoiced. Slick Willie had been effectively neutered--blamed for Hillary’s loss of a Democratic nomination that was hers to lose, then effectively sidelined while she ran the State Department. Meanwhile, I vividly recalled a friend telling me how, after his resignation as Speaker of the House following his own missteps, Gingrich had been spotted in New York, visiting a publisher--and nobody seemed to pay him any mind on the street.

The Clinton-Gingrich Era belonged to the ages, I thought.

Boy, was I wrong.

I attribute Gingrich’s desire to achieve the Oval Office in the face of embarrassing disclosures about his personal life--not to mention his most unusual justification for said walk on the wild side--to Clinton. The Comeback Kid’s relationships were so multiple, so out there, that they provided other politicos across the country with practically step-by-step instructions on how to make centers of government actions also centers of personal action. The outcome of the Lewinsky scandal led more than one commentator to conclude that perhaps America was finally shedding its Puritanism and acting more like Europe in its attitude toward sins of the flesh.

In an interview with Esquire, Gingrich’s second wife, Marianne, discussed how her (now ex) husband had been called to a 1998 Oval Office meeting by Clinton, who told him, “You’re a lot like me.” What that meant exactly would soon become apparent with revelations of his own tomcatting, but even before then, Gingrich became unusually hesitant about resorting to his usual rhetorical flame-throwing approach. Clinton survived the 1998 midterm election very well indeed, but Gingrich didn’t. A dozen years later, it still must eat at him.

Now watch the wheels of Newt’s mind spin. Last year, he surely took note of Clinton’s justification to friend and historian Taylor Branch about how the Lewinsky affair transpired. (It occurred after the death of Clinton’s mother, the Democrats’ loss of Congress in the ‘94 elections, the widening Whitewater scandal, according to the ex-President. In this telling, the leader of the free world felt unexpectedly vulnerable when the intern came by during the Gingrich-engineered government shutdown of November 1995.)

That revelation--if that’s the right word for it--didn’t cause much of a stir, let alone merriment, when it was trotted out a year ago, Gingrich must have reasoned. “Why can’t I try it out with evangelical voters?” he surely thought. “And I can subliminally support it by constantly repeating the name of another divorced politician who went on to win the Presidency: Ronald Reagan.”

Gingrich--and Clinton--might have chosen another route in speaking about their errant ways. Hugh Grant pioneered this approach, and his account to Larry King (not to mention Jay Leno and other talk-show hosts) about his mad encounter with Divine Brown got him off the hook with the public: “I could accept some of the things that people have explained, 'stress,' 'pressure,' 'loneliness' -- that that was the reason. But that would be false. In the end you have to come clean and say 'I did something dishonorable, shabby and goatish.'"

Several months ago, describing the President’s 2008 campaign, Gingrich called the President “authentically dishonest.” He has also speculated that the President might be subject to impeachment for giving up the “don’t ask, don’t tell” stance concerning gays in the military.

The ex-Speaker does all of this at enormous peril to his own thin hopes for winning the high office he has craved for so long. Americans still want to feel comfortable with the character of the man they put in the nation’s highest office. If they have a choice between a still-married father of two and a man who requested divorces from two wives as they faced health crises--not to mention a man with a hilarious explanation for past misconduct--then Gingrich--not to mention a party that might just be silly enough to nominate someone who's repeatedly done something, in Grant's memorable phrase, "dishonorable, shabby and goatish"--shouldn’t be surprised at the outcome.

Sunday, March 13, 2011

Bonus Quote of the Day (Horace, Anticipating Charlie Sheen by Two Millennia)


Ne, quicunque Deus, quicunque adhibebitur heros,
Regali conspectus in auro nuper et ostro,
Migret in Obscuras humili sermone tabernas:
Aut, dum vitat humum, nubes et inania captet.


(Translation:

“But then they did not wrong themselves so much,
To make a god, a hero, or a king,
(Stript of his golden crown, and purple robe)
Descend to a mechanic dialect;
Nor (to avoid such meanness) soaring high,
With empty sound, and airy notions fly.”—Horace, Ars Poetica, translated by Wentworth Dillon, Earl of Roscommon

I don’t know about you, Faithful Reader, but I’ve given up tracking the daily pronouncements and news surrounding Charlie Sheen. He’s not only put out of business the crew of his own show, but also late-night comics, prime-time entertainment journalists and bloggers such as myself who hoped to say something definitive that would not be superseded by each successive news cycle involving the (now former) star of Two and a Half Men.

Heck, he’s even trying to sideline the editors of Bartlett’s Familiar Quotations: the number of catchphrases he’s minting with each Tweet and TV appearance (“tiger blood,” “bi-winning,” etc.) has grown so ridiculously immense that he now requires not just a few pages, but an entire CD unto himself.

When did Sheen transform from a bratty, limited-talent son of a Hollywood star into the highest-paid actor on TV, hellbent on taking down his long-running show? In other words, when did this example of garden-variety Tinseltown megalomania become a tale of not-so-ordinary madness?

Sheen likens himself to “a total freakin’ rock star from Mars.” Indeed, in our current culture, it takes only a nanosecond to morph from a rock star to a rock god. (And doesn’t a god deserve “goddesses,” like the twentysomething women from the adult-entertainment industry in his pad?)

You have to go back a long way to find people who thought they had so many divine powers—the Roman emperors, to be exact. My guess is that Sheen knows only one phrase from the centuries of Roman domination of the world: Horace’s Nunc est bibendum (“Now we must drink”).

But the great poet of the Augustan Age, in Ars Poetica, has some things to say apropos of the descent of gods into the common muck.


Fundamentally, Sheen has to watch out. It’s not just because the world outside his hermetically sealed, “bi-winning” environment shows signs of tuning him out (even the witches of Salem became so offended by his use of “Vatican assassin warlock” that they performed a “magical intervention”).

It’s also because that same public is emitting increasing signs that, though it is willing to forgive the worst—and repeated excesses—of stars, it expects repentance. Mental illness only goes so far to excuse someone who endangers his life and his family members, then goes on a 24/7, seven-days-a-week, ad hoc reality show, then sues the creative powers that tried to use tough love to save his life. The crowd, as Horace shrewdly observed, scorns performers prancing around “With empty sound, and airy notions fly.”

Far more talented actors than Sheen have come a cropper, especially for excesses eerily reminiscent of his. Had he opened his paper or turned on his TV this week, he might have seen someone in his corner, Oscar winner and past box-office star Mel Gibson, pleading guilty to a misdemeanor assault charge for battering the mother of his child, with a career grounded for the last five years after a drunk-driving incident that included an anti-Semitic rant as out-of-left-field as Sheen's own. (See last month’s Vanity Fair article on the roots of Gibson’s decline.)

But, if Sheen really wants a glimpse of his frightening future, he would do well to rent or catch on TCM the 1933 golden oldie, Dinner at Eight—and, in particular, concentrate on John Barrymore, in a role based on his persona and in a performance as emotionally naked and terrifying as any Sheen can ever hope to see.

“The Great Profile” was just a little more than a decade removed from his electrifying Broadway turn as Hamlet, but he was already headed straight for his sorry career finale—an inebriated has-been whose failing memory--and consequent need to improvise anything on the spot--led to the pathetic spectacle of audiences laughing at his expense.

In Dinner at Eight, the situation faced by Barrymore’s character Larry Renault should strike a chord of recognition in Sheen: a star in the grip of substance abuse, abandoned at last by a press agent exhausted from covering for his endless excesses Before his lonely end, Renault/Barrymore looks in the mirror and finds only exhaustion and emptiness. Like Sheen, he finds himself, in Horace’s words, “Stript of his golden crown, and purple robe”—and the discovery shatters him.

Quote of the Day (Edmund Campion, on Being “Made a Spectacle Unto God”)


Spectaculum facti sumus Deo, angelis et hominibus. These are the words of St. Paul, Englished thus, ‘We are made a spectacle unto God, unto His angels and unto men,’ verified this day in me, who are here a spectacle unto my Lord God, a spectacle unto His angels and unto you men.”—English Jesuit Edmund Campion (1540-1581), standing on the scaffold, awaiting martyrdom for alleged conspiracy to overthrow Queen Elizabeth I, December 1, 1581, quoted in Evelyn Waugh, Edmund Campion (1935)

Friday, March 11, 2011

Quote of the Day (Christopher Buckley, on His Year at Sea)


“I remember dawn coming up over the Strait of Malacca; ragamuffin kids on the dock in Sumatra laughing as they pelted us with bananas; collecting dead flying fish off the deck and bringing them to our sweet, fat, toothless Danish cook to fry up for breakfast. I remember sailing into Hong Kong harbor and seeing my first junk; steaming upriver toward Bangkok, watching the sun rise and set fire to the gold-leafed pagoda roofs; climbing off the stern down a wriggly rope ladder into a sampan, paddling for dear life across the commerce-mad river into the jungle, where it was suddenly quiet and then suddenly loud with monkey-chatter and bird-shriek, the moonlight lambent on the palm fronds.”—Christopher Buckley, “My Year at Sea: Recalling the Splendid Isolation of Travel by Freighter,” The Atlantic, December 2010


He might not be as consequential (for better or worse) than his father, the founder of modern conservativism, William. But for my money, Christopher Buckley is a thousand times more engaging as a writer. I’m not sure that there’s a better satirist writing today, and anytime I see one of his works in print—an article, say, or, increasingly over the last several years, one of his marvelous, laugh-out-loud novels (No Way to Treat a First Lady, Florence of Arabia)—I pounce, knowing that I’m in for something good.


The first item I bought on my Kindle more than a year ago, in fact, was a Buckley novella, a Kindle-only product for The Atlantic, Cynara. That turned out to be funny and, in the end, surprisingly moving.

The nonfiction essay from which today’s quote comes is every bit as good, but in a different key. It’s as if the hypnotic, lyrical prose rhythms of Joseph Conrad had been absorbed, then refracted through the experience of an irreverent, late-20th-century American, recalling his “year of adventure,” going around the world on a tramp freighter in 1970, at 18 years old. Marvelous stuff.

Over the last year, it’s become obvious, from Garry Wills’ recent memoir, Outside Looking In, as well as Buckley’s own narrative of coping with the deaths of his parents, Losing Mum and Pup, that he had a deeply ambivalent relationship with his father. But in certain ways, William F. Buckley Jr. lived most intensely on the water in his boat, and if he had a chance to read Christopher’s essay on life at sea, he would have felt thrilled that, in this sense at least, he greatly influenced his son.

Thursday, March 10, 2011

Quote of the Day (Graham Greene, on Writing as Therapy)


"Writing is a form of therapy; sometimes I wonder how all those who do not write, compose, or paint can manage to escape the madness, the melancholia, the panic fear which is inherent in the human situation."—Graham Greene, Ways of Escape (1980)

Wednesday, March 9, 2011

TV Quote of the Day (“Rumpole of the Bailey,” on An Englishman’s Castle)


“An Englishman's gin bottle is his castle.”-- Barrister Horace Rumpole (played by Leo McKern) to wife Hilda (played by Peggy Thorpe-Bates), a.k.a. “She Who Must Be Obeyed,” in “Rumpole and the Married Lady,” from Season One of Rumpole of the Bailey, directed by Graham Evans, written by John Mortimer, air date April 24, 1978

Tuesday, March 8, 2011

Quote of the Day (John McPhee, Paying Tribute to His Headmaster)

“[Deerfield Academy headmaster Frank] Boyden has the gift of authority. He looks fragile, his voice is uncommanding, but people do what he says. Without this touch, he would have lost the school on the first day he worked there. Of the seven boys who were in the academy when he took over in 1902, at least four were regarded by the populace with fear, and for a couple of years it had been the habit of Deerfield to cross the street when passing the academy….The boys were, on the average, a head taller and thirty pounds heavier than the headmaster. The first day went by without a crisis. Then, as the students were getting ready to leave, Boyden said, ‘Now we’re going to play football.’ Sports had not previously been a part of the program at the academy. Scrimmaging on the village common, the boys were amused at first, and interested in the novelty, but things suddenly deteriorated in a hail of four-letter words. With a sour look, the headmaster said, ‘Cut that out!’ That was all he said, and—inexplicably—it was all he had to say.”—John McPhee, The Headmaster: Frank L. Boyden of Deerfield (1966)

Today is the 80th birthday of prolific New Yorker contributor John McPhee (in the image accompanying this post). Far be it from me to argue with the deliberations of those who select Pulitzer Prize winners, but over the last several decades, as the Princeton, N.J. resident has concentrated increasingly—almost obsessively—on the physical world (e.g., Basin and Range), I have tended to avoid his work. 

(I blame The New Yorker, which, in the last years of the William Shawn era, became so musty that it allowed favored writers to muse, often at interminable length, on just about anything—see, for instance, E.J. Kahn Jr.’s Staffs of Life, a book that grew out of his multi-part series on grain.)

But in McPhee's early days, when he profiled real human beings, he gave an extraordinary vivid picture of their world. 

A Sense of Where You Are, for instance, remains, more than four decades after its appearance, the essential account for understanding why Bill Bradley made such a huge impression on the basketball world in college.

Likewise, The Headmaster masterfully describes, toward the end of his 66-year career at Deerfield Academy in Massachusetts, its benevolent despot, headmaster Frank Boyden. This piece benefits more than a little from intimate familiarity with its subject. (McPhee was a product of the school himself during Boyden’s long reign). 

If you want to know not just about the rise of one of this country’s major prep schools—more than that, what makes an institution-builder tick—then this is the book for you.

It’s astonishing to realize that, in his six decades with the school, Boyden—whose demeanor, according to McPhee, suggested “a small, grumpy Labrador”—not only kept no written rules but only expelled a half-dozen students altogether. 

Would that record be possible to maintain in today’s world of broken homes that damage young lives, the substance abuse to which teens are exposed—and litigators ready to pounce on the lack of any written record of school policies?

(By the way, film fans: the 1982 Diane Keaton-Albert Finney movie Shoot the Moon was based on a screenplay by Bo Goldman, a former Princeton classmate of McPhee’s. More than a decade after its premiere, McPhee’s ex-wife sued the filmmakers, alleging that the events onscreen depicted her marital strife as witnessed by Goldman while he was a guest of the couple. The case was settled out of court.)

Monday, March 7, 2011

Quote of the Day (Heywood Broun, Imagining a Hogwarts for Knights, Decades Before J.K. Rowling)


“Of all the pupils at the knight school Gawaine le Cœur-Hardy was among the least promising. He was tall and sturdy, but his instructors soon discovered that he lacked spirit. He would hide in the woods when the jousting class was called, although his companions and members of the faculty sought to appeal to his better nature by shouting to him to come out and break his neck like a man. Even when they told him that the lances were padded, the horses no more than ponies and the field unusually soft for late autumn, Gawaine refused to grow enthusiastic.”—Heywood Broun, “The Fifty-First Dragon,” from Seeing Things at Night (1921)

Sunday, March 6, 2011

Quote of the Day (Alexander Solzhenitsyn, on “The Line Dividing Good and Evil”)


"If only it were all so simple! If only there were evil people somewhere insidiously committing evil deeds, and it were necessary only to separate them from the rest of us and destroy them. But the line dividing good and evil cuts through the heart of every human being. And who is willing to destroy a piece of his own heart?"—Alexander Solzhenitsyn, The Gulag Archipelago: an Experiment in Literary Investigation (1974)

Saturday, March 5, 2011

Quote of the Day (Winston Churchill, on the “Iron Curtain”)


“From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the continent. Behind that line lie all the capitals of the ancient states of Central and Eastern Europe. Warsaw, Berlin, Prague, Vienna, Budapest, Belgrade, Bucharest and Sofia, all these famous cities and the populations around them lie in what I must call the Soviet sphere, and all are subject in one form or another, not only to Soviet influence but to a very high and, in some cases, increasing measure of control from Moscow. Athens alone - Greece with its immortal glories - is free to decide its future at an election under British, American and French observation. The Russian-dominated Polish Government has been encouraged to make enormous and wrongful inroads upon Germany, and mass expulsions of millions of Germans on a scale grievous and undreamed-of are now taking place. The Communist parties, which were very small in all these Eastern States of Europe, have been raised to pre-eminence and power far beyond their numbers and are seeking everywhere to obtain totalitarian control.”—Sir Winston Churchill, Address At Westminster College, Fulton, Mo., March 5, 1946

Look beyond that first sentence in the above quote, the one that the 20th-century’s greatest phrasemaker turned into a Cold War soundbite. With good friend President Harry Truman in the audience, watching as Winston Churchill accepted an honorary degree from Westminster, the former British Prime Minister—the man who had led his country in its hour of greatest peril—was now, after a dutiful acknowledgement of the massive contribution of Joseph Stalin in the Grand Alliance, somberly outlining the Soviet leader’s recent catalogue of electoral crime.

The 65th anniversary of the Iron Curtain speech comes the same week that the media reported on the death of Judith Coplon Socolov, a Justice Department analyst who lived nearly 60 years after convictions in two espionage trials. (An appellate court judge tossed out the convictions because of FBI agents’ perjury concerning wiretapping Socolov and their failure to obtain search warrants, and, in 1967, the government decided not to pursue the case any longer. However, the judge affirmed that Ms. Socolov was guilty, a judgment confirmed by the 1995 disclosure of the VENONA decrypts of intercepted cables concerning Alger Hiss, Klaus Fuchs, Ted Hall, Julius and Ethel Rosenberg and other spies that the government could not disclose at the trials.)

Over 30 years ago, Vivian Gornick’s The Romance of American Communism showed how for a group of Old Leftists of the 1930s through Nikita Khruschev‘s 1956 “secret speech“ outlining the terror of Stalin, Soviet-style Marxism became a golden ideology. In Judith Socolov's case, however, “romance” took on a double meaning, as becomes clear in the lede of Sam Roberts’ New York Times obit:

“Judith Socolov, who as a diminutive Barnard graduate named Judith Coplon was convicted of espionage more than 60 years ago after embracing a utopian vision of communism and falling in love with a Soviet agent, died Saturday in Manhattan.”

Oh, what she did for love....

A little later, the article quotes Ms. Socolov’s daughter Emily on whether her mother was guilty: “Was she a spy? I think it’s another question that I ask: Was she part of a community that felt that they were going to bring, by their actions, an age of peace and justice and an equal share for all and the abolishing of color lines and class lines?”

The obit does not describe the younger Ms. Socolov’s current profession, but it sounds as if she is either currently a politician or should seriously consider becoming one. After all, she appears to have absorbed one of the key rules that any budding politico learns in an encounter with the media: When faced with a question you don’t like, make believe it’s another question and answer that one.

There were unquestionably victims of anti-Communism hysteria in the United States, but Ms. Socolov was certainly not one of them. Her defense at her two trials—that she was meeting a Soviet intelligence officer because she was working on a book for which she was culling the most intimate of secrets during pillow talk—was simply not credible, since she could not produce any manuscript or even outline.

And so, Judith Socolov’s remaining defense—the one her daughter raises now like a fallen flag—is that American Communism was the best force for a brave new world of labor and civil rights.
All too many academic historians have allowed that argument to go unchallenged. It should be demolished on several points:

1) American Communists were engaged in willful ignorance about what was occurring behind the Iron Curtain. The Moscow “show trials” of the late 1930s were demonstrated to be miscarriages of justice by a panel that included American philosopher and educator John Dewey, hardly a conservative. As if that weren‘t enough to make them wonder what was going on, American Communists who traveled to the U.S.S.R. repeatedly faced the dilemma of how to account for the sudden, unexplained disappearance of friends in Stalin’s labor paradise.

2) American Communists called for enhanced electoral rights at home and their destruction abroad. What a contradiction: while the great cause of African-American voting rights would be upheld by the party faithful in the U.S., anyone who attempted to exercise the franchise in a way that deviated from the party line would be crushed in Eastern Europe.

3) American Communists faced more than a simple choice between country-club Republican capitalism and Democratic Party racism. The New Deal demonstrated that there was a small-d democratic alternative that safeguarded the rights of labor. Moreover, the liberal wing of the Democratic Party had, by 1948, moved decisively toward embracing civil rights. (It should also be stated that the GOP had not yet embraced the “southern strategy” to which it would turn in Richard Nixon’s 1968 Presidential campaign.)

4) The rights championed by Communists did not extend to religious liberty. When I attended parochial school as a child, many classmates had parents who had emigrated from Eastern Europe and Cuba in the wake of Communist takeovers. Yet Catholics were hardly the only religious victims of Communism: Stalin’s effort to install atheism suppressed people of many religious faiths, including Jews, who had to endure the closure of synagogues as he consolidated his hold on power in the 1930s.


5) The legal rights possessed by American Communists simply did not exist in the Soviet Eden they supported. Ms. Socolov was able to live out the remainder of her life in freedom because a judge, despite certainty of her guilt, believed that her right to a fair trial had been fatally undermined by prosecutorial misconduct. No such safeguards existed in the Soviet Union, where people were summarily executed.

Defenders of American Communists are not content to say that the old Leftists lived out the remainder of their lives doing good (in Ms. Socolov’s case, raising a family, earning a master’s degree in education, and tutoring women in prison in creative writing). They also insist that these were idealists who did not know the truth about the dictator they defended.

But how could these Old Leftists not know at least some of what was occurring? For anyone who picked up a newspaper and read even Churchill's recital of electoral abuses in Eastern Europe--especially for many college-educated devotees of this new political gospel such as Ms. Socolov (a Barnard College graduate) who tried to make contact with people in the Soviet Union--how could they not know that something was awry? To claim otherwise would be adopting what has been termed “the Sergeant Schultz defense” (named, of course, for the hapless Hogan’s Heroes German soldier): “I know nothing.”

In a number of lands freed from the yoke of dictatorships, truth commissions have been established that have determined the extent of past political crimes. The lack of critical thinking in academe and journalism about the cost of the crimes of Ms. Socolov and other Soviet spies makes one long for such a body (or, at least, something like the Dewey-headed commission on the show trials) here in the U.S. There is a cost for accepting uncritically the claims of these Old Leftists, best expressed by Alexander Solzhenitsyn’s The Gulag Archipelago:

“In keeping silent about evil, in burying it so deep within us that no sign of it appears on the surface, we are implanting it, and it will rise up a thousand fold in the future. When we neither punish nor reproach evildoers, we are not simply protecting their trivial old age, we are thereby ripping the foundations of justice from beneath new generations."