Friday, February 29, 2008

This Day in Colonial History (Deerfield Massacre Foreshadows American Struggles of Uncertainty and Terror)

February 29, 1704—After trekking 300 miles through winter snow and cold, a combined French-Indian war party of nearly 300 attacked the frontier town of Deerfield, Massachusetts, while the inhabitants were still fast asleep. With a night watchman away from his post and heavy snowdrifts muffling sound, the settlers inside the fort were taken completely by surprise.

By the end of the attack, 56 men, women and children lay dead; 109 were carried away into captivity; and half of this Connecticut River Valley town’s houses were burned.

I wrote “end” in the last sentence, but the ordeal had only begun for the survivors of the Deerfield Massacre

In a certain sense, long after the English colonists had died, the vocabulary used to describe the experience continued to live on in American literature and, in a curious half-life, even classic Hollywood films.

I visited Deerfield in October 2001 (when I took this picture), only a month after another surprise attack in which Americans viewed scenes of burning and mourned their dead: the World Trade Center bombing. Talking with the owner of the bed-and-breakfast where I was staying, I relived my shock on 9/11 and my numbness in the days immediately afterward.

Some of the same feelings must have been experienced by the survivors of the Deerfield Massacre—human nature hasn’t changed all that much over three centuries.

The massacre formed just one incident in a long series of wars that took place between the English and French colonists, with Native-Americans assuming a complex, often dreaded, role. 

In my high school, a troubled administrator demonstrated, in a small but telling way, why he had been deeply admired as a history teacher when he told our class one morning an excellent mnemonic device for recalling these assorted conflicts that almost nobody cared about anymore—the acronym WAGS, standing for King William’s War, Queen Anne’s War, King George’s War, and the Seven Years War.

Deerfield played a significant part early on in WAGS. 

In September 1675, Indian forces led by King Philip beat English settlers at Bloody Break, only two miles from Deerfield, then proceeded to massacre 64 Deerfield men. The town was abandoned twice before settlers finally were able to erect, with the help of stronger fortifications, a permanent settlement.

Or, at least, what looked permanent until the night of February 29, 1704.

The First “Long, Twilight Struggle”

Francis Parkman’s history of the period is called A Half Century of Conflict. In an unusual parallel, nearly three centuries later, the Cold War between the United States and the Soviet Union might be described in the same way. And, with no end in sight to the West’s encounter with Islamic fundamentalism, I would argue that we live in an age that might last just as long.

In his inaugural address, President John F. Kennedy memorably called on Americans to “bear the burden of a long twilight struggle.” In the early years of his state, especially in Deerfield, citizens were asked to engage in a struggle of similar duration and uncertainty.

Though most Indian raids seldom reached the bloodbath levels of Deerfield, settlers were continually picked off one or two at a time, away from fortifications, when they were most isolated and vulnerable.

“This system of petty, secret, and transient attack put the impoverished colonies to an immense charge in maintaining a cordon of militia along their northern frontier--a precaution often as vain as it was costly,” observed Parkman.

The volume on Massachusetts in the justly famous “American Guide” series, produced by the Works Progress Administration during the Depression, called Deerfield “a beautiful ghost,” representing “the perfect and beautiful statement of the tragic and acute moment when one civilization is destroyed by another.”

I don’t think I’d go that far: after all, the town retains the tony prep school Deerfield Academy, the Yankee Candle headquarters (in nearby South Deerfield), and enough money made off of tourists to keep the tumbleweeds away. 

But as I walked the mile-long stretch of Old Deerfield Street, with its carefully preserved colonial homes, I sensed something unique about the setting. 

I didn't grasp what it was until I came across the statement by historian Richard Slotkin that, with its incessant demands for vigilance, "The frontier made Europeans into Americans."

The “Redeemed Captive” and the Unredeemed One

An entire literary genre evolved, a kind of spiritual crisis memoir, written by survivors of attacks by Native Americans: captivity narratives. The first I encountered in Carl Hovde’s brilliant American Literature class at Columbia University was Mary White Rowlandson’s A True History of the Captivity and Restoration of Mrs. Mary Rowlandson, A Minister's Wife in New-England (1682). 

These “captivity narratives” combined two literary forms: the jeremiad, or a call on the faithful to return to the way of truth before it’s too late, and a cracking good adventure story.

Another landmark of the form was produced by the most important Deerfield captive, the Reverend John Williams. His 1707 account of the massacre, The Redeemed Captive Returning to Zion, became the colonial equivalent of a bestseller, going through six editions by the end of the eighteenth century.

Far more palpable than Williams’ gratitude to God for deliverance from his ordeal, however, was his revulsion at witnessing the slaughter of his wife and a couple of his children. 

While the minister was one of 88 prisoners who had survived the forced march to Canada, and among the 59 eventually ransomed, he was horrified that his daughter Eunice was among the 29 captives who refused to return.

According to a penetrating analysis in Richard I. Melvoin’s New England Outpost: War and Society in Colonial Deerfield, Eunice and the 28 other intransigents were overwhelmingly young and female. Thus, they had few if any memories of the life and culture from which they had been snatched, and were more open to psychological manipulation by their captors.

Three years after the attack, the minister’s daughter had forgotten to speak English and her catechism. Nine years later she married a Caughnawaga Indian, thereafter going by the name of Marguerite A’ongote Gannenstenhawi—with its evocation of her new Catholic and Native-American identity, a double poke in the eye of the Puritans.

In 1740, a decade after her father’s death, Eunice/Marguerite was prevailed upon to visit Deerfield again with her husband, and made three more trips after that. On the one occasion when she stepped inside her old church in her “civilized” garb, “she impatiently discarded her gown and resumed her blanket,” wrote Parkman.

In their loathing of Eunice/Marguerite’s Catholicism, the Rev. Williams and Parkman sound an awful lot like late 20th-century parents ready to hire a deprogrammer to snatch a child from religious cults.

Yet another feeling, more alien now to a country in a post-civil rights era, also emerges in their accounts: fear of miscegenation. It’s the same panic, in a different setting, shared by Southern slaveholders over the possibility of mixture of blood with another race. 

More than that, it speaks to a violation of hearth and home—a psychic and even physical rape—and the impotence of the white male protector of community and family order.

The Redemption Narrative on Film

That same disgust is the coiled spring that propels the plot of the great John Ford western, The Searchers. In that 1956 film, an Indian attack on the Edwards homestead at the edge of the wilderness leaves all dead but for two young daughters carried off by the war party.

At the heart of the Puritan venture into the New World was what minister Samuel Danforth called, in one of the most famous jeremiads, an “errand into the wilderness.” 

Yet the isolation of the wilderness not only afforded the colonists unfettered freedom to worship as they pleased but also exposed them to Satan’s temptations: desire, for sex and for vengeance for wrongs committed against them.

The religious element does not enter into the Ford film, but the same elements of the captivity narrative endure: homes in a fragile outpost between civilization and wilderness, where people long for freedom and a chance to advance in life, suddenly left smoking ruins (in a scene alluded to in George Lucas’ Star Wars); a young girl carried off to God-knows-what fate; and a long, often fruitless multiyear search to bring her back to her relatives.

Now, however, in a subtle shift, the captivity narrative morphs into a redemption narrative, with the emphasis now on a would-be savior who seeks not only to return a long-lost loved one but to redeem a family, a community, and especially his own heart hardened by bitterness.

The agent of that redemption is the Edwards’ uncle Ethan, with little of the Rev. Williams’ piety but every bit of his revulsion over the disturbing mixture of blood.

Ethan Edwards is surely one of the most problematic heroes in American culture. As embodied by John Wayne in perhaps the most complex role he ever took on, he is gifted with all the hardy self-reliance and know-how of the archetypal American frontiersman, but also cursed with a loner streak that has left him fatally unable to have a family of his own.

Mix into all of this Ethan’s guilt over his unacknowledged love for his brother’s wife and over his inability to prevent what the movie implies is her torture, rape and death. By the time Ethan learns that one of his nieces, Debbie, has become the squaw of the Native American mastermind of the raid, the rescuer has become an avenger ready to destroy what he now regards as his polluted flesh and blood—and whatever soul he himself has left.

The climax of the film shows Ethan finally catching up with his niece, years later, roughly seizing her and lifting her body toward the sky. The question is posed in its starkest terms now: Will the hero become the destroyer?

But suddenly, the presence of God so often missing from the film returns as the force of love. The white racist, the mirror image of Debbie's Indian husband, the appropriately named chief Scar, softens. With utter tenderness, Edwards says softly, “Let’s go home, Debbie.”

Like Ethan Edwards and the Rev. Williams, America in the age of terror looks to snatch back something seized from it—a dream of innocence, a moment of peace—and, as the anxious initial days turn to years, it struggles with feelings of resentment. 

Facing the necessary task of remembering and rebuilding, we are less far removed than we imagine from our wilderness origins.

Quote of the Day

“To the broody hen the notion would seem monstrous that there should be a creature in the world to whom a nestful of eggs was not the utterly fascinating and precious and never-to-be-too-much-sat-upon object which it is to her.” – William James, The Principles of Psychology

Thursday, February 28, 2008

This Day in Television History


February 28, 1983 – After 251 episodes and 11 seasons – three times the length of the conflict it depicted—the CBS comedy about the Korean War, M*A*S*H, came to a triumphant end, with a record 125 million viewers watching what became of Dr. Benjamin Franklin “Hawkeye” Pierce and the rest of the 4077th unit.

Eight writers were credited with the script for “Goodbye, Farewell and Amen,” including star-writer-director
Alan Alda (who also directed this episode) and future Judging Amy executive producer, blogger and Roman Catholic convert Karen Hall.

In a way, that reflects the multiple hands that shaped material which began as a novel by Richard Hooker, was adapted by Ring Lardner Jr. into an Oscar-winning screenplay (with extensive improvisation and overlapping dialogue encouraged by director Robert Altman), then developed into a TV series in 1972. Along the way, events set in one Asian conflict served as a not-so-distant mirror of a later one: the Vietnam War.

So much about the show seems not just from another decade, but in TV terms, another lifetime ago:

* It survived first-season ratings so abysmal that nowadays, it wouldn’t have made it past the second episode. (But networks had greater patience then—witness a similar ratings trajectory for Cheers a decade later.)
* It was not just a comedy but, like another ‘70s CBS comic blockbuster, All in the Family, one that challenged censors and stretched notions of the genre (dispensing with the laugh track in operating-room scenes, freely mixing dramatic as well as comic elements, even killing off the beloved Colonel Henry Blake).
* Its massive audience at its denouement could never be repeated today, when cable TV, the Internet, and God-knows-what-other-new media compete with networks for viewers’ attention.

Faithful reader, I have a confession: I have never watched this last episode. At this juncture so many years later, my only guess as to why I missed it 25 years ago is that I was taking a continuing-ed course that night, and without a VCR at the time I couldn’t tape it. But somehow, I have never gotten around to seeing the 2 1/2-hour episode a quarter-century later—not even given the ubiquity of the show in video and DVD formats, not to mention in reruns on one channel or another. (TV Land? Fox? Who knows what else?)

Mea culpa, mea culpa. I’ll rectify this. Someday. Hopefully soon.

Believe it or not, I had two close encounters with two creative forces most closely associated with the show: writer/co-producer
Larry Gelbart and Alda.

The first of the two I saw was Alda. Back when I was still a pup (okay, a teenager), I worked as a page (i.e., student assistant) at my local public library. One day, glancing at the circulation desk, I saw a tall, dark-haired, lean man in his late 30s standing there. I was staggered to recognize Alda, though, in truth, I shouldn’t have been—at the time, he was a resident of the New Jersey town to the immediate south of mine.

Emptying the book bin at the desk before it became too heavy, I encountered an older student assistant whom I knew from my high school. She was practically devouring the star with her eyes. “I can’t believe it!” she said. “It’s Alan Alda!”

Unfortunately for Alda, he was being served not by a star-struck teen who would have expedited his checkout time but by a short, slight, gray-haired Englishwoman with not a clue about his identity. Adjusting her glasses as she stared at the piece of white plastic he’d given her, she finally announced: “Sir, your card has expired!”

Unlike other celebrities given to throwing their weight around under and all circumstances (take a bow, Opray Winfrey, for your
after-hours hissy fit at that exclusive Paris store!), Alda didn’t throw a tantrum or even look a bit irked when the English circ-desk clerk told him this…nor when she sat down at a typewriter and, in her best hunt-and-peck style, created an entirely new card for him…not even when a long line of other patrons waited behind him at the desk, winking and nudging each other about who else was there.

This incident occurred when the show was very close to its creative peak. More and more, particularly when Gelbart departed after the fourth season, Alda took up much of the writing and directing slack. It provided a perfect springboard for similar work by the actor in films, when he seemed to bid fair for a time to become another Woody Allen comic star-writer-director hyphenate with
The Seduction of Joe Tynan and The Four Seasons.

But during this time, M*A*S*H became less like the ribald series I had loved in the first place—hardly a surprise, given that Alda had taken on an increasingly high profile as the prototypical late ‘70s Sensitive New Age Guy. Add to that the restlessness and constant changes in the ensemble cast over the years (blazing a trail that Law and Order would later follow) and you had a series more sanctimonious, more obvious in telegraphing its feelings. (The photo accompanying this blog entry comes, of course, from the series' early incarnation, when Wayne Rogers still played Hawkeye sidekick Trapper John.)

I met Gelbart 10 years ago, at an appearance at Fairleigh Dickinson University in Teaneck, N.J., for his new memoir,
Laughing Matters. A member of the audience asked Gelbart about what had troubled me about the show’s change from its hard-edge early seasons to its softer tones later.

Gelbart admitted that it was not necessarily a direction he would have taken, but by that time it had in effect become Alda’s show and it was his prerogative to do as he wished.

The writer, who time and again that night let loose one outspoken, wildly funny line after another, now seemed to speak more carefully and diplomatically. I sensed that he had major concerns about how the show had changed after his departure, but he was too respectful of Alda’s contributions—and too diplomatic—to give full vent to his opinions on this score.

Quote of the Day

"I'd demand a recount." – William F. Buckley, Jr., the longshot 1965 Conservative Party candidate, when asked what he would do if he won the race for Mayor of New York.
(He was wrong—often—on a host of matters large and small, from the South’s right to resist civil rights legislation and Edgar Smith's innocence to the Beatles’ lack of musical skill. But as this quip demonstrates, he also brought—often—to political discourse uncommon wit, grace, and appreciation for the humanity of opponents. Requiescat in pace, WFB.)

Wednesday, February 27, 2008

This Day in World History

February 27, 1933—A fire that destroyed the seat of government for Germany’s parliament, the Reichstag, provided Chancellor Adolf Hitler and his newly installed Nazi Party the flimsy pretext they needed to crack down on civil liberties; wipe out their principal political opponents, the Communist Party; and end their nation’s fledgling democratic republic.

Best known today for “Charlie and the Chocolate Factory,” “James and the Giant Peach” and other stories for children, the British writer Roald Dahl also wrote a series of brilliantly unnerving short stories and teleplays for adults in the 1950s. One such story, “Genesis and Catastrophe,” seems at first to be a conventional tale of how a mother kept faith, even after three miscarriages, until she was safely able to deliver her fourth child. The twist comes at the end, when it is revealed that the infant who lived was none other than Adolf Hitler.

If the birth of Hitler was the “genesis,” then the Reichstag fire might mark the point at which the governing style of Hitler – the “catastrophe,” if you will--came into unmistakable being. It featured thuggery, the search for a convenient scapegoat, opportunistic exploitation of events, and, most of all, a massive superstructure of lies.

Empty since December, the immense Reichstag was awaiting the results of new elections scheduled for March 5. A little after 9 on the night of Feb. 27, a fire broke out in the building, and within a half hour all that was left were the walls.

Today, nobody knows exactly what happened—largely because at least a few people at the heart of the mystery turned up dead, probably on orders of Hitler.

Two witnesses at the postwar Nuremberg trials—one from the Prussian Ministry of the Interior, the other from the Gestapo—indicated that Joseph Goebbels (sporting the Orwellian title of “Minister for People's Enlightenment and Propaganda”) and Reichmarshal Herman Göring, respectively, had advance knowledge of the blaze. Feeding the suspicions was the fact that an underground passageway led from Goring’s office to the Reichstag.

However, the British historian A.J.P. Taylor—a leftist with no motive for exonerating the Nazis—did not believe that Hitler’s men manufactured evidence against the Communists in the subsequent trials after the event. Though five Communists were put on trial subsequently for setting the fire, all but one was acquitted.

That one was a half-crazed 24-year-old Dutch construction worker, Marinus van der Lubbe, who had been making his way toward the Soviet Union—a state he had come to admire—when he circled back to Germany, hoping to rouse the dormant working class with a spectacular action.

According to the careful account of the incident in the highly praised The Coming of the Third Reich, by Richard J. Evans, van der Lubbe chose the Reichstag as “the supreme symbol of the bourgeois political order that, he thought, had made his life and that of so many other unemployed young men a misery.” He spent his last bit of money on the 27th for matches and firelighters.

When apprehended by the non-Nazi Prussian political police, van der Lubbe readily confessed to sole responsibility for the blaze. Hitler and his men, however, were having none of it. The Chancellor declared to the group gathered around him on a balcony above the Chamber: “There will be no more mercy now; anyone who stands in our way will be butchered.” The Communists would be “shot,” “hanged,” “arrested,” he vowed.

He was as good—or as bad—as his word. An emergency decree was drafted by Goring aide Ludwig Grauert, providing legal cover for the wholesale arrest of Communists that were taking place. An additional clause, added by Reich Minister of the Interior Wilhelm Frick, enabled the Cabinet, rather than the German President (an office then in the hands of non-Nazi, but superannuated, Paul von Hindenburg), to intervene in affairs of the federated states. Hitler presented the decree to his cabinet the morning after the fire and promptly secured Hindenberg’s agreement.

Within three weeks, 10,000 Communists had been put in custody. For all intents and purposes, after the emergency decree took effect on February 28, the Communists were outlawed.

In 1932, the Communists gave as good as they got from the Nazis in the street fighting that swirled around that year’s elections. The Reichstag fire, however, took them completely by surprise. Their removal from Hitler’s path also marked, in a larger sense, the end of any organized opposition to Germany’s new dictator, for socialists and anarchists were also rounded up and a variety of other political organizations and newspapers were shut down.

In January 1934, after being chained for up to seven months in his cell and undergoing a hunger strike to protest his conditions while in detention, Marinus van der Lubbe was beheaded. In January of this year, a German prosecutor overturned the guilty plea of the luckless Dutchman. The arsonist had been inadvertently overlooked for nearly 10 years after a 1998 law provided for pardons for people convicted by the Nazi regime, under the assumption that Nazi law “went against the basic ideas of justice.”

Quote of the Day

“There is a tragic flaw in our precious Constitution and I don't know what can be done to fix it. This is it: Only nut cases want to be president.” – Kurt Vonnegut, Jr. (thanks to Brian for the suggestion)
(As of this morning, only three nutcases – and counting downward—left this year.)

Tuesday, February 26, 2008

This Day in World History

February 26, 2008 – The Communist Manifesto, written by Karl Marx and Friedrich Engels in six weeks, was published in London as a 23-page document featuring a green cover and measuring 21.5 to 13.4 cm.

The program was developed at the request of the Communist League, an organization of German émigré workers in London. Associate member J.E. Burghard composed the text hastily. Appropriately enough for a political economy text that left so much collateral damage in its wake over the years, the typeset book was riddled with errors.

“A specter is haunting Europe—the specter of Communism,” read the opening sentence, which turned out to be prematurely boastful. Even as revolution overtook Europe in the manifesto’s year of publication, in France, Germany, Hungary, and Italy, the manifesto made little impression, even though translated copies were sent to several countries.

When Communism finally became the reigning ideology of a government, nearly 70 years later, it took root in a nation—Russia—which had imploded because of a ruinous world war. Its next “great leap forward” (to use a phrase from the Marxist Mao Tse-tung that described a period in China that was anything but) took place as a result of another cataclysm: World War II.

In an op-ed piece for The Wall Street Journal nearly two years ago, business historian Louis B. Galambos noted
President Dwight Eisenhower’s fear that before long, “the U.S. might soon have no trading partners, no allies dedicated to democracy or capitalism.”

Though one European country after another overrun was overrun or threatened by Communism in the middle of the twentieth century, it’s easy in retrospect to dismiss the concerns of Ike and others as overblown. As important as—advocates of “soft power” would insist, even more important than—the West’s military defenses, was the realization that Marxism was a philosophy that reduced the complexity of life to the ridiculously simplistic foundation of class.

Although I feel deep misgivings about many aspects of Pope John Paul II’s governance of the Roman Catholic Church, I believe that history will treat him kindly, in much the same way that it has done with Winston Churchill, another figure of frequent wrongheaded and bull tenacity: because, in a very dark hour of the world, in the most important cause of his time, he rallied the forces of freedom against oppression.

Jonathan Kwitny’s 1997 biography,
Man of the Century: The Life and Times of Pope John Paul II, is far more nuanced than its hyperbolic title might suggest. Kwitny itemizes, in detail, the damage done to the future of the church because of the pope’s refusal to accord women a role in the priesthood, to allow priests to marry, or to crack down on corruption inside the church. (And this was written before the sex abuse scandal had reached crisis proportions in America.)

Yet Kwitny does not scant on the greater greatness of the pontiff in taking on Communism. The victory over the Marxist-inspired Soviet regime was the greatest nonviolent revolution in the past century. An empire that held millions under its fearful sway collapsed without a shot being fired.

And, as Kwitny makes plain—disputing
Carl Bernstein—the Pope saved the anti-communist opposition in its worst days, after the 1981 crackdown on Solidarity, at a point when the Reagan administration had left the dissident Polish labor movement to its own devices. No less an authority than Mikhail Gorbachev said, “"Everything that happened in Eastern Europe during these past few years would have been impossible without the pope."

Likewise, in a fine blog posting, “
John Paul II, Communism and Liberalism” Canadian theologian Craig Carter notes: “John Paul II believed that the core weakness of Communism was a flawed concept of the human person and a shriviled and reductionistic concept of freedom.”

Marx might have derided religion as “the opiate of the people.” But in the 21th century, it is one ideology that has been revealed as the true hallucinogen, and it is the one created by the German economist-philosopher-journalist with the hyperbolic style in the Reading Room of the British Museum.

Marx’s 20th-century disciple Joseph Stalin asked, “How many divisions does the Pope have?” As John Paul knew, a soulless materialistic philosophy is far weaker than it believes.

Quote of the Day

“My doctor told me I shouldn’t work out until I’m in better shape. I told him, ‘All right; don’t send me a bill until I pay you.’” – Comedian Steven Wright
(Hey, maybe I’ll try that, too!)

Monday, February 25, 2008

This Day in American History


February 25, 1855—William Poole, better known as “Bill the Butcher” of the Bowery Boys Gang, was shot at the Stanwix Hotel in New York by gunmen doing the bidding of Tammany Hall tough John Morrissey. The Butcher, who died two weeks later, was buried in an unmarked grave in Green-Wood Cemetery.

Poole became adept in using knives by working in his father’s butcher shop, then by opening his own. His muscular frame made him a natural “shoulder-hitter,” or enforcer, for the virulently nativist True American or Know-Nothing Order of the Star-Spangled Banner (the Know-Nothings earned their nickname because their oath of secrecy enjoined them to answer, when questioned about the group, “I know nothing”). It was in this latter capacity that he made the fateful acquaintance of Morrissey.

Some months back, Morrissey—who made his name among the immigrant Irish in challenging the True Americans’ control of the streets and the polls by smashing up their meeting halls—had gone into the American Club on Water Street and challenged heavyweight champ Tom Hyer and Bill the Butcher. Morrissey was cut up so badly that The Butcher decided that he could live as an example of what could happen to the unwary.

On the night he died, Poole was in the Stanwix when Morrissey came in, pulled the trigger on his gun, and fired three times—with no result. (That type of thing happened surprisingly often in those days.) Poole was ready to use his knife on the Tammany tough when Morrissey was hustled out of the hotel in the nick of time. Some time later, Morrissey’s henchmen came back and gunned The Butcher down.

In 2003, a granite headstone was placed on Poole’s grave bearing his last words: “Goodbye, boys: I die a true American.” The fact that this anti-Irish, anti-Catholic hood was posthumously remembered in this fashion might derive from his recent curious cinematic fame, as “Bill the Butcher” Cutting, in Martin Scorsese’s 2002 film,
Gangs of New York.

As in There Will Be Blood, Daniel Day-Lewis allows his Snidely Whiplash moustache to perform half his acting job in this role. (That's the actor in costume in the photo accompanying this blog entry.) Moreover, Scorsese took dramatic license with the real-life facts by placing The Butcher’s death closer to the New York City draft riots of 1863.

The director rather freely adapted the
Herbert Asbury history of the 19th-century New York underworld. Scorsese leaves his own very individual thumbprint on every one of his films (notably through his fascination with crime, betrayal, and the costs of living within a group's codes), so even his worst work is not without interest.

But aside from that ridiculous handlebar moustache of Day-Lewis', how much do you recall about the film? One great scene where the Irish gang led by Liam Neeson comes in from all sides to form a great mass to face off against the nativists; that curly wig of Cameron Diaz’s; and anything else? I didn’t think so.

Instead of concocting a ramshackle plot with events that nobody can remember, Scorsese would have been better off adapting a novel that covered the same time period and milieu (and even includes the Poole killing as a background incident)—Peter Quinn’s
Banished Children of Eve—which has the advantage of an actual plot and characters an audience could really care about.

Quote of the Day

“I…drink…your…milkshake!”—Daniel Plainview (played by Daniel Day-Lewis), in There Will Be Blood, screenplay by Paul Thomas Anderson, based on the novel Oil!, by Upton Sinclair
(Day-Lewis certainly drank Hollywood’s last night in winning Best Actor, especially given what might be the hammiest, most over-the-top scene by an actor since Jack Nicholson’s “Here…’s Johnnny!!!!” in The Shining, maybe even since John Barrymore imitated a camel in Twentieth Century)

Sunday, February 24, 2008

This Day in American History

February 24, 1868—By a straight party-line vote of 126 to 47, Republican-dominated House of Representatives impeached President Andrew Johnson. Several months later, in the Senate, he escaped conviction and removal from office by a single vote.

Public perceptions of the impeachment of Johnson have changed, then changed again, over the years, in a fascinating case study of historiography. In the years immediately after the event, the President and his defenders were vilified in the North.

In the first half of the 20th century, a number of historians, following the lead of
Columbia University professors William Dunning and John W. Burgess, depicted Reconstruction as an era marked by corruption and Radical Republican vengeance. They viewed Andrew Johnson as at worst an imperfect upholder of the Constitution and at best a man of courage.

Starting in the mid-1950s and the civil rights movement, however, a more complicated picture emerged. Kenneth Stampp, James McPherson, David Herbert Donald, John Hope Franklin, Eric McKitrick and Eric Foner (the last two in the same Columbia history department as Dunning) pursued the lead of African-American historian W.E.B. DuBois’s Black Reconstruction in America, 1860-1880 (minus the Marxist-influenced ideology) in showing this era as a lost opportunity for American racial and social progress. These
“revisionists” see President Johnson as deeply racist and his adversaries as more moderate than that loaded term “Radical Republican” would suggest.

The events leading up to the impeachment votes bolster the argument for the last viewpoint. Johnson, a lifelong Democrat, was nominated by Republicans as Abraham Lincoln’s running mate in the 1864 election as a way to win the votes of Union Democrats (including those from Southern border states) that year. Lincoln’s assassination, then, put in the White House a man largely out of sympathy with the party that controlled Congress.

The Dunning school of thought on Reconstruction posited that Johnson was merely following his predecessor’s lenient Reconstruction policy. They ignore, however, both Lincoln’s political suppleness and the evolution of his policies. Unlike Johnson, Lincoln was a master of men, a politician with a highly attuned sense for exactly what was politically possible. If anyone could have reconciled Southerners and Radical Republicans, it was he.

But there was a limit to how much even this man who called for “malice for none and charity for all” would accept. In
Abraham Lincoln: Man Behind the Myths, Stephen B. Oates takes issue with the idea that Lincoln would have been accommodating to the South. In fact, he shows how, in issue after issue (notably that of compensation for slaves), the South’s stubbornness pushed him increasingly toward progressive, pro-freedmen policies.

Though initially interested in indicting leading Confederates (especially the Southern aristocrats that he blamed for starting the war) for treason, Johnson soon parted ways with the Republicans over using government to promote economic development and over greater political and economic opportunity for freed slaves.

At every turn, Johnson used his office to circumvent the will of Congress: by appointing conservative generals to administer Southern military districts, by giving civilian governments the authority to control voter registration and election of convention delegates, and to obstruct any role for freedmen in the new Southern governments.

Johnson’s personal failings didn’t help. His intoxication during his inaugural address as Vice-President (Lincoln kept his eyes shut throughout the painful ordeal, then rose to his feet to deliver his immortal Second Inaugural Address) enabled detractors to depict him (wrongly) as a habitual drunkard. His customary intransigence and public belligerence toward opponents – even, on Washington’s birthday in 1866, accusing Congress of trying to assassinate him and starting a second rebellion—unnecessarily alienated many.

The showdown between President and Congress came over the
Tenure of Office Act, which provided that all officials confirmed by the Senate could only be removed with that body’s approval. Johnson sought to remove Secretary of War Edwin Stanton for defiantly claiming in a Cabinet meeting that military governors in the South were answerable to Congress, not to the President.

At first, Johnson sought to fire Stanton by offering his post to General Ulysses S. Grant. The Union hero, having his own disagreements with the President on Reconstruction, wouldn’t take the bait. A long, painful search for a successor then led the President to General
Lorenzo Thomas, who made the mistake of acceding to Stanton’s wish for “time for reflection.”

Bolstered by a one-word message from Massachusetts Senator Charles Sumner—“Stick”—Stanton proceeded to barricade himself in his office! For a second time, Johnson removed him. A day later, on February 22, the spokesman for the powerful House Reconstruction Committee,
Thaddeus Stevens, drafted 11 counts of impeachment—two dealing with Johnson’s indecorous behavior toward Congress, the others with the Tenure of Office Act.

(Particularly in the early 20th century, when he was caricatured as “Austin Stoneman,” the wild-eyed Congressman in D.W. Griffith’s Birth of a Nation, Stevens came in for harsh criticism. His constitutional case against Johnson was not the strongest, but this passionate supporter of civil rights long before it was fashionable deserves better than his caricature, and is finally getting his due from historians.

Though nearly all his correspondence with his household “servant” no longer survives, the few letters that do strongly suggest that the two were an interracial couple involved in a common-law marriage. When Stevens died, not long after the impeachment proceedings, he insisted on being buried in the only cemetery in his area that allowed for internment of blacks and whites together, so he could be next to the woman he loved.)

Partly because the coming trial had such important implications for the later impeachment proceedings against Nixon and Clinton, and partly because the trial was such a fascinating spectacle in itself, I will be returning to the story of Johnson’s impeachment at a later point. Suffice it to say for now that his prospects for acquittal were enhanced immeasurably by his legal team’s insistence that he stay silent and not even attend the proceedings – his mere presence might have tipped the balance decisively against him.

Johnson, like all Presidents, had the responsibility to ensure that laws be “faithfully executed.” The Radical Republicans surely overreached with impeachment charges based on the Tenure of Office Act, but Johnson just as surely failed in his constitutional responsibility to ensure that the laws be “faithfully executed.”

Appreciations: Sondheim’s “Company” on PBS

There’s a song in the Stephen Sondheim musical Merrily We Roll Along that’s called “Our Time,” but for the next week, the composer could charitably be forgiven for thinking it’s “My Time.” Filmgoers are taking in Johnny Depp’s typically spot-on performance in Sweeney Todd; a revival of Gypsy, his youthful 1959 collaboration with Jule Styne, is preparing to open on Broadway soon; Sunday in the Park with George has just opened to critical acclaim. (I saw it last weekend, and will have more to say on it in the near future.)

But the immediate cause of this post is the PBS airing of Company, as part of the Great Performances series. Though John Doyle’s re-imaging of the classic 1970 musical was shown a few days ago on Channel 13, it’s bound to be repeated in the New York metropolitan area soon within the next week, either on that station or any public broadcasting station around here. By any means necessary, watch it.

The taping is a pleasant reminder for me of the day I attended the show—its July 1, 2007 final performance at the Ethel Barrymore Theater. From first to last, these were the most ecstatic, moving, memorable hours of musical theater I’ve ever experienced—more like a rock show than Broadway, if you can believe it.

As the lights came up, a roar went up from the audience, which proceeded to cheer wildly throughout the show, urging cast members to heartfelt performances. By the end, when Raul Esparza, in the central role of 35-year-old bachelor Bobby had concluded the climactic “Being Alive,” the applause had become thunderous, going on and on for five minutes, finally bringing the actor to tears.

Doyle’s staging of this musical resembles what he did with Sweeney Todd: The entire cast not only acts and sings but plays musical instruments—with one significant difference: Bobby does not play any instrument until “Being Alive,” when his piano playing—at first slow and tentative, finally relentless and impassioned—underscores that he’s ready to stop being a romantic bystander and will not participate fully in life and love.

Sondheim’s songs here—which are not only central to his own career songbook but, I’d suggest, also to the history of American musical theater—still stand the test of time 38 years after the fact. The book, by George Furth (a character actor perhaps best known for playing the persnickety railroad clerk in Butch Cassidy and the Sundance Kid), holds up less well.

I was not especially bothered by its ramshackle form (Furth wrote it as a revue until he was advised by Hal Prince to turn into it into a more conventional musical), but by Doyle’s attempts to modernize some aspects of the show while leaving intact several anachronisms. For instance, it may have been considered “cool” for a middle-aged couple to get stoned in the late ‘60s and early ‘70s, but it’s lost its shock value today to such an extent, I would bet, that drug use in this age group is down today. Moreover, the couple in the hilarious “Not Getting Married Today”—a Jewish groom and Catholic bride—are nowhere near as uncommon now as then, given the loosening of restrictions on marriage outside the faith by Catholicism.

But this play is performed—and deserves to be performed again and again—because it is Sondheim at his zenith. Each song offers each member of the ensemble cast behind Bobby a chance to shine by providing a glimpse into that character. At the same time, each song seemingly represents an entirely different technical challenge that Sondheim has created for himself, then met. “Another Hundred People,” “Not Getting Married Today,” “The Ladies Who Lunch,” “You Can Drive a Person Crazy,” and “Barcelona” either make extraordinary vocal demands on the cast and/or overturn audience expectations at every turn.

Finally, Company puts paid to the cliché that Sondheim is all head and no heart. “Sorry/Grateful” and “Being Alive” are adult songs about everyone’s need for love and their terror at the emotional nakedness to which they are reduced by their longing. Company, together with the musical pastiche Sondheim and longtime producer Hal Prince created the following year, Follies, retains the ability to rend the heart even as other musicals woven inextricably into the fabric of their time, such as Hair and Rent, already begin to show their age.

Quotes for Oscar Night

From past award recipients:

"Welcome to the Academy Awards. Or as it's known in my house - Passover." –Bob Hope, perennial Oscar host, honorary winner, but never acting award winner

“I haven't had an orthodox career and I wanted more than anything to have your respect. The first time I didn't feel it, but this time I feel it and I can't deny the fact that you like me -- right now, you like me!”– Sally Field, upon winning her second Best Actress Oscar, for Places in the Heart

''If New York is the Big Apple, tonight Hollywood to me is the big nipple”—Bernardo Bertolucci, in accepting his Best Director statuette for The Last Emperor

“If I'd known this was all it would take, I'd have put that eyepatch on 40 years ago.” – John Wayne, upon winning his award for his role as “one-eyed fat man” Rooster Cogburn in True Grit

“I accept this very gratefully for keeping my mouth shut for once, in Johnny Belinda. I think I'll do it again.”—Jane Wyman, in accepting Best Actress for her role as a deaf mute in Johnny Belinda

“I'd like to thank everybody who ever punched or kissed me in my life, and everybody who I ever punched or kissed.”—John Patrick Shanley, upon winning Best Original Screenplay for Moonstruck

“This is the highlight of my day. I hope it's not all downhill from here.”—Kevin Spacey, after winning Best Actor for American Beauty

"This is the only naked man that will ever be in my bedroom." –Melissa Etheridge, clutching her Oscar statuette for Best Original Song for “I Need to Wake Up” from An Inconvenient Truth

"I'm in shock. And I'm so in love with my brother right now, he just held me and said he loved me."—Angelina Jolie, in accepting Best Supporting Actress for Girl, Interrupted

“I bet they didn't tell you that was in the giftbag.”—Adrien Brody, after kissing a startled Halle Berry on his way to accepting his Best Actor Award for The Pianist

“This is too - I mean, my hormones are just too way out of control to be dealing with this.”—A very pregnant Catherine Zeta-Jones, accepting Best Supporting Actress for Chicago

"Most of all, I want to thank my father, up there, the man who when I said I wanted to be an actor, he said, 'Wonderful. Just have a back-up profession like welding.' " --Robin Williams, after winning Best Supporting Actor for Good Will Hunting

"I guess this proves there are as many nuts in the Academy as anywhere else."—Jack Nicholson, in accepting his first Oscar, for Best Actor, for One Flew Over the Cuckoo’s Nest

“I want to thank the members of the Academy who were bold enough to give me this award for this song which, obviously, is a song that doesn't pussyfoot around or turn a blind eye to human nature. God bless you all with peace, tranquility and good will.”—Bob Dylan, after winning “Best Original Song” for his “Things Have Changed” in Wonder Boys

“I feel like diving into this ocean of generosity … I would like to be Jupiter and kidnap everybody and lie down in the firmament making love to everybody.”—Roberto Benigni, after jumping over chairs, rushing to the stage, and kissing Sophia Loren on his way to picking up his Best Actor statuette for Life is Beautiful
(Yeah, baby!)

Saturday, February 23, 2008

This Day in Medical History (Salk Starts Last Phase of Polio-Vaccine Program)

February 23, 1954 – As part of the third and most important pilot program of his major medical discovery, at Arsenal School in the working-class Pittsburgh neighborhood of Lawrenceville, Dr. Jonas E. Salk (pictured) immunized 137 five-to-nine-year-old schoolchildren with his new polio vaccine. The school was the first to participate in the city-wide experiment that within a month would inoculate 7,000 children.

The vaccine came not a month too soon: the summer of 1952 produced the worst polio epidemic in U.S. history, with 57,628 coming down with the disease. On the same day that Salk tried his vaccine out on the Arsenal students, The New York Times reported on a scientific panel’s conclusion that mass inoculations with gamma globulin of 185,000 children in 23 areas the prior summer had failed to produce demonstrably beneficial results against polio.

Salk developed his vaccine while working at the University of Pittsburgh’s School of Medicine. In his laboratories on the ground floor and in the basement, he used rhesus monkeys. According to Franklin Toker’s Pittsburgh: An Urban Portrait (1986), the walls “were said to bear the paw prints of animals that had scampered out of their cages.” (The building—the former Municipal Hospital—is now named in Salk’s honor.)

On April 12, 1955—ten years to the day after the death of the world’s most famous polio victim,
President Franklin D. Roosevelt—Salk announced the successful conclusion of this medical experiment.

Though not well known nowadays, a major force behind Salk’s pioneering work was
Basil O’Connor, FDR’s law partner. Establishing the National Foundation for Infantile Paralysis at his old partner’s request, O’Connor spearheaded the March of Dimes fundraising campaign that brought in $50 million a year ($250 million in 1995 dollars) by the late 1940s.

Intent on moving beyond supplying iron lungs, O’Connor saw killed-virus vaccines as an alternative to live vaccines, which had given researchers unexpected problems. He persuaded Salk to come to the University of Pittsburgh with the promise of a professorship.

O’Connor may have performed his greatest bit of service for Salk, however, in a case of medical politics. Twice, Salk saw his applications for funding for the killed-virus vaccine rejected by the scientific advisory committee of the National Foundation for Infantile Paralysis. On Salk’s third try, the committee told O’Connor that it would not stand in the way of the doctor’s grant.

By 1960, Salk’s vaccine had all but eliminated polio from the general population.

When O’Connor died in 1972, Salk paid tribute to his old benefactor at a memorial service: “In 80 years, what centuries could not do, he did…He used time well and time did not misuse him. Hence, miracles were wrought. His accomplishments are woven into the fabric of existence. He made history happen.”

Quote of the Day

“It is our responsibilities, not ourselves, that we should take seriously.” – Actor Peter Ustinov

Friday, February 22, 2008

This Day in American History

February 22, 1819—The Adams-Onis Treaty, also called the Transcontinental Treaty, was signed between the United States and Spain, drawing a definite boundary between Spanish-held lands in North America and the United States.

In the election of 1824, supporters of
Andrew Jackson claimed that John Quincy Adams secured enough votes in the House of Representatives to win the Presidency through a “corrupt bargain.” In exchange for being appointed Secretary of State, they alleged, Henry Clay threw his support behind Adams. The charge led four years later to an even more vitriolic Jackson-Adams rematch that Adams lost.

What has often been forgotten in the years since is that before their Presidential ambitions divided the two men, Adams had not only done Jackson a good turn, but had even saved his career. I’ve always felt that it was a particular blind spot that Jackson, usually loyal to a fault, did not appreciate what Adams did for him.

In a larger, more important sense, Adams converted an embarrassing international incident into a diplomatic triumph for the United States. It is just one of several achievements that have led many diplomatic historians to regard him as
America’s greatest Secretary of State.

In 1817, the Seminole tribe launched raids into Georgia from Spanish-held East Florida. President
James Monroe, believing that the Pinckney Treaty required Spain to restrain Native American incursions, believed that moderate pressure could eventually persuade the faded colonial power to cede the land to the United States.

Into this situation stepped Jackson, who on January 6, 1818, blithely offered the President his views on how to handle the situation: Not only could Amelia Island, an outpost for pirates, slave traders and smugglers, be taken, but East Florida as a whole could be taken to indemnify the U.S. for past outrages. “This can be done without implicating the government. Let it be signified to me through any channel…that the possession of the Floridas would be desirable to the United States, and in sixty days it will be accomplished.”

The letter received no response. Monroe, seriously ill, passed it along to acting Secretary of War William Crawford and the incoming secretary,
John C. Calhoun. Six months later, the President arrived in Washington to discover a media firestorm about the general. In late April, Jackson had executed two British subjects for inciting the Native Americans against the United States, then, less than a month later, had attacked Pensacola, forcing the departure of the Spanish governor and his troops to Havana.

The Spanish representative to the United States,
Luis de Onis y Gonzalez , hastened to Washington from his vacation to lodge a bitter protest. Monroe conferred with his cabinet in July to consider what to do.

Most, including, most vehemently, Crawford and Calhoun, called for publicly repudiating Jackson, believing it to be an act of war. Adams disagreed. In subsequent discussions with Onis, the Secretary of State insisted that if Spain could not restrain the Native Americans, they should yield the land to a country that could: the United States. The trump that Adams held was growing colonial unrest that sapped the Spanish monarchy’s will.

Under the treaty that Adams negotiated, the U.S. gained possession of Florida for $5 million, to be paid to Americans with claims against Spain; the boundary between the Louisiana Purchase region and Spanish Texas was set to American advantage; and Spain surrendered any claim to Oregon.

The agreement removed major impediments to U.S. expansion westward—and by persuading Monroe that Jackson need only be privately chastised for exceeding orders rather than publicly repudiated, Adams spared his boss a political mudbath and the general a possible loss of his command and the prestige he had won with the Battle of New Orleans. Not a bad piece of work, all around.

Quote of the Day

“Well I used to be disgusted,
But now I try to be amused.” – Elvis Costello, “(The Angels Wanna Wear My) Red Shoes”
(A thought to ponder while contemplating the possibility of the usual quadrennial Presidential election scandal)

Thursday, February 21, 2008

This Day in Native American History

February 21, 1828—The first newspaper in a Native American language, the Cherokee Phoenix, was published in New Echota, Georgia. The publication, which remains in existence today, was part of a wider movement among the five Southern or “Civilized” Tribes (Cherokees, Choctaws, Chicasaws, Creeks, and Seminoles) toward white cultural patterns, including raising large cash crops on plantations and modeling their tribal government after the federal system.

The Phoenix used an 86-character Cherokee alphabet called the “
Talking Leaves,” created by the Native American linguist Sequoyah and rapidly adopted because of its ease of use. The paper’s editor was twentysomething (accounts differ as to exact birthdate), Moravian-educated Gallegina (Buck) Oowatie, who took the name Elias Boudinot in tribute to the writer-poet-statesman who served as President of the United States under the Articles of Confederation.

Boudinot’s tragedy epitomized the larger tragedy of white-Native American relations during the 19th century. In 1829, he and friend John Ridge advocated the death penalty for any Indian giving away Cherokee land. Only three years later, however, worried about the threats posed by white encroachment, Boudinot reluctantly came to believe that relocation was the only reasonable alternative to tribal extermination. The decision opened a breach with Cherokee leader
John Ross, who forced his resignation from the Phoenix.

In December 1835, 20 members of the “Treaty Party” met in Boudinot’s home and signed the
Treaty of New Echota, relocating the tribe to Cherokee Nation (now Oklahoma). On June 22, 1839, Boudinot was working on a translation of the Bible with Ridge and Ridge’s father when fellow Cherokees decided to impose the penalty Boudinot had advocated a decade earlier for ceding tribal land. Taken completely by surprise, the three were stabbed and tomahawked by members of the John Ross faction.

More than two decades after Boudinot’s death, his brother,
Stand Watie, became a brigadier general in the Confederate Army—the only Native-American to become a general in either the Confederate or Union Armies during the Civil War.

Movie Quote of the Day

Otto Ludwig Piffl (a young East German Communist, suddenly disillusioned): “Is everybody in this world corrupt?”
Peripetchikoff (a defector): “I don't know everybody.”
--Horst Buchholz (Piffl) and Leon Askin (Peripetchikoff), in One Two Three (1961), starring James Cagney, screenplay by Billy Wilder and I.A.L. Diamond (1961)

Wednesday, February 20, 2008

This Day in Literary History


February 20, 1852—The Springfield (Mass.) Republican published “sic transit gloria mundi,” a mock valentine by Emily Dickinson—the first ever printed in that paper by the twenty-one year-old poet, and one of only 11 published out of an estimated 1,700 written by the mysterious “Belle of Amherst” during her lifetime.

Like these surviving gems, “sic transit gloria mundi” was sent by an admirer without the poet’s permission—“love turned to larceny,” in the words of sister-in-law and neighbor Susan Dickinson.

Emily mailed the poem to William Howland, a young clerk in her father’s law office, who was so taken with it that he submitted it to the Republican. Almost a week later, she was embarrassed to find her work printed, and took steps to make sure her father did not discover it. This proved easy to do, as the poem had been printed anonymously.

The Shrine of the “Belle of Amherst”
Several years ago, I visited the Dickinson Homestead, now a beloved shrine in Amherst, an academic community amid Massachusetts’ Pioneer Valley.

In a rectangular, Federal-style home at 280 Main Street, Emily Dickinson (1830-1886) lived for all but 14 of her 55 years—cooking, gardening, caring for her parents, and seeing precious few visitors. Most of all, she molded verses of bewitching power.

Samuel Dickinson, the poet’s paternal grandfather, built what is believed to be Amherst’s first brick residence here in 1813. His bankruptcy led to loss of the home during the poet’s youth. When it was finally repurchased by her father, Edward Dickinson, in 1855, he transformed it into a fashionable Italian-style villa. Inevitably, the homestead’s cupola, symmetrical windows, slight elevation, and stand of trees in front suggested reason and rectitude, Edward’s bulwarks against disorder.

Today the Dickinson Homestead belongs to Amherst College, which operates the National Historic Landmark as a museum. A constant stream of Dickinson aficionados like myself make pilgrimages to the home.

“The heart has many doors,” Dickinson wrote. I came here hoping to find the one that would dissolve her enigma. At best, like so many other visitors, I was only partially successful.

The Materials of Domestic Life
A short tour of the property reveals how Dickinson managed to construct a universe from the simplest domestic materials. Her poems are filled with locked doors, chimneys, candles, church bells, meadows, mountains, bees, lamps, and flowers—all items she could see in her house, in her backyard, or through her window.

Amherst’s major institutions press close to the Homestead, providing anchors for the rest of her family but which Dickinson rejected. The house stands only a few hundred yards from Amherst College, which Samuel Dickinson helped found with an endowment so generous that it burdened his descendants with debt.

Across the street tower the steeples of the First Congregational Church. Dozens of Dickinson’s friends and family members joined the denomination as part of the Protestant revival sweeping mid-19th century New England. Yet Emily, in quiet but unmistakable defiance, refused to join the converts. She saw the church her brother had helped build only after midnight, when nobody else was around.

Her poems’ multiple ironies trace back to Dickinson’s ambivalent relationship to the Homestead. Rather than this house, the poet preferred a white clapboard dwelling, where she lived from ages 10 to 24, next to the local burial ground. (A Mobil station stands today on the site of the old North Pleasant Street home, which fell victim to the wrecking ball in the 1920s.) This was the home where she wrote “sic transit gloria mundi.”

The North Pleasant Street dwelling represented her brief window of social activity: going to dances, calling on friends, and attending book club readings and concerts. Perhaps not coincidentally, her mother fell into a prolonged illness after the move back to Main Street—and Dickinson embarked in earnest on her poetry.

Dickinson only ventured out of the Homestead once in her last two decades, to attend the funeral of her beloved nephew Gilbert or “Gib.” The house was not hers, she said, but “my father’s.”

Beyond “my father’s grounds” stood the homes of relatives and friends who received her many gifts—not just picked flowers and gingerbread cakes (lowered by basket from her bedroom window to delighted neighborhood children below), but her short verse.

To preserve many of her poems, Dickinson took writing paper, wrote on both sides, and tied it all together with string. Forty of these booklets, called “fascicles,” each containing 25 or so poems, were discovered after her death. Guides hold up copies of these fascicles during tours, noting that the poems are untitled.

The Riddles of a Poet’s Life
Despite contemporaries’ near-unanimous agreement about Dickinson’s wit, intelligence, warmth, and generosity, she increasingly cloistered herself as the years went on. That stance contrasted strongly with three generations of Dickinson males—the poet’s grandfather, father, and older brother Austin—who became lawyers.

This seclusion has fueled one of the great debates about the poet. Some psychologists have ascribed to her all kinds of neuroses and shocks, from agoraphobia to rape. Other feminist scholars see in Dickinson’s withdrawal a conscious strategy to circumvent a patriarchal society that undervalued feminine intellectual and creative endeavors.

Whatever the case, by her 20s, she could only conquer her fear of the outdoors by walking her enormous dog Carlo through her father’s 11-acre hay field across the street.

On the other hand, in the backyard, protected from onlookers, Dickinson could work undisturbed in her garden. Her mother nurtured this interest, and Dickinson deepened it by taking botany at Amherst Academy. A conservatory allowed her to care even in winter for such plants as daphne, jasmine and wildflower. (It was torn down in 1916 for safety reasons.)

The Dickinsons kept several servants, mostly Irish, during their second stay here. One, Tom Kelly, served as one of Emily's pallbearers.

Guides take small groups through the house at regular intervals, answering a barrage of questions:

* How accurate a likeness is the daguerreotype of Dickinson in the parlor? (The photo--he only one taken in her lifetime--was made when she was recovering from an illness, so she appears more fragile than she probably was, and her naturally curly hair was straightened for the camera.)
* Did Dickinson withdraw from people because of a broken heart? (Most likely not: her seclusion may have grown out of her need to care for her mother, a chronic invalid who probably suffered from postpartum depression, and in her late 40s Dickinson seriously considered a marriage proposal from a judge whom she loved.)
* How did Dickinson die? (The death certificate cited Bright’s Disease, but recent medical speculation has pointed to congestive heart failure.)

“The Soul selects her own Society,” Dickinson wrote. Perhaps she sensed that her poems’ unconventional grammar and capitalization and her relentless questioning of God would not play well in her lifetime. Now, however, the reclusive poet speaks to and for thousands of readers, from generation to generation.

Quote of the Day

“The cure for writer's cramp is writer's block.” -- Inigo DeLeon
(Thanks to Brian for the suggestion)

Tuesday, February 19, 2008

This Day in Scientific History (Edison Patents Phonograph)

February 19, 1878 – Inventor Thomas Alva Edison secured US patent No. 200,521 for the phonograph, the first machine to record someone’s voice and play it back.

Edison came up with the idea in the second half of 1877 (the precise date remains uncertain, though he filed for the patent in December). He tested his cylinder phonograph with the old nursery rhyme, “Mary had a little lamb.”

“Of all my inventions, I liked the phonograph best,” Edison later claimed. But, at the time of his patent, he only conceived of music recording as a secondary use for his new machine. He expected the primary market would involve “Letter writing and all kinds of dictation without the aid of a stenographer.” But the hearing-impaired inventor was not so stubborn that he wouldn’t oblige his public.

Edison’s invention took the democratization of music to another level. From now on, people need not be satisfied with sheet music, but could listen to the best of modern music—Caruso, Tchaikovsky, etc.—in their own homes.

Robert V. Bruce’s entry on “The Wizard of Menlo Park:” in
The Reader’s Companion to American History (edited by Eric Foner and John A. Garraty) points out that Edison was “clumsier in entrepreneurship than in invention, distracted by the demands of management, [with] his inventive genius ebbing with age.” What could he mean?

I got a pretty good idea of what Bruce was talking about when I visited the
Edison National Historic Site in West Orange, N.J. The ugly red-brick buildings were where Edison and his group of “muckers” labored long hours to come up with more than half of his 1,000 inventions.

One anecdote illustrates Edison’s unique brand of management style as he aged. In the morning, annoyed with an associate, he told him he was fired and should be off the premises by noon. By the end of the day, struggling with a knotty technical problem, he asked where the employee was. “You fired him this morning,” he was told. “Well, get him back,” the great man answered.

Quote of the Day

“The welfare of the people in particular has always been the alibi of tyrants, and it provides the further advantage of giving the servants of tyranny a good conscience.” – Albert Camus
(Good riddance to Fidel Castro, and may brother Raul follow him quickly onto the ashheap of history.)

Monday, February 18, 2008

A Traveler's Tale for Presidents' Day


No matter how small the town, it's bound to have generated some history, often of far more than local interest. This lesson was reinforced for me anew last summer when I was visiting the Chautauqua Institution, in upstate New York.

Surprisingly enough, it wasn't Chautauqua itself, or even nearby
Jamestown (home at one point to Lucille Ball, Supreme Court justice Robert H. Jackson, and ornithologist Roger Tory Peterson) that was the site of a fascinating bit of Americana, but the nearby village of Westfield, the self-described “Grape Juice Capital of the World.”

As I was traveling one blindingly sunny mid-summer afternoon on Route 3 into this town of 3,400, my eyes caught sight of a pair of facing bronze statues in a small corner park. One statue depicted a small, young girl; the other, a lanky, bearded man. I couldn't recognize the girl, but there was no doubt at all about the man. (He's in the photo I took that accompanies this blog entry; due to lighting conditions that time of day, I could not fit the girl in as well.)

I was on my way into town, in the middle of my vacation, to get some photos developed. But the sight was so arresting that I just had to stop to find out: What on earth was a statue of Abraham Lincoln with a little girl doing in such a comparatively remote part (more than 60 miles inland from Buffalo) of upstate New York?

Janice Hogenboom, a reference librarian at Westfield's Patterson Library, helped me unlock this tale. My thanks to her for allowing me to view the library’s file of clippings on onetime resident Grace Bedell—the little girl responsible for one of the iconic images in American history.

(A word of advice to researchers of all levels of sophistication: Be kind to librarians. Take it from one who has been on both sides of the reference desk: It’ll pay you dividends!)

The Face of the President-Elect
For all of our current emphasis on image, it would be a mistake to believe that appearance never entered the minds of pre-20th century statesmen. When the Second Continental Congress considered who should be appointed commander in chief of the army, George Washington showed up in the blue-and-buff uniform of his Fairfax (Va.) County militia. And Abraham Lincoln was fully conscious of his own appearance, using it as the subject of some of his best lines. (Charged with being a phony by a detractor, Lincoln responded with a joke that brought down the house: “If I were two-faced, would I be wearing this one?”)

In her fine study of Lincoln and his cabinet, Team of Rivals, Doris Kearns Goodwin noted how his face could change. Unutterably sad in response, it lit up so much in telling a humorous anecdote that throngs of onlookers gathered around him in taverns along the court circuit he traveled in Illinois, creating a circle of friends that would be instrumental in his runs for the Senate and the Presidency.

Take a look at how Lincoln’s private secretary John Nicolay described that face in motion: “Graphic art was powerless before a face that moved through a thousand delicate gradations of line and contour, light and shade, sparkle of the eye and curve of the lip, in the long gamut of expression from grave to gay, and back again from the rollicking jollity of laughter to that far-away look.”

Unlike conventionally handsome Presidents such as FDR, John F. Kennedy or Ronald Reagan, who photographed as well while still as while mobile, Lincoln’s face would have benefited dramatically from the coming of the motion picture.

A Matthew Brady photo, taken at the start of the 1860 campaign, depicted Lincoln as he had appeared for his entire adult life to that point: clean-shaven. His cheekbones protruded noticeably, casting dark shadows across the face that highlighted his frequent melancholy.

Posterity's opinion of the President may have been shaped irrevocably by his subsequent decision to grow a beard. Facial hair disguised the hollowed-out look of the cheekbones without covering over the deep-set gaze. Inevitably, facial hair also gave him the appearance of an Old Testament patriarch, inspiring a subsequent nickname: "Father Abraham."

The name might have been more appropriate than many realized at the time. Just as the biblical Abraham came to be revered as “the father of nations” by the three monotheistic traditions of Judaism, Christianity and Islam, Lincoln became the father of “a new birth of freedom” in what one of my Columbia University professors of history, James Shenton, liked to refer to as the “Second American Republic.”

A Little Girl’s Suggestion
During the campaign of 1860, the Bedell family of Westfield were ardent supporters of "Honest Abe." After a poster of the candidate was brought home, 11-year-old Grace told her mother that he would look much better with whiskers, then set down her opinion in a letter to Lincoln.

Of the many letters the busy candidate received in this stressful time, this one caught his attention. As he told Westfield notable G.W. Patterson, the letter “so differed from the many self-seeking and threatening ones I was daily receiving that it came to me as a relief and a pleasure.”

In mid-October, Lincoln replied to the young girl. After writing that it was too bad he did not have any daughters, he turned to her suggestion: "As to the whiskers, having never worn any, do you not think people would call it a piece of silly affection if I were to begin it now?"

But the idea began to appeal to Lincoln, and after the election he began to grow a beard in earnest. By January 1861, it had grown so substantial that his young aide (and future biographer) John Hay even composed a couplet on it.

Lincoln stopped at Westfield as part of a 12-day railroad tour of the Northeast before his inauguration. Throughout the tour, the President-elect made sure he said as little as possible about the conflict everyone knew was coming.

Under normal circumstances, Lincoln was not fond of off-the-cuff speaking. Now, with the question of war in the balance, he did not want to risk a gaffe that would exacerbate an already troubling situation. (Emotions were so high that Lincoln’s personal safety was jeopardized: Detective Allan Pinkerton persuaded him to take a night train through Baltimore, a hotbed of Confederate sentiment, to avoid an assassination plot he had uncovered.)

All the more reason, then, to keep the Westfield appearance more in the line of a photo op than a major policy address.

When Lincoln's train pulled into town, young Grace went to see it with two older sisters. At first, the large crowd blocked her view of the tall President-elect. But, hearing him ask that if she were present she should step up, she was led forward by the boyfriend of one of her sisters.

Stepping down from the platform of the railroad car, Lincoln took the girl’s small hands in his large ones, stooped down from his immense height and kissed her on the cheek, saying, “You see I let these whiskers grow for you, Grace.”

The unexpected attention from the candidate and the cheering crowd so embarrassed the girl that she forgot all about the bouquet of roses she was going to present him. She ran home, speaking to no one.

For all her consternation, however, she did not forget, then or subsequently, the expression on the great man’s face as he bent down toward her. It was so characteristic of what everyone else said of him: “He seemed so very kind but looked so very sad.” He hadn’t even started his new job yet, but, as he told his old Springfield, Illinois neighbors in bidding them farewell, he knew he faced “a task before me greater than that which rested upon Washington.”

A Footnote to the Lincoln Legend
Surprisingly enough, this was not the only letter that Grace would send Lincoln. In 1864, now “grown to the size of a woman,” she wrote him again, after financial reverses led her father to lose nearly all his property, requesting a job in the State Department, where, she had heard, a number of young girls were employed, at good wages, cutting Treasury notes. This time, she received no reply.

The historian who discovered this second letter in the National Archives in March of last year believes it never reached the President.

Four years later, Grace married a man named George Billings and moved with him to Delphos, Kansas. They encountered the usual hardships of prairie life and had another close encounter with American history (George became friendly with “Wild Bill” Hickok), but they eventually settled into a comfortable life—George as a banker, Grace as (this is the part I love!) the town’s first librarian.

Over the years, as reverence for the martyred President grew, every bit of lore associated with his life became fodder for journalists. For her last six decades, Grace Bedell Billings recounted to them her 15 minutes of fame. She died in 1936, at age 87.

As I faced the two statues in Westfield’s pocket-sized Lincoln-Bedell Park, I did not feel the same sense of rapt awe I experienced in the presence of the Lincoln Memorial, that immense secular temple on Washington’s great national mall. But I did find these figures located far more in a specific moment, far more approachable, and far more human than Daniel Chester French’s masterpiece.

And all because I took an unexpected drive in a remote town…

This Day in European History (Gestapo Arrests 'White Rose' Dissidents)

February 18, 1943 – The members of the “White Rose” anti-Nazi group, centered in the University of Munich, were apprehended and arrested by the Gestapo.

Since 1942, as the speed of atrocities had quickened and Germany had gotten bogged down in Stalingrad, these nonviolent activists, consisting mostly of medical students, had surreptitiously printed and distributed pamphlets calling for an end to Hitler’s regime, and even scrawled large graffiti all over Munich: “Down with Hitler! . . . Hitler the Mass Murderer!” and “Freiheit! . . . Freiheit! . . . Freedom! . . . Freedom!”

Yet for months, despite the most strenuous efforts, the Gestapo couldn’t locate the source of this insurrectionary activity, even though it correctly guessed that the group had access to a duplicating machine as well as large quantities of paper, envelopes and postage.

On this date, however, two members of the group, brother and sister Hans and Sophie Scholl—two former members of the Hitler Youth—were discovered. Four days later, the siblings, along with best friend Christoph Probst, were executed after a farce of a trial. Later, three other group members met the same fate.

Today, a square in the University of Munich is named after Hans and Sophie Scholl. A documentary on the group was released in the 1980s, and more recently the film
Sophie Scholl: The Final Days earned a much-deserved Oscar nomination for Best Foreign Language Film. (The interrogation scenes were based on actual Gestapo records that became available only after reunification.) The group has taken its place of honor among those who set the face against Nazism, including the Jesuit Alfred Delp, the Lutheran theologian Dietrich Bonhoeffer, and the leader in the plot to kill Hitler, Count Claus von Stauffenberg (the subject of the upcoming Tom Cruise film that has caused such controversy).

Two of the finest foreign language films—two of the finest films, period—of recent years dealt with Germany’s experiences with 20th-century totalitarian regimes and the resistance formed against it: Sophie Scholl and
The Lives of Others, last year’s Oscar winner for Best Foreign Language Film, on East Germany under Communism.

Why aren’t more such thoughtful, serious movies made? More to the point, why aren’t they made in Hollywood? Part of the problem might stem from the attitudes of Hollywood’s finest stars.

In a
New Yorker profile last year, Julie Christie remarked, “I’m not sure I can bear to see a film they gave the Oscar to, that tells you what awful people Communists are.”

Having just watched her Oscar-winning performance in the 1965 film Darling, I know how superb an actress Christie is, and I’m waiting patiently to see how, four decades later, she has transformed herself into an Alzheimer’s patient in Away From Her.

But comments like these are simply fatuous—particularly from one such as Christie, who chose to appear in tripe like the Brad Pitt film Troy but somehow thinks it beneath her to watch a film that might say something essential about her times.

You have to wonder why she said this. Is it because there are already too many films on this subject? But by the same reasoning, why have another film about Alzheimer’s after the similarly themed Judi Dench film Iris?

This leads logically to the conclusion that the actress might be suffering from either an intellectual or moral deficit. But none of the interviews that Christie has given over the years, nor her longtime anti-nuclear and peace activism, can lead one to believe that she is anything but intelligent.

That leaves us with a moral deficit. How sad that a woman who came to personify all the allure of London in the 1960s possesses such a terrible blind spot. Clearly, she wasn’t paying attention to the part of the script in Doctor Zhivago where her character Lara's friend Pasha (Tom Courtenay), an idealistic revolutionary turned remorseless Communist functionary, says: “History has no room for personal feelings.”

But Christie is hardly unusual in the filmmaking community in her moral failure. Oliver Stone made an admiring documentary about
Fidel Castro, Commandante, and has stated that he admires him “because he’s a fighter.” A few years ago, The Motorcycle Diaries related the evolution of Che Guevara’s political thought while on a journey, with no attention, except for a caption at the end, about his career with Castro. (This is a little bit like making a film about Josef Goebbels at the University of Heidelberg and his activities as poet, playwright and novelist without getting into that unfortunate association with Adolf Hitler.) The Robert Redford movie Havana also presented a highly romanticized, Casablanca-influenced version of the commandante’s rise to power.

In fact, the first film I can recall that took Castro to task was Before Night Falls, in which the Cuban dictator was criticized for his cruel anti-gay regime. Yet even before that, Castro had appropriated private property, interfered with religious institutions, refused to allow free and open elections, as well as jailed and, when the need arose, executed political opponents.

Why has the film community so rarely, if ever, covered this? It can’t have anything to do with lack of witnesses – my own elementary and second schools were filled with the children of refugees from his tyranny, and I’m sure there are hundreds of similar schools across the country.

The fact is that nearly 20 years after the end of the Iron Curtain (and more than 60 years after the so-called “Thousand-Year Reich” met its inglorious Gotterdammerung), the world needs to know all the history it can about totalitarian regimes of all kinds.

In his great elegy, “In Memory of W.B. Yeats,” W.H. Auden urged his fellow poets, “In the prison of his days,/Teach the free man how to praise.” That same responsibility – to tell the truth about totalitarianism, no matter what the cost, as the White Rose group did —falls to artists of all kinds, including Ms. Christie—and especially so in this land where today, we celebrate the two men who gave us a new republic and “a new birth of freedom.”