Tuesday, June 03, 2014




ART AND THE AMERICAN CENTURY

Jerry Harkins



On December 5, 1985, the New York Philharmonic led by its Laureate Conductor, Leonard Bernstein of Lawrence, Massachusetts, presented an extraordinary program consisting of three symphonies: the Third of Roy Harris of Chandler, Oklahoma, the Third of William Schuman of Manhattan and the Third of Aaron Copland of Brooklyn.  This was distinctly American music of a high order—three serious, engaging works by contemporary American composers conducted by the first American-born Music Director of a major American orchestra.  All three works had been premiered by Serge Koussevitzky and the Boston Symphony Orchestra in 1939, 1941, and 1946 respectively.

Everything about that concert—the program, the conductor and the fact that it was happening in New York said that America had come of age musically.  One hundred forty-eight years after Ralph Waldo Emerson claimed that, “…our long apprenticeship to the learning of other lands draws to a close” his assertion seemed at last to be credible. [1]  It will be noted that all four principals were protégés of Koussevitzky who was a Russian immigrant.  Both Copland and Harris had been students of the French composer Nadia Boulanger.  All of them, teachers and students, had deliberately set out to develop and nurture a classical musical idiom that was specifically and recognizably American.  It was their success Bernstein was celebrating that night.

National styles in the arts are not, of course, unusual and were, in fact, flourishing in several countries during the same period, notably in England, France, Hungry, Finland and the Soviet Union.  Throughout the Western world, classical composers were mining their own national folk traditions.  European painters, sculptors and writers were also taking inspiration from aboriginal material that was seen as emblematic of the subconscious archetypes being described by psychologists and anthropologists.  But everywhere else, such national or ethnic sourcing was peripheral.  In America, it was the mainstream.

It was also pervasive.  Eugene O’Neill’s plays and Martha Graham’s dances pioneered the exploration of psychological and anthropological themes.  Gertrude Stein, Vachel Lindsay, Wallace Stevens, E. E. Cummings, Langston Hughes and Robert Frost gave poetry its modernist form and function.  In architecture, Frank Lloyd Wright was already building a second wave of masterpieces in the late 30’s while the best Europeans were unable to get their designs executed.  Eliel Saarinen and his son Eero left Finland for America in 1923 and were followed by Walter Gropius and Mies van der Rohe in 1937.  Europeans learned film making from D. W. Griffith, Mack Sennett and Charlie Chaplin.  In spite of the Bauhaus, European design had become remote, academic and soulless in the aftermath of Edwardian excess.  Thus, the brilliant post-war European designers had to learn the modern craft from Americans like Walter Teague, Norman Bel Geddes, Henry Dreyfus, Raymond Loewy and Brooks Stevens.

Thus it is fair to see the Bernstein concert as both a celebration and an assertion of American ascendancy.  New York was suddenly the center of the art world.  New York:  big, boisterous, smart, anxiety-ridden, electric, busy as hell.  Some European countries suddenly felt the need to enact laws against American “cultural imperialism” while their intellectuals heaped scorn on American philistinism and gaucherie. By the 1950’s, they were talking about American “culture” as though it meant only movies and television, theme parks and rock and roll.  Some added morals and values to the pot.  Interestingly they rarely spoke the same way about the visual or performing arts possibly because they considered these beneath their notice.  In spite of everything, the world became increasingly Americanized especially in the arts and sciences, in economics, language and cuisine.

“Specifically and recognizably American” is a phrase sure to get a writer in trouble.  It is debatable on many levels but, for the moment, it is sufficient to point out that a distinct American idiom was precisely what the artists of the era thought they were pursuing.  Copland, for example, returned from his three years with Boulanger in 1924 and immediately wrote his first symphony with its blues-infused scherzo.  Of it, he said his intention had been, “…to write a work that would be immediately recognized as American in character.” [2] He later experimented with serialism and with the neo-classicism of Stravinsky but the body of his work is unabashedly American.  Pieces like “Appalachian Spring,” “Rodeo,” and “A Lincoln Portrait” could not and would not have been written anywhere else.  Even the Europeans who made use of European folk themes—composers like Bartok and Vaughan Williams—could never be confused with the Americanist style.

It is admittedly difficult to define the elements of “Americanism” in music but it certainly begins with the use of jazz harmonies and rhythms.  Copland’s appreciation of jazz may have been ambiguous but, even in his more formal works like “Quiet City,” there is a blues sensibility that imparts an introspective feeling like that of a painting of Edward Hopper or Andrew Wyeth.  More obvious examples could be multiplied in the works of George Gershwin, Leonard Bernstein, Virgil Thompson and Gunther Schuller.  Thompson was a dedicated modernist but even his experimental operas Four Saints in Three Acts and The Mother of Us All draw from American folk themes and employ jazz riffs.

Jazz had its roots in the music of the African slaves in the American South.  They brought with them the blue notes, improvisation and complex rhythmic patterns of African music and applied them to their call and response work songs and to the hymns and popular songs of their new environment.  After emancipation, jazz practice coalesced in New Orleans before spreading north and diversifying into multiple styles.  Within thirty years, it had become the defining characteristic of the American songbook and of much classical music.  But there is more to American music than jazz.  There is a narrative hallmark that that derives from three peculiarities of the American experience:  ethnic diversity, the frontier, and the commitment of the immigrants to the future.  The Europeans who settled the land, even those who arrived with little but the clothes they wore, were never Emma Lazarus’ “huddled masses” or “wretched refuse.”  Many, of course, were destitute but all were proud, strong, defiant people with the courage to test themselves against the unknown in search of a better life.  Until the nineteenth century, the lives of ordinary people did not change much from century to century and people generally did not expect or value change.  But the frontier seduced the immigrants.  Long after it ceased to exist physically, there remained the idea of a better future and a social contract that valued change and experimentation.  They had the sense of being part of the Novus ordo seclorum, a new order or new beginning of the ages.  They were and are a religious people but, from the earliest days, they were divided between hellfire and brimstone fundamentalists and less dogmatic modernists.  Their politics were often raucous and just as often corrupt but they valued democracy and made it work for them.  They insulated themselves against the worst of their frustrations through a healthy sense of humor.  Americans invented  the joke, the one-liner, the tall tale, slapstick and the situation comedy.  They developed a culture that was individualistic, innovative, bold or brash, pragmatic and not classless but mobile as to class.

There had never been anything like it.  When they came to translate these values into cultural pursuits, they became the first to embrace the idea that the only purpose of art is art.  American art was almost never esoteric or rarefied even if it occasionally ventured into obscurity.  It can be amusing to read commentaries of critics who think art should flow from academic ideologies.  The works of painters like Grant Wood, Edward Hopper and Norman Rockwell are nowhere near as complex as they can be made to seem.  The self-identity of critics is typically rooted in their investment in esoteric theory which may explain why they so often denigrate American artists and audiences.

If all this sounds like the Magic Kingdom, so what?  You don’t have to love it or even approve of it.  As Simone de Beauvoir wrote, “America is a pivotal point in the world where the future of man is at stake.  To like or not like her—these words have no sense.  Here is a battlefield, and one can only follow with excitement the struggles she carries on within herself, the stakes of which are beyond measure.” [3] The struggles were and are mighty.  The immigrants encountered the vicissitudes of the business cycle with its periodic panics and depressions.  They discovered that America was as much about the Dust Bowl as amber waves of grain.  It was about slavery and Jim Crow as much as the pursuit of happiness.  It displayed strong streaks of Puritanism and elitism and its intellectuals were always happy to denounce as shallow what they saw as the most overt symbols of its culture—Madison Avenue, strip malls and the Golden Arches. [4] Still, after all the disappointments, failures and outrages have been catalogued and castigated, there remains a sense that anything is possible and everything should be given a hearing.

And so it happened that during the first half of the twentieth century America emerged from nowhere to become a dominant force in virtually all the arts.  In each case, there had been a deliberate attempt to create something new that would be marked by a distinctly American character.  It happened suddenly.  America and, especially New York, became the world’s cultural center more rapidly and more completely than even Athens in the Golden Age or London in the Elizabethan Era.  Pretty much the same revolutionary process unfolded in music, painting, drama, literature, dance, design, architecture, photography, fashion and film.  And it tended to happen the same way:  a talented cadre of young native born Americans and older, already prominent immigrants came together in New York with the deliberate goal of breaking with the past and creating radically new art.  The art they sought would derive from, depict and explain the inexplicable character of America.  For painters, the cubism and surrealism of Europe were interesting but irrelevant.  For musicians, atonality was too intellectualized to comport with the directness, the high contrast of the frontier experience.  Europe was too refined, too precious.  America was, in Walt Whitman’s words, “blithe and strong.”  Its people were “…singing with open mouths their strong melodious songs…robust, friendly, clean-blooded, singing with melodious voices, melodious thoughts.”

Whitman reminds us that the American upsurge did not occur ex nihilo.  It was thoroughly if not exclusively grounded in Western, which is to say European esthetics.  Aristotle and perhaps even Plato would have been comfortable debating with the abstract expressionists of the Cedar Tavern.  In addition, the Americans were standing on the shoulders of giants.  Beginning in the middle of the nineteenth century, European writers, composers and painters had begun the modernist break-away from traditional forms. This happened at the same time and, to some degree, because the church and state were becoming less relevant as patrons of the arts and artists were free to court their own muses.  Their status and that of art itself grew and the artists gradually became cultural leaders. 

Just as American art was flourishing, Europe descended into moral and esthetic desolation.  World War I had been an abattoir, a danse macabre that left society and culture as well as millions of people on the slaughterhouse floor.  The effects were traumatic.  Between the wars, European intellectuals inside and outside the academy were derailed by the efforts of the existentialists to rationalize the barbarity and by the decadence that infected much of popular culture.  For all its youthful rebellion and sexual latitude, the Jazz Age in America was far less dissolute, less introspective.  Angst was replaced by exuberance fueled in part by the near universal defiance of prohibition.  The war had been shorter for Americans and had been fought far from home.  The economy survived and prospered and, even after the Great Depression arrived, Americans were able to mount a response that did not destroy hope.  The WPA’s Federal Art Project alone employed thousands of painters and sculptors and tens of thousands of writers, architects and musicians.  Today, the iconic images of the Depression are the bread lines, the Dust Bowl and the photography of Dorothea Lange, Walker Evans and Berenice Abbott.  The Yip Harburg lyrics for “Brother, Can You Spare a Dime?” might be remembered as the anthem of privation, but from Jimmy McHugh and Dorothy Fields’ “On the Sunny Side of the Street” (1930) to Harburg’s “Over the Rainbow” (1939), almost all the popular music remained upbeat and forward-looking.  The immortal Woody Guthrie’s 1940 song “This Land Is Your Land” was written because Woody thought Irving Berlin’s “God Bless America” was too soupy.  But “This Land” is at least as optimistic if somewhat edgier. [5]

Americans entered the years between the wars as the heirs, not only of late nineteenth century Europe but also of a long list of both immigrant and home grown artists of startling originality.  Among the latter are Emily Dickinson, Walt Whitman, and Mark Twain.  In classical music, Charles Ives was arguably as much the father of modernism as Wagner, Schoenberg or Stravinsky. [6] If there was a single trait that linked American innovators, a case could be made that it was eccentricity.  Indeed, eccentricity and innovation are closely related and both are on intimate terms with the risk and the reality of failure.  The twentieth century in America witnessed an explosion of truly radical genres in all the arts.  In music, these included a variety of methods of “composing” by means of random or other “aleatoric” processes for placing notes on paper with or without staves.  In some cases, the “composer” would have the players determine the notes using the readings of the I Ching.  John Cage did exactly that in his “Imaginary Landscape No. IV” of which he wrote, “It is thus possible to make a musical composition the continuity of which is free of individual taste and memory (psychology) and also of the literature and ‘traditions’ of the art.”  The ultimate expression of his philosophy is 4’ 33” which consists of four minutes and thirty-three seconds of silence in three movements.  These experiments had their parallels in the work of avant-garde choreographers.  Isadora Duncan broke from the conventions of classical ballet and sought to deemphasize the role of the feet in dance.  Merce Cunningham made dances, many in collaboration with John Cage, on the basis of random decisions made by his dancers.  In the late 1980’s, the Guggenheim Museum in New York mounted an exhibition of ultimate minimalist paintings many of which were blank canvases.  Andy Warhol, one of the original pop artists, re-invented himself as a mass production artist and sold works he had never set hand to and maybe had never seen.  Many of these movements now seem excessively contrived and some were vehemently rejected by critics, audiences or, occasionally, both.  But art, like science, advances as much by failure as by success.

Ars gratia artis.  The American avant-garde may have taken its enthusiasms to an extreme but they became the driving force of twentieth century culture.  When Wagner started to compose The Ring of the Nibelung, he said his aim was to create “the artwork of the future.”  George Bernard Shaw admired it in spite of or perhaps because of its “gloomy, ugly music, [without] a glimpse of a handsome young man or pretty woman.”  He said it stood as a turning point in the history of opera and, indeed, of all music.  If so, it was a very tentative beginning.  Stravinsky and the members of the Second Viennese School extended Wagner’s atonality but it never did take hold in opera which reverted to the more traditional harmonies of Verdi and Puccini.  It was left to the Americans, immigrants and aliens among them, to bring to fruition the next stage in the evolution of musical drama and it did not occur in traditional opera houses.

In 1927 Jerome Kern and Oscar Hammerstein II created Show Boat based on Edna Ferber’s novel of the same name. Although it is often said to be the first modern Broadway musical, it is not.  It is a fully realized opera except that the story is not the least bit frivolous, the music is accessible and every song fits perfectly in advancing the narrative.  The show was produced by Florenz Ziegfeld and he cast it with several of his Follies stars including Helen Morgan. [7] The casts of its many revivals, however, have regularly included opera singers including Frederica Von Stade, Bruce Hubbard and Teresa Stratas in 1988 and Audra McDonald in 2012.  In 1935, George and Ira Gershwin and DuBose Heyward introduced what is still the quintessential American opera—they called it a “folk opera”—Porgy and Bess.  Ten years later, Richard Rogers and Oscar Hammerstein II premiered their second major collaboration, Carousel.  In general it was well received although several critics thought it was excessively sentimental when compared to the1903 novel Liliom by Ferenc Molnár on which it is based.  But as Stephen Sondheim has observed, “Oklahoma! is about a picnic, Carousel is about life and death.”  In 1949, Lost in the Stars based on Alan Paton’s novel Cry, the Beloved Country, opened to a lukewarm reception in New York with music by Kurt Weill and lyrics by Maxwell Anderson.  Weill thought of it as a “choral play.”  His German collaborator Bertolt Brecht denounced it as the work of a sellout hack.  Brecht was wrong.  The libretto may be a throwback to nineteenth century melodrama but the music and lyrics are both masterful and modern.  This kind of serious musical drama—works such as Sunday in the Park with George (1984), The Lion King (1997) and Wicked (2003)—have by now become standard fare on Broadway and have laid legitimate claim to being the true heirs of classical opera. [8]

The era of American hegemony in the arts, as in other spheres, is now fading under the pressure of globalization.  Culture, like business, economics and governance, is becoming more homogenized than ever, a trend that seems to run counter to the political fragmentation that is on the rise everywhere. [9] One of these forces may ultimately yield to the pressure of the other.   But the driving force—the great enabler—of both homogenization and fragmentation is the Information Revolution and it is just beginning.  It took more than a century to ameliorate the dislocations brought about by the Industrial Revolution.  It would be foolhardy to try to imagine how the current upheaval will work out or how long it will take.  But it does seem safe to predict that the next century will be at least as unsettling as the last one. 

Notes

1.  The quotation is from “The American Scholar,” an address delivered to the Phi Beta Kappa Society of Cambridge, Massachusetts on August 31, 1837.

2.  “Composer from Brooklyn: An Autobiographical Sketch,”  Magazine of Art,  32, 1939, p. 549. 

3.  America Day by Day, translated by Patrick Dudley, London, 1952, p. 296.

4.  American critics and intellectuals, like their European cousins, are and always have been notoriously elitist.  One of my favorite examples is the reception that greeted Winslow Homer’s paintings of children.  Henry James wrote they were “barbarously simple” and “horribly ugly” which they were not.  Other critics denounced them as done to appeal to the simple tastes of American collectors.  George Gershwin fared even less well at the hands of the critics.  Lawrence Gilman of the Tribune reviewed the “Rhapsody in Blue” (February 13, 1924) with this broadside: “How trite, feeble and conventional the tunes are; how sentimental and vapid the harmonic treatment, under its disguise of fussy and futile counterpoint! ... Weep over the lifelessness of the melody and harmony, so derivative, so stale, so inexpressive!  To this day, The New York Times rarely likes any expression of what this essay regards as American art.  On September 9, 1971, its senior critic Harold Schonberg called Leonard Bernstein’s Mass pretentious and thin, cheap and vulgar. The following Sunday, he added superficial and said it was, “…the greatest mélange of styles since the ladies’ magazine recipe for steak fried in peanut butter and marshmallow sauce.”  Schonberg never liked anything Bernstein did.  But ten years later, the paper’s dismay was still unrelieved.  On September 14, 1981, Donal Henahan wrote that the piece, “…finds no time to say anything worth hearing.”  Moreover, “…much of the evening would sound as it were being improvised by the cast of ‘Saturday Night Live’ except that the humor is vapid and superficial.”  Nicolas Slonimsky filled a 325 page book with the stupidities of critics (Lexicon of Musical Invective, Coleman-Ross, 1953).

5.  Berlin’s original 1918 song was considerably less sugary than the 1938 version he wrote for Kate Smith which is the one sung today.  Right wingers sometimes say “This Land” is a Marxist anthem which is nonsense.  The song celebrates endless skyways, golden valleys, sparkling sands, diamond deserts and freedom highways.  It may also approve of trespassing and lament hunger but such attitudes do not require a close reading of Das Kapital.  Like many prominent Americans, Woody was a self-proclaimed socialist in the 1930’s and may have joined the Communist Party for a time.  He was listed as a subversive by the House Committee on Un-American Activities but was never officially blacklisted.

6.  It will be noted that both Schoenberg and Stravinsky emigrated to the United States and became American citizens in 1941 and 1945 respectively.  Neither participated in the events discussed in this paper although Schoenberg seems to have been influenced by the music of his good friend and tennis partner, George Gershwin.

7.  Those of us who never had the opportunity to hear Helen Morgan live can be forgiven for agreeing with the critic who called her voice "high, thin and somewhat wobbly."  In recordings from the 30's, she seems an unlikely torch singer who might have made a credible bel canto soprano.  Her voice was not terribly sexy as was Julie London’s but she knew how to put a song across with the best of them.  She was wildly popular and, like her peer Billie Holiday, her candle truly did burn at both ends.

8.  One of the first composers to follow and expand the new kind of musical drama was the classically trained British composer Andrew Lloyd Webber whose spectacularly successful works include Evita (1978), Cats (1981) and Phantom of the Opera (1986).

9.  The idea of the Global Village was propounded by Marshall McLuhan in his 1962 book, The Gutenberg Galaxy:  The Making of Typographic Man.  He foresaw that technology, especially television and “automation” (by which he meant computers) would result in people all over the world coming to share cultural assumptions and expectations.  He did not think this would produce unity or tranquility but, on the contrary, expected it would lead to increased dissension and political fragmentation as, indeed, it has.  Buckminister Fuller had a similar vision and developed his World Game in 1961 to counter what he called “desovereignization.”

Tuesday, January 08, 2013

"GROW UP, CONSERVATIVES"


“GROW UP, CONSERVATIVES”

Jerry Harkins

We must get the American public to look past the glitter, beyond the showmanship, to the reality, the hard substance of things. And we'll do it not so much with speeches that will bring people to their feet as with speeches that bring people to their senses.
                                                                                                              –Mario Cuomo


            History and myth are often hard to tell apart because of all the glitter that surrounds current events.  The media love that glitter because it is easy, it attracts attention and it sells ads.  Mario Cuomo was right about this as he was about so many things.  Thus, I ask you to bear with me while I bring out a bit of modern history for a curtain call.  History without the glitter.
            Once upon a time, February 12 was a holiday honoring the birthday of Abraham Lincoln.  But Congress, never noted for wisdom or common sense, decided there were too many holidays for a month that had only twenty-eight days to begin with so they killed it in a comedy of legislative errors which also moved Washington’s Birthday to the third Monday.  Now George was born on February 22 and the third Monday can never come later than the 21st.  Congress intended that this day should honor all Presidents but somehow Virginia objected and that desire was omitted from the final version of the bill.  Given that most of our Presidents have been fools, knaves or worse, I think that was a lucky mistake and I now propose to restore the twelfth as a national holiday.  It seems to me that February, the bleak mid-winter, needs as many holidays as it can get.  I propose to call the twelfth Impeachment Day.  On that date in 1999, the Senate of the United States acquitted Bill Clinton of perjury and obstruction of justice, the last two charges standing from the cornucopia of complaints lodged against him by Red Meat Republicans.  It was the final chapter of a story that had more serious consequences than we thought at the time. 
            For many of the players, the impeachment drama remains a dark blot hovering over the first sentence of their obituaries.  It will identify them forever with an embarrassing malfeasance that will eclipse whatever they did before or after.  For most of them, of course, it is only one of several such blots.  There is, for example, Ken Starr exercising his adolescent taste for salacious prose by drawing up a fantastical 445 page indictment complete with 480 footnotes.  Nor will we ever forget the Mighty Mormon, Orin Hatch, who for months went around muttering inanely, ”This isn’t about sex.”  Or Henry Hyde [1], the Ayatollah of Illinoiah, the Lead Manager of the House Impeachment Committee, the adulterer and home wrecker who told the Senate his job was “…to tell, nay shout truth to power.”  Whose emotional summary invoked the heroes of Valley Forge, Gettysburg, Iwo Jima, and Desert Storm and then ended with the gushy sentiment, “My solitary, solitary hope is that 100 years from today people will look back at what we've done and say, ‘they kept the faith.’  I'm done.”   This is the guy who, when confronted with his own sexual escapades merely said,  “The statute of limitations has long since passed on my youthful indiscretions.”  After all,  he was only 41 when he began his affair with Cherie Snodgrass, a married woman with three small children.  Cherie was 29 at the time and described by a friend as a “glamour queen.”  And, of course, there was Newt Gingrich, currently on his third wife, a woman 29 years his junior with whom he was having a torrid extramarital affair while he was leading the impeachment coup against Clinton.  He must love Calista though because he converted to Catholicism to marry her.  He explains his serial infidelities as “not appropriate” but driven by “how passionately I felt about this country.”  Patriotic peccadilloes?  The list goes on:  the sanctimonious, the hypocritical, the self-righteous, the unctuous, the mendacious, the holier-than-thou.  In short, the Army of Virtuous Christians disgorging itself of its worst nightmares.
            Fortunately, there were moments of comic relief.  We were treated to the stuff of soap opera as Pat Moynihan frenetically searched for a single word to define his disgust with the President’s behavior without lending gravitas to the jejune crusade of the hypocrites.  And Chief Justice Rehnquist all dolled up in a black robe with four gold stripes on each arm, a design he borrowed from a Gilbert and Sullivan operetta to symbolize the solemnity of the occasion and the momentousness of his own role therein.  Or the late Senator Ted Stevens delicately picking his nose on national television—with his thumb no less, pinky fully extended.  Ted was representing the great state of Alaska where they have to wear mittens nine months a year, so it was very much a homespun skill.
            Bill Clinton probably did perjure himself and do everything else he could think of to obstruct what purported to pass for justice.  That much was obvious at the time but it took the Special Prosecutor nearly three years and $100 million to say so.  In fact it was so obvious that Mr. Clinton was dissembling that no one in the whole world believed the poor bastard when he went on national television to announce, “I did not have sexual relations with that woman.”  His tortured denials were pro forma only.  The unvarnished truth might have sounded like this, “Of course I did it and I had a lot of fun doing it.  No one got hurt.  No disaster befell the country.  It does not even rise to the level of triviality.  It’s nobody’s business except mine, my family’s and Ms. Lewinsky’s.  I apologize to them.  No one else needs or merits my apology.  Next question.”  All of which was true and none of which would have appeased the Lilliputian hypocrites who beset him.  In fact, Mr. Clinton did sort-of apologize on September 11, 1998 at a White House Prayer Breakfast attended by religious leaders.  He said, “I don't think there is a fancy way to say that I have sinned.  It is important to me that everybody who has been hurt know that the sorrow I feel is genuine -- first and most important, my family, also my friends, my staff, my Cabinet, Monica Lewinsky and her family, and the American people.”   The next day, The Washington Post reported:

By the end of his speech, many of the clergy said they had experienced a rare moment of the spirit stirring and were moved to offer their pastoral protection.

"It was a truly holy moment for me," said Rajwant Singh, director of the National Sikh Center in Washington.

"I felt it was a religious experience," said the Rev. Joan Brown Campbell, general secretary of the National Council of Churches.

"It was a scene of biblical proportions," said Rev. Robert Franklin, director of the Interdenominational Theological Center in Atlanta.

            Please note that the President was apologizing for his sins, not for high crimes and misdemeanors.  He paraphrased the classic plea of King David in Psalm 51, “The sacrifices of God are a broken spirit; a broken and contrite heart, O God, you will not despise.”  Kind of an awkward translation but, like David, Mr. Clinton was not repenting for adultery as much as for forgetting that the divine right of kings does not exempt one from the ten commandments.
It is of course a whole other sin to tell a lie but I don’t think Mr. Clinton was apologizing for perjuring himself either.  Rather it sounded like he was ashamed for being unfaithful to the vow he made the day he married—the part that says, “…forsaking all others.”  If so, he was right to be sorry.  A person’s word should be his or her bond.  Lying about it, however, was never an important issue whether he did it under oath or otherwise.  He wasn’t hoping to deceive anyone, all he was trying to do was maintain a modicum of dignity.  We’ve all been there, especially the Republican mountebanks who wailed most loudly about the desecration of family values.
            As I have often said, moral theology is a subtle discipline but the first thing you learn is that, in order to be evil, a sin must actually hurt somebody.  A victimless sin may still be a sin but it is not evil in human terms.  The same thing is true of a law.  A violation becomes serious only when somebody gets hurt.  Lying about one’s sex life may be illegal but only if you think the Roundheads were right and the duty of government is to define and defend public and private morals.  Alcohol is abused by a small minority of people and causes a great deal of social trouble and expense.  But “Demon Rum” was never diabolical except to people like Anthony Comstock and Carrie Nation who thought it could be legislated out of existence.  Perjury is certainly a crime and so is obstruction of justice.  But they are technical crimes—they offend against an abstract principle, the so-called “rule of law.” In moral theology, you owe the truth only to those who have a right to know the truth.  In this case, even the  court had no moral right to the truth about Bill Clinton’s sex life except as it directly affected the plaintiff, Ms. Jones.  Which it did not.  Not a shining moment of high virtue to be sure but hardly an impeachable offense.
            Hatch and Hyde knew all this and they and their cohorts did not care a fig about anything quite so technical as perjury or obstruction.  They were putting the President on trial for adultery, not to convict him which was never in the cards, but to embarrass him.  This was not high crimes and misdemeanors, it was celebrity gossip.  Sex sells.  The Roundheads wanted to sew a big red A on his breast pocket.  The whole idea of sex in the oval office actually turned them on and they were hard pressed to hide their guilty pleasures.
            Okay, we’re coming to the point of this essay.  First, though, in the interest of full disclosure, you should know that I think Bill Clinton was among the best presidents the USA ever had—a smart, charismatic, compassionate leader who was despised by the elitists of the political and media establishments but much beloved by everybody else.  I think it might be a good idea to take a jackhammer to Teddy Roosevelt and Thomas Jefferson on Mount Rushmore and replace them with FDR, Harry Truman and Bill Clinton.  It’s only an opinion but I base it on an assessment of the complexity of the problems they faced and their ingenuity in addressing them.  Mr. Clinton won some and lost some, all the while suffering the indignity of having to deal with the likes of Newt Gingrich’s temper tantrums and the racism of the elites.
            We’re getting closer.  I used to think The New York Times despised Bill Clinton even though it endorsed him twice. It never had a kind editorial word for him.  When it was forced to admit he had done something right, there was always a “but,” always the sense that it was damning him with faint praise.  But I was wrong.  It was not so much hatred as it was fear.  This smart, uppity, sexually voracious Southerner fed into the worst nightmares of editors whose mothers had lived in constant fear that their black maids would steal their silverware, contaminate their China and seduce their husbands.  Make no mistake about this.  As Toni Morrison pointed out, Clinton was our first black President.  “After all, Clinton displays almost every trope of blackness:  single-parent household, born poor, working-class, saxophone playing, McDonald’s-and-junk-food-loving farm boy from Arkansas.”  With certain kinds of Americans, Bill Clinton never had a chance.  Day after day, The Times excoriated him as a “low rent” pervert.  The day after the House Judiciary Committee released his testimony before the Starr Chamber, it piled on with seventeen signed stories in the news hole, a full page of Monica’s testimony, a full page of letters between various lawyers, an 8-page transcript of the testimony itself, an editorial, seven letters to the editor, three op-ed pieces, and a sidebar on the famous dress.  It lacked only a partridge in a pear tree but was otherwise a full, ranting, raving lynch mob.  The Times was not duped by the red meat radicals.  It went willingly, enthusiastically, to wallow in the journalistic swill.  The editors knew he would not and should not be convicted.  They knew his offense was not the end of the world.  They merely wanted to take him down a peg or two, to put him in his place.
            Does this sound like a conspiracy theory of history?  Well, it was a conspiracy.  It was orchestrated by a coterie of wealthy right-wing activists to drive Mr. Clinton from the White House because they did not accept the legitimacy of his election.  The New York Times (and much of the rest of what passes these days for the fourth estate) and the Republican Party merely fronted for them.  The right wing always works through puppets it can buy or seduce.  In 2000, it was a bunch of operatives in Florida that delivered the election to the Supreme Court and, hence, to George W. Bush.  Four years later, they hired another bunch of thugs to libel and slander the military service of John Kerry on behalf of a Republican draft dodger.  Again in 2008, they spent a small fortune on rumors and innuendos that Barack Obama was not American-born, that he was a closet Muslim.  In 2012, Governor Romney’s entire campaign was based on a litany of big and blatant lies.
            Okay, we have arrived.  There is evidence that America is awakening from the long nightmare of reactionary right wing politics.  We may be past the worst of it.  The defeat of the radicals in the 2012 elections represents to me the culmination of a process that began with the acquittal of Bill Clinton in 1999.  People have finally realized that the crazies and the anarchists will stop at nothing to spew their hatred and promote their big lies.  For the moment the worst of contemporary conservatism is in retreat. [2]  Those of us who think progressivism is the mainstream of historical evolution must take no pleasure in this.  We remember with appropriate humility the mindless antics of the extreme left during the 1960’s.
            Extremism—of the right or the left—in the name of liberty or for any other reason is always a threat to a community’s ability to govern itself and, therefore, to freedom itself.  It is particularly dangerous when pressure groups combine to advance otherwise unrelated agendas.  We witnessed this in 1994 when Newt Gingrich engineered a sweeping turnaround in the House of Representatives and then led his fanatical new troops in a march down Pennsylvania Avenue waving copies of their so-called Contract With America.  They represented virtually every right wing issue from pro-guns to anti-abortion.  They included isolationists, fundamentalists, Promise Keepers, anti-immigrants, and opponents of affirmative action, environmentalism and the minimum wage.  The only thing they seemed to have in common was that they were all middle aged white males without a sense of humor.  The question, of course, is who voted for these people?
Notes
1.   For example, the first sentence of Hyde’s obituary in The New York Times, November 30, 2007, read, “Former Representative Henry Hyde, the powerful Illinois Republican who won battles to prohibit federal financing of abortions and to impeach President Bill Clinton but who failed to persuade the U.S. Senate to convict Clinton and remove him from office, died Thursday in Chicago.”  His hometown paper, the Chicago Sun-Times,wrote, “Former Rep. Henry Hyde, the Illinois Republican who steered the impeachment proceedings against President Clinton and championed government restrictions on the funding of abortions, has died.”
2.  Obviously I was as wrong as a person can be to think that the 2012 election represented a turning point in American politics.  Four years later, the Republicans elected Donald Trump as the result of an elaborate conspiracy.  I take a small measure of comfort in the fact that the Don actually lost the election by more than three million popular votes.  Fortunately I learned to accept disappointment early in life by rooting for the Brooklyn Dodgers.


Monday, November 19, 2012

Strip Football


STRIP FOOTBALL

Jerry Harkins



On September 5, 1998, the University of Southern Mississippi Golden Eagles seemed to lose its season-opening football game to Penn State by the score of 34 - 6.  The Nittany Lions went on to post 8 more wins and 3 losses that year and then defeated Kentucky in the Outback Bowl. 

Not so fast my friend, the story was not over.  On July 23, 2012, the National Collegiate Athletic Association declared that Southern Miss had actually won that game and Kentucky had prevailed in the bowl game.  In fact, Penn State was declared the loser of every single game it had played in the 14 years following its loss to Mississippi.  (But see Subsequent Note 2.) In the technical parlance of the sports world, the NCAA “vacated” 112 victories.  This was meant to “strip” Joe Paterno of the title of “winningest coach in major college football history” for the crimes of one of his assistants.  Those crimes, centered around a long-running pedophilia scandal, were horrendous.  The punishment, however, requiring the re-writing and falsification of history, made a mockery not only of the sanity of the NCAA--nothing new there-- but also the probity and integrity of the academic enterprise.  If a university does nothing else, it must be undeviating in the pursuit of truth, as elusive and frustrating as that pursuit often is.  Penn State, however, meekly bowed to a travesty.  They might as well have closed their doors and thrown the keys away.  Instead, early the day before the NCAA announced its decree, the school blanketed the stadium plaza with blue construction tarps to hide what was going on, called in the police to guard the site from crazed fans and removed a 900-pound bronze statue of Mr. Paterno to a secure but undisclosed location.  He thereby became an official non-person six months to the day after he had died.  He was suddenly one with the legion of Russians obliterated by the censors of the old Soviet Union.  He had been disappeared, turned into a non-person by the Penn State Thought Police.

Organizations that govern sports tend to be staffed by low life ne’er-do-wells, has-beens who never rose above the second string but played just enough to get a letter and keep a tenuous grasp on their athletic scholarships.  Universities, on the other hand, tend to be run by academics who are bored by sports except for their salutary effect on alumni loyalty.  This is a lethal combination.  Not only does it confer power on the otherwise inept, but it provides them a platform on what they perceive to be the moral high ground.

In the interests of this higher morality, the immortal Jim Thorpe was “stripped” of his gold medals from the 1912 Olympic Games.  Seventy years later, the International Olympic Committee admitted its mistake.  It didn’t do much good for Thorpe who had died in poverty thirty years earlier.  The redoubtable Tonya Harding was “stripped” of her 1994 national figure skating championship but not her 1991 title.  It was never clear exactly what her crime was other than extremely bad taste in ex-husbands but one thing led to another and she did jail time including service on what passes for a chain gang in Oregon for offenses committed in Detroit. Lance Armstrong was “stripped” of all his cycling victories including his seven consecutive victories in the Tour de France. He was accused of doping and investigated many times at the behest of his opponents and the French racing authorities but never convicted or even charged in a court of law. Pete Rose, one of the greatest baseball players who ever lived was declared a “permanently ineligible” non-person for gambling.  The penalty means that he cannot be elected to the Hall of Fame which, like the Methodist Church, does not approve of gambling.  (Rose also did six months in the federal pen and 1,000 hours of community service for failing to report income from, among other things, gambling at race tracks.)  Marion Jones was “stripped” of her five Olympic medals for committing perjury about performance enhancing drugs.  She did six months in some federal medical lock-up and a half-way house because the judge thought she was a chronic liar.  The athletic bureaucrats (Pooh-bahs) also tried to strip her teammates in the 100 X 4 and 400 X 4 relay races but the accessory malefactors prevailed on appeal.  So, as things stood, three fourths of the American team won those events and one fourth did not.

Jacques Rogge, president of the International Olympic Committee, said he was disappointed by the arbitrator’s decision.  Dr. Rogge is a physician and sixteen-time winner of the Belgian national yachting championship.   As Patrick Sandusky, a spokesperson for the United States Olympic Committee (and no relation to the Penn State child molester Jerry Sandusky) said, “Although we continue to believe that the U.S. medals in the 4 x 100 and 4 x 400-meter women’s relays were unfairly won due to Ms. Jones’s doping, we have always recognized that the athletes who made up the U.S. teams might have a legal basis on which to defend these medals.”  In his youth, Mr. Sandusky who had been a long snapper and reserve center for the Northern Illinois Huskies, turned out to be right.  The court ordered the I.O.C. to pay the athletes 10,000 Swiss francs, about $9,500, toward their legal expenses.  The women are currently  pursuing a case against the U.S.O.C. arguing that in failing to support them it breached a contractual duty. A decision is pending in that case.
“Stripping” has been imposed by The International Olympic Committee 55 times since 1968.  Somehow, though, the Committee never noticed the East German “Doping for Gold” program which involved thousands of athletes over a period of at least 24 years.  But no matter.  The idea is to get ink for the enforcers and you don’t do that by going after non-celebrities.

Let us be clear that we do not approve of child molestation or the use of performance enhancing drugs or, in Ms. Harding’s case, hanging around with unsavory boyfriends.  We regret that such offenses are widespread in our culture and understand that their prevalence does not excuse them.  We are, however, not overly worried that such offenses threaten the very fabric of civilization.  Moreover, we are dismayed by persecutors, inquisitors and prosecutors so desperate for public notice that they seize  upon any opportunity to make life miserable for celebrities.  We are also dismayed by the hypocrisy of the sports bureaucracy and the media.

In announcing its decision the NCAA said, “These events should serve as a call to every single school and athletics department to take an honest look at its campus environment and eradicate the 'sports are king' mindset that can so dramatically cloud the judgment of educators.”  From now on, educators will share the vision of the NCAA in which 7-foot tall basketball players win graduate fellowships to pursue their love of ancient Greek poetry and/or superstring theory.  As punishment for their past failures, the educators will be stripped of their Ph.D.’s.  In keeping with the theory that history is malleable, I have decided to strip Adolph Hitler of the Chancellorship of Germany he acquired on January 30, 1933.  While I’m at it, I hereby vacate Bobby Thompson’s home run, the so-called “shot heard round the world,” of October 3, 1951.  Henceforth, the record books will record that the Dodgers won that game and went on to beat the Yankees in the World Series.  As a result, they never left Brooklyn.

What bothers me about these scandals is that they bring out the worst in the moralistic classes including the press which treats them like the end of the world.  The New York Times is one of the worst offenders.  On October 10, 2012, its lead headline screamed, “Details of Doping Scheme Paint Armstrong as Leader.”  The story was about accusations made by United States Anti-Doping Agency.  That name suggests an official government agency but USADA is actually a private organization dedicated to the purity and integrity of the Olympic movement.   According to is web site, its “Vision/Mission” is, “To be the guardian of the values and life lessons learned through true sport.”  Which is to say, it can destroy careers and reputations without reference to anything so gooey as due process of law and can do so on the basis of testing programs that are of doubtful reliability.  Of course it could not do this without the active cooperation of the vast consortium of sponsors, regulators, promoters, governing bodies and media pundits that lives parasitically on sports.

Obviously sports contests require rules.  Obviously someone has to set standards.  Thus, in baseball there is a rule against pitchers throwing spitballs.  Why?  Well, to make a long story short, because a moistened ball moves erratically and is hard to hit.  Which, of course, is the basic intent of all pitching.  A spitball is not immoral or even unfair;  it’s just a rule we have agreed to in order to encourage more hits.  We draw a line between a spitball and a knuckle ball which is equally hard to hit but a lot more difficult to pitch.  The justification for anti-doping rules is slightly different.  The purpose is to assure a level playing field by eliminating artificial means of enhancing skill.  But make no mistake:  anti-doping rules require the drawing of arbitrary lines.  For example:

·       Millions of Americans take perfectly legal prescription medications to relieve anxiety and thereby give them a competitive edge in their business and other dealings.  Should we “strip” them of their MBA’s?

·       Many more millions of people drink alcoholic beverages to relax and perform more easily in social situations.  This was illegal in the United States for thirteen years and is still illegal in some places, notably (and laughably) in some parts of Texas.  But prohibition did not work and now our rules specify permissible levels of alcohol in the bloodstream.

·       Thousands of musicians smoke illegal marijuana to help them chill out and improve their performance skills.  Should we fill the jails with jazz players?

·       One of the crimes Mr. Armstrong is accused of is drawing his own blood, chilling it and returning it to his body.  This apparently increases its oxygen content which leads to improved performance.  Why is that different from eating a candy bar in the middle of a marathon?  Why does anyone think of Mr. Armstrong’s blood as an illegal substance?  Or, for that matter, oxygen?

The fact is Mr. Armstrong has never failed a drug test.  But the Doping Agency and The Times report that he once withdrew from a competition.  He said it was because of an injury.  They said he was afraid of failing their tests.  How they know that?

These and thousands of other questions are complex and decent people can differ on all of them.  Some people, however, find them so difficult that they are happy to have other people make the decisions for them.  Those other people—famously called the “deciders” by George W. Bush—include the Pope, Pat Robertson and the editors of The New York Times.  The Times churns out some fifteen hundred editorial opinions a year, providing moral guidance to anyone or any institution they think needs their help.  The scope is breathtaking.  The most common action verb is must.  The President must to this, the banking industry must do that, the trustees of Yale University must do the other thing.  Not only does The Times know better than you do, it is better morally and in every other way.

I do not know what Lance Armstrong did or did not do and neither do you.  Nor do I know what Marion Jones did or did not do.  I think Ms. Jones must have lied at one point because her stories seemed to contradict each other.  But if we’re going to put people in jail for lying, there’ll be nobody left on the outside.  I do know people do foolish things.  But I also know people like Mr. Armstrong and Ms. Jones do a great deal of great work for society—work that needs to be done but no one else seems interested in.  On net, I’d rather have a beer with either one of them than with an editor of The Times or a bureaucrat of some private sports-policing authority.  I admire Jim Thorpe a lot more than the modern day witch hunters.  Mr. Thorpe’s crime by the way was playing baseball for money.  Horrors!  He thereby violated the purity standards of the wealthy Olympic Pooh-bahs who never had to work a day in their lives.

In many cases, the actual crimes people go to jail for are not doping itself or knee-whacking but the allegation that they lied about their wicked behavior.  They lied to a federal agent, in Barry Bonds’ case to FBI agents.  Well, he didn’t really lie.  He gave an “evasive” answer.  They tried Roger Clemens for a similar offense of lying to Congress which, given the intelligence of Congresspersons, must be at least as heinous as stealing candy from a child.  They couldn’t convict him but the baseball Pooh-bahs will keep him and Barry Bonds out of the Hall of Fame until hell freezes over.  Add them to the Pete Rose scandal.  Three of the best baseball players in the history of the game.  At least we know Pete really did bet on games.  Gambling:  a clear sign that the Antichrist is at hand.

But lying.  That’s the really infamous crime.  A sin that cries to heaven for vengeance.  Lie under oath and you clearly place yourself with the guys on Arlo Guthrie’s Group W bench.  They had to impeach Bill Clinton for lying about his sex life, for God’s sake.  Anyone who fails to tell the whole truth about his or her sex life is spitting in the face of the public’s right to know and the right of the press to expose every scandal of the day.  They should be branded with a big L on their foreheads so mothers can keep them away from their children.

Bill Clinton went on national television and proclaimed, “I did not have sex with that woman.”  Not a single human being in the whole world believed him for a nanosecond.  He knew that at the time.  He never intended to deceive anyone.  He was only saying what any other male would have said, indeed what any male would be expected to say.  It wasn’t even a real lie, just a misleading little fib.  As anyone educated by the nuns will tell you, there is a clear line between sex and other scandalous activities.  The key is penetration however slight.  But, you say, he lied under oath and that, by God, is perjury!  Maybe, but if so the law has not kept up with the latest discoveries of the medieval philosophers.  A “lie” is something you say that is known by you to be untrue.  But you can only lie to someone who has a moral right to the truth.  And for a lie to be perjurious, it must also be material to the matter at hand.  No one, including the court in the Paula Jones matter, had any right to the truth in the Clinton case.  Only the President’s wife had a legitimate claim.  Furthermore, his behavior had absolutely nothing to do with the Jones case except in the perfervid imaginations of red meat Republicans.

The sins discussed in this essay exist along a spectrum of heinousness and the worse they are the more careful we should be in denouncing the alleged perpetrators.  We need also be very careful before equating a lie with a crime.  If I’m forced to admit doping, for example, to an FBI agent, I am very, very close to being compelled to be a witness against myself in a criminal matter.  I know all about the Fifth Amendment and the first thing to know is that anyone who expects the world to suspend its disbelief has got to be one of God's innocents.

H. L. Mencken said, “The objection to Puritans is not that they try to make us think as they do, but that they try to make us do as they think.”  Americans still think we should all think like Puritans and act as they imagine the Puritans might have acted.  The witch trials of Salem are not quite finished business.

Subsequent Note

Shortly after this essay was posted, Lance Armstrong confessed to Oprah Winfrey that, yes, he had used performance enhancing drugs, essentially saying everybody did it.  It's a safe bet that he's wrong about everybody but, if it's true, then we still have the right to ask what all the fuss is about.  The playing field would have been even and the only ones hurt would have been the athletes.  Of course, it's almost certainly not true which means that he cheated and is still lying about it.  A sad story in the life of a man who has otherwise done a great deal more good in this poor world than those who have been hounding him for years.

Subsequent Note 2


On January 16, 2015, the NCAA settled a suit brought by a Pennsylvania Senate Majority Leader,  Jake Corman, and the State Treasurer, Rob McCord accusing it of stupidity (my word not theirs).  Senator Corman said, "Clearly the NCAA looked to make their own name on the backs of Penn State University instead of doing their own internal investigation or any sort of due process to find out what happened."  The 112 vacated games were "restored" making Mr.Paterno once again college football's "winningest coach."  The fate of the Paterno statue is not yet clear but it will probably be restored if the University can remember where it put it.