Thursday, August 10, 2006


THE GENTLE JOYS OF MAYBE [1]


Jerry Harkins




Pete Seeger tells a story about a king who asked his wise men to reduce all the world's knowledge and wisdom to a single book he could use to educate his son. It took them nearly a year and, when they were done, the king challenged them to distill the book down to a single sentence. Five years of intense labor and constant debate followed until, at last, the wise men were able to agree on the sentence, "This too shall pass." Again the king sent them back to the academy. "Find me," he commanded, "the single word my son can live by, the irreducible essence of everything we know." Ten years passed. At length, the sages, old, gray and worn with disputation, returned. "Sire," said their leader, "the one word at the heart of all wisdom, is maybe." [2]

The Day Science Acquired A Sense Of Humor. Paul Johnson began his survey of Twentieth Century history with the remarkable claim that, “The modern world began on 29 May 1919 when photographs of a solar eclipse... confirmed the truth of a new theory of the universe.” The theory was General Relativity and Professor Johnson rightly says that it revolutionized the way people perceive reality. It was also, he argues, the first piece of high science to be followed by the mass media. [3] Eventually, millions of people did come to know the world had changed even if, as was also widely reported, only three of those people actually understood what Einstein was talking about. To the extent that this was true, it may have been due to the fact that the press itself displayed remarkable confusion about relativity. It still does. The famous formula e = mc2 became journalistic shorthand for everything in the modern world that befuddled people — a sort of modernist mantra replacing the old saw, "It's Greek to me."

In fact, compared to most modern thinkers, Einstein was a classicist. It was not until 1930 that Werner von Heisenberg really upset the Newtonian apple cart by demonstrating that nature does not permit us to know the mass and the momentum of a particle at the same moment, a phenomenon he called the Uncertainty Principle in recognition of the implication that we cannot ever fully know anything. There is, it seems, an important, inherent and invincible ignorance at the heart of our knowledge of the universe. Around the same time, Kurt Gödel was proving that there is a class of mathematical problems that have no technical solution, that is, there are questions without answers. His two Incompleteness Theorems argued that logical systems lack the ability to describe mathematical systems and cannot be proven to be free of inconsistency. Einstein hated the entire drift of things. "God," he later said, "does not play dice with the universe!" He may yet be proven right, but uncertainty — in art and philosophy as well as in science — is, for the moment, well established as the defining characteristic of modernism.

For those who are finicky about the benchmarks of history, then, 1919 was a false alarm. The modern world really began on Saturday, October 25, 1930. It was the last day of the Sixth Slovay Conference. Heisenberg had been working on uncertainty for several years but that was the day the titans of physics, led by Einstein himself, conducted a famous thought experiment that convinced most of them it must be true. Science had suddenly acquired a Zen-like overlay which at first seemed extremely threatening.

A Brief History of Science. The activities we refer to as "science" are a way of knowing things based on systematic observation and dispassionate analysis. These tools have evolved gradually as a response to the experience of intellectuals that the explanations of the divines, derived from revelation and tradition, are invariably out of sync with the knowledge we gain through our senses. The church might reason by biblical exegesis to the conclusion that the sun revolves around the earth, but Galileo, who had actually looked at the solar system and measured its movements, knew differently. Even as he recanted under the paternal guidance of the Holy Inquisition, he muttered under his breath, "E pur si muove," — in effect, might and right make strange bedfellows. [4] Science replaced revelation and deductive reasoning first with systematic observation and induction, then with experimentation and statistical inference.

In spite of its apparent challenge to divine order, there is something irresistibly comforting about scientific truth. [5] The square of the hypotenuse of a right triangle equals the sum of the squares of the other two sides. Always. Every action creates a reaction which is equal in force and opposite in direction. There are no exceptions. Certainty, whether religious or scientific, is seductive and gradually the certainties of science became a substitute for the crumbling dogmas of theology. This state of affairs lasted a bit more than two centuries until, in 1859, Darwin published his theory of evolution. The Origin of Species started an epistemological war that has persisted to the present day.

For the first sixty years or so, the battle was joined largely over a symbolic issue, the descent of man. The fundamentalists believed (and some still do) that humankind was created from the clay of the earth on Friday, October 28, 4004 BCE. [6] They accept without question the report of Genesis 2:7 that "...God formed the human of dust from the soil" and reject without hesitation any notion that they might personally be related to a monkey. This, at least, is the controversy that always gets all the ink. [7]

But a more troubling problem lurked just below the surface of evolution. If species evolve, exactly how do they do it? How does a frog turn into a prince? Evolution posits a fair amount of one thing turning into another and it's hard to imagine God bothering to map it all out in advance. William Blake asked the tiger, “Did he who made the lamb make thee?” But more to the point, did God see the tiger in the paramecium? Or did she leave it to chance? It’s fine to say that the paramecium did not turn into a tiger, that both were created simultaneously out of nothing. But, like Galileo, the biologists have actually looked and the fundamentalists can only rejoin that looking is cheating. [8]

In one of history's truly brilliant insights, Darwin had proposed that environmental changes — sudden or gradual — present new challenges and opportunities to old species. Some members of a species evolve into new species by responding with adaptations that prove to be advantageous. Darwin called this "natural selection" and others referred to it as the survival of the fittest. [9] Still, no one knew exactly where these adaptations came from or why some individuals adapted while others did not. When the answers did begin to emerge they were more threatening than even Bishop Wilberforce had imagined.

Essentially, the early neo-Darwinians discovered that gene pools carry many variations within them some of which find frequent expression (e.g., hemophilia); others of which are rare (e.g., albinism). [10] The variations are often (but not always) harmful to the individuals that carry them. But they persist in the gene pool against the day that the environment changes and they become advantageous. These variations arise primarily — and this is the real sticking point — from seemingly random genetic mutations. A cosmic ray from deep space happens to pass through a sex cell and re-arranges its DNA. It just so happens that that sex cell is the one in a billion that gets a chance to reproduce. [11]

This model is now known to be incomplete and new models have been devised to account for the speed and direction of change. Still, the Darwinian challenge is seen by many as heretical, not because evolution is thought to be beyond the ability of an omnipotent God, [12] but because it suggests that our most basic sense of reality rests on the quicksand of chance. This implies a dictatorship of capriciousness; like Alice, we react in horror to the idea of a croquet game in which the mallets are flamingos, the balls are hedgehogs and both are free to act without regard to law, convention or prophecy. We are the slaves of our genes which, in turn, are at the mercy of random packets of energy that zip across the universe for millions of years only to wind up making mischief with them. There is no grand design, no meaning, no memory, no destiny. There is only chance, indifferent and implacable.

"...un grand peut-etre" The Uncertainty Principle deepens the Darwinian dilemma and restores the tyranny of Plato’s cave. Contemplating the confused state of nuclear physics in the wake of Bohr’s 1913 planetary model of the atom, it occurred to Heisenberg, that the laws of classical physics might not apply in the world of the infinitely small. This insight led him in two directions, one explicating uncertainty itself and the other applying it to the creation of a probabilistic description of fundamental reality.

Quantum mechanics seeks to describe the structure and behavior of sub-atomic particles — the more or less familiar electrons, protons and neutrons that comprise the atom, the photon which is the basic unit of electromagnetic radiation, and the six sets of quarks and anti-quarks currently thought to be the basic building blocks of the universe. It turns out that these are not really particles and that they behave in ways that seem radically “weird” to us. For example, they appear to be able to exist in more than one place simultaneously; under certain conditions, they travel faster than the speed of light; and some of them are “entangled” in ways that let them influence each other instantaneously over vast distances. [13]

Perhaps the most bothersome characteristic of these manifestations is that they seem to exist in an indeterminate state until someone measures them. In other words, the measurement imparts the reality. If a photon were a child and a teacher gave it an IQ test, the act of testing would force the child to choose between smart and non-smart. This is bad enough but now, enter uncertainty. Since the teacher is measuring intelligence, we cannot know whether the child is a boy or a girl. Indeed, it would be neither; its sex would be indeterminate. Moreover, if the child had a twin and you wanted to know only its hair color, it wouldn’t have any until you looked. But the instant you looked at one child, it would “become,” say, a redhead. And so would its twin even if the two were located at opposite ends of the universe. Of course, you couldn’t know where either one was actually located. But the color information would travel faster than light and the more barriers you tried to place in its path, the faster it would travel.

You can see why quantum weirdness inspires truly amazing flights of fancy in the public press: it is for the same reason editors gravitate to man-bites-dog stories. But there is no paradox and, properly understood, not even a decent puzzle. When we try to measure the photon, we are applying standards from our own perspective where things exist in time and have dimension. But from the photon’s perspective (or from the perspective of an observer on the photon), the universe is quite different. There is no up or down, no right or left, no here or there and, most importantly, no now and then. At this level, nature is indifferent to such artifacts and only begrudgingly allows us to fumble around with our silly rulers.

Silly or not, we do want to understand things from our own worm’s eye view. Thus, just as the Victorians responded to Darwin in the context of their horror at being descended from monkeys, the modernists assumed that, if the building blocks of the universe are fundamentally insubstantial, then so must we be. To those who thought about such things, uncertainty seemed a cold, ominous doctrine, playing upon the racial nightmare of free fall. It seemed to separate us from history and ethics and, in some ways, from meaning itself. Like a fifth horseman, it brought death to the soul by mocking our intellectual attainments, our social constructs and our moral imperatives. If uncertainty is the ultimate fruit of the tree of knowledge, then all that is left is to lament with Solomon that everything is literally vanity. For in an uncertain universe, the plight of the individual — the grand theme of the Renaissance, the Age of Reason and the Romantic Era — is of no significance. If nothing is certain, then nothing is sacred and nothing is important and nothing is interesting. Life is, as Macbeth observed, “…a tale told by an idiot, full of sound and fury, signifying nothing.”

Probable-Possible, My Black Hen. You remember her: “She lays eggs in the Relative When. / She doesn’t lay eggs in the Positive Now / Because she’s unable to Postulate How. [14] You can’t blame the poor hen for being confused, and it’s not her fault that the eggs she lays are not very nourishing. Roosting, as she does, on the Left Bank — she’s that sort of hen, you know — she spends her days consulting with intellectuals who frighten her with visions of the sky collapsing on the ash heap of philosophy. Now you may protest that it is overstating the case to claim that uncertainty gave birth to existentialism or that existentialism was responsible for the nihilistic tendencies in modern culture. [15] But there is no doubt that, in the wake of uncertainty, the existentialists felt compelled to restore some sense of meaning to individual experience. The old ideas about human dignity had to be jettisoned, they felt, under the crushing burden of an indifferent universe. Perhaps they tried their best. They kept insisting on their own optimism while writing about the most depressing themes: dread, alienation, instability and, my personal favorite, the "nullity of existential possibilities." Sartre was actually trying to say something positive about the transcendence of the human mind over the uncertainties of physical being. There is a perfectly good French noun for this — transcendant — but Sartre chose néant — nothingness. L'Être et le néant, eh, bien? He meant it in the sense of no thingness, lacking substance, not concrete. But he had to have known the subtlety would escape almost every reader. Like other European philosophers of the time, having experienced the abattoir of World War I—or at least its aftermath—and having felt compelled to justify it, Sartre was not by nature an optimist and his insistence to the contrary is merely pathetic self-deception. [16]

It is not surprising, though, that many of the best and brightest sank into a Slough of Despond. Uncertainty killed philosophy and came very close to killing art. A urinal is not transformed into art by being mounted on a gallery wall, nor is it rendered anti-art by being disconnected from the plumbing. [17] A blank canvas is no more or less uplifting than any other sophomoric statement. Four minutes and thirty-three seconds of silence is not music simply because it is divided into three movements and published on staff paper. Whatever else you may think of such things, they are all expressions of nihilism, of surrender to the blind forces of the cosmos.

But the impulse to make art is strong and many artists soon began to grapple with the new sensibility. Eugene O'Neill, for one, saw, "...the death of the old God and the failure of science and materialism to provide a new one" as the central challenge to dramatists. Like many others, he turned to Freud and Jung, seeking new truths in the interior landscapes they were illuminating. For a time it worked, and psychoanalytic insight supplied the raw material or the methodology for a great deal of art. But it wore poorly when it became apparent that it consisted of nothing more nourishing than facile talk.

The dilemma for the artists was very real. The challenge arose from science, a language they did not speak. So they sought out and fell prey to those who purported to be intermediaries, one-eyed kings promoting half-baked sciences. In addition to psychoanalysis (scientific shamanism) they chased the gurus of Marxism (scientific hooliganism) and cultural anthropology (scientific gossip mongering). They produced a fair amount of great art but also more than their share of impenetrable novels, inarticulate poetry and incomprehensible paintings. Academia, in turn, responded with incoherent critical theories purporting to explain the inexplicable. [18]

Eventually, though, artists came to understand that uncertainty had never been the black hole of despair it seemed at first. In the place of discredited certainty, it offered probability which turned out to be attractive in its own right. In a universe devoid of certainty, probability restored the faith that, as unknowable as things may be, they are not capricious. Indeed things are more or less predictable within the parameters of well established laws that are intuitively satisfying. Probability is objective without compromising the possibility of surprise and ecstasy. It is operational without being a straitjacket. It was never certainty that we had to learn to live without. It was rather the collective myth that any particular certainty was the source of whatever small security we could muster and the resulting urgency we felt to keep uncertainty at bay.

These realizations took decades to gain a foothold, and in the interim, modern art lost much of its audience. But gradually uncertainty began to seem natural, comfortable and even — properly appreciated — playful as a source of thematic material. We smile with recognition each fall as Charlie Brown kicks off the new season rebelling against the almost perfect certainty that Lucy will snatch away the football at the crucial moment. [19] Our delight reflects the power of almost — of maybe — in our lives. Maybe, this time, the universe will be benevolent. The chances are remote, but they keep Charlie tilting bravely at his windmills.

Probability is the measure of maybe. It is virtually but not perfectly certain that Lucy will, once again, send Charlie flopping head over heals. You expect it. You smile because you know that Charlie is staring down the near certainty of his own humiliation, and you admire him for doing so. He plays the game not because he is good at it, but because he is unwilling to concede the certainty of defeat. Charlie is like us; we are modernists: certainty is boring; it is the enemy of interest, engagement and passion. Oh, we still seek it out, but we have learned that the pleasure is in the chase.

Probability encouraged artists to explore the regions between the known and the unknown, between order and chaos. Some tilted one way, some the other but these regions are the medium of modernist art and, thus, in McLuhanist terms, its message. John Cage, for example, composed his Music of Changes (1951) in which every musical element was determined by a throw of the I Ching. Think of it as order imposed at random. Earle Browne took the idea a step further in December 1952 whose “score” consists of pages of abstract typography meant to be interpreted musically by performers. Indeterminism was not new to music; Charles Ives had experimented with it as early as 1906 in his seminal work The Unanswered Question. But in Cage, it is the subject as well as the medium of the work. Other composers tilted toward order. The serialists, notably Karlheinz Stockhausen and Pierre Boulez composed almost by formula but then made a clear separation between composition and performance, often asking the performers to improvise.

Many of these artists took themselves too seriously, as people with an almost religious mission. They were ultimately relegated to the academy and to the least accessible reaches of critical theory. But others caught the liberating spirit of uncertainty and probability. Alexander Calder reinvented sculpture with mobiles that relied on random breezes to impart movement to them. The art of Calder is not in the randomness but in the control of random elements to produce movement that is both lyrical and slightly magical. In a similar vein, Lawrence Ferlinghetti reinvented the sonnet, that most rigorous of poetic disciplines. He substituted not blank verse but an almost random meter that relies on assonance — In Goya’s greatest scenes we seem to see… — instead of traditional syllabification. In “Coney Island of the Mind,” the lines have anywhere from nine to thirty-four syllables and in the earlier “11” from Pictures of the Gone World, there is one line of 89 words and 120 syllables.

In retrospect, this coming to grips with the uncertain was probably inevitable given our thirst for the gambling proposition, and it is no coincidence that gamblers, not Newtonian physicists, were the first to work out the laws of probability. [20] The Latin word for game is ludus which also has the sense of a trifle, an amusement, hence the English word ludicrous. Probability is an element of all games. It is the essence of roulette. It plays a minor but crucial role in most competitive sports. America loves a winner but only if there is a chance that the best player will lose. Babe Ruth was both the home run king and the strike out king. You never knew.

Magic exerts a similar pull. You see it, and you know it can’t be true. But for just a moment, the magician forces you to think that, this time, the laws of nature have been repealed. The assistant has been cut in half at the waist and has then been restored. This time, there is no other explanation. Magic does not work for a child too young to know that it cannot be true. And it does not work for the sophisticate who knows how it was done. It can only work for the person capable of gasping, however briefly, at a miracle.

Lastly, we should note the bond between magic and religion. Death is the eternal verity and religion teaches us to die, ameliorating the certainty of death with the possibility of salvation or rebirth. Religious magic — whether sacramental such as transubstantiation or liturgical such as glossolalia — is always a metaphor for immortality. It is very strong magic compounded equally of faith and skepticism, hope and reason, fierce courage and naked fear.

As it turns out, truth is elusive. It yields only tentatively to science, art, religion and philosophy. Its pursuit is our destiny and perhaps also our doom. We eat the apple of the Tree of Knowledge and are cast into a journey whose destination is unknown. Like Plutarch’s boatman, we look astern while rowing forward. If probability is the measure of maybe, then maybe is the measure of our progress, our claim to glory and the source of much of our joy.

Scholarly Notes and Diversions

1. In the spirit of Our Lady’s Juggler, this essay is dedicated to the memory of George M. Glasgow, late Kavanaugh Professor of Speech in Fordham University. Dr. Glasgow was a gentleman, a scholar and a true eccentric who believed that the only decent way to study logic was to cultivate fallacy.

2. Like all good stories, this one is hard to track down. Pete Seeger used it to introduce "Seek and You Shall Find" on Waist Deep in the Big Muddy and Other Love Songs (Columbia Records, CS 9505). He got it from his father, the noted musicologist Charles Seeger. The first half of the story, ending with the line, "This too shall pass," had previously been used by Abraham Lincoln in a speech he gave in Milwaukee in 1859 and the line appears by itself in Nathaniel Hawthorne's 1860 novel, The Marble Faun. As a child, I remember being told that when Caesar paraded in triumph through the streets of Rome, a slave rode with him in the chariot repeatedly whispering in his ear, "Etiam transebit!"  This too shall pass! It made an impression.

As long as we're at it, we may as well save a few footnotes by owning up to the other dubious quotations cited in this paper. Einstein probably never imagined God playing dice but you will find the famous metaphor attributed to him in Philipp Frank's 1947 biography and it has been repeated so often that the great man should have said it even if he didn't. Presumably he did say, "Raffiniert is der Herr Gott, aber Boshaft ist er nicht," roughly, God is subtle but not malicious — because that line is engraved on a mantle piece at Princeton University. For years, it escaped me that this is an elegant pun because Boshaft, malicious, is almost homophonous with Botschaft which, in some contexts refers to the gospels.

Similarly, Rabelais, asked on his deathbed where he expected to go, probably did not say, "Je vais chercher un grand peut-etre" — I seek to go to a great perhaps. Still, it seems perfectly consistent with the character of one of history's greatest cynics.

Those of us who have never imagined being subjected to the tender mercies of the Holy Inquisition need not inquire as to what if anything Galileo might have or should have said. The sotto voce remark attributed to him is probably a pious fraud perpetrated by admirers who worried that he might seem insufficiently defiant.

Finally, it has been nearly 40 years since I encountered Eugene O'Neill's remark about the death of god and the failure of science. I think I remember it being in a letter to George Jean Nathan but I have never been able to find it again and have no reference for it.

3. Modern Times: The World From the Twenties to the Eighties, Harper and Row, 1983, p. 1. Professor Johnson says (p. 3) that the Eddington experiments, "...aroused enormous interest throughout the world. No exercise in scientific verification, before or since, has ever attracted so many headlines..." He is stretching the truth more than a little bit. The New York Times did report the eclipse the next day in an unsigned, three inch dispatch from, of all places, Chicago where it was not visible. It was on Page 6. It did not, then or later that year, mention Eddington, Einstein or relativity but focused on speculation about a "big cloud" on the eastern edge of the sun. The cloud metaphor was simply an inept stab at describing an especially brilliant solar flare. There had been lucid pre-eclipse descriptions of the Eddington experiments in the British journal Nature (Feb. 6, 1919, p. 444 f.) and in a Scientific American column of May 3, 1919 (which, however, failed to mention Professor Eddington). I suspect Johnson was thinking of the renown that relativity did gradually attain as the 1920's progressed. By 1931, Herman Hupfeld could reasonably write the lyric "...we get a trifle weary of Mr. Einstein's theory" for "As Time Goes By," the song made famous 11 years later by Dooley Wilson as Sam in Casablanca. For the record, Edington took 16 photographs of the eclipse which showed several stars that were not where they should have been because the their light was bent by the sun’s gravity. The “bendability” of light was one of the most important implications of the General Theory of Relativity.

4. This is a fascinating story. Galileo believed — wrongly I think — that the book of nature is written in mathematical characters (see his letter Il Saggiatore of 1623). The Church, on the other hand, believed — rightly in my view — that mathematics is a self-referential system that does not bear on physical reality except incidentally and fortuitously. Mathematics can describe certain aspects of reality but it is purely a figment of the human creative imagination. The numbers are no more the reality than notes on a staff are the music or the word is the thing. Please bear this in mind as we discuss “reality” as it appears to the photon. Time and dimension are measurements of human perceptions. They do not compute for photons.

5. Even the church found science hard to resist and for every scientist it persecuted there were a dozen who enjoyed its patronage. From Roger Bacon to Gregor Mendel, an impressive number of scientists were monks and science itself often masqueraded as natural philosophy. They got away with it by the simple expedient of publishing in Latin which constituted no threat to the faith of the masses.

6. This, of course, is the sixth day of creation (in the Julian calendar) according to Bishop Ussher's calculation which is still found in the marginal notes of many bibles. James Ussher (1581-1656) was the (Anglican) Archbishop of Armagh from 1625 and Primate of Ireland from 1634 to 1640. He was an accomplished scholar but his dating of the creation, "...could not have been more wrong" in the view of Steve Gould. (See Gould, Stephen Jay, "Fall in the House of Ussher," Natural History, November, 1991, p. 12 ff, reprinted in Eight Little Piggies, Norton, 1993.) Still, Gould finds it, "...an honorable effort for its time" and urges us not to ridicule it. Well, of course not. Still, it is hard to resist having a bit of fun at the expense of an Irish Calvinist whose view of biblical inerrancy was considerably to the right of Luther. I suspect anyone with any sense of metaphor would not devote 2,000 pages of dense Latin prose to deducing a date for the creation. And life is much too short to spend a lot of time worrying about the sensibilities of anyone without a sense of metaphor.

7. Fundamentalism refers essentially to the belief that the Bible is the inspired word of God and, therefore, contains no error. As such, it has been an important issue in Christianity since the Reformation. But modern fundamentalism is a direct outgrowth of the debate over evolution. It emerged gradually from about 1890, became an institutional force in the United States in 1919, and was elevated to the status of dogma in a coup engineered by W.A. Criswell at the Southern Baptist Convention of 1979. Harold Bloom, no friend of the neo-fundamentalists (he calls them Know-Nothings only because, “…Fascist has never domesticated itself as an American term,” whatever that means), offers an incisive understanding of their position. In their quest for bedrock certainty, the neo-fundamentalists have proposed not only that the Bible is free of error but that inerrancy is the prime meaning of the Bible. “Neo-fundamentalists want a densely substantial inerrancy, a truth beyond language, beyond ambiguity, beyond any possibility of refutation” (The American Religion, Simon and Schuster, 1992, p. 230).

8. It is often said that evolution is only a theory, that no one has ever seen one thing evolve into another in nature. Such people know very little about orchids.

9. This unfortunate phrase was coined by one of Darwin's great explicators, Herbert Spencer, but Darwin embraced it immediately much to the regret of generations of his defenders. In truth what is interesting about evolution is that unfit variations survive against the day that they might be needed. Of course Darwin had no way of knowing this.

10. My mother always said, “What’s natural isn’t wonderful” by which I think she meant that values such as beauty and intricacy are not mysterious even if they are hard to explain. The incidence of albinism in the population is about 1 in 17,000. That of hemophilia B is 1 in 40,000. But the gene for hemophilia is actually expressed much more frequently — 1 in 7,500 live male births. Sadly, hemophilia is often fatal while albinism is not. The persistence of hemophilia in the gene pool is hard to imagine in terms of selective advantage but there it is against the day when it might be needed.

11. This is not quite right because it takes many such mutations to produce a noticeable effect. This was the central point of the 1969 Wistar Symposium Mathematical Challenges to the Neo-Darwinian Interpretation of Evolution (Wistar Monograph No. 5, 1967) in which the mathematicians proved that there has not been enough time for one form of hemoglobin to evolve into another, just one of millions of changes needed for an ape to become a human. The great Neo-Darwinian Ernst Mayr responded, in essence, “Yet evolve it certainly has.” Shade of Galileo! Both sides, of course, were right. Evolution has occurred. That is a fact. We do not yet fully understand how it happened never mind why. Probably we never will.

12. Fundamentalists think the fossil record is the spawn of Satan but orthodox Christians have never flatly condemned evolution even if they often wished it would go away. On October 23, 1996, Pope John Paul II published a formal statement endorsing the theory of evolution "...as more than just a hypothesis." Several months later, Steve Gould (Note 6) devoted his This View of Life column to the pope’s statement, noting its conformance with his own view that “…evolution is both true and entirely compatible with Christian belief.” The entire Curia, I am sure, breathed a sigh of relief. Meanwhile, I was the only one who noticed that the pope had chosen to publish his address on the 5,999th anniversary of the Creation.

13. Quantum weirdness serves our upscale media in much the same way that celebrity gossip provides fodder for supermarket tabloids. The Times, for one, cannot resist paradoxical headlines. For example, some years ago (September 16, 1997, Page F5), it reported on a series of experiments in which two colliding beams of radiation were used to produce an electron and a positron. The results, according to The Times, confirmed a longstanding prediction, “…that light beams colliding with each other can goad the empty vacuum into creating something out of nothing.” This sentence betrays a very profound ignorance. First, one cannot goad a vacuum into doing anything. Second, the notion of an empty vacuum implies that the author thinks there may be some other kind. But what is really disturbing is the idea of creating something out of nothing. That is not what happened. Energy got converted into matter, just as Einstein said in 1905 and just as the longstanding predictions predicted. The author of the article in question, Malcolm W. Browne, is not himself profoundly ignorant. Rather he is the victim of a journalistic culture that values the spectacular claim over accuracy and that routinely indulges in sloppy editing.

14. In case you’ve forgotten, this is from The Space Child’s Mother Goose by Frederick Winsor, Simon and Schuster, 1958.

15. What can you expect from a Jesuit-educated writer? To me, nihilism refers not to an obscure political/ philosophical movement that arose in Russia in the 1850’s but to a strain of cynicism about reality that infects numerous intellectual fashions of the last century and a half. There is a certain kind of academic to whom the great issues of 2,000 years of Western philosophy are trivial. Meaning itself is a bourgeois affectation. Deconstructionism, for example, postulates that all texts undermine themselves because meaning inheres in the world independently of any human attempt to represent it in words. It follows from this view that the "meaning" of a text bears only an accidental relationship to the author's conscious intentions. Thus, trying to understand the last two sentences is futile.

16. Sartre’s politics were more significant than his philosophy. He was a committed Stalinist until the Soviet invasion of Hungry in 1956. He then became a Maoist, making the shift from the diabolical to the merely insane with ease and a certain amount of grace. He ultimately denounced Being and Nothingness as the product of “aristocratic idealism” but it was a strange renunciation. It seems to have been the book itself or maybe the process of writing it that was being rejected, not its contents.

17. "Fountain" by Marcel Duchamp was completed in 1917, two years before the modern world began which argues not that art caused uncertainty but that uncertainty fell on fertile soil. When he submitted it pseudonymously to the avant garde Society of Independent Artists for exhibition, it was rejected.

18. By the early 1960's, it had become evident that literary criticism in particular had gone off the deep end. There were so many isms afloat that they became immune even to parody. One disgruntled Berkeley professor produced a casebook containing an even dozen fictitious critical appraisals of Winnie-the-Pooh only to have it taken seriously (See: Crews, Frederick C., The Pooh Perplex, E.P. Dutton and Co., 1965). One of his essays, for example, postulates that the "...stories in the book are really dictated by compromises between the edificatory wishes of the Milnean voice and the self-indulging wishes of the Christophoric ear." And this was before the advent of deconstructionism which killed literature entirely in a chamber of toxic verbiage.

19. Charlie Brown is Existential Christian, a modernist version of John Bunyon's Pilgrim seeking the way, the light and the truth in this Vale of Humiliation. Lucy, of course, is the devil — not the medieval Prince of Darkness, but the boulevardier of heaven we meet in Job and as Sportin' Life in Porgy and Bess. Snoopy, naturally, is God, again, not the fierce God of Genesis but the somewhat less engaged version you run into in mainline Protestant churches. (Readers suspecting the writer of making all this up are invited to consult The Gospel According To Peanuts by Robert L. Short, John Knox Press, 1964. Those who think it blasphemous should consider that it could be worse. In Frank L. Baum's version of the story, The Wizard of Oz is God and he turns out to be a frightened little nerd with a big megaphone. In The Book of J, Harold Bloom casts Yahweh as a sexually frustrated Jewish-American prince.)

20. Perhaps “gamblers” is not completely accurate to describe the professions of Gerolano Cardano (1501-1576), a physician, Pierre de Fermat (1601-1665) an attorney, and Blaise Pascal (1623-1662), a mathematician. Each made multiple contributions to mathematics and none can be said to have specialized in probability. Rather, all were attracted to it through a shared interest in games of chance.