Tuesday, November 10, 2020

 

BY THE NUMBERS

 

Jerry Harkins

 

On March 21, 1927, Werner Heisenberg published an early version of a paper with the unpromising title “On the Perceptual Content of Quantum Theoretical Kinematics and Mechanics.”  Seven months later, the basic idea expressed in that paper was the subject of an intense discussion among 29 of the world’s most brilliant physicists who ultimately agreed that what came to be known as the Heisenberg Uncertainty Principle stood to revolutionize theoretical physics.  It did much more.

The Uncertainty Principle holds that the precise position and momentum of a subatomic particle can never be known simultaneously.  Any attempt to measure them both at a single moment is subject to an error of at least Planck’s Constant, h, divided by 2pi.  This is an infinitesimal number.[1]  But in the world of the subatomic particles of which the universe is made, it is very significant.  Moreover, it became a philosophical issue well beyond the realm of theoretical physics.  Artists and academics in many disciplines came to think that it implied that at the core of reality there is an important, inherent and invincible element of ignorance.[2]  Coupled with Einstein’s General Theory of Relativity, it profoundly shaped our understanding of reality in a way that was not flattering.

Throughout our history, ignorance and its alter ego, curiosity, have inspired both fear and attraction like the moth and the flame.  But Uncertainty raised the ante.  In essence it claimed that ignorance is, ultimately, at the heart of all knowledge.  It is King Solomon’s vanity of vanities, Plato’s Cave, Sartre’s nothingness, Kurt Gödel’s Incompleteness Theorems and E.E. Cummings’ “no of all nothing” written large.  It seems to alienate us from our history and culture and from every human construct of art, science, philosophy and religion.  If you can never know anything for sure, reality may be only a dream of which we are merely an illusory part.  As King Solomon laments:

 I applied my mind to study and to explore by wisdom all that is done under the heavens. What a heavy burden God has laid on mankind.  I have seen all the things that are done under the sun; all of them are meaningless, a chasing after the wind.

Uncertainty bothered Einstein[3] who challenged it at the Fifth Solvay Physics Conference with one of his famous “thought experiments.”  At one point he went to the blackboard to correct what he thought was an error in Heisenberg’s equations but he wound up devising a mathematical proof of the theory which carried the day.[4]  Perhaps without realizing it, he also highlighted  the fact that Uncertainty and quantum mechanics itself were not so much laws of physics as they were of mathematics.  Subsequent history has witnessed a similar phenomenon in every branch of physics and in several of the social sciences such as anthropology and psychology.  Economics and finance have been dominated by mathematical modeling for many decades.  Practitioners of both hard and soft sciences have been drawn to mathematical analysis, some for its supposed precision and others for its ability to deal with imprecision.  This represents the flowering of a discussion that goes back to the debates of the ancient Arabic philosophers who invented (or discovered) many of the foundational methods of mathematical analysis.

A good example of these debates was the trial of Galileo Galilei for heresy by the Holy Inquisition in 1633.  Like most stories from history, this one is usually oversimplified as  merely a contest between science and religion.  More importantly, it hinged on what the principals thought about the nature of mathematics.  Galileo believed that mathematics is more than a tool useful for describing nature but is actually the ultimate reality or at least as close as we can come to that.  It can be thought of as the Platonic pure idea behind all other pure ideas.  His chief Inquisitor, Cardinal (later Saint) Robert Bellarmine, took an almost opposite position.  To him, mathematics was a human artifact, a tool useful for measuring certain aspects of things but always with the caveat that the Bible was the repository of all truth as revealed by the creator of all truth.  The Bible, of course, seemed to teach that the sun revolves around the earth.  To the modern reader, this may seem primitive but, in 1633, most people still relied on the common sense observation of the sun “rising” in the east and “setting” in the west.  Bellarmine knew that Galileo might prove to be right and worried about how theology would cope with the consequences. 

The great German mathematician Carl Friedrich Gauss referred to mathematics as "the Queen of the Sciences” by which I think he meant that science could not exist without mathematics.  Gauss was also one of the Enlightenment thinkers who laid the groundwork for modern probability theory and statistical analysis which, of necessity, became the principal mathematical tool for quantum theory.  If the Uncertainty Principle is correct, then the best we can do is describe the probability that that an event is occurring, has occurred or will occur or, in other words, to assert or predict reality.  So, in a sense, probability is reality.  To the poor benighted creatures who are restricted exclusiveley to the world of their senses, this proposition is absurd.  An apple released from its tree will fall down with a probability of +1.  It will not ever fall up.  Moreover, if it is subject only to gravity, its acceleration will accelerate by 32 feet with each succeeding second so that at the end of the second second it will have travelled 96 feet.  In exactly the same way, so will a cannon ball and so will a feather.  Of course, all these objects will be subject to more than gravity, for example, to air resistance which will affect them differently. 

Now you will be forgiven if you think the law of acceleration is every bit as weird as the notion that two particles can communicate with and affect each other instantly (that is, infinitely faster than the speed of light) even if they are at opposite ends of the universe.  But you can verify acceleration while the kinetic behavior of particles cannot be seen or heard but must be inferred from indirect evidence.  You can’t see a Higgs boson.  Sir Peter Higgs couldn’t either but he was able to detect the trail it left.  That trail had been predicted mathematically.  So the boson itself is “real” only at two removes, the trail and, ultimately the mathematics that predicted the trail. 

In the not-too-distant past, physics was a branch of metaphysics but it has now evolved almost entirely into a branch of mathematics which is why the default photograph (or cartoon) of a physicist shows him or her standing by a blackboard filled with abstruse formulas.  It has also extended its compass to include phenomena that lie outside both the sensory universe and realm of traditional logic and even conventional language.  It traffics in theories that are incomprehensible to non-specialists because ordinary language is not suitable for describing them.  Mathematics works because it is the language of abstraction.  Like language, every kind of math is a system for combining symbols.  But while the word “porridge” symbolizes a nutritious dish, you cannot eat the word.  Of course, you can’t eat a scientific theory either but, increasingly, you can’t even conceptualize it unless you understand its language.  Once more, you are left with the brilliant but infuriating insight of Marshall McLuhan, “The medium is the message.”



Notes

 

[1] Planck’s Constant is 6.62607015 X 10-34 joule seconds.  A joule second is the energy of one watt over a period of one second.  Piof course, is the ratio of a circle’s circumference to its diameter.  It is an irrational number commonly given as 3.14 or 22/7.

 

[2] The influence of Uncertainty on a wide range of artists is discussed in two earlier essays.  “The Gentle Joys of Maybe” and “The Gnostic Glow” which can be accessed respectively at my blog, Jerry’s Follies. 

 

[3] Einstein was strongly committed to the idea that all knowledge is knowable.  The quotation attributed to him to the effect that God does not play dice with the universe is probably apocryphal but he did say, “Raffiniert is der Herr Gott, aber Boshaft ist er nicht”  (“God is subtle but not malicious”).  

 

[4] As a statistician, I am hugely attracted to the Uncertainty Principle but, lurking in the back of my mind is the heretical suspicion that it may not be true.  I believe I could devise a thought experiment in which a particle at rest is energized in a collider.  The collision would simultaneously activate a double switch that instantly measures both the mass and the momentum of the target particle at the moment it moves.  Of course, the clinker in this fantasy is hidden in the word “instantly.”  But “almost instantly” would at least reduce the error so that it might ultimately be less than Planck’s Constant.  As the instruments got closer and closer to instant, the error might become negligible.  (Another problem with my fantasy is the notion of a particle at rest which may be a state antithetical to the particle’s nature.  True, Hans Dehmelt and Wolfgang Paul won the Nobel Prize in 1989 for their creation of a “magnetic trap” which held an electron still for ten months but there is no evidence the electron enjoyed being held.)