ON BECOMING DIGITAL THINKERS
Jerry Harkins*
On the morning of April 6, 1964, I presented a carefully
researched, closely reasoned report on a question posed by the Head of the
Department of Mechanical and Technical Equipment of the United States Army Engineer
School at Fort Belvoir, Virginia.
Three months earlier, he had asked whether the military field computers
of the future would employ analog or digital technology. Now I was addressing him and his top
aides armed with a 3,000-word text and a deck of overhead projector
slides. I had interviewed experts
and learned about the kinds of decisions made by battlefield commanders. I drew up a list of the kinds and
sources of the information they needed and their special requirements for
portability, ruggedness, reliability, and practicality. The take-away was that analog computing
was likely to be the dominant technology for the foreseeable future.
Twenty-four hours later
IBM introduced System 360 thereby rendering my analysis antiquated. Luckily the Colonel was an engineer
with a sense of humor. He recited
the immortal words of Robert Burns: “The best laid schemes o’ Mice an’ Men / Gang aft agley,
/ An’ lea’e us nought but grief an’ pain, / For promis’d joy!”
Not that anyone could operate a 360 or any other
digital machine outside a large air conditioned environment or, especially, inside a
Vietnamese rice paddy. But as the
first general purpose solid state computer, it pointed the way to the future of
everything. Apple’s hand held smart
phone is much faster and has far more computing power than did the 360 but is
nonetheless its direct descendent.
For IBM the 360 represented the third generation of digital computing, a
technology that had evolved in the 1930’s and that was represented in the
popular press by ENIAC developed for the Army and introduced in 1946. Designed by scientists at the
University of Pennsylvania, ENIAC weighed 30 tons and employed 17,468 vacuum
tubes. Its successor was the even
bigger MANIAC designed in Los Alamos and introduced in 1952.
Digital technology works by reducing everything to
discrete bits and bytes and using its vast power and speed to bull through them
to a solution. “Discrete” is the
key word. Every part of every
problem is represented by a series of 1’s and 0’s corresponding to the on and
off conditions of solid state devices such as transistors. This is the meaning of the term
“binary,” the either/or condition in which there is nothing in between. The world, or at least the world we
perceive with out unaided senses, is not like that. There are (or at least our brains think there are) infinite
shades of gray, spectra of color, nuances of meaning, subtleties of expression
and gradations of touch, taste, sight and sound. The digital world can simulate these seamless transitions
but cannot duplicate them. A
common example is digitized music which can reproduce a performance to an impressive
exactitude. Still, an experienced
listener cannot ultimately be fooled. Of
course, the same listener would not be fooled into thinking an analog recording
was anything but a recording in part because the old technology introduced
extraneous noise. Still, the sound
of vinyl is often heard as more “natural” and more “human.” Digital sound is often perceived as too
perfect, too cold, too engineered.
No one has ever heard anything in the real world quite so detached.
Before proceeding, we must pause to recognize quantum theory and
its offspring which suggest that in the subatomic world matter and energy do
indeed exist in discrete packets.
These exotic phenomena are the true building blocks of the sensible
universe. Thus, elegance seems to
demand that all creation must be discrete at heart. If so, it would follow that reality can be better
represented by digital than analog.
At least if digital is taken as discrete or discontinuous. A black and white photograph made on film is
composed not of infinite gradations of gray but of eleven discrete tones. Our eyes are capable of much finer
discriminations—a visit to any paint store will confirm there are more than fifty
shades of gray—but the number is not infinite. No matter, that is not the way we perceive things. Our minds fill in the gradations. Hamlet says, “…there
is nothing either good or bad, but thinking makes it so.” In a similar manner, there is nothing
digital or analog but perception makes it so. And from the time we came down out of the trees, our senses
have suggested that, for the most part, the world is continuous or analog.
True binaries or opposites like “on” and “off” do exist: positive and negative in the algebraic
sense, up and down in the physical sense, absent and present, yes and no,
clockwise and counterclockwise, left and right. But there is a rich class of antonyms that constitute more
complex dualities, things that are not absolute opposites but are contrasting
in varying degrees. The known and
the unknown. Good and evil. Order and chaos. Male and female. Long term and short term. Individual and community. Self and other. In each case, there is an element of
tension, vigorous or feeble, between the members of the pair. Good and evil are certainly opposed but
there are gradations of both. Not
only are these dualities not opposites but, in some cases, they depend on
each other. “Male” is without
meaning except in relation to “female.”
Yet both exist independently.
And both manifest themselves across a single, fairly wide spectrum of
gender characteristics. The common
requirement to check a box for “male” or “female” is almost without meaning. Facebook provides 58 choices for its
members including “neither,” “other” and “pan.” If these don’t work you can provide a customized
descriptive. Imagine Plato
attempting to discuss the pure idea of “male.” Anything he might say must ultimately include reference to
“female.” Like yin and yang, each is half of an idea yet both can stand alone.
Rather than think of such ideas as “opposites,” I prefer to call them
“complementarities.” They are
fundamentals of the human experience.
To take another example, very early in life we come to the realization
of “self” and “other.” The
recognition is simultaneous; “self” implies “other” and vice versa. Except for people with multiple
personality disorder, the distinction between self and other is
absolute—binary—but they are not opposites. “We” addresses a community made up of multiple “self” and
“other” pairs. As we mature, we learn
to subdivide “self” into “me” and “mine” which are different but again not
opposite.
The real world of our senses is complex, ambiguous and
subtle. An analog computer can
target artillery fire very accurately and fast enough for most situations. But the genius of digital technology is
to make it seem simple while doing so with much greater precision. Deep Blue is
an IBM supercomputer that can play chess at the grandmaster level and Watson is
an even more advanced breed that can beat the best Jeopardy players in the
world. But, in spite of decades of
development, neither can carry on a real conversation with a human being. You can ask Apple’s Siri for restaurant
recommendations. She will make
reservations for you and call a taxi to get you there. But don’t ask her to discuss the
differences between Beaujolais and Burgundy or even choose between Pinot gris
and Pinot noir.
In 1950, Alan Turing published a paper purporting to answer the
question “Can machines think?” By
the criteria he established for the famous Turing Test, the unequivocal answer
was that there was no reason to suppose they would not someday do so. Within a few years, researchers at MIT
developed ELIZA, an artificial intelligence system that simulated psychotherapy
sessions between a computer/therapist and a human patient. The developers knew it was therapeutically
primitive but were surprised by how readily it was accepted by its human
users. It seemed to pass the
Turing Test even if many felt it was due to the simplicity of the test rather
than the sophistication of the software.
Today’s “chatterbots” are less ambitious but much improved in terms of
conversational realism. Still, anyone
who has ever used a computerized Help Line knows they are sometimes efficient
but often hateful substitutes for knowledgeable humans.
The question is still being asked as to whether digital computers
will ever reach a stage where they can realistically mimic the human
brain. It would be foolhardy to
suggest such progress is impossible.
However, if it ever does come about, the computer in question will
probably employ some hybrid technology.
Not to worry. At present,
such verisimilitude seems unlikely for economic as well as technical
reasons. For one thing, the
product development emphasis remains on traditional device speed and capacity
both of which put a premium on compression of data. For example, when music or video is streamed, the result is
lower fidelity than that delivered by CD’s or DVD’s. Each successive generation of Apple’s iTunes application has
delivered slightly lower fidelity which most users don’t notice or ignore. The quality of digital photographs
taken by smart phones has increased with each generation of the technology but
is still lower than that of either conventional film or high end digital technology. Cost and convenience are dramatically
improved and even users who notice the decline in quality are happy to make the
compromise. Digital material is
also easier to manipulate. The old
saw, “A photograph doesn’t lie,” was never guaranteed given the ability of
darkroom technicians and airbrush artists to modify what the camera saw. But programs like Photoshop make it
easy to alter an image completely.
Whether the intent is benign or malicious, this level of control alters
the nature of truth itself. A viewer
cannot know anything about what the camera actually saw or how the photographer
felt about it.
This is an ancient problem. Writing and printing, music,
photography and other “extensions” of human senses (to borrow Marshall
McLuhan’s locution) are all languages.
The great semanticists of the 1930’s knew that language acts to
constrict thought in order to serve the interests of communication. Poetry, for example, is a compromise
between logic and emotion effected through the intermediation of vocabulary,
grammar and other conventions. It
proceeds from mental imagery we refer to with words like inspiration, intuition
and imagination, processes that cannot be quantified. Moreover, every language treats the constriction process
differently so that the architecture of a language accounts for a significant
part of the world’s cultural diversity.
Language is the central tool of knowledge but it
also burdens us with an unavoidable problem. Except for the possibility of telepathy, there is no way for
people to communicate anything other than the most basic ideas without using
twice-translated symbols. A
speaker or writer chooses a word to convey an idea that is not a word and the
listener or reader translates that symbol into his or her own mental
construct. Both run the risk of
treating the word as though it is the
idea it refers to. But really it
is merely an undisciplined coding system that is more or less analogous to the
way we think.
Digital language is altering culture not only
because numbers are more abstract than words and binary numbers are more
abstract than continuous numbers, but also because it is insidious. The 1’s and 0’s are hidden from view and
the vast majority of people are unaware of them. To a noticeable extent, it has already made us digital
thinkers. As a society, we
have become data-driven. We assess
our scholarship by counting the number of times our publications are cited in
other publications, our visual culture by the prices investors pay for art at
auction and our spirituality by the frequency of church attendance. We follow opinion polls assiduously at
a time when societal dispositions are conspiring to significantly reduce their
validity and reliability. We
reduce a sport like baseball to a vast number of sums and ratios and believe
religiously that the data can help us predict the future. But such numbers are merely imperfect
surrogates for the things they purport to measure. They are never the referents themselves.
The idea that Galileo’s dispute with the Holy
Inquisition was about heliocentricity is simplistic at best. It was rather a debate about the nature
of reality. Galileo, of course,
was right about the earth’s rotation around the sun because he had observed and
measured it. But he was wrong in
believing that the laws of nature are written in mathematics or, if not wrong,
he was merely expressing an ingenious but unsupportable opinion. To me, mathematics is a human
construct. It is often quite
useful but Pi
is only a useful ratio. Indeed, the “laws of nature” are
nothing more than artifacts. There
is nothing to suggest that “life” in some parallel universe must be
carbon-based or that the hare will always outrun the tortoise. Pythagoras was able to prove that the
expression:
A2
+ B2 = C2
defines the relationship between the hypotenuse
and the legs of a right triangle. His
discovery turned out to be one of the most useful in the history of science. But it is not a right triangle. It derives from one aspect of the
triangle. Similarly, 261.625565
hertz describes one important aspect of Middle C but is not music.
The human race has turned a corner in its evolution. Until recently, energy was what Daniel
Bell called the “strategic and transforming resource” of the developed
world. It was energy that, when
harnessed and applied to raw materials by technology, added value to them. If you date the industrial revolution
from James Watts’ first practical steam engine in 1776, it lasted about 200
years. Today energy has been
displaced as the great value adder by information harnessed and applied by digital
technology. The information
revolution is different from anything that preceded it. For one thing, information is the only
resource that cannot be depleted.
The more we use it, the more it grows and the more valuable it
becomes. For another, it is
ubiquitous, instantly available to anyone anywhere and acting as though it
wants to be free. For better and
sometimes for worse, it both enables and compels the global economy.
In 1964, I had a conversation with a scholarly officer
at the Command and General Staff College at Fort Leavenworth, Kansas. He showed me a sandbox configured to
display the terrain and troop dispositions around Cemetery Ridge near
Gettysburg, Pennsylvania on the morning of July 3, 1863. Twenty or so students were supposed to
re-think Lee’s strategy and revise Pickett’s tactics accordingly. The Army wanted to computerize the
exercise so that each student’s responses could be critiqued in real time. Given the sheer number of variables and
the fluidity of the situation, it would be hard to imagine a more complex
problem whether it was considered from the perspective of a commander on the
ground or a computer programmer in a software laboratory. The textbook answers had failed Lee,
Longstreet and Pickett. Although
the Confederate attackers outnumbered the Union defenders by more than 2 to 1,
their artillery could not see the enemy through the smoke and their infantry
occupied the lowest part of an undulating field. The software was supposed to assess each decision every
student made and to display the results in the same amount of time it would
have become clear to the battlefield commanders in 1863. Today’s most sophisticated computer
games are nowhere near as complex and I am not sure today’s digital computers
could handle the challenge. Given
that feedback time was meant to correspond to real time, an analog system would
have a better chance especially if a student came up with a brilliant new
solution. The analog machine
“thinks” more like a creative human being.
It reassures me to believe that neither technology
could whisper the only good solution in Lee’s ear: Don’t do this;
get the hell out of here.
Brave new world indeed! There is an important advantage to be gained by thinking
digitally or numerically and that is it may nudge us away from the scandalous
scientific illiteracy that besets our society. If, as we think, language is the springboard of culture,
constant exposure to the language of science is sure to ease our anxieties about
it. The caveat is we must
recognize that science, like every other approach to knowledge, employs a
metaphoric language. For all
our history we have thought, reasoned, imagined and even dreamed in metaphors. Dealing with their imprecision, their
fuzziness has served us well. It
is important that we disconnect our young people from their smart phones long
enough to teach them to use the brains they were born with.
_______________
*Jerry Harkins is a statistician who, long ago, spent
two years as a member of the Army Corps of Engineers rising to the rank of
Specialist Fourth Class, the same pay grade as a corporal. In addition to regular turns at KP and
guard duty, he was a member of a small unit of displaced draftee academics that
undertook special projects related mostly to automated learning systems. He is well aware of the epistemological
problem posed by quantum weirdness even if he remains stubbornly unable to
understand anything else about it.
No comments:
Post a Comment