Sunday, July 19, 2015


TAKING STOCK:  THE MADNESS OF MARKETS

Jerry Harkins

“The current crisis has demonstrated that neither bank regulators, nor anyone else, can consistently and accurately forecast whether, for example, subprime mortgages will turn toxic.”
                     —Alan Greenspan, March 10, 2010


Poppycock!  Total, utter nonsense!  The truth is that it is in the nature of bubbles to burst.  The truth is that you can’t make silk purses out of pigs’ ears.  The truth is that anyone who tries to slice and dice subprime mortgages—or subprime anything else—and thereby turn them into investment grade securities needs his or her head examined.  The run-up to the crash of 2008 was an obvious bubble and the subsequent meltdown was entirely predictable.  Its ramifications were near catastrophic and are still being felt around the world.

The markets for stocks, bonds, commodities and other asset classes play several crucial rolls in all capitalist societies.  Economically, they provide the capital companies and governments need to promote innovation and efficiency.  Socially, they provide the basis for virtually all savings especially those meant to support major family expenditures such as housing, education and retirement.  They act as the principal means of encouraging sustainability in terms of resource utilization and other social goals.  By their very nature, they act as insurance that the long term interests of both the economy and the society are taken into consideration in the decision-making process.  Or they should do all these things.  But the crash of 2008 was compelling evidence that these functions were being jettisoned in favor of the extremely dangerous idea that markets are nothing more than ultra short term, high stakes casinos.

There had been early warning signs.  Amidst the euphoria characteristic of the markets around the turn of the century, there were some naysayers, among them the executives of Procter and Gamble who saw portents of stormy weather.  In early 2000, P and G had been looking forward to profit growth of 13% above 1999.  That would have been spectacular performance but on the morning of Tuesday, March 7, 2000 the company released a statement saying that earnings for fiscal year 2000 would likely be only 7% higher than 1999.  By any rational standard this would still be outstanding performance.  But, the minute the stock market opened, P and G took a 30-point or 33% hit.  Within three minutes, $36 billion of its market valuation simply evaporated.  Poof!  By week’s end, the stock shed an additional $4 billion.  It wasn’t a complete surprise.  The market had already taken the company down by 25% from its 1999 high.  Even at 13%, it was no longer a sexy “new economy” company. 

Proctor and Gamble was and is among the bluest of the blue chips.  Founded in 1837, it has always been one of the best managed entities in the world, right up there with IBM and GE.  It is as recession proof as any company can be, home to a roster of global consumer brands including Ivory Soap (1880), Crisco (1911), Tide detergent (1946), Crest toothpaste (1955), Pampers diapers (1961), Folgers coffee (acquired 1963), Bounty paper towels (1965), Clairol hair coloring (acquired 2001) and dozens of others.  By contemporary standards, it is big but not huge.  In 2013, its sales amounted to $84 billion and its net earnings were about $11 billion.  Its market valuation was $220 billion, only $10 billion less than Facebook.  In 2014, it ranked 31 on the Fortune list of the 500 largest U.S. corporations.

In a rational world (or in an efficient market), it simply could not have lost 33% of its value in a matter of minutes.  But what happened to it was not uncommon in 2000 as one company after another performed well but not well enough to satisfy the voracious expectations of growth that Wall Street had come to believe were its birthright.  Stock prices were now being governed by these expectations rather than by conventional measures of value.  James J. Cramer, a pundit for TheStreet.com, taunted traditional value investors by writing, “All of that price-to-earnings flotsam and discount-to-normalized-earnings jetsam didn’t save you today.”  Still, the idea that investors would bail out of P and G because it might grow by only 7% was a sign that something was very wrong in the stock market, something that has gotten progressively worse in the succeeding fifteen years and that threatens both the economic and the social functions of markets.

There had been other warnings.  On October 19, 1987, “Black Monday,” the Dow Jones Industrial Average [1] lost 22.6% of its value on volume that was twice the previous record.  It was the worst day in the history of the market and no one knew why.  It subsequently inspired a vast speculative literature but to this day no one really knows why it happened.  Then and now, much of the discussion centered around “program trading” which was initiated by computerized systems responding to various “trigger” events.  But there had been no obvious triggers.  There were concerns about rising interest rates and an unexpected increase in the trade deficit but there is nothing unusual about such concerns or about Wall Street being surprised by them.  Wall Street is easily surprised.  It was also in some ways a slow motion crash.  On the previous Wednesday, the Dow Jones had lost a then-record 95 points and two days later it shed another 108 points.  On Monday, intense selling pressure overwhelmed the trading systems which were reporting over an hour late thereby adding to the anxiety.  But it was not Armageddon.  The market began to recover on Tuesday suggesting that traders knew it had been oversold.  Two months later, it ended the year up slightly over 1986 and began an historic rise that lasted twenty years with only minor interruptions.  On a long term chart, the 1987 crash appears as a mere blip.  Still it was not your grandfather’s classic panic and the psychology of the market has never been the same.

Extreme crashes are blessedly infrequent but since 1987 global markets have witnessed a spectacular increase in average volatility. [3] For example, in the first quarter of 2015, the DJIA posted an increase of 0.008%, a virtually flat performance.  But on March 30, it had a gain of 263 points, more than 1%.  The following day, the last session of the quarter, yielded a loss of 200 points.  This was not unusual.  Since 1987, the indices have often gone down 1% or more one day and up 1% or more the next day and such volatility can continue for weeks on end.  Why?  It has absolutely nothing to do with the underlying value of the economy or even for its prospects.  Most recently, oil futures prices and interest rate fears have taken much of the blame.  When oil is down, though, the market may rise or fall.  When the Federal Reserve is thought to be signaling the end of low interest rates, it is taken as cataclysmic news.  We seem to be living through a period when tea leaves set off earthquakes.  The market has become a gambling den for professional players who constantly strive to game the system by shedding risk and passing it off to others.  They are true believers in the adage that there’s a sucker born every minute.

Again this is nothing new.  As St. Paul points out, “…the love of money is a root of all kinds of evil.”  He might have added that the love of money can readily transform itself into the love of more money which we call greed.  And greed seems to be an inherent concomitant of free markets and a constant challenge to regulators.  It is essential that all market participants have incentives but it is human nature that incentives tend to become addictive and require more and more to satisfy the hunger.  Like all addictions, the process is insidious corrupting some players in ways they may not even notice. Some very smart people wind up doing stupid things in the belief that they won’t be caught. 

Shortly before he went to jail for insider trading, Ivan Boesky famously told students at the University of California, “I think greed is healthy. You can be greedy and still feel good about yourself.”  This line became famous when paraphrased by Gordon Gekko in “Wall Street” (1987) and by Larry the Liquidator in “Other People’s Money” (1991).  But it is not true.  Once natural self-interest metastasizes into avarice it becomes both sociopathic and self-destructive.  The trick is to recognize and excise the cancer before it reaches that point.  This is easier said than done as is evident from the headline scandals of the last generation.  Among them were Archer Daniels Midland (price fixing, 1995), Enron (accounting fraud, 2001), Tyco (racketeering, 2002), WorldCom (accounting fraud, 2002), Health South (Bribery, 2003) and, of course, Madoff Securities (Ponzi scheme, 2008).

What is captivating about this list is the diversity of the alleged crimes from simple theft to the most exotic financial skullduggery.  The Tyco executives were accused of stealing $600 million from the company by giving themselves unauthorized loans and issuing stock fraudulently.  It was little more than a mugging with the money going to high living including a $2 million birthday party for the Chariman's wife and a $6,000 shower curtain for his bathroom.  It actually hurt almost no one including the shareholders who have seen their stock rise nicely.  The Enron case was quite the opposite:  a grab bag of elaborate almost incomprehensible financial schemes meant to present to the world the picture of a huge, successful energy company.  The outside world knew it as Number 6 on Fortune Magazine’s list of the largest 500 global corporations.  In fact, it was nothing of the sort.  It had already been sucked dry by gross mismanagement and Wonderland accounting designed to conceal the losses and bleed California electric ratepayers.  The key players were paying themselves enormous salaries and bonuses but, like Tyco, those had little long term effect on the company.  Mostly it seemed these geniuses were covering up for their own operating mistakes.  In any event, the important point is that very few people could ever claim to understand their financial machinations.  In this regard, Bernie Madoff was a throwback to a simpler era.  So simple that he was able to fool regulators and clients for twenty years or more depending on whom you believe.  Ultimately the law caught up with him.  His penalty did nothing to deter the next dozen or so Ponzi schemes.

As bad as these scandals were, they did little to affect the market as a whole.  Enron, for example, hurt a lot of people, mostly shareholders to the tune of $74 billion (including employees’ pension savings) and creditors totaling about $67 billion.  However, substantial recoveries were realized and the total actual loss (not counting overpayments by Californians for their utilities) was probably just under $11 billion.  The scandal also contributed to the market downturn of 2000-2003 but its impact was dwarfed by the events of 9/11.  The Madoff scandal may have exacerbated the 2008 meltdown but, again, its was overshadowed by the subprime mortgage fiasco which represents a whole new level of financial deviltry.

Traditionally, residential mortgages were safe investments.  For one thing, people who were not creditworthy generally could not get them.  For another, mortgagees tend to prioritize their mortgage payments each month.  Beginning in the 1980’s, however, banks began to market adjustable rate loans to borrowers with less than excellent credit.  This virtually guaranteed that the default rate would increase.  About the same time, Michael Milken was discovering that high yield or “junk” bonds issued by financially stressed corporations could be packaged together.  Each such package would have a known risk of default which would be fully reflected by their high yields.  In essence, this meant that they could be marketed as though they were high quality investment grade securities.  Milken’s theory might work in an environment of  stable markets but it was soon being applied to packages of subprime mortgages and other instruments that were unstable by definition.

“Securitization” itself can be a useful financial tool and it is important to separate the theory from the insider trading and other abuses that came to be associated with its practice.  Unfortunately, the “rocket scientists” of Wall Street could not leave well enough alone.  They began to create more and more exotic investments by separating out every conceivable aspect of the underlying loans and focusing exclusively on their short term performance prospects.  Given a basic package consisting of, say, 100,000 subprime mortgages, an investor, usually a bank or an insurance company, could buy an instrument that exposed it to only the principal repayment stream or only the interest stream or to some combination of both based on the average yield and/or the average maturity.  The possibilities got even more abstract when investors were able to buy and sell puts and calls on options or future contracts or engage in custom tailored “swaps” of any combination thereof.  In the monkey-see-monkey-do world of high finance, it soon became possible to make exotic bets on interest rates, weather conditions and space on oil tankers.  And this was only the beginning of a process that has rendered investments increasingly abstract and increasingly divorced from the real world and its long term interests.  In many cases, Wall Street sold instruments that had no palpable connection to any underlying value.  In the most notorious cases the bankers who engineered these instruments did so specifically in order to bet against them and thus against the customers who bought them.  It was roulette.  For the customers, Russian roulette.

We have already noted that the increasing volatility of the stock market is not caused by anything in the real world.  A good example of Wonderland markets is the price of oil.  By now we are accustomed to sudden spikes in the price of gas at the pump and we have learned to pay attention to the price of crude oil.  Supply and demand play a small role in this phenomenon but the real culprit is the little understood futures market.  A futures contract is an agreement to buy or sell some commodity—in this case 1000 barrels of crude oil—at a given price on a given date in the future, often 90 days from the date the contract was originally made.  These contracts can be traded at any time during their life and are usually traded many times before they expire.  A few players actually want to buy or sell the oil but the vast majority have no interest in or ability to cope with the real stuff.  They are gamblers, pure and simple, who will eventually settle their positions in cash.  They will be the first to claim they provide important liquidity to the market which is true except that the market needs only a tiny fraction of that liquidity.  Meanwhile, their speculation overwhelms the market for real oil and sets the price for vital commodities like gasoline at the pump and home heating oil.  It is as though the value of the dollar were determined by the Las Vegas blackjack tables.  The same insanity applies to the markets for wheat, soybeans, bacon and, if you remember the movie “Trading Places,” frozen orange juice concentrate.

Future contracts were invented 5,000 years ago in Mesopotamia and have been essential enablers ever since in agriculture and large scale extractive industries.  They provide a level of financial stability to actual producers and users of a wide range of commodities.  Future contracts on interest rates and other financial benchmarks are important risk management tools for banks.  But as a form of gambling, they become more complex over time.  Complexity increases the probability of what mathematicians call “cascading failures” of their component parts which lead to catastrophic failures.  As this is written, the major disaster scenarios are related to “algorithmic trading” most notably to the form known as high frequency trading (HFT), a strategy that redefines the notion of short term from minutes to microseconds.

In today’s market, 2% of the traders employ HFT and related strategies which allow them to buy and sell huge positions instantaneously.  The most popular of these strategies account for between 50% and 75% of daily trading all of which is transacted by computers programmed with highly sophisticated algorithms, statistical models that track a wide variety of trigger events.  The SEC has said that such a system caused the “Flash Crash” of May 6, 2010.  It began with the sale of 75,000 “E-Mini” futures contracts worth $4.1 billion at 3:32 PM.  (E-Minis are bets on the Standard and Poor 500 Index.  Each contract has a nominal value of 50 times the value of the Index at any given moment.)  The May 6 sale triggered other black box systems with the result that E-Minis came to define the real value of the real securities in the Index.  Within 3 minutes, the it dropped 3%.  Between 2:41 PM and 3:00 PM, the Dow dropped 998 points or 9.2% of its value and then turned around and regained most of the loss closing the day down 384 points or 3.2%.  For a half hour, the market experienced a total disconnect from anything resembling the real world. [4]

If all this were merely a case of the inmates running the asylum it would be a tightly circumscribed problem of little concern to most people.  But the stock market is too important to ignore even at a time when fewer than half of Americans are exposed to it.  The various commodity markets are crucial to those directly involved in producing and using the commodities.  Real estate is central to almost everyone.  In short, the financial system is the foundation of the economic, social and cultural architecture of the entire community and it is in desperate need of reform which will not be easy.  Even if the solutions were obvious there is at present no political appetite for reform.  Throughout the developed world there is an epidemic of self interest and economic disarray that bodes ill for the kind of concerted and determined action that was achieved at Breton Woods in 1944.

We tend to forget that money is an artificial construct and that it and the systems involved in storing and circulating it are highly abstract, fragile and reactive.  Neither dollars nor diamonds have any intrinsic value but are “worth” only what someone is willing to pay for them. [5] The systems used to regulate markets tend, on the other hand, to be clumsy compromises based on political ideology and are therefore slow to recognize changing conditions.  Moreover, any changes that are made in the governance of the financial markets must be coordinated with equally profound changes that need to be made in the broader economy.  We face a daunting challenge.  But as timeframes collapse and the global economy becomes ever more interconnected market contradictions become less tenable and more threatening.  Time is short and the river is rising.

Subsequent Events

In the first quarter of 2020, volatility went through the roof of every stock market in the world, supposedly because of the pandemic caused by the coronavirus.  Actually, the down days seemed like nothing but a typical panic in the face of uncertainty.  It was as though the computers thought markets were normally certain and therefore risk-free.  The up days, on the other hand, looked like computers grasping at straws.  It was not edifying.

Earlier, Navinder Sarao who had singlehandedly caused the Flash Crash (see Note 4 below) was convicted in a Chicago courtroom and sentenced to a year of home confinement at his parents' house  in London.  He apologized for his prank, attributing it to Asperger's Syndrome, and claimed to have "found God."  The leniency was supported by the prosecution which thanked him for educating them about exotic stock market fraud.

Notes

1. The Dow Jones Industrial Average (DJIA) of 30 large U.S. companies is a reasonable and widely understood simulacrum for the U.S. equities market.  There are several more accurate measures notably the Standard and Poor’s 500 or the Russell 3000 but all are fairly well correlated and the Dow Jones has the advantage of a much longer history.  The Dow of course is no longer restricted to industrial companies but includes such as American Express, Disney, Goldman Sachs and Wal-Mart.

2.  Not everyone.  In 2013, Pew Research Center reported that investors were primarily white, middle aged college graduates with annual incomes over $75,000.  Currently, slightly less than half of Americans are exposed to the stock market either directly or indirectly, the lowest participation rate in many years.  The recent high was 65% in 2007, the year before the real estate bubble burst and the economy crashed.  Most Americans are not good savers (in part because wages have not kept up with inflation).  The average net worth of American adults is about $301,000 but the median is only $45,000, another reflection of economic disparity.  Most of that net worth is in real estate.

3.  The academic literature is full of analyses of every conceivable kind of volatility almost all of which is useless in the real market.  I use the term here to discuss the fluctuation of price over a given period of time.  Obviously the greater the change and the shorter the period the higher the volatility.

4.  On April 21, 2015, British authorities arrested one Navinder Sarao on a 22-count complaint of market manipulation issued by the U.S. Department of Justice.  Mr. Sarao is a London day trader accused of causing the Flash Crash by “spoofing,” a technique of entering thousands of large but fake buy and/or sell orders simultaneously in an attempt to influence HFT algorithims.  Conservative publications rushed to his defense and legal pundits emphasized that the government would have a hard time proving its case.  The charges against him carried a prison term of 380 years which may have been what encouraged him to cop a plea to one count.  He was sentenced in late 2017.  Anyone seriously interested in the mechanics of high frequency trading and their sociopathic implications should read Flash Boys:  A Wall Street Revolt by Michael Lewis (Norton, 2014).

5.  Intrinsic value is another one of those slippery terms you struggled with in Economics 101 and this is not the place to attempt to resolve it.  Once upon a time, gold was defined as being worth $32 an ounce.  Why?  Because some economists and diplomats said so.  It was arbitrary but useful in a simpler age.  The price (and therefore the value) of gold is now set by the gold market which, like the stock market, is made up mostly of gamblers.  A painting by Pablo Picasso recently sold at auction for close to $180 million.  Please note that no combination of canvas and paint has an intrinsic value of $180 million.



Thursday, July 16, 2015


OLD TIMES THERE ARE NE’ER FORGOTTEN

Jerry Harkins



In the aftermath of the horrendous massacre at Mother Emanuel AME Church in Charleston, the Republican Governor of South Carolina, Nikki Haley, led a courageous and successful campaign to remove the Confederate battle flag from the grounds of the State House.  The next day, Congressional Republicans from the South introduced a bill to allow the display of the same flag in national cemeteries and permit the sale of souvenirs printed with its image by the National Park Service.  As they pointed out, it would be legal at gravesites only one day a year on  Confederate Memorial Day which was instituted in Georgia one year after Appomattox and is still celebrated in ten of the eleven rebel states.  The bill would have passed but for the embarrassment already suffered by the party over the racist rants of Donald Trump who is running first in the run-up to their presidential race.

Anyone who thinks the Civil War ended on April 9, 1865 at Appomattox Courthouse should pay a bit more attention to the assassination of Abraham Lincoln six days later.  Nor did it end with the implementation of the 13th, 14th and 15th Amendments to the Constitution in 1865, 1868 and 1870 respectively.  As soon as the Federal troops withdrew from the South, the confederate states began to pass a variety of Jim Crow laws which gained acceptance with the decision of the Supreme Court in Plessey v. Ferguson in 1896 legalizing the infamous doctrine of “separate but equal.”  Fifty-eight years later, the Court reversed itself declaring in Brown v. Board of Education that separate educational facilities are inherently unequal.  For years after that thousands of southern billboards proclaimed “Impeach Earl Warren.”  The fact is the Civil War continues to this day.

America has come a long way toward the ideals expressed in its Declaration of Independence but it still has a long way to go.  The fierce resistance to Brown in the South led a frustrated but unanimous Supreme Court to approve the use of busing to force integration in 1971 (Swann v. Board of Education).  It was dealing with state mandated segregation in the public schools of the South—in other words de jure segregation—but the same remedy was later approved for de facto segregation in the North.  The reaction was spectacular.  Boston didn’t like forced busing anymore than Montgomery did and there was rioting in Southie, the Irish and Italian enclave of Beantown.  Busing was not perfect but it improved things enough so that it gradually waned in the North and the more rational precincts of the South.  In less sensible precincts such as the suburbs of Atlanta, there arose new forms of resistance.  Parents who had fled to the white suburbs to avoid integration in the cities now sent their children to all-white “Christian” academies.  If that didn’t work, they turned to home schooling.  The Lost Cause, after all, is well worth a generation or two of poorly educated children.

As recently as 2011, former House Speaker Newt Gingrich (who represented those same Atlanta suburbs mentioned above) denounced another unanimous Supreme Court ruling that required the city of Little Rock, Arkansas to obey court rulings on desegregation (Cooper v Aaron, 1958).  Since he was running for the Republican presidential nomination at the time, a spokesperson quickly claimed that was not what he meant.  Of course that raises the question just what he was thinking of when he said he would send the police to drag the justices before Congress to explain their criminal behavior.

No, the Civil War is not over in Dixie.  The Lost Cause is alive and well, not everywhere in the South but probably among a majority of white folks in most places.  Several states have active secession movements endorsed by many of their elected officials including former Governor Rick Perry of Texas who thinks Texas should secede again and he should become President of the United States.  He is not the first Southern politician to hold such a self-contradictory fantasy.  In the immortal words of three-time presidential hopeful Governor George Wallace of Alabama, “In the name of the greatest people that have ever trod this earth, I draw the line in the dust and toss the gauntlet before the feet of tyranny, and I say segregation now, segregation tomorrow, segregation forever.”  It is often said Wallace recanted his racist views in the years before his death but what he was really saying was he regretted that the media never understood the difference between his political philosophy and his social beliefs.  It probably did not occur to him that both were depraved.

I love the South.  I have often enjoyed its legendary hospitality and its sense that life should be enjoyed.  On one memorable occasion I was visiting rural Alabama on a fact-finding mission for a huge paper-making company.  I was invited to play softball with the factory team which was fully integrated and was welcomed by the players, their wives, children and sweethearts.  The barbecue after the game was the best I ever experienced.  I bow to no man in my love of fried green tomatoes.  I think of racism not as a character defect but as an historical inevitability.  Slavery was America’s original sin.  Fittingly it was an economic sin first brought about by the fact that cotton was a labor intensive crop just at the time when cotton fabric was enjoying spectacular growth in demand.  Once the cotton gin was invented in 1793, the slave economy began a long decline.  As it did, the status of the slaves became an increasingly thorny problem first recognized by the northern abolitionists.  What next?  For slaves and masters both, what next seemed unanswerable.  Most Southerners opted to ignore the problem and war came.

Shortly before the end, Abraham Lincoln delivered his second inaugural address.  Speaking of the opposing sides, he said, “Both read the same Bible and pray to the same God, and each invokes his aid against the other. It may seem strange that any men should dare to ask a just God's assistance in wringing their bread from the sweat of other men's faces, but let us judge not, that we be not judged.”  It was a noble but impossible dream.  Slavery was an evil that had to be judged as such.  But not in the South where instead it was explained away.  The former slaves were said to be ignorant, childish, devious, lazy and, of course, frightening.  We cannot be forced to associate with them or expose our women and children to them.

Such primal hauntings exist in the deepest recesses of the subconscious.  They are preverbal but subject to access by unscrupulous politicians and preachers who master catalytic code words.  One master of the form was George Wallace.  Mark the rhetoric and the cadences of that “Segregation Forever” speech:

…this Cradle of the Confederacy, this very Heart of the Great Anglo-Saxon Southland
…the heel of tyranny does not fit the neck of an upright man
We intend, quite simply, to practice the free heritage as bequeathed to us as sons of free fathers.
…the false doctrine of communistic amalgamation.
States rights.  Freedom of association.  Crime in the streets.  Welfare reform.  Teenage pregnany.  Gang warfare.  Slums.  Absentee fathers.  Be afraid:  the vomitus of racial hatred.  As we gear up for the 2016 presidential race, you will hear a whole dictionary  of code words employed mostly by Republicans seeking their party's nomination.  Not a single one of them – not Mike Huckabee and not even Donald Trump – will admit to holding any untoward thoughts about race but if you add up their stated positions you'll find 75% of them are threatening to the interests of all minorities but especially to black people.

This too shall pass.  Slowly perhaps. Too slowly.  But it has been passing ever since Brown.  The massacre at Mother Emanuel and a rash of individual black deaths at the hands of police have nudged it further toward the dust bin of history.  But before it can receive a fitting burial, the South and especially Southern politicians must concede that the Civil War is over.  The issue was settled in blood and treasure 150 years ago.  The South fought well against tremendous odds but it fought not in defense of liberty but in support of one of history’s greatest evils.  We can not re-write history or re-live it.  It is hard enough to live with the contemporary consequences of slavery which include poverty and the fragility of black family life.

The South lost the war.  No amount of political claptrap or academic pettifoggery can change history.  No amount of noxious nostalgia can convert the sow’s ear of racism into a silk purse of folk wisdom.  There may have been a time when the white people took refuge in such nonsense but it was always fantasy.  All up and down de whole creation / Sadly I roam, / Still longing for de old plantation, / And for de old folks at home.”  Don’t believe it.

The Old South is literally gone with the wind and good riddance.  The New South has been proclaimed several times but until recently always on the wings of some variation of the right wing creed, first Jim Crow, then anti-unionism, then as a congenial corporate headquarters location.  Industry, shameless as always when confronted with the prospect of higher profits, fled to the South because it attracted companies with low wage, anti-union policies.  The politicians got away with it because things had never been different.  The South, since the days of slavery, has been a stronghold of David Ricardo’s Iron Law of Wages which holds that companies naturally and properly pay wages just sufficient to keep the workers’ bodies and souls together.  There have been exceptions of course but Southern DNA is structured to promote the interests of the Jeffersonian freeholding class which are invariably at odds with those of the working class.  In this regard, Southern blacks are disadvantaged by class and race both.

The South will rise again.  So proclaimed the slogan of the Redeemers, an unholy alliance of landowners, businessmen and professionals formed during Reconstruction to suppress or banish the freed slaves and their allies, the northern carpetbaggers and southern scalawags.  Less violent than the Ku Klux Klan, they were nonetheless the primary force in ending the federal presence in the South and inaugurating the Jim Crow era.  But they were right about the South.  It has risen several times in 150 years and it will do so again as soon as the politicians realize they are pandering to a dying breed of rednecks.  As Congressman John Lewis said during the debate on displaying the Stars and Bars, “Hate is too heavy a burden to bear.”




Saturday, July 11, 2015


FRANCIS:  A PRELIMINARY PERFORMANCE APPRAISAL

Jerry Harkins

God writes straight with crooked lines.
                                                      —Traditional Portuguese adage



The election of Jorge Mario Bergoglio, Archbishop of Buenos Aires, as Pope Francis was a stroke of unaccustomed genius by the 2013 conclave of the College of Cardinals.  A pastor with a deep caring for the poor, a humble and welcoming man with a friendly smile and, not least, a Jesuit intellectual, he engendered worldwide optimism for reform of the church which is facing numerous and daunting challenges.  So far, he has proceeded cautiously but he did resolve the Curia’s bootless witch hunt against the American nuns.  He wrote a landmark encyclical about global climate change and its impact on the poor.  He revamped the Vatican’s scandal-ridden financial apparat and took decisive action against several bishops who had failed to protect children from predatory clerical perverts.  He apologized for the evils wrought by the church during the colonial period in Latin America.  He had kind words for gay people, stunning observers when he asked, “Who am I to judge?”  I suspect this question has never before occurred to a Pope.

So far, Francis’ method has been compromise.  Responding to an irresistible groundswell of support for Subito Sancto!, he canonized the autocrat John Paul II but used the same occasion to canonize the truly saintly John XXIII.  He disappointed liberals by agreeing with his predecessors that the question of ordaining women is settled forever but he has spoken favorably about placing women in important Curial positions. 

All Francis’ progressive actions to date concern matters of discipline even if his two immediate predecessors tried to raise them to the level of divine truth.  He has been notably silent on the far more important dogmatic issues confronting the church.  (But see Subsequent Event below.)  It may be that he is reluctant to provoke an existential debate within the hierarchy.  He would win such a debate among most lay Catholics but he would almost certainly lose it among members of the hierarchy most of whom were appointed by those same ultra conservative predecessors.  Or he may think that reforming the medium will make it easier to reform the message later on.  As I have said, he is a Jesuit.  Finally, it is possible that he thinks dogmatic theology is already a dead letter in the twenty-first century.  As I have also said, he is pastoral and he may not care how many angels can dance on the head of a pin.  If so, he is one with all but a handful of Catholics and mainline Protestants.

The problem is Francis is getting on in years and the church does not have a deep bench.  If he does not at least set it on a course of dogmatic reform before the next conclave, it is hard to imagine a successor as hopeful as he is.  And if the church keeps teaching asinine, unbiblical, illogical doctrines it cannot help but devolve into a late night laughingstock.  At the moment, he seems to actually believe most if not all of the mumbo-jumbo packaged in medieval mummery.  He has, it is true, abandoned the imperial dress favored by John Paul and Benedict.  He has declined to live in the papal palace, preferring a modest apartment where he can cook his own meals.  But he is under pressure to declare Mary as the Mediatrix of All Graces and even the Co-redemtorix of mankind.  Even if he couldn’t swallow such nonsense himself, it might appeal to him as a bone to be tossed to the conservatives in turn for their support on, say, a new theology of homosexuality or even perhaps a revocation of the absurdist teachings on contraception.  Did I mention he’s a Jesuit?

The most important doctrine that needs to be consigned to the ash heap of history is papal infallibility.  Unfortunately, it is also going to be the most difficult to get rid of because it is central to the church’s claim to power.  At the insistence of Pope Pius IX, the first Vatican Council issued the “Dogmatic Constitution” Pastor Aeternus (Eternal Shepherd) in 1870.  The key ruling said:

"We teach and define that it is a dogma Divinely revealed that the Roman pontiff when he speaks ex cathedra, that is when in discharge of the office of pastor and doctor of all Christians, by virtue of his supreme Apostolic authority, he defines a doctrine regarding faith or morals to be held by the universal Church, by Divine assistance promised to him in Blessed Peter, is possessed of that infallibility with which the Divine Redeemer willed that his Church should be endowed in defining doctrine regarding faith or morals, and that therefore such definitions of the Roman pontiff are of themselves and not from the consent of the Church irreformable.  So then, should anyone, which God forbid, have the temerity to reject this definition of ours:  let him be anathema."
The logic is appalling.  The historical claims are pure fiction.  Every shred of evidence and experience points in the opposite direction.  The author, Pius IX, was an Orwellian sociopath.  Nevertheless, it may indeed be true that the entire house of cards is irreformable.  What, after all, is the Roman Catholic Church if it is not the infallible keeper of the keys to heavenly bliss?  Infallibility binds Francis to the ignorance of the past.

The mental gymnastics needed to seriously maintain such an absurdity are impressive.  Heresy, for example, is surely a matter of “faith or morals.”  Thus the ruling of Pope Urban VIII’s  Inquisition in 1633 must remain infallibly true:

“We pronounce, judge, and declare, that you, the said Galileo… have rendered yourself vehemently suspected by this Holy Office of heresy, that is, of having believed and held the doctrine (which is false and contrary to the Holy and Divine Scriptures) that the sun is the center of the world, and that it does not move from east to west, and that the earth does move, and is not the center of the world.”
Actually Urban was a well-educated and sophisticated nobleman and his Inquisitor, Cardinal Robert Bellarmine, was one of the great intellects of the age.  (Bellarmine, of course, was a Jesuit and is now a Saint.  Urban had been Jesuit educated.)  But the Bible is wrong and they were wrong and almost certainly both knew it.  This is no disgrace and no surprise.  Better minds than theirs have often been wrong.  Aristotle was wrong about the laws of motion.  But you can hear the chorus of curial bureaucrats insisting that, while Urban ratified the Inquisition’s finding, it was not done ex cathedra and therefore can be “reformed.”  But it has not been reformed!  John Paul had an opportunity in 1996 when he spoke about the Vatican’s finding after thirteen years of formal inquiry that Galileo had probably been right.  He did not however consider it sufficiently probable to rescind the decree of heresy.  So it remains the official doctrine that the sun revolves around the earth and poor Galileo remains in hell.

It’s not important.  The only one hurt was Galileo and he’s been dead nearly 400 years.  But other, equally foolish doctrines, hurt millions of people.  The diatribe about  contraception, for example, influences American politicians and exposes vast numbers of people, mostly poor non-Christian Africans, to sexually transmitted diseases including AIDS.  But it too is not important because most people, including most Catholics, do not really believe the nonsense the hierarchy insists on so vehemently.  Teaching such nonsense squanders the church’s credibility.

Do you really think the eucharist is the literal body and blood of God Almighty, that Mary was a virgin when she gave birth to Jesus not to mention James and his other brothers, that a divorced woman is automatically an adulteress in the eyes of a loving God?  If you do, it is only because you worry that the church’s teachings may indeed be infallible and you fear the eternal fire and brimstone of hell.  But the truth is you do not really believe these things.  You may like them.  The Bible tells great stories which teach lessons that are central to our ability to live in harmony with each other.  They are not truths but metaphors.  They are loving and loved companions that make you feel warm and comfortable.

Like most children I had my favorite stories that I insisted be read to me again and again.  One was “The Tinderbox” by Hans Christian Andersen.  Now, even as a four-year old, I knew that there was no such thing as a dog with eyes as big as a teacup or another with eyes as big as a dinner plate or a third with eyes as big as a windmill.  But I remembered the poor soldier and the princess and their travails and eventually I realized that Andersen was writing about the universal theme of coming of age and facing a world of both infinite possibility and the challenges of learning adult self-control.   “The Tinderbox” is a parable just like the laborers in the vineyard and the prodigal son.

Parables do not come with scholarly footnotes, money back guarantees or expiration dates.  Their truth is not literal and they do not always begin with a simile.  Every Christian loves the story of Jesus’ birth as related in the second chapter of Luke’s gospel.  The instant you hear the opening words, “In those days Caesar Augustus issued a decree that a census should be taken of the entire Roman world,” you are suffused with a sense of joy and peace and even déjà vu.  You know it so well you don’t have to think about it.  But you do.  You wonder what it means that the son of God chose to be born in a manger.  Luke says there was no room in the inn.  But this was Joseph’s hometown in a part of the world where hospitality has always been a cardinal virtue.

The story of the first Christmas has absolutely nothing to do with the doctrine of original sin or Mary’s immaculate conception or the human nature of the triune God.  As the late Father Andrew Greeley often said, the church has traveled a long way from Bethlehem and Calvary.  It survived for two thousand years mostly because its adherents were illiterate peasants easily dominated by bishops and popes who were mostly self-serving, power hungry dictators.  Only the stories kept the enterprise alive and meaningful.  The stories fed the people of God as they sought to encounter the divine.

The question then is this:  can Francis restore the centrality of the stories absent the terrorism of fire and brimstone?  He is about to convene the second session of a synod of bishops to consider issues of importance to the family including contraception, divorce and remarriage, same-sex marriage, premarital sex and in vitro fertilization.  For centuries, the teaching on every one of these topics has been irrelevant to the lives of people and corrupt in every way, designed only to maintain the power of the hierarchs.  There is no room for compromise and no room for Vaticanspeak.  His decency, his humanity and his smile have given him an opening but time is short and the river is rising.  The good news is he’s a Jesuit.

Subsequent Event

Shortly after this essay was posted, Francis dipped his toe into the cold waters of dogma and announced that women who had had an abortion could be absolved from their grave sin by ordinary priests during the forthcoming Holy Year.  Previously only a bishop could grant such forgiveness and, presumably, once the Holy Year ends, Holy Mother the Church will revert to that policy.  So hurry on down to take advantage of this limited time offer!  It is disappointing that Francis did not think to concern himself with the issue of why abortion is a sin.  It has to do with the church's belief that life begins at the moment a sperm comes in contact with an egg.  Scientifically this is equivalent to its official belief that the sun revolves around the earth.  Politically, it is part and parcel of the power strategy that sees no irony in granting a coterie of elderly celibates absolute control over women's bodies.  Realistically, however, Francis has done as much as he dare.  Anything more would have run right up against the infallible rantings and ravings of his predecessors.  So our preliminary appraisal has to be that he seems to be doing all he thinks he can do.  If he's right, the church is irredeemable and will not outlive the century.








Thursday, July 09, 2015


ON BECOMING DIGITAL THINKERS

Jerry Harkins*



On the morning of April 6, 1964, I presented a carefully researched, closely reasoned report on a question posed by the Head of the Department of Mechanical and Technical Equipment of the United States Army Engineer School at Fort Belvoir, Virginia.   Three months earlier, he had asked whether the military field computers of the future would employ analog or digital technology.  Now I was addressing him and his top aides armed with a 3,000-word text and a deck of overhead projector slides.  I had interviewed experts and learned about the kinds of decisions made by battlefield commanders.  I drew up a list of the kinds and sources of the information they needed and their special requirements for portability, ruggedness, reliability, and practicality.  The take-away was that analog computing was likely to be the dominant technology for the foreseeable future. 

Twenty-four hours later IBM introduced System 360 thereby rendering my analysis antiquated.  Luckily the Colonel was an engineer with a sense of humor.  He recited the immortal words of Robert Burns:  The best laid schemes o’ Mice an’ Men / Gang aft agley, / An’ lea’e us nought but grief an’ pain, / For promis’d joy!”

Not that anyone could operate a 360 or any other digital machine outside a large air conditioned environment or, especially, inside a Vietnamese rice paddy.  But as the first general purpose solid state computer, it pointed the way to the future of everything.  Apple’s hand held smart phone is much faster and has far more computing power than did the 360 but is nonetheless its direct descendent.  For IBM the 360 represented the third generation of digital computing, a technology that had evolved in the 1930’s and that was represented in the popular press by ENIAC developed for the Army and introduced in 1946.  Designed by scientists at the University of Pennsylvania, ENIAC weighed 30 tons and employed 17,468 vacuum tubes.  Its successor was the even bigger MANIAC designed in Los Alamos and introduced in 1952.

Digital technology works by reducing everything to discrete bits and bytes and using its vast power and speed to bull through them to a solution.  “Discrete” is the key word.  Every part of every problem is represented by a series of 1’s and 0’s corresponding to the on and off conditions of solid state devices such as transistors.  This is the meaning of the term “binary,” the either/or condition in which there is nothing in between.  The world, or at least the world we perceive with out unaided senses, is not like that.  There are (or at least our brains think there are) infinite shades of gray, spectra of color, nuances of meaning, subtleties of expression and gradations of touch, taste, sight and sound.  The digital world can simulate these seamless transitions but cannot duplicate them.  A common example is digitized music which can reproduce a performance to an impressive exactitude.  Still, an experienced listener cannot ultimately be fooled.  Of course, the same listener would not be fooled into thinking an analog recording was anything but a recording in part because the old technology introduced extraneous noise.  Still, the sound of vinyl is often heard as more “natural” and more “human.”  Digital sound is often perceived as too perfect, too cold, too engineered.  No one has ever heard anything in the real world quite so detached.

Before proceeding, we must pause to recognize quantum theory and its offspring which suggest that in the subatomic world matter and energy do indeed exist in discrete packets.  These exotic phenomena are the true building blocks of the sensible universe.  Thus, elegance seems to demand that all creation must be discrete at heart.  If so, it would follow that reality can be better represented by digital than analog.  At least if digital is taken as discrete or discontinuous.  A black and white photograph made on film is composed not of infinite gradations of gray but of eleven discrete tones.  Our eyes are capable of much finer discriminations—a visit to any paint store will confirm there are more than fifty shades of gray—but the number is not infinite.  No matter, that is not the way we perceive things.  Our minds fill in the gradations.  Hamlet says, “…there is nothing either good or bad, but thinking makes it so.”  In a similar manner, there is nothing digital or analog but perception makes it so.  And from the time we came down out of the trees, our senses have suggested that, for the most part, the world is continuous or analog.

True binaries or opposites like “on” and “off” do exist:  positive and negative in the algebraic sense, up and down in the physical sense, absent and present, yes and no, clockwise and counterclockwise, left and right.  But there is a rich class of antonyms that constitute more complex dualities, things that are not absolute opposites but are contrasting in varying degrees.  The known and the unknown.  Good and evil.  Order and chaos.  Male and female.  Long term and short term.  Individual and community.  Self and other.  In each case, there is an element of tension, vigorous or feeble, between the members of the pair.  Good and evil are certainly opposed but there are gradations of both.  Not only are these dualities not opposites but, in some cases, they depend on each other.  “Male” is without meaning except in relation to “female.”  Yet both exist independently.  And both manifest themselves across a single, fairly wide spectrum of gender characteristics.  The common requirement to check a box for “male” or “female” is almost without meaning.  Facebook provides 58 choices for its members including “neither,” “other” and “pan.”  If these don’t work you can provide a customized descriptive.  Imagine Plato attempting to discuss the pure idea of “male.”  Anything he might say must ultimately include reference to “female.” Like yin and yang, each is half of an idea yet both can stand alone.

Rather than think of such ideas as “opposites,” I prefer to call them “complementarities.”  They are fundamentals of the human experience.  To take another example, very early in life we come to the realization of “self” and “other.”  The recognition is simultaneous; “self” implies “other” and vice versa.  Except for people with multiple personality disorder, the distinction between self and other is absolute—binary—but they are not opposites.  “We” addresses a community made up of multiple “self” and “other” pairs.  As we mature, we learn to subdivide “self” into “me” and “mine” which are different but again not opposite.

The real world of our senses is complex, ambiguous and subtle.  An analog computer can target artillery fire very accurately and fast enough for most situations.  But the genius of digital technology is to make it seem simple while doing so with much greater precision. Deep Blue is an IBM supercomputer that can play chess at the grandmaster level and Watson is an even more advanced breed that can beat the best Jeopardy players in the world.  But, in spite of decades of development, neither can carry on a real conversation with a human being.  You can ask Apple’s Siri for restaurant recommendations.  She will make reservations for you and call a taxi to get you there.  But don’t ask her to discuss the differences between Beaujolais and Burgundy or even choose between Pinot gris and Pinot noir. 

In 1950, Alan Turing published a paper purporting to answer the question “Can machines think?”  By the criteria he established for the famous Turing Test, the unequivocal answer was that there was no reason to suppose they would not someday do so.  Within a few years, researchers at MIT developed ELIZA, an artificial intelligence system that simulated psychotherapy sessions between a computer/therapist and a human patient.  The developers knew it was therapeutically primitive but were surprised by how readily it was accepted by its human users.  It seemed to pass the Turing Test even if many felt it was due to the simplicity of the test rather than the sophistication of the software.  Today’s “chatterbots” are less ambitious but much improved in terms of conversational realism.  Still, anyone who has ever used a computerized Help Line knows they are sometimes efficient but often hateful substitutes for knowledgeable humans.

The question is still being asked as to whether digital computers will ever reach a stage where they can realistically mimic the human brain.  It would be foolhardy to suggest such progress is impossible.  However, if it ever does come about, the computer in question will probably employ some hybrid technology.  Not to worry.  At present, such verisimilitude seems unlikely for economic as well as technical reasons.  For one thing, the product development emphasis remains on traditional device speed and capacity both of which put a premium on compression of data.  For example, when music or video is streamed, the result is lower fidelity than that delivered by CD’s or DVD’s.  Each successive generation of Apple’s iTunes application has delivered slightly lower fidelity which most users don’t notice or ignore.  The quality of digital photographs taken by smart phones has increased with each generation of the technology but is still lower than that of either conventional film or high end digital technology.  Cost and convenience are dramatically improved and even users who notice the decline in quality are happy to make the compromise.  Digital material is also easier to manipulate.  The old saw, “A photograph doesn’t lie,” was never guaranteed given the ability of darkroom technicians and airbrush artists to modify what the camera saw.  But programs like Photoshop make it easy to alter an image completely.  Whether the intent is benign or malicious, this level of control alters the nature of truth itself.  A viewer cannot know anything about what the camera actually saw or how the photographer felt about it.

This is an ancient problem.  Writing and printing, music, photography and other “extensions” of human senses (to borrow Marshall McLuhan’s locution) are all languages.  The great semanticists of the 1930’s knew that language acts to constrict thought in order to serve the interests of communication.  Poetry, for example, is a compromise between logic and emotion effected through the intermediation of vocabulary, grammar and other conventions.  It proceeds from mental imagery we refer to with words like inspiration, intuition and imagination, processes that cannot be quantified.  Moreover, every language treats the constriction process differently so that the architecture of a language accounts for a significant part of the world’s cultural diversity.

Language is the central tool of knowledge but it also burdens us with an unavoidable problem.  Except for the possibility of telepathy, there is no way for people to communicate anything other than the most basic ideas without using twice-translated symbols.  A speaker or writer chooses a word to convey an idea that is not a word and the listener or reader translates that symbol into his or her own mental construct.  Both run the risk of treating the word as though it is the idea it refers to.  But really it is merely an undisciplined coding system that is more or less analogous to the way we think. 

Digital language is altering culture not only because numbers are more abstract than words and binary numbers are more abstract than continuous numbers, but also because it is insidious.  The 1’s and 0’s are hidden from view and the vast majority of people are unaware of them.  To a noticeable extent, it has already made us digital thinkers.   As a society, we have become data-driven.  We assess our scholarship by counting the number of times our publications are cited in other publications, our visual culture by the prices investors pay for art at auction and our spirituality by the frequency of church attendance.  We follow opinion polls assiduously at a time when societal dispositions are conspiring to significantly reduce their validity and reliability.  We reduce a sport like baseball to a vast number of sums and ratios and believe religiously that the data can help us predict the future.  But such numbers are merely imperfect surrogates for the things they purport to measure.  They are never the referents themselves.

The idea that Galileo’s dispute with the Holy Inquisition was about heliocentricity is simplistic at best.  It was rather a debate about the nature of reality.  Galileo, of course, was right about the earth’s rotation around the sun because he had observed and measured it.  But he was wrong in believing that the laws of nature are written in mathematics or, if not wrong, he was merely expressing an ingenious but unsupportable opinion.  To me, mathematics is a human construct.  It is often quite useful but Pi is only a useful ratio.  Indeed, the “laws of nature” are nothing more than artifacts.  There is nothing to suggest that “life” in some parallel universe must be carbon-based or that the hare will always outrun the tortoise.  Pythagoras was able to prove that the expression:

A2 + B2  = C2

defines the relationship between the hypotenuse and the legs of a right triangle.  His discovery turned out to be one of the most useful in the history of science.  But it is not a right triangle.  It derives from one aspect of the triangle.  Similarly, 261.625565 hertz describes one important aspect of Middle C but is not music.

The human race has turned a corner in its evolution.  Until recently, energy was what Daniel Bell called the “strategic and transforming resource” of the developed world.  It was energy that, when harnessed and applied to raw materials by technology, added value to them.  If you date the industrial revolution from James Watts’ first practical steam engine in 1776, it lasted about 200 years.  Today energy has been displaced as the great value adder by information harnessed and applied by digital technology.  The information revolution is different from anything that preceded it.  For one thing, information is the only resource that cannot be depleted.  The more we use it, the more it grows and the more valuable it becomes.  For another, it is ubiquitous, instantly available to anyone anywhere and acting as though it wants to be free.  For better and sometimes for worse, it both enables and compels the global economy.

In 1964, I had a conversation with a scholarly officer at the Command and General Staff College at Fort Leavenworth, Kansas.  He showed me a sandbox configured to display the terrain and troop dispositions around Cemetery Ridge near Gettysburg, Pennsylvania on the morning of July 3, 1863.  Twenty or so students were supposed to re-think Lee’s strategy and revise Pickett’s tactics accordingly.  The Army wanted to computerize the exercise so that each student’s responses could be critiqued in real time.  Given the sheer number of variables and the fluidity of the situation, it would be hard to imagine a more complex problem whether it was considered from the perspective of a commander on the ground or a computer programmer in a software laboratory.  The textbook answers had failed Lee, Longstreet and Pickett.  Although the Confederate attackers outnumbered the Union defenders by more than 2 to 1, their artillery could not see the enemy through the smoke and their infantry occupied the lowest part of an undulating field.  The software was supposed to assess each decision every student made and to display the results in the same amount of time it would have become clear to the battlefield commanders in 1863.  Today’s most sophisticated computer games are nowhere near as complex and I am not sure today’s digital computers could handle the challenge.  Given that feedback time was meant to correspond to real time, an analog system would have a better chance especially if a student came up with a brilliant new solution.  The analog machine “thinks” more like a creative human being. 

It reassures me to believe that neither technology could whisper the only good solution in Lee’s ear:  Don’t do this;  get the hell out of here.

Brave new world indeed!  There is an important advantage to be gained by thinking digitally or numerically and that is it may nudge us away from the scandalous scientific illiteracy that besets our society.  If, as we think, language is the springboard of culture, constant exposure to the language of science is sure to ease our anxieties about it.  The caveat is we must recognize that science, like every other approach to knowledge, employs a metaphoric language.   For all our history we have thought, reasoned, imagined and even dreamed in metaphors.  Dealing with their imprecision, their fuzziness has served us well.  It is important that we disconnect our young people from their smart phones long enough to teach them to use the brains they were born with.

_______________
*Jerry Harkins is a statistician who, long ago, spent two years as a member of the Army Corps of Engineers rising to the rank of Specialist Fourth Class, the same pay grade as a corporal.  In addition to regular turns at KP and guard duty, he was a member of a small unit of displaced draftee academics that undertook special projects related mostly to automated learning systems.  He is well aware of the epistemological problem posed by quantum weirdness even if he remains stubbornly unable to understand anything else about it.