Reparations [5]:  Moral Compulsion

Reparations for American slavery require a sense of moral compulsion. Moral compulsion requires humility. Are we capable of it?

There is no hope for reparations if the topic is left to business and politics as usual – to the customary manner in which decisions are made, national affairs are conducted, pundits and media outlets clamor for sensationalism, social media serves up clickbait, religion and social science and academia offer their apologetics to an unappreciative public, and the elected and electorate alike close their minds to any opinion other than the one they already hold.

Reparations have no place in a culture given over to polarization, rage, and post-truth subjectivity.

The case for reparations cannot be heard by a society deafened with the noise of the daily outrage and distracted with the madness du jour.

The case for reparations cannot reach a national identity hijacked by endless competing and ever-shapeshifting agendas, histrionic accusations, and the exigencies of life ever more difficult and dystopian.

Reparations have no place where populists fan the fires of rage, and the enraged populace persists in voting against its own self-interest.

Reparations have no chance to gain the support of people long-starved of commitment to their communal welfare, unaware that their own beliefs and truths have done this to them, have dumbed them down with despair and chained them to the incessant grinding of life with no cushion against their misfortunes or safety net to catch them when they fall.

Reparations cannot capture the imagination of a nation that denies its people leisure time for renewal and reflection, that accepts as logical, normal, and virtuous that they should be compelled to labor in a state of total work without respite or gain or opportunity for improvement.

Reparations will not find a way in a nominally democratic country where the practice of democracy languishes under polarized ideologies, where systemic inequalities and social Darwinism are not merely accepted but revered as true and right and just and godly proof of their nation’s superiority.

That, and more, is why reparations don’t have a chance in contemporary America. Is there any countervailing force strong enough to pave the way for them?

Yes there is:  it is moral compulsion.

Moral compulsion is an urgency to set things to right, an overweening determination to be cleansed of an enduring ugliness, to be freed from the burden of national shame, a commitment to individual, cultural, and national transformation, an uncompromising will to transcend the mistakes of the past and meet the unprecedented challenges of today.

Moral compulsion would provide an irrepressible energy to displace the inevitable failure of reparations with robust action to ensure their implementation.

But what place does moral compulsion have in American policy-making at this time? Moral compulsion does not make the agenda of an administration devoted to consolidating its power by fomenting division and perverting the rule of law into a “law and order presidency.” Moral compulsion is also missing from the agenda of an opposition party incapable of anything other than the pathetic hope that if they stay still they will not be seen, if they remain silent they will not be singled out. Reparations have no chance when moral compulsion is unknown on one side of the aisle and a terror on the other. No conversation and compromise will ever be reached when even the least of moral consensus – common decency – cannot find common ground.

America’s current moral vacuum was not always the case.

“In the past, America has played a critical role on the global stage as a model for developing democracies, a crusader for human rights and a bulwark against the spread of authoritarian regimes. Former secretary of state Madeleine Albright once called America “the indispensable nation” for its moral leadership. But unlike ever before, scholars say, America’s commitment to democracy is flagging…. The risk, [Stanford political scientist Larry Diamond] says, is a century defined by the rise of the autocrat.”[1]

That was then.

What is now?

If the 2016 election taught us anything, it was that America had grown tired of its role as the world’s “moral leader.

Moral leadership had become tiresome, our efforts not worth the return. The catastrophes of recent decades of international policy and a lost taste for globalization suggested we were not as suited to the job of worldwide betterment as we once thought. We could pick a fight anywhere in the world and win it, therefore our strategy for bringing freedom and democracy to the world had been to impose our moral will by military force, covertly supported with the covert support of right-wing strongmen through corruption, bribery, torture, and other forms of governmental criminality. Our moral duplicitously was exposed when a raft of domestic and international whistleblowers and secret-leakers disgorged our tactics into public awareness, turning our times and technologies into apocalyptic revelation. They pulled back the facades of our imperial pridefulness, revealed the behind and beneath, ushered in a Great Revealing of ourselves to ourselves. Our secret vaults were opened, our private and vulnerable selves made known, all motives revealed, alliances betrayed, files ransacked, classified access breeched, proprietary information violated, everything hacked and made Open Source, seals all broken, all safes cracked, all containers emptied and their contents strewn across a million conference tables and chronicled in the tabloids.

By 2016, we had lost the stomach for it. Moral leadership had become a “loser.”

There was a moral lesson in all this that we could have learned, and new national self-awareness we could have gained.

    • What we will see, and what we won’t. The lenses we wear. The silos we construct.
    • What we block, recoil from. The shadows in our souls. The things we fear. The parts of us that threaten our own being.
    • Our biases, assumptions, prejudices, projections and deceptions. The cases we build to advantage ourselves, and the lengths we’ll go to cling to them.
    • The order we have imposed on life and the people in it. Rank, pecking order, winners and losers. Who we’ll talk to, friend, like, follow, ally with, and who we won’t. And why.
    • What we consider reasonable, viable, proper, possible… and their opposites.
    • What we will say, and what we won’t.
    • What we will hear, and what we won’t.
    • The secrets we carry, that we are confident will never be known by anyone but ourselves.
    • The cultivated appearances we can no longer keep up.
    • Our selective memories, choices, regrets. And resentments. Alliances betrayed and relationships broken. Forgiveness neither extended nor received.

The new, unflattering self-awareness we might have gained from these revelations could have helped us regain a newly realigned perspective on who we had become. But we didn’t want to hear it, so we didn’t learn it. There were some rare feints at remorse:  press conference confessions saying we were sorry while the betrayed stood stoically by. No one was fooled:  we weren’t sorry we did it, we were sorry we got caught.

What have you gathered to report to your progenitors?
Are your excuses any better than your senator’s?
He held a conference and his wife was standing by his side
He did her dirty but no-one died

What are you waiting for, a kiss or an apology?
You think by now you’d have an A in toxicology
It’s hard to pack the car when all you do is shame us
It’s even harder when the dirtbag’s famous

          The Killers, Run For Cover

Mostly, we stormed and swore vengeance against the prophets of our moral recrimination. We labelled them as traitors and enemies, blew their legal cover, strong-armed foreign governments to give them up to our salivating justice. We were defensive because the truth hurt. American was not as blameless as we wanted to think.

It could have been a moral reckoning, but it wasn’t.

The disorienting truth could have reoriented us as a nation, could have shown us how we had shunned and discarded our ideals to make room for the twin pillars of our foreign policy:  capitalism and militarism, We could have become freshly aware of what we had built while no one was looking and we weren’t paying attention. We could have, but we didn’t. We couldn’t separate ourselves from our need to feel good about ourselves, from our national belief — that we breathe in from childhood and begin learning before preschool — that our nation is the apex of civilization — morally, spiritually, militarily, and economically. If we were appalled at all by what we had become, it was not because of what we might have learned about ourselves but because we were terrified to see our shadow selves dredged up from our  own hidden vaults, now walking the streets; haunting and pursuing , calling us out. We completed our denial and purposeful self-deception by concluding that surely some enemy had done this, had sown tares in our heartland wheat. They had done it. And now we were on to Them, newly justified in our judgment and pure in our hatred of Them.

We had been called to reckon, but we didn’t. We still haven’t. We denied and fled – away from Them and into ourselves. Globalization became a dirty word. Among its many faults was that it had made the world too small. We had too many neighbors too close, too unlike us. We needed our open spaces back, needed to feel again our rugged individualism, the spirit that tamed the Wild West.

“Globalization may be partly to blame [for America’s flagging commitment to democracy]: In an increasingly interconnected world, governing has gotten trickier. ‘If you have a constant flow of capital, people and trade goods, it’s harder to figure out what to do in your own country,’ says political science professor Anna Grzymala-Busse, who directs the Global Populisms Project at the Freeman Spogli Institute. The increasing interdependence of the world’s economies also limits the impact of any one nation’s policies. As mainstream politicians struggle to solve ‘national’ problems that are, in actuality, intertwined with the actions and economies of other countries, voters can start to view them as inept.

“Globalization has stoked nationalism and anti-immigrant sentiment among citizens who fear not only the economic but also the cultural changes that can accompany such shifts. There again, Grzymala-Busse says, populists have stepped in, defining ‘the people’ of a country narrowly and subjugating minority interests. ‘Populist movements have this very corrosive impact on democracy,’ she says.”[2]

We abandoned the global village and rushed home to ourselves –the people we wanted to believe we had once been and still were. We put those people and their country first. We demonized and expelled outsiders, built walls against Them, withdrew trade, made capital calls, foreclosed on collateral, imposed tariffs. We imprisoned them, banned their travel, rejected them. It was our turn, our time, and we would make the best of it.

And none of that helped assuaged our national conscience, rooted as it was in the lies of lost utopia.

Lashed on by those who stood to gain the most from our disorientation, we stormed the gates of the lost Garden in hyped-up agitation, and the more we ranted, the more we became addicted, drugged with the madness of a mob that promised a return to the unjustified and unaccountable superiority we had granted to our idealized and delusional past. We reconstituted our fictional past into a delirious present, created in the image of every broken promise we had ever made.

We doubled down on a bluff, and when the other worldwide players laughed at our bravado, our national resentment turned spiteful and toxic. We turned our rage not only against Them but against ourselves. We banned the notion of the public welfare and communal good. We forfeited our rights to a living wage, to healthcare and education, to security in retirement, to home ownership, to security against our own human frailty and life cycles. We derided the notion of public welfare as weak and pitiful, and converted all of life and culture, law and economics, government and socio-economic policy over to hyper-competition. We traded moral and societal good for law and order, the triumph of power, and the ascension of socio-economic elitism. We drowned out doomsayers with chanted mythologies that placed humans, and particularly Caucasians, at the apex of Creation, crowned with the divine right to subdue it to our own ruin. We jettisoned science, objective truth, and reasonable discourse in favor of an unbridled right to mangle our own truth until it made us gods, force-feeding our starving souls with “reality” that wasn’t.

And now, into our failed and rejected moral leadership and policies of communal hatred comes the idea of reparations for slavery.

Which is why reparations don’t have a chance under America’s populist overlords and their domestic armies. The moral compulsion reparations require has been crushed in the void of our national implosion.

Reparations offer us a way out – a way to restore ourselves and our nation, to push back the night, to draw ourselves back from the brink of our final self-destruction. Paying the moral debt of slavery offers the salving of our collective conscience through restoring and recreating, repairing and remediating the stain of our beginnings and our stumbling path through our own history. It offers to fill the unfathomable moral trough excavated by the systematic brutalization of an entire class of fellow humans in ways that none, nobody, not one of the rest of us would ever. never, not ever accept for ourselves, not in a million years, but that our ancestors carried out in untroubled allegiance to what for them was normal, legal, and their divine right – an ideological tradition the nation has carried on ever since the ultimately empty “victory” of the Civil War, which officially abolished slavery but left untouched its de facto existence.

In our current moral vacuum, reparations for slavery are not just difficult and troublesome and unlikely, they are impossible – irrevocably not-on-my-watch, over-my-dead-body impossible. They have only one hope:

Reparations will be made only when
they are no longer reparations for slavery.

Not even if they are made for racism.

But when they are made for our lost humanity.

The essence of moral compulsion is humility.

America would need to do as Germany did after the Holocaust — publicly relinquish belief in the superiority of white European ancestry. Germans had to abandon the “Teutonic national myth.” Americans would need to abandon the myth of manifest destiny. Humbling ourselves in that way would be heroic.

If Germany’s example plays out in America, there would be violent opposition. And, as Germany’s example also teaches us, humility is a two-way street:  both those making reparations and those benefiting from them must humble themselves to each other and before the eyes of the watching world. Humility will not be easy on either side:

“Humility is the most difficult of all virtues to achieve;
 nothing dies harder than the desire to think well of self.”

T.S. Eliot

We will look more at Germany’s example next time, also at the international mechanism created after WWWII that could help us with the difficult task of humbling ourselves – a mechanism  that America’s government has rejected.

[1] Patton, Jill, An Existential Moment for Democracy? As American leadership falters, scholars say, autocrats are on the rise, Stanford Magazine (December 2019)

[2] Ibid.

Reparations [4]:  The Essential Doubt

And so you see I have come to doubt
All that I once held as true
I stand alone without beliefs
The only truth I know is you.

Kathy’s Song[1]
Paul Simon

We saw last time that the U.S. government could waive its legal defense of sovereign immunity to pave the way for slavery reparations. It would take more than a legal reckoning for that to happen. Law lies on the surface of society, readily visible, but it has deep roots in history and ideology, national identity and mission, values and beliefs, ways of looking at the world and how life works.[2] These ancient root systems invoke fierce allegiances deeply embedded in human psyche and culture. Because the legal doctrine of sovereign immunity is grounded in Biblical doctrine,[3] laying it aside requires doubt and dissent of the highest order – national treason and religious apostasy in a single act.

Doubt of that magnitude is rare beyond description but not without precedent. Consider, for example, Germany’s reparations for World War II, which required not only the international banishment of Nazism, but also the German people’s moral renunciation of Nazism’s philosophical and political roots stretching back to the 19th Century.[4]; In comparison, the USA”s roots of slavery (and hence racism) extend back to the earliest New World settlements, which imported English common law, including the divine right of kings and its nationalistic version, sovereign immunity. Renouncing the latter to pave the way for slavery reparations would require a similar American moral renunciation of centuries of related social, economic, and political ideology and set new terms for a post-racism American state.

That, in turn, would require a reckoning with the “first cause” roots of the divine right of kings and sovereign immunity.

The First Cause Roots of Sovereign Immunity

A “first cause” satisfies the human desire for life to make sense by assigning a cause to every effect. Trouble is, as you trace the cause and effect chain to its remotest origins, you eventually run out of causes, leaving you with only effects. That’s when a first cause comes to the rescue. A first cause has no prior cause – it is so primary that nothing came before it but everything came after it. Since knowledge can’t reach that far back, a first cause is a matter of belief:  you take it on faith, declare the beginning into existence, and go from there.

Western civilization’s worldview historically identified God as the ultimate first cause.

“First cause, in philosophy, is the self-created being (i.e., God) to which every chain of causes must ultimately go back. The term was used by Greek thinkers and became an underlying assumption in the Judeo-Christian tradition. Many philosophers and theologians in this tradition have formulated an argument for the existence of God by claiming that the world that man observes with his senses must have been brought into being by God as the first cause.

“The classic Christian formulation of this argument came from the medieval theologian St. Thomas Aquinas, who was influenced by the thought of the ancient Greek philosopher Aristotle. Aquinas argued that the observable order of causation is not self-explanatory. It can only be accounted for by the existence of a first cause; this first cause, however, must not be considered simply as the first in a series of continuing causes, but rather as first cause in the sense of being the cause for the whole series of observable causes.

“The 18th-century German philosopher Immanuel Kant rejected the argument from causality because, according to one of his central theses, causality cannot legitimately be applied beyond the realm of possible experience to a transcendent cause.

“Protestantism generally has rejected the validity of the first-cause argument; nevertheless, for most Christians it remains an article of faith that God is the first cause of all that exists. The person who conceives of God in this way is apt to look upon the observable world as contingent—i.e., as something that could not exist by itself.”[5]

God is the ultimate Sovereign from which all lesser sovereigns – the king, the national government — derive their existence and legitimacy. God’s first cause Sovereignty justifies God’s right to rule as God sees fit. The king and the state, having been set into place by God, derive a comparable right of domination from God. The king and the national government are to the people what God is to them.

The Divine Right of Kings

When kings ruled countries, their divine line of authority took legal form as the Divine Right of Kings.

“The divine right of kings, divine right, or God’s mandate is a political and religious doctrine of royal and political legitimacy. It stems from a specific metaphysical framework in which the king (or queen) is pre-selected as an heir prior to their birth. By pre-selecting the king’s physical manifestation, the governed populace actively (rather than merely passively) hands the metaphysical selection of the king’s soul – which will inhabit the body and thereby rule them – over to God. In this way, the ‘divine right’ originates as a metaphysical act of humility or submission towards the Godhead.

“Consequentially, it asserts that a monarch (e.g. a king) is subject to no earthly authority, deriving the right to rule directly from divine authority, like the monotheist will of God. The monarch is thus not subject to the will of his people, of the aristocracy, or of any other estate of the realm. It implies that only divine authority can judge an unjust monarch and that any attempt to depose, dethrone or restrict their powers runs contrary to God’s will and may constitute a sacrilegious act.”[6]

The Divine Right of Kings was a favorite doctrine of the first King James of England, who commissioned what would become the King James Version of the Bible partly in response to Puritan challenges to the Church of England’s doctrine of an ordained clergy that could trace its lineage to the original Apostles.

“Divine right of kings, in European history, a political doctrine in defense of monarchical ‘absolutism,’ which asserted that kings derived their authority from God and could not therefore be held accountable for their actions by any earthly authority such as a parliament. Originating in Europe, the divine-right theory can be traced to the medieval conception of God’s award of temporal power to the political ruler, paralleling the award of spiritual power to the church. By the 16th and 17th centuries, however, the new national monarchs were asserting their authority in matters of both church and state. King James I of England (reigned 1603–25) was the foremost exponent of the divine right of king….”[7]

“While throughout much of world history, deified potentates have been the rule, in England, absolute monarchy never got a solid foothold, but there certainly was the attempt. Elements of British political theory and practice encouraged absolutism—the idea and practice that the king is the absolute law and that there is no appeal beyond him. Several movements and ideas hurried along the idea of absolute monarchy in England. One of those ideas was the divine right of kings,

“In England, the idea of the divine right of kings will enter England with James VI of Scotland who will come and rule over both England and Scotland as James I in 1603 and will commence the line of several ‘Stuart’ monarchs. James had definite ideas about his role as monarch, and those ideas included the divine right of kings. Here are just a few of James’ statements that reflect his view that he ruled by divine right:

      • Kings are like gods— “…kings are not only God’s lieutenants upon earth, and sit upon God’s throne, but even by God himself are called gods.”
      • Kings are not to be disputed— “… That as to dispute what God may do is blasphemy….so is it sedition in subjects to dispute what a king may do in the height of his power.”
      • Governing is the business of the king, not the business of the subjects— “you do not meddle with the main points of government; that is my craft . . . to meddle with that were to lesson me . . . I must not be taught my office.”
      • Kings govern by ancient rights that are his to claim— “I would not have you meddle with such ancient rights of mine as I have received from my predecessors . . . .”
      • Kings should not be bothered with requests to change settled law— “…I pray you beware to exhibit for grievance anything that is established by a settled law…”
      • Don’t make a request of a king if you are confident he will say “no.”— “… for it is an undutiful part in subjects to press their king, wherein they know beforehand he will refuse them.”

“James’ views sound egotistical to us today, but he was not the only one that held them. These views were held by others, even some philosophers. For example, the English philosopher Thomas Hobbes wrote a work called Leviathan in 1651 in which he said that men must surrender their rights to a sovereign in exchange for protection. While Hobbes’ was not promoting the divine right of kings per se, he was providing a philosophy to justify a very strong absolute ruler, the kind that the divine right of kings prescribes. Sir Robert Filmer was a facilitator of the divine right of kings and wrote a book about it called Patriarcha (1660) in which he said that the state is like a family and that the king is a father to his people. Filmer also says that the first king was Adam and that Adam’s sons rule the nations of the world today. So, the King of England would be considered the eldest son of Adam in England or the King of France would be Adam’s eldest son in France.”[8]

King James, Witch Hunter

King James had no impartial academic interest in a Bible translation that supported his divine right:  during his reign, the “Cradle King” accumulated a long list of covered offenses that included mass murder, torture, injustice, tracheary, cruelty, and misogyny.

“The witch-hunts that swept across Europe from 1450 to 1750 were among the most controversial and terrifying phenomena in history – holocausts of their times. Historians have long attempted to explain why and how they took such rapid and enduring hold in communities as disparate and distant from one another as Navarre and Copenhagen. They resulted in the trial of around 100,000 people (most of them women), a little under half of whom were 
put to death.

“One of the most active centres of witch-hunting was Scotland, where perhaps 
4,000 people were consigned to the flames – 
a striking number for such a small country, 
and more than double the execution rate in England. The ferocity of these persecutions can be attributed to the most notorious royal witch-hunter: King James VI of Scotland, who in 1603 became James I of England.

“Most of the suspects soon confessed – under torture – to concocting a host of bizarre and gruesome spells and rituals in order to whip up the storm.… James was so appalled when he heard such tales that he decided to personally superintend the interrogations… while the king looked on with ‘great delight’.

“James’s beliefs had a dangerously misogynistic core. He grew up to scorn – even revile – women. Though he was by no means alone in his view of the natural weakness and inferiority of women, his aversion towards them was unusually intense. He took every opportunity to propound the view that they were far more likely than men to succumb to witchcraft…. He would later commission a new version of the Bible in which all references to witches were rewritten in the female gender.

“Most witchcraft trials constituted grave miscarriages of justice…. If the actual facts of a case were unsatisfactory, or did not teach a clear enough moral lesson, then they were enhanced, added to or simply changed.”[9]

When the new King James Bible substantiated the King’s divine right to carry on these activities, and when the USA imported the king’s divine right into its legal system as sovereign immunity, both acknowledged God as the first cause of these legal doctrines. Like the King, the U.S. government also has a long list of covered offenses:  the treatment of slaves during the reign of legal slavery mirrors King James’ obsession with brutalizing, lynching, and murdering witches.

In the U.S., where a 2019 Gallup Poll found that 64% – 87% of Americans believe in God  (depending on how the question was asked), there remain many ”Christians [for whom] it remains an article of faith that God is the first cause of all that exists.[10] As a result, we see in the USA’s current social and political climate both explicit and implicit affirmation of the following Bible passages (which the online source appropriately expresses in the King James version) to substantiate the ability of national leaders to avoid accountability for acts of governance that sponsor this kind of horrifying treatment of citizens.[11]:

“Let every soul be subject unto the higher powers. For there is no power but of God: the powers that be are ordained of God. Whosoever therefore resisteth the power, resisteth the ordinance of God: and they that resist shall receive to themselves damnation. For rulers are not a terror to good works, but to the evil. Wilt thou then not be afraid of the power? do that which is good, and thou shalt have praise of the same: For he is the minister of God to thee for good. But if thou do that which is evil, be afraid; for he beareth not the sword in vain: for he is the minister of God, a revenger to execute wrath upon him that doeth evil. Wherefore ye must needs be subject, not only for wrath, but also for conscience sake.” Romans 13:1-5, KJV

“Lift not up your horn on high: speak not with a stiff neck. For promotion cometh neither from the east, nor from the west, nor from the south. But God is the judge: he putteth down one, and setteth up another.” Psalms 75:5-7, KJV

“Daniel answered and said, Blessed be the name of God for ever and ever: for wisdom and might are his: And he changeth the times and the seasons: he removeth kings, and setteth up kings: he giveth wisdom unto the wise, and knowledge to them that know understanding:” Daniel 2:20-21, KJV

“This matter is by the decree of the watchers, and the demand by the word of the holy ones: to the intent that the living may know that the most High ruleth in the kingdom of men, and giveth it to whomsoever he will, and setteth up over it the basest of men.” Daniel 4:17, KJV

“I have made the earth, the man and the beast that are upon the ground, by my great power and by my outstretched arm, and have given it unto whom it seemed meet unto me.” Jeremiah 27:5, KJV

“The king’s heart is in the hand of the LORD, as the rivers of water: he turneth it whithersoever he will.” Proverbs 21:1, KJV

“For rebellion is as the sin of witchcraft, and stubbornness is as iniquity and idolatry. Because thou hast rejected the word of the LORD, he hath also rejected thee from being king. And Saul said unto Samuel, I have sinned: for I have transgressed the commandment of the LORD, and thy words: because I feared the people, and obeyed their voice. Now therefore, I pray thee, pardon my sin, and turn again with me, that I may worship the LORD. And Samuel said unto Saul, I will not return with thee: for thou hast rejected the word of the LORD, and the LORD hath rejected thee from being king over Israel.” 1 Samuel 15:23-26, KJV

“And upon a set day Herod, arrayed in royal apparel, sat upon his throne, and made an oration unto them. And the people gave a shout, saying, It is the voice of a god, and not of a man. And immediately the angel of the Lord smote him, because he gave not God the glory: and he was eaten of worms, and gave up the ghost.” Acts 12:21-23, KJV

The Ultimate Focus of Doubt:  God

In “Abrahamic” cultures — Jewish, Muslim, and Christian – the Biblical God is the first cause of the divine right of kings and sovereign immunity. The full force of patriotic nationalism and religious zeal therefore originates with God – which explains why a surprising number of European nations had blasphemy laws on the books until not that long ago, and why some nations still do.[12]

“Blasphemy is the act of insulting or showing contempt or lack of reverence to a deity, or sacred objects, or toward something considered sacred or inviolable.”[13]

God, it seems, like kings and sovereign nations, has much to be excused from. Aside from the Biblical God’s sponsorship of war, genocide, mass murder, rape, torture, and brutality to humans and animals, a list of modern labels would include misogynist, homophobe, and xenophobe. But of course you don’t think that way if you’re a believer, because that would be blasphemy, often punishable by death, often after the infliction of the kind of cruel and unusual punishment reserved for the faithful and unfaithful alike. As for the latter, the Bible makes it a badge of honor for the faithful to suffer in the name of God:

“Some were tortured, refusing to accept release, so that they might rise again to a better life. Others suffered mocking and flogging, and even chains and imprisonment. They were stoned, they were sawn in two, they were killed with the sword. They went about in skins of sheep and goats, destitute, afflicted, mistreated—of whom the world was not worthy—wandering about in deserts and mountains, and in dens and caves of the earth. And all these, though commended through their faith, did not receive what was promised,” Hebrews 11:  35-39.ESV

Transformation Made Possible by Doubt

Nonbelievers not vexed with these kinds of rights of the sovereign and duties of the governed are free to doubt God’s first cause status and its derivative doctrines, laws, and policies. In the USA, doubt embraced on that level would open the door to any number of contrary beliefs – for example:

    • The state does not enjoy superior status — historically, legally, morally, or otherwise – that gives it a right to act without consequence.
    • The people governed are therefore not bound – theologically, morally, or otherwise – to submit to government that is not responsible for its actions.

Once you’re no longer worried about breaking faith with God as the first cause of your national institutional structure, a while new “social contract” (also discussed last time) between government and the people becomes possible – a contract that would, in effect, not be satisfied with paying only descendants of slaves “damages” for past harm, but would look to establish a fresh national vision of the duties of those who govern and the rights and freedoms of the governed. The result, it would seem, is the possibility of ending the USA’s institutionalized racism for good.

[1] Who was Paul Simon’s Kathy? And whatever happened to her? See this article from The Guardian.

[2] See the Belief Systems and Culture category of posts in my Iconoclast.blog.

[3] The Founding Myth: Why Christian Nationalism Is Un-American, Andrew L. Seidel (2019). Although the USA was not founded as a Christian nation, its core values and beliefs, like those of other Western countries, are Classical and Biblical in origin.

[4]  See Alpha History and The Mises Institute on the historical origins of Nazism.

[5]  Encyclopedia Britannica. See also New World Encyclopedia and the Stanford Dictionary of Philosophy.

[6] Wikipedia – The Divine Right of Kings.

[7] Encyclopedia Britannica and Wikipedia.. See also the New World Encyclopedia

[8] Owlcation

[9] Borman, Tracy, James VI And I: The King Who Hunted Witches,  History Extra (BBC Historical Magazine)  (March 27, 2019)

[10]  Encyclopedia Britannica. See also New World Encyclopedia and the Stanford Dictionary of Philosophy.

[11]Bill’s Bible Basics.”

[12]  Wikipedia – Blasphemy law.

[13]  Wikipedia – Blasphemy.

Reparations [3]: The Airtight Legal Case Against Them, and the Moonshot That Would Make Them Possible

“We choose to go to the Moon in this decade… not because [it is] easy, but because [it is] hard; because that goal will serve to organize and measure the best of our energies and skills because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one we intend to win….”

JFK, Sept. 12, 1962[1]

It was 1962 and the Cold War was raging. Soviet leader Nikita Khrushchev gave his “we will bury you” speech to 1956[2] and his shoe-banging speech in 1960[3]. Meanwhile, the competition had turned skyward[4], and the Soviet Union had gotten a leg up.

“History changed on October 4, 1957, when the Soviet Union successfully launched Sputnik I.

“That launch ushered in new political, military, technological, and scientific developments. While the Sputnik launch was a single event, it marked the start of the space age and the U.S.-U.S.S.R space race.

“As a technical achievement, Sputnik caught the world’s attention and the American public off-guard… the public feared that the Soviets’ ability to launch satellites also translated into the capability to launch ballistic missiles that could carry nuclear weapons from Europe to the U.S.”[5]

Astrophysicist Neil deGrasse Tyson compares Sputnik’s impact to the furor that ensured when, on January 11, 2007, China blasted one of its own weather satellites out of the sky:

“The hit put tens of thousands of long-lived fragments into high Earth orbit, adding to the already considerable dangers posed by debris previously generated by other countries, notably ours. China was roundly criticized by other spacefaring nations for making such a mess: twelve days later, its foreign ministry declared that the action ‘was not directed at any country and does not constitute a threat to any country.’

“Hmm. That’s a little like saying the Soviet Union’s launch of the world’s first satellite, Sputnik, in October 1957 was not a threat — even though Sputnik’s booster rocket was an intercontinental ballistic missile, even though Cold Warriors had been thirsting for a space-based reconnaissance vehicle since the end of World War II, even though postwar Soviet rocket research had been focusing on the delivery of a nuclear bomb across the Pacific, and even though Sputnik’s peacefully pulsing radio transmitter was sitting where a nuclear warhead would otherwise have been.”[6]

JFK announced the USA’s comeback with his “we choose to go to the moon” speech[7] to 40,000 people packed into the stadium at Rice University.[8] It was visionary in concept and triumphant in tone. The USA wasn’t going to go to the moon just because the Soviets were trying to beat us there, not just to win a celestial derby for a grand prize of bragging rights, and not just to gain the ultimate battlefield high ground. We were going to do it to further America’s mission of bringing peace to the nations, including the new frontier of outer space.

“Those who came before us made certain that this country rode the first waves of the industrial revolutions, the first waves of modern invention, and the first wave of nuclear power, and this generation does not intend to founder in the backwash of the coming age of space. We mean to be a part of it–we mean to lead it. For the eyes of the world now look into space, to the moon and to the planets beyond, and we have vowed that we shall not see it governed by a hostile flag of conquest, but by a banner of freedom and peace. We have vowed that we shall not see space filled with weapons of mass destruction, but with instruments of knowledge and understanding.

“Yet the vows of this Nation can only be fulfilled if we in this Nation are first, and, therefore, we intend to be first. In short, our leadership in science and in industry, our hopes for peace and security, our obligations to ourselves as well as others, all require us to make this effort, to solve these mysteries, to solve them for the good of all men, and to become the world’s leading space-faring nation.

“We set sail on this new sea because there is new knowledge to be gained, and new rights to be won, and they must be won and used for the progress of all people. For space science, like nuclear science and all technology, has no conscience of its own. Whether it will become a force for good or ill depends on man, and only if the United States occupies a position of pre-eminence can we help decide whether this new ocean will be a sea of peace or a new terrifying theater of war. I do not say the we should or will go unprotected against the hostile misuse of space any more than we go unprotected against the hostile use of land or sea, but I do say that space can be explored and mastered without feeding the fires of war, without repeating the mistakes that man has made in extending his writ around this globe of ours.

“There is no strife, no prejudice, no national conflict in outer space as yet. Its hazards are hostile to us all. Its conquest deserves the best of all mankind, and its opportunity for peaceful cooperation many never come again. But why, some say, the moon? Why choose this as our goal? And they may well ask why climb the highest mountain? Why, 35 years ago, fly the Atlantic? Why does Rice play Texas?

“We choose to go to the moon. We choose to go to the moon in this decade and do the other things, not because they are easy, but because they are hard, because that goal will serve to organize and measure the best of our energies and skills, because that challenge is one that we are willing to accept, one we are unwilling to postpone, and one which we intend to win, and the others, too.

“It is for these reasons that I regard the decision last year to shift our efforts in space from low to high gear as among the most important decisions that will be made during my incumbency in the office of the Presidency.”

The speech didn’t focus on the bad guys, didn’t accuse or blame them, didn’t spout media-speak about protecting our national interests. Instead, it was aspirational. It seized the high ground. We were going to the moon because that’s the kind of thing Americans do — we willingly test ourselves to see how good we are. We routinely “organize and measure the best of our energies and skills because that challenge is one that we are willing to accept.” We do hard things, we take on huge challenges because that’s who we are. We stand on the high ground – on Earth, and in space.

It’s hard to imagine someone making a speech like that today. It feels hokey in the unforgiving hindsight of all that’s transpired in the past 60 years, and especially recently. No, I’m not nostalgic for the 60’s — those were not “the best days of my life.”[9] And no, I’m not beatifying JFK or waving the flag of American superiority – a myth I’ve long since had disillusioned out of me. It’s just that I miss living in a culture, nation, and world where leaders think and act and talk like that. And in particular, if we’re going to talk about reparations for slavery, we need to do so with the kind of attitude and outlook that permeated JFK’s speech. Otherwise, the legal technicalities will shut it down.

The Open-and-Shut Case Against Reparations

Here is the insurmountable legal case against reparations:

  • Slavery wasn’t illegal. There are and never have been criminal penalties or civil remedies against those who carried it out — all of whom are long since dead anyway.
  • The only possible responsible party is the government itself, which sponsored slavery in the first place.
  • But even if there were legal grounds to prosecute or sue the government (there aren’t) you can’t do it anyway. That’s because the government is protected by the legal doctrine of “sovereign immunity,” which means it can’t be held to account for administering its own law.
  • The only tribunal with authority to override the doctrine of sovereign immunity is international law, but submitting to international law is voluntary, a matter of each nation’s willingness to give up some of its sovereignty to its national peers, and that is a choice the U.S. has not made.

“Law and order” adherence to the legal case against reparations instantly shuts down the idea. The legal case against reparations is exemplified in what Senate majority leader Mitch McConnell said about the topic:

“I don’t think reparations for something that happened 150 years ago for whom none of us currently living are responsible is a good idea. We’ve tried to deal with our original sin of slavery by fighting a Civil War and passing landmark civil rights legislation. We’ve elected an African-American president. I think we’re always a work in progress in this country, but no one currently alive was responsible for that. And I don’t think we should be trying to figure out how to compensate for it. First of all, because it’s pretty hard to figure out who to compensate.”[10]

McConnel’s comments make it clear that he views reparations in the conventional way of suing for “damages”– money –to recompense a victimized party for past losses.

I wasn’t there. Nobody who’s alive now was there. Everybody who was there is dead now. It’s not my fault. It’s nobody’s fault. The law doesn’t hold anybody accountable.

He was right about all that. The rest of what he said was legally unnecessary, a resort to the kinds of rationalization and platitudes we reach for when what we really mean is “over my dead body.”

Slavery was bad, but why dwell on the past? We’ve been trying to move on, put it behind us. We’re a work in progress. We need to let bygones be bygones.

He didn’t need platitudes. He could have gone straight to the ultimate legal defense:

The Ultimate Defense: Sovereign Immunity

“Sovereign immunity, or crown immunity, is a legal doctrine whereby a sovereign or state cannot commit a legal wrong and is immune to civil suit or criminal prosecution.”[11]

Sovereign immunity came over on the boat with the rest of English common law.

“Sovereign immunity finds its origins in English common law and the king’s position at the ‘apex of the feudal pyramid.’ In that pyramid, lords could not be sued in their own courts, ‘not because of any formal conception or obsolete theory, but on the logical and practical ground that there can be no legal right as against the authority that makes the law on which the right depends.’ Thus, lords could only be sued in the courts of their superiors, but, for the king, ‘there was no higher court in which he could be sued.’” [12]

Where Sovereign Immunity Came From: The Divine Right of Kings

Sovereign immunity is a carryover from the “Divine Right of Kings” – a legal doctrine formulated in the days when monarchies were more than ceremonial. The doctrine was derived from the Biblical worldview that underlies law and culture in America, Europe, and the U.K.

“The theory of the divine right of kings lent support to the proposition that the king was above the law-that he was in fact the law-giver appointed by God, and therefore could not be subjected to the indignity of suit by his subjects…. To Bracton the maxim ‘the king can do no wrong’ meant simply that the king was not privileged to do wrong, but to Blackstone the phrase was not so restricted, and in his Commentaries the following is to be found: ‘Besides the attribute of sovereignty, the law also ascribes to the king in his political capacity absolute perfection… The king, moreover, is not only incapable of doing wrong, but even of thinking wrong: he can never mean to do an improper thing: in him is no folly or weakness.’”[13]

The divine right of kings and non-monarchical sovereign immunity both mean that government –i.e., the people in it who determine and enforce its laws — get the same hands-off treatment as God. God can do no wrong — neither can the king or the President or their emissaries.

I still recall sitting in a law school class when I learned about this. How could it be, that government would not be held accountable for how it treats the governed? “Government needs to be free to govern,” my law professor explained.

There is, however, one powerful way through this legal barrier:

Sovereign Immunity Can be Waived.[14]

The government can volunteer to make things right – it can waive its own sovereign immunity. (It has in fact done so on other occasions, which we will also look at another time.)

Viewed solely as a legal act, a waiver of sovereign immunity would require the commitment and action of all three branches of U.S. government: an act of Congress, signed into law by the President, and upheld as Constitutional by the Supreme Court.

Beyond legalities, reparations would require a break from centuries-old notions of the right of government to govern as it sees fit. Such a break would require a new “social contract.” As one history teacher explains:

“The Divine Right of Kings represents a ‘Top Down’ approach to government, in contrast with the ‘Bottom Up’ approach of social contract theory, which claims that the people create governments for their own protection and that those governments serve the people who created them.”[15]

A New Social Contract

According to Rousseau, a social contract is the mechanism by which we trade individual liberty for community restraint. As Thomas Hobbes famously said, lack of that tradeoff is what makes life “solitary, poor, nasty, brutish, and short.”[16] Or, as a recent version put it, “For roughly 99% of the world’s history, 99% of humanity was poor, hungry, dirty, afraid, stupid, sick, and ugly.”[17] A social contract suggests we can do better. As Hobbes said:

“As long as men live without a common power to keep them all in awe, they are in the condition known as war, and it is a war of every man against every man.

“When a man thinks that peace and self-defense require it, he should be willing (when others are too) to lay down his right to everything, and should be contented with as much liberty against other men as he would allow against himself.”[18]

The USA was created out of the colonists’ desire for a new social contract when their deal with England grew long on chains and short on freedom. In response, the Founders declared a new sovereign nation into existence:

“We hold these truths to be self-evident, that all Men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the Pursuit of Happiness.”

The new nation was conceived in liberty, but there would be limits. Once the Revolutionary War settled the issue of sovereign independence[19], the Founders articulated a new freedom/chains balance:

“We the People of the United States, in Order to form a more perfect Union, establish Justice, insure domestic Tranquility, provide for the common defense, promote the general Welfare, and secure the Blessings of Liberty to ourselves and our Posterity, do ordain and establish this Constitution for the United States of America.”

That original social contract + revisions and amendments over the course of 250 years of history = the USA as we know it today.

Mitch McConnell was right: our nation’s history is always a work in progress – we are constantly revisiting and readjusting our social contract.

For reparations to happen, we need a new social contract that would enable a waiver of sovereign immunity. And for that to happen, the new social contract needs to explicitly reject a racial perspective articulated by none other than John Wilkes Booth:

“This country was formed for the white, not for the black man,” John Wilkes Booth wrote, before killing Abraham Lincoln. “And looking upon African slavery from the same standpoint held by those noble framers of our Constitution, I for one have ever considered it one of the greatest blessings (both for themselves and us) that God ever bestowed upon a favored nation.”[20]

Reparations Would Require Another Moon Shot

A new social contract is an idea of monumental proportions. People don’t rally behind small ideas. National transformation requires big, bold, decisive initiative — ideas like that are hard, impossible by current standards, that require voyages into uncharted territory and commitment to solve unprecedented problems. The USA would make reparations for slavery because that’s what Americans do — we willingly test ourselves to see how good we are. We routinely “organize and measure the best of our energies and skills because that challenge is one that we are willing to accept.” We do hard things, we take on huge challenges. That’s who we are. We don’t make ourselves the good guys and everyone else the bad. We don’t blame them, don’t spout media-speak about national interests, don’t hide behind legal technicalities. We do the aspirational. We stand on the high ground – on Earth, and in space.

If the USA is going to make reparations for slavery, we need a new moonshot.

 

[1] Here’s the full text. See also Wikipedia.

[2] See a previously classified CIA report on that speech here.

[3] See Wikipedia.

[4] See this timeline for the Space Race.

[5] NASA.

[6] Tyson, Neil deGrasse and Lang, Avis, Accessory to War: The Unspoken Alliance Between Astrophysics and the Military,

[7] Here’s the full text.

[8] Wikipedia.

[9] Bryan Adams, Summer of ’69.

[10] Axios.com.

[11] Wikipedia on Sovereign Immunity. See also Wikipedia on Sovereign Immunity in the United States.

[12] McCann, Miles, “State Sovereign Immunity,” National Association of Attorneys General, NAGTRI Journal Volume 2, Number 4. Although the article is technically about state – vs. federal — sovereign immunity, the quoted text applies to both. See also the following quote from this monograph from the law firm of Debevoise & Plimpton, a New York based firm with a reputation for its commitment to diversity” “At its core, the doctrine of sovereign immunity stands for the proposition that the government cannot be sued without its consent – that is, ‘the King can do no wrong.’ Sovereign immunity is simple in concept but nuanced in application.”.

[13] Pugh, George W., “Historical Approach to the Doctrine of Sovereign Immunity.” Louisiana Law Review Volume 13, Number 3 (March 1953).. Citations omitted.

[14] McCann, Miles, “State Sovereign Immunity” and Wikipedia on Sovereign Immunity in the United States

[15] TomRichey.net.

[16] Hobbes, Thomas, Leviathan.

[17] Rutger Bregman, Utopia for Realists (2016),

[18] Hobbes, op cit.

[19] In Hobbes’ terms, social contracts end the battle royale. Ironically, they often also create war as the ideals of one contract conflict with those of another.

[20] Coates, Ta-Nehisi, The Case for Reparations, The Atlantic (June 2014).

Reparations [2]: Slavery, Human Capital, Le Déluge, and Paying the Piper

Après moi, le déluge.
(After me, the deluge.)
King Louis XV of France

The proposal of reparations for the USA’s racial history raises complex legal, economic, and other issues. We’re familiar with these – they’ve been well-rehearsed in op-eds and speeches, by politicians and pundits, activists and the media….

Less familiar are issues more subjective than objective, reflective than combative, instinctual than intellectual. These are the province of shared human experience and sensibility, particularly of virtue — a nearly obsolete concept these days. Virtue prompts change not from the outside, not institutionally, but from a transformation in shared human consciousness, a cultural change of heart, We learn its lessons not from economic models and legal briefs, but principally from truth expressed in fiction –myths and legends, fables and feature films — Aesop’s Fables for adults. As one of Aesop’s contemporaries said about him:

“… like those who dine well off the plainest dishes, he made use of humble incidents to teach great truths, and after serving up a story he adds to it the advice to do a thing or not to do it. Then, too, he was really more attached to truth than the poets are; for the latter do violence to their own stories in order to make them probable; but he by announcing a story which everyone knows not to be true, told the truth by the very fact that he did not claim to be relating real events.”.[1]

As we’ll see below, virtue asks more than legal compliance, it demands that we pay the piper.

In this series, we will look at both kinds of issues in detail.

History Lesson: The French Revolution

“After me, the deluge” is sometimes attributed to the King’s mistress, Madame de Pompadour, as “After us, the deluge.” Either way – King or mistress, me or us – the quote is usually taken as a prophesy of the French Revolution, delivered with an attitude of elite indifference that ranks right in there with Marie Antoinette’s “Let them eat cake.” (Which she probably never said.[2]) “We’re getting away with it now, but all hell is going to break loose once we’re gone.” And indeed it did, when King Louis XVI was guillotined a generation later, under the name Citizen Louis Capet. [3]

From that historical context, après moi, le déluge has come to represent an awareness of coming doom, a feeling that we can’t get away with this forever. Things are good now, but watch out, they won’t last. People thought life was good back in Noah’s time, but look what happened to them. We keep this up, we might get our own version of the Flood.

Contemporary Lesson: Economic Inequality

Plutocrat Nick Hanauer offers a modern version of the saying in his TED talk. According to his TED bio, Hanauer is a “proud and unapologetic capitalist” and founder of 30+ companies across a range of industries, including aQuantive, which Microsoft bought for $6.4 billion. He unabashedly loves his yacht and private jet, but fears for his own future, and the futures of his fellow plutocrats, if economic inequality is left unaddressed:

“What do I see in our future today, you ask? I see pitchforks, as in angry mobs with pitchforks, because while people like us plutocrats are living beyond the dreams of avarice, the other 99 percent of our fellow citizens are falling farther and farther behind.

“You see, the problem isn’t that we have some inequality. Some inequality is necessary for a high-functioning capitalist democracy. The problem is that inequality is at historic highs today and it’s getting worse every day. And if wealth, power, and income continue to concentrate at the very tippy top, our society will change from a capitalist democracy to a neo-feudalist rentier society like 18th-century France. That was France before the revolution and the mobs with the pitchforks.”

Whether French Revolution or today, the issue is “paying the piper.”

The Moral of the Story: The Pied Piper of Hamelin

Pied Piper

Illustration by Kate Greenaway for Robert Browning’s “The Pied Piper of Hamelin”

Victorian poet Robert Browning brought us the “paying the piper” idiom in The Pied Piper of Hamelin. [4] Here’s a synopsis to refresh our memories:

“‘Pay the piper’ comes from the famous 1842 poem by Robert Browning, The Pied Piper of Hamelin. The story is about a German town called Hamelin which, after years of contentment, was suddenly plagued by a huge increase in the rat population, probably due to some plague or poison which had killed all the cats. The rats swarmed all over, causing much damage. Try as they might, the townspeople could not get rid of the rats.

“Then appeared a mysterious stranger bearing a gold pipe. He announced that he had freed many towns from beetles and bats, and for a cost, he would get rid of the rats for the town.

“Although he only wanted a thousand florins, the people were so desperate that the Mayor promised him 50,000 for his trouble, if he could succeed.

“At dawn, the piper began playing his flute in the town and all the rats came out of hiding and followed behind him. In this way, he led them out of the town. All the rats were gone.

“When the piper came back to collect his pay, the town refused to pay even his original fee of one thousand florins. The mayor, thinking the rats were dead, told the piper he should be happy if he received any pay at all, even fifty florins.

“The pied piper warned the town angrily that they would regret cheating him out of his pay.

“Despite his dire warning, the rats were gone so the townspeople went about their business, at last enjoying a peaceful night’s sleep without the scurrying and gnawing of rats.

“At dawn, while they slept, the sound of the piper’s pipe could be heard again, except this time only by the children. All the children got out of bed and followed behind the piper, just as the rats had before. The piper led the children out of town and into a mountainous cave. After all the children had walked into the cave, a great landslide sealed up the entrance. One little boy managed to escape and tell the town what had happened to the children. Although they tried, they could never rescue them, and they were lost forever.”

After me, the deluge + Pay the piper = Pay the piper or risk the deluge

Virtue says don’t get greedy. Don’t be tempted. Don’t be a fraud. Keep your end of the bargain. Don’t be too smart for your own good. Don’t try to get away with it. You’re better than that. Fess up, take responsibility. Don’t invite the deluge – the sudden and terrible twist of fate, the movement of greater mysteries, the imposition of higher justice.

The rats you get rid of won’t be worth the children you lose.

The mayor and citizens of Hamelin defrauded the Piper at the cost of their own children. Justice was absolute — the mountain vault was sealed. The Piper was fully, awfully paid.

Reparations for American slavery are a proposed remedy – a way to pay the piper — for the lost humanity of slaves, stolen from them by a legal and economic framework that assigned slaves economic but not human value. Slaves were dehumanized, and virtue will not tolerate it.

Exploitation of Human Capital

Exploitation of capital assets is expected in a capitalist economy. Human labor is a capital asset, and will also be exploited — everyone who’s ever worked for someone else figures that out the first day on the job. But slavery took exploitation too far: slaves were not people, they were capital assets and nothing more. They were no longer human.

“Exploitation can also be harmful or mutually beneficial. Harmful exploitation involves an interaction that leaves the victim worse off than she was, and than she was entitled to be. The sort of exploitation involved in coercive sex trafficking, for instance, is harmful in this sense. But as we will see below, not all exploitation is harmful. Exploitation can also be mutually beneficial, where both parties walk away better off than they were ex ante. What makes such mutually beneficial interactions nevertheless exploitative is that they are, in some way, unfair.

“It is relatively easy to come up with intuitively compelling cases of unfair, exploitative behavior. Providing a philosophical analysis to support and develop those intuitions, however, has proven more difficult. The most obvious difficulty is specifying the conditions under which a transaction or institution may be said to be unfair.

“Does the unfairness involved in exploitation necessarily involve some kind of harm to its victim? Or a violation of her moral rights? Is the unfairness involved in exploitation a matter of procedure, substance, or both? And how, if at all, are facts about the history of the agents involved or the background conditions against which they operate relevant to assessing charges of exploitation?”[5]

Slavery harmed its victims, exploited them both procedurally and substantively. And “the facts about the history” of slavery’s purveyors and “the background conditions against which they operate[d]” are most definitely “relevant to assessing charges of exploitation.” Today, 165 years after the nominal end of slavery, those charges remain unanswered, and unpaid.

Slavery and Human Capital

19th Century economist John Elliot Cairnes was “an ardent disciple and friend of John Stuart Mill” and “was often regarded as ‘the last of the Classical economists.’”[6] Writing during the American Civil War, Cairnes analyzed the impact of slavery on both human and other forms of capital in his book The Slave Power: Its Character, Career, and Probable Designs: Being an Attempt to Explain the Real Issues Involved in the American Contest.[7]

“Cairnes’s shining hour was his widely-discussed 1862 treatise Slave Power.  Cairnes analyzed the consequences of slavery for economic development, in particular how it speeded up soil erosion, discouraged the introduction of technical innovations and stifled commerce and enterprise more generally. Written during the American Civil War, Cairnes warned British policymakers to think twice about backing the economically-unviable Confederacy.  Cairnes book was instrumental in turning the tide of popular English opinion against the rebels.”

Writing about slaves as human capital, Cairnes said this:

“The rice-grounds of Georgia, or the swamps of the Mississippi may be fatally injurious to the human constitution; but the waste of human life which the cultivation of these districts necessitates, is not so great that it cannot be repaired from the teeming preserves of Virginia and Kentucky.

“Considerations of economy, moreover, which, under a natural system, afford some security for humane treatment by identifying the master’s interest with the slave’s preservation, when once trading in slaves is practiced, become reasons for racking to the uttermost the toil of the slave; for, when his place can at once be supplied from foreign preserves, the duration of his life becomes a matter of less moment than its productiveness while it lasts.

“It is accordingly a maxim of slave management, in slave-importing countries, that the most effective economy is that which takes out of the human chattel in the shortest space of time the utmost amount of exertion it is capable of putting forth. It is in tropical culture, where annual profits often equal the whole capital of plantations, that negro life is most recklessly sacrificed. It is the agriculture of the West Indies, which has been for centuries prolific of fabulous wealth, that has engulfed millions of the African race. It is in Cuba, at this day, whose revenues are reckoned by millions, and whose planters are princes, that we see in the servile class, the coarsest fare, the most exhausting and unremitting toil, and even the absolute destruction of a portion of its numbers every year.”[8]

Five years after Cairnes wrote that, Karl Marx cited the above passage in Das Kapital[9] in his own analysis of slave labor as capital:

“The slave-owner buys his labourer as he buys his horse. If he loses his slave, he loses capital that can only be restored by new outlay in the slave-mart.

“‘Après moi le déluge!’ is the watchword of every capitalist and of every capitalist nation. Hence Capital is reckless of the health or length of life of the labourer, unless under compulsion from society.

To the out-cry as to the physical and mental degradation, the premature death, the torture of over-work, it answers: Ought these to trouble us since they increase our profits?

Marx believed that the ultimate culprit was not the individual slave owners, but the capitalist economic system which sponsored the exploitation of all capital – including human capital – to achieve its competitive goal of profitability:

“But looking at things as a whole, all this does not, indeed, depend on the good or ill will of the individual capitalist. Free competition brings out the inherent laws of capitalist production, in the shape of external coercive laws having power over every individual capitalist.”

Under the reign of capitalism, Marx argued, workers would be exploited – slaves and free alike — and this would be both an economic and cultural norm. This practice would become so entrenched that it could be broken only by a contrary “compulsion from society.”

The Deluge:  Civil War

“The deluge” is a form of “compulsion from society,” and civil war is a form of both.

The American Civil War was the deluge. The war ended almost exactly four years after it began, at the cost of hundreds of thousands of American lives, uncounted non-fatal casualties, and incalculable damage to the rest of American citizenry, human property, and nature.

“Approximately 620,000 soldiers died from combat, accident, starvation, and disease during the Civil War. This number comes from an 1889 study of the war performed by William F. Fox and Thomas Leonard Livermore. Both men fought for the Union. Their estimate is derived from an exhaustive study of the combat and casualty records generated by the armies over five years of fighting.  A recent study puts the number of dead as high as 850,000. Roughly 1,264,000 American soldiers have died in the nation’s wars–620,000 in the Civil War and 644,000 in all other conflicts.  It was only as recently as the Vietnam War that the number of American deaths in foreign wars eclipsed the number who died in the Civil War.”[10]

Tragically, the course of American racial history would cause many to wonder if all those deaths had been in vain. War – the deluge, the compulsion of society – had its day, but it didn’t change cultural attitudes. The ones that supported Antebellum slavery only became more belligerently expressed.

In France, Louis XV saw the deluge coming, Louis XVI suffered from it, but eleven years later Napoleon was Emperor.

The piper was never paid.

In the USA, war gorged itself on the American land and population, but the Union’s victory foundered on the failings of the Reconstruction.

The Piper was never paid.

The law concerning slavery was changed, but de facto[11] slavery lived on. Before the Civil War, slavery had been, like war itself, a legal crime against humanity, justified under the law of the land. After the Civil War, slavery was simply a crime, illegal as all other crimes, but propagated by a reign of terror that eventually gained its own legal justification that would once again have to be dismantled by another compulsion from society 100 years later.

After the war, you couldn’t own slaves anymore, couldn’t buy and sell them, but you could treat legally freed former slaves just as you once treated their legally enslaved predecessors. In fact, it was much worse. Before the war, the ownership and treatment of slaves was by legal right. After the war, de facto slavery relied on a reign of terror grounded in cultural indifference and brutality. Cruel and unusual punishment had been banned by the Eighth Amendment to the U.S. Constitution, but de facto slavery relied on it to terrorize society into submission.

The Piper was never paid.

The U.S. Labor Movement and Human Capital

The American labor movement’s 400-year history is a chronicle of shifting economic theories and new labor laws brought about by periodic challenges – compulsions from society – to the capitalist norm of the exploitation of human capital.[12] Changing times generated changing attitudes, and American culture demanded accommodations in often violent ways.

And now, in the middle of another deluge – this time a plague, the Covid-19 virus – we have seen the most recent and striking societal shift in the form of the Supreme Court’s ruling that the Civil Rights Act of 1964 protects LGBTQ workers from workplace discrimination.[13] Few would claim that the 56-year old Civil Rights Act specifically had today’s gender sensibilities in mind, but the law shifts with cultural attitudes when compelled to do so.

The labor movement will continue to change with the times. Issues of sexism remain, and technology – especially robotics, AI, and machine learning – are threatening human labor in ever-accelerating, unprecedented ways. There will be more deluge, more societal compulsion.

The Piper will still need to be paid.

The Racist Roots of Police Brutality

Finally – for today, at least – the Coronavirus deluge has also recharged the force of societal compulsion currently taking on mass incarceration and police brutality, both of which have historical roots in the Reconstruction’s unresolved racism.[14]

The Piper was never paid.

We have much more to talk about. We’ll continue next time.

[1] Philostratus, Life of Apollonius of Tyana, Book V:14. From Wikipedia.

[2] See Solosophie.com and Phrases.org.

[3] For more about what the saying might mean, see this is from Wikipedia: “The most famous remark attributed to Louis XV (or sometimes to Madame de Pompadour) is Après nous, le déluge (“After us, the deluge”). It is commonly explained as his indifference to financial excesses, and a prediction of the French Revolution to come. The remark is usually taken out of its original context. It was made in 1757, a year which saw the crushing defeat of the French army by the Prussians at the Battle of Rossbach and the assassination attempt on the King. The “Deluge” the King referred to was not a revolution, but the arrival of Halley’s Comet, which was predicted to pass by the earth in 1757, and which was commonly blamed for having caused the flood described in the Bible, with predictions of a new deluge when it returned. The King was a proficient amateur astronomer, who collaborated with the best French astronomers. Biographer Michel Antoine wrote that the King’s remark “was a manner of evoking, with his scientific culture and a good dose of black humor, this sinister year beginning with the assassination attempt by Damiens and ending with the Prussian victory”. Halley’s Comet finally passed the earth in April 1759, and caused enormous public attention and anxiety, but no floods.

[4]   Idioms.online.

[5] Exploitation, Stanford Encyclopedia of Philosophy (first published Thu Dec 20, 2001; substantive revision Tue Aug 16, 2016).

[6] The History of Economic Thought.

[7] Cairnes, John Eliot, The Slave Power: Its Character, Career, and Probable Designs: Being an Attempt to Explain the Real Issues Involved in the American Contest (1862).

[8] Cairnes, Slave Power, op cit.

[9] Marx, Karl, Das Kapital (Vol. 1, Part III, Chapter Ten, Section 5).

[10] American Battlefield Trust.

[11] “In law and government, de facto describes practices that exist in reality, even though they are not officially recognized by laws. It is commonly used to refer to what happens in practice, in contrast with de jure, which refers to things that happen according to law.” Wikipedia

[12] See this timeline, which runs from 1607-1999, beginning with complaints about labor shortages in Jamestown in 1607, addressed by the arrival in 1619 of the first slaves stolen from Africa.

[13] Civil Rights Law Protects Gay and Transgender Workers, Supreme Court Rules, New York Times (June 16, 2020).

[14] See, for example, The Racist Roots Of American Policing: From Slave Patrols To Traffic Stops, The Conversation (June 4, 2019) and George Floyd’s Death Reflects The Racist Roots Of American Policing, The Conversation (June 2, 2020).

Reparations [1]: Economics and a Whole Lot More

The current civil rights movement has reopened the discussion about reparations for American slavery[1]:

“As protests continue to convulse cities across America, many wonder where we go from here. It’s impossible to know the future. But if efforts do not include meaningful reparations for African Americans, the omnipresent injustices we face will not be resolved.

“For a long time, the word ‘reparations’ was a non-starter, but it is finally losing its taboo. The movement to provide financial redress to African Americans for centuries of subjugation and racial terror was already growing last year. HR 40, a bill that would establish a commission to study the legacy of slavery and develop reparations proposals to Congress, is enjoying a surge in support. Groundbreaking reparations legislation has been approved in Evanston, Ill. And a bill has been introduced in the California Assembly that would create a task force to study the impact of slavery and offer proposals for reparations for African-Americans in the state.

“The outpouring of anger in every corner of this country in recent days — more than 400 years after the first enslaved Africans arrived in America — could finally put reparative justice within reach.”

The day after the above appeared in the Los Angeles Times, Oprah Whitney ran a special that contained a segment on reparations. The day after that, the following appeared in the Washington Examiner[2]:

“It was only a matter of time before ‘Justice for George Floyd’ became ‘And while we’re at it, here are a few other things we’d like you to take care of with no questions asked.’

“That’s invariably what happens when the media, Hollywood, and the Democratic Party get involved.

“What started out as an issue over excessive force used by police against minorities has quickly devolved into a jackpot for the social justice people who see oppression, grievance, and victimhood in every aspect of their lives.

“[Bringing up the topic of reparations for slavery] lost the attention of nearly every white person who might have been watching.”

Thus the issue was reframed as a political blasting cap.

We can do better.

In the past couple weeks, I’ve accumulated a research file on reparations of over two dozen pages of resources and citations that make the topic much larger than who’s for it and who’s against it, who would get paid how much and when and how, how the government would finance it, etc. Instead, my research pulls back to a wide shot that starts with economics and law but then encompasses everything from individual and institutional belief systems, religious and secular notions of morality and ethics, national and cultural identity and worldview, and a whole lot more. I found all of that in the 400 years of American history I never knew, including the history made in my own time. I suggest we start with the latter as a first step toward moving ourselves past polarization paralysis.

Coming of Age in the 1960’s Civil Rights Movement

My hometown was a rural community in the western plains of Minnesota, populated with Scandinavian Lutherans living on Homestead Act farms in family groups where the grandparents still spoke Norwegian. There were also enough German Catholics to support a parish with a K-8 school staffed by nuns. The rest of us – the minorities — were identified mostly by reference to the small Protestant churches where our parents took us on Sundays.

None of us had any reason to be racist, but we were, although we would have been surprised and insulted if somebody had pointed that out, which of course nobody did. Racial slurs were part of the vocabulary: my childhood friends tossed around the N-word as casually as they traded baseball cards, and talked about “putting them on the boat and shipping them back.”[3] Nothing personal, that kind of talk was just… normal. I always felt ashamed to hear it. I didn’t know why. And you didn’t talk that way in my house. The N-word we used was “negro” – blacks weren’t called blacks yet.

In 1954, the year after I was born, the US Supreme Court ruled in Brown v. Board of Education that “separate but equal” violated the Constitution. A few years later – I was four, maybe — I saw a black man for the first time.

Our home was up the hill from the railroad tracks, and the “bums” who rode the rails sometimes camped in a ravine between the tracks and our house, and would come begging. I came downstairs to breakfast one morning to see my mother talking through the back screen door to a black man standing in the middle of our backyard, well away from the house. He wore a wrinkled white shirt and baggy gray trousers held up with suspenders, and was holding his hat with both hands at this chest, head slightly bowed. “I would be so very much obliged, ma’am,” he was saying. Mom turned away from the door and started frying eggs, making toast, and pouring coffee. Her face had that hard, determined look you didn’t cross. I asked who he was, and what he wanted. “He’s a bum,” she said, “and he’s hungry.” My own breakfast was going to wait, so I went up to my room to play. When I came back he was gone.

My dad had the International Harvester farm implements franchise, and now and then he won a sales contest that earned him a trip to a company function. One of those was in the South, with a stop to visit his dad, who had retired to Sarasota. Our family didn’t talk much at meals — mostly sat, ate, and left — but at “supper” (not “dinner” like the city people on TV) on his first night back home he sat looking stunned all the way through pie and ice cream and coffee as he described what he’d seen: a “No Colored” sign over a water fountain, a “Colored” entrance at a restaurant…. We were all stunned with him, that such things existed. We had no idea.

A few years later, LBJ’s Great Society[4] brought Lady Bird Johnson to town for a ribbon-cutting commemoration of a renovation to Main Street. It’s only now that I wonder if a few benches, flower planters, and garish turquoise mushroom-shaped fiberglass shelters were what LBJ and the Congress had in mind when they passed a law promoting urban renewal. Schools closed for the parade, there were speeches and reporters from the Minneapolis Star and Tribune, and we made the 6:00 o’clock news from the NBC affiliate we picked up with an antenna on the roof.

About then I started drawing pictures of black athletes on my tablet during recess — Lew Alcindor, Cassius Clay, Dr. J…. Kids would gather around to watch. One day one of them snorted, “Nigra,” and walked away. I liked the sound of the word. It wasn’t the usual N-word, and it seemed defiant somehow. I drew another picture of a Black Everyman with an afro, and wrote “Nigra” underneath it. I’ll bet I could still draw it today.

Middle school summers at the lake (you took refuge from the baking humidity at a “cottage at the lake”) were played out to a soundtrack liberally laced with Motown, and two weeks at Boy Scout camp brought letters from home with news of riots. Detroit was burning. L.A. was burning. “Ghetto” entered the national lexicon, and even Boy Scouts in the north woods knew where Watts was.

In high school, my girlfriend went with her Lutheran Youth Group to a civil rights event in the Twin Cities that included a speech from a local Black Panther leader. In those days you didn’t say the F-word even if you were telling a story about somebody who used it, but somehow she communicated that the speaker had used that word a whole lot. I wondered why.

In 1968, USA runners Tommie Smith and John Carlos raised their fists on the medal podium, joined by silver medalist Peter Norman, a white Australian runner.

“As the American athletes raised their fists, the stadium hushed, then burst into racist sneers and angry insults. Smith and Carlos were rushed from the stadium, suspended by the U.S. team, and kicked out of the Olympic Village for turning their medal ceremony into a political statement. They went home to the United States, only to face serious backlash, including death threats.

“However, Carlos and Smith were both gradually re-accepted into the Olympic fold, and went on to careers in professional football before retiring. Norman, meanwhile, was punished severely by the Australian sports establishment. Though he qualified for the Olympic team over and over again, posting the fastest times by far in Australia, he was snubbed by the team in 1972. Rather than allow Norman to compete, the Australians did not send a sprinter at all.”[5]

In 1971, six months before I graduated from high school, Sports Illustrated ran its “Black is Best” article.[6]

“It is clear that the black community in the U.S. is not just contributing more than its share of participants to sport. It is contributing immensely more than its share of stars. Black athletes accounted for all eight Olympic records set by U.S. runners at Mexico City in 1968, which led a European coach to observe: ‘If not for the blacks, the U.S. team would finish somewhere behind Ecuador.’”

I was an athlete. Those events and stories meant a lot to me.

Off at college, my R.A. was black (no longer a negro), and two other black guys shared a room two doors down from mine. With them in my life, I felt like I had arrived. Kelly had a springy, athletic way of moving, a short afro and a ready smile. Miles was tall and stooped, had a giant afro, always seemed mad, and never spoke. I wondered why.

I became a Jesus Freak during a gap year, and a Lutheran youth pastor (he had long hair, wore a big wooden cross, and drank beer at Kenny’s Tavern) struck a blow for ecumenicism and invited me along as a counselor on a trip with his youth group to a conference in Houston. Our first day at the convention center, a procession of blacks in bright blue robes marched two-by-two through the crowd, dipping and bobbing, two steps forward one step back, singing and -chanting, “Y-E-S, oh yes, Y-E-S, oh yes….” We followed them to the Y.E.S. Soul Choir’s gospel music concert. That night’s general session featured Andre Crouch and the Disciples rocking the house. I had one of their records back home. My new life as a Jesus Freak didn’t get any better than this.

Back at school, I heard about the annual welcome picnic for black students and decided to go. I was the only white guy there, didn’t know anybody and couldn’t think of what to do, so I volunteered for the serving line. A black guy and girl from Houston joined the campus Christian fellowship that fall, and the three of us started a Bible study with their friends in Black House. That winter a movie came out about Corrie ten Boom – the Nazis sent her and her family to concentration camps for aiding Jews — and fifteen black urban kids and one white town kid piled into a couple college vans and drove to a nearby town for pizza and the movie. The silences that met our arrivals were… thunderous. Not hostile, not threatening, mostly just… pointed. We were something you didn’t see every day. We were the new normal, and it was taking some getting used to.

That spring, we brought my co-leader’s pastor up from her church in Houston. For three days I followed him around, sat next to him at meals and in small groups, watched him — tall, erect, muscular in tailored three-piece suits and gleaming white shirts with cufflinks — as he parted the waters of shabby tie-dyed holey-jeaned flower children, laying down the gospel in a voice that rumbled.

The more I go on, the more I could go on — the memories pour in, scenes from a decade far more turbulent than the worst flight you’ve ever been on; racing across my mind’s theater screen in a blurry fast forward, leaving behind the indelible feel of those times. Incredibly, the Civil Rights Movement wasn’t the only one bringing radical cultural change, just one of many in a Revolution that was everywhere. The times were as thick and pungent with change as the marijuana haze that filled the quad, filled the dorms. The world was changing, and we were changing it. No, we had changed it. One night I attended a guest lecture where a visiting astrophysicist described a new cosmological theory called the Big Bang — the entire universe blasted into existence from an inconceivably compressed pre-temporal mass. It made sense. We could relate. We were living our own Big Bang.

Deep Ignorance and Long Memories

Then it was the 1970’s, and the Revolution staggered along, still tripping but starting to come out of it, Soon every commercial had at least one black person in it, like that was normal. Okay, so maybe it was tokenism, but we didn’t care, it would be normal soon enough. With that attitude, we were making the same mistake every generation seems to make: we assumed we were the enlightened ones, we’d gotten it right in ways our parents hadn’t, and they would have to deal with life on our new terms, and our terms were that “prejudice” (it wasn’t called “racism” yet) was over. The Beast was dead. The stain of slavery had been expunged. Equality was fixed in place, a given, a reality solidly grounded.

Or so we thought.

The first Black History month was observed in 1970 at an iconic location – the Kent State campus, ground zero of our opposition to the Vietnam War. I heard about it, as I’ve heard about it annually for the past fifty years, but I’ve never participated, never attended because… well, why would I? There was no point in it: the new normal was that the races were now equal. We wouldn’t have a White History Month, so why would we have a Black one?

Or so I thought.

I managed to hold those beliefs, that judgment of history, all the way into this century, even as the justice system carried out its policies of mass incarceration, even as the news increasingly included body cam and cell phone videos of the police beating and murdering black people.

The new Civil Rights Movement has finally awakened me to just how shockingly wrong and blind I was and have been. And not just me, but how wrong and blind many in my generation were and have been. We never grew up, remained children full of ourselves. We made false assumptions, stopped learning from the times that came after ours, and never bothered to learn from the times that came before our own. That level of misjudgment generated the deepest kind of ignorance – not merely a personal failure to know, but the shared ignorance of an entire generation, a massive communal failure to know that history is not a dead letter but an active force still alive in us, still powering us in hidden, subconscious ways, still shaping our attitudes, initiatives, and responses in ways we would vehemently deny if confronted with them, just as my hometown would have denied its racism back in the day. We soak up our history from our surroundings, breathe it in, are immersed in it… and we don’t even know it. That kind of ignorance and arrogance has enabled the systemic racism that today’s protests are now broadcasting to the world.

It seems fitting, then that my personal reckoning should begin with a century-old cultural memory that, until my research on this article, was part of my massive, hidden Black History file of stupefying ignorance. The 1921 Greenwood Massacre is a particularly pertinent place to begin writing about reparations: it was undeniably a major economic event, but it was also much, much more, and the long-suppressed memory of it has now found its way out, and into the streets.

The Greenwood Massacre

Greenwood massacre

Photo:  Tulsa Historical Society

We heard earlier from Damario Solomon Simmons, a civil rights attorney and adjunct professor of African and African American studies at the University of Oklahoma. He wrote this in his L.A. Times article cited earlier:

“The aversion to making amends for systemic racism is perhaps most evident in my hometown of Tulsa, Okla., which last week commemorated the 99th anniversary of the Greenwood massacre.

“On May 31, 1921, thousands of white Tulsans, 2,000 of whom were deputized by the police, stormed the Greenwood neighborhood, a community known as ‘Black Wall Street.’ In one day and night, the nation’s most prosperous black community was reduced to rubble. Hundreds were killed, and more than 10,000 black Tulsans were left injured, homeless and destitute.

“For decades, Greenwood managed to flourish despite racist Jim Crow laws in Oklahoma. In a matter of hours, millions of dollars in hard-fought wealth — property, homes, businesses, investments — burned to ashes. About 35 square blocks, including 1,200 homes and scores of businesses, were destroyed. Tulsa has not been the same since.”[7]

Ta-Nehisi Coates, a national correspondent for The Atlantic, wrote in 2014 what remains as the definitive piece on slavery reparations.[8] There, he wrote this about the Greenwood Massacre:

“Something more than moral pressure calls America to reparations. We cannot escape our history. All of our solutions to the great problems of health care, education, housing, and economic inequality are troubled by what must go unspoken. ‘The reason black people are so far behind now is not because of now,’ Clyde Ross told me. ‘It’s because of then.’ In the early 2000s, Charles Ogletree went to Tulsa, Oklahoma, to meet with the survivors of the 1921 race riot that had devastated ‘Black Wall Street.’ The past was not the past to them. ‘It was amazing seeing these black women and men who were crippled, blind, in wheelchairs,’ Ogletree told me. ‘I had no idea who they were and why they wanted to see me. They said, We want you to represent us in this lawsuit.’ ”

“A commission authorized by the Oklahoma legislature produced a report affirming that the riot, the knowledge of which had been suppressed for years, had happened. But the lawsuit ultimately failed, in 2004. Similar suits pushed against corporations such as Aetna (which insured slaves) and Lehman Brothers (whose co-founding partner owned them) also have thus far failed. These results are dispiriting, but the crime with which reparations activists charge the country implicates more than just a few towns or corporations. The crime indicts the American people themselves, at every level, and in nearly every configuration. A crime that implicates the entire American people deserves its hearing in the legislative body that represents them.

“John Conyers’s HR 40 is the vehicle for that hearing. No one can know what would come out of such a debate. Perhaps no number can fully capture the multi-century plunder of black people in America. Perhaps the number is so large that it can’t be imagined, let alone calculated and dispensed. But I believe that wrestling publicly with these questions matters as much as—if not more than—the specific answers that might be produced. An America that asks what it owes its most vulnerable citizens is improved and humane. An America that looks away is ignoring not just the sins of the past but the sins of the present and the certain sins of the future. More important than any single check cut to any African American, the payment of reparations would represent America’s maturation out of the childhood myth of its innocence into a wisdom worthy of its founders.”

Bottom line: today’s Civil Rights movement is asking me, asking us, to grow up to our own history.

More next time.

[1] Simmons, Damario Solomon, Reparations Are The Answer To Protesters’ Demands For Racial Justice, Los Angeles Times (June 8, 2020).

[2] Scarry, Eddie, George Floyd Protests Hijacked For Reparations And Other Pet Projects,, Washington Examiner (June 10, 2020).

[3] See A History of Hate Rock From Johnny Rebel to Dylann Roof, The Nation, June 23, 2015.

[4] See the story in History,com.

[5] See the story in History.com.

[6] Sports Illustrated, January 18, 1971.

[7] Simmons, op cit.

[8] Coates, Ta-Nehisi, The Case for Reparations, The Atlantic (June 2014).

America’s National Character, Revealed in its COVID-19 Response

“The entire man is… to be seen in the cradle of the child. The growth of nations presents something analogous to this; they all bear some marks of their origin. If we were able to go back… we should discover… the primal cause of the prejudices, the habits, the ruling passions, and, in short, all that constitutes what is called the national character.”

Alexis de Tocqueville, Democracy in America (1835)

“Begin as you would continue,” my new mother-in-law told my bride and me. Her advice was good beyond gold – a standard we return to in every new beginning, of which there’ve been many in 40+ years.

Alexis de Tocqueville didn’t offer the principle as advice, he recognized its operation in the America he famously toured and wrote about – a nation shaping itself around its founding principles – its “primal cause.” A country’s “national character,” he said, is revealed in the “prejudices,” “habits,” and “ruling passions” of the government and the people. The specifics may shift over time as certain founding values prevail over others due to political tradeoffs and changing circumstances, but in the long haul the country stays true to its origins. Countries, like marriages, continue as they began.

The same dynamics that apply to individuals and nations also apply to institutions, for example societal institutions of law, economics, academics, and commercial enterprise. And for all of them, there’s no such thing as a single beginning to be sustained forever. Personal, national, and institutional histories are shaped around many beginnings and endings. With every new beginning comes an invitation to return to “primal causes” and accept the transformation of historical into contemporary; i.e., each path forward requires a fresh look at how the past’s wisdom can help navigate today’s unprecedented challenges. Trouble is, transformation is perhaps the most difficult thing asked of a person, relationship, institution, nation. The opportunity to transform is therefore rarely recognized, much less embraced, but without it there will be hardening into what was but no longer is, and soon the person or entity under stress will fray under the strain of forcing the fluidity of today into the memory of yesterday.

The Covid-19 Policy-Making Triumvirate

Covid-19 has brought the entire world to an inescapable threshold of new beginning, with its commensurate invitation to transformation. America’s response reveals no embrace of the invitation, but rather a doubling down on the pre-pandemic version of a currently predominant ideological triumvirate of values.[1] Other “prejudices,” “habits,” and “ruling passions” of the “national character” are clearly evident in the nation’s response as well, but I chose to write about this triumvirate because I’ve previously done so here and in my other blog.[2]. The three prongs of the triumvirate we’ll look at today are as follows:

  1. Freemarketism: a hyper-competitive and hyper-privatized version of capitalism that enthrones individual and corporate agency over the centralized promotion of the public good.

Freemarketism is grounded in a belief that marketplace competition will not only prosper capitalists but also promote individual and communal welfare in all social and economic strata. Its essential prejudices and practices are rooted in the transmutation of the western, mostly Biblical worldview into the Protestant work ethic, which judges individual good character and communal virtue by individual initiative and success in “working for a living” and the ability to climb the upward mobility ladder. The state’s highest good is to sponsor a competitive market in which capitalists, freed from governmental regulation and taxation, will build vibrant businesses, generate wealth for themselves as a reward, and activate corollary ”trickle down” benefits to all. Granting the public good an independent seat at the policy-making table is considered detrimental to the market’s freedom.

Freemarketism skews Covid-19 relief toward business and charges the state with a duty to restore “business as usual” as quickly as possible. Direct benefit to citizens is considered only grudgingly, since it would encourage bad character and bad behavior among the masses. Particularly, it would destroy their incentive and willingness to work for a living. The employable populace must be kept hungry, on-edge, primed to get back to work in service to the capitalist engine that fuels the greater good of all.

  1. Beliefism: The denigration of science and intellect in favor of a form of secular post-truth fundamentalism.

Freemarketism is a belief system that emerged in the 1980’s, after the first three decades of post-WWII economic recovery played out in the 1970’s. Freemarketism addressed the economic malaise with its utopian promise of universal benefit, and its founders promoted it with religious zeal as a new economic science – the rationale being that it had been “proven” in ingenious, complex mathematical models. But math is not science, and however elegant its proofs of Freemarketism theory might have been, they were not the same as empirical testing . Freemarketism was therefore a new economic belief system — something you either believed or didn’t.

To gain widespread political and social acceptance, Freemarketism would need to displace the Keynesian economics that had pulled the U.S. out of the Great Depression of the 1930’s by massive federal investment in infrastructure, the creation of new social safety nets, and the regulation of securities markets. During the post-WWII recovery, neoliberal economic policy had struck its own balance between private enterprise and government intervention, creating both new commercial monoliths and a vibrant middle class. Freemarketism would eventually swing this balance entirely to the side of private enterprise. It did so thanks in part to auspicious good timing. At the dawn of the 1980’s, after a decade of Watergate, the oil embargo and energy crisis, runaway inflation, and the Iran hostage crisis, America was ripe for something to believe in. Its morale was suddenly boosted by the USA’s stunning Olympic hockey gold medal, Then, at the end of the decade, came the equally stunning collapse of the Soviet Union, brought on by Chernobyl and the fall of the Berlin Wall. These two bookend events ensured that Freemarketism had made a beginning that politicians and the populace wished to continue.

By then, Soviet-style Communism had been fully exposed as a horrific, dystopian, failed system. It had begun with Karl Marx’s angry empathy for the plight of the working stiff, but a century and a half later had morphed into a tyranny of fear, mind control, and brutality that turned its nominal beneficiaries into its victims, administered by a privileged, unthinking, corrupt, emotionally and morally paralyzed class of party bosses. When the failed system met its just desserts, the West’s storyline trumpeted that capitalism had won the Cold War. Freemarketism stepped up to receive the accolades, and its political devotees set about dismantling the social structures Keynesian economics had built before WWII.

From that point, as Freemarketism gained acceptance, it stomped the throttle toward fundamentalism, which is where every belief system, whether religious or secular, must inevitably end up. Belief by its very nature demands its own purification – the rooting out of doubt. To endure, belief must become irrefutable, must become certain to the point where doubt and discourse are demonized, conformity becomes the greatest social good, and ideological myths become determinants of patriotic duty and moral status. Accordingly, as Freemarketism evangelists increasingly installed their privatized solutions, any system of government based on state-sponsored promotion of the common good was quickly characterized as a threat of a resurgence of Communism. In the minds of Freemarketers – both priests and proles – the European social democracies were thrown into the same toxic waste dump as Communism, because the state could never again be trusted to know what is good for its citizens, or be given the power to carry out its agenda.

Freemarketism’s blind spot is now obvious: for all its demonization of government policy, it needed precisely that to create the conditions it needed to operate. Politicians from the 1990’s forward were happy to comply. Thus empowered, in the four decades since its inception, Freemarketism has ironically failed in the same manner as Soviet Communism, gutting the public good of the working masses and protectively sequestering the wealthy capitalist classes. Along the way, Beliefism as the cultural norm has displaced scientific rationalism with moment-by-moment inanity, expressed in the Covid-19 crisis by everything from drinking bleach to mask and supply shortages, lockdown protests and defiance of mask-wearing, terminating support of the World Health Organization, confusion and skepticism about statistics of infection rates and the value of mass testing, the public undercutting of medical authorities, and much more.

The post-truth flourishing of Beliefism is in turn held in place by the third prong of the triumvirate:

  1. Militarism: The American infatuation with military might and private armaments, and a proclivity towards resolving disputes and achieving policy outcomes through bullying, violence, and warfare.

Militarism is the enforcer for the other two prongs of the triumvirate. Its status as a pillar of the national character is on the one hand entirely understandable, given that the USA was formed because the colonists won their war, but on the other hand perhaps the most ideologically inexplicable when measured against the Founders’ rejection of a standing military in favor of a right to mobilize an armed militia as needed. The displacement of the latter with the former was fully complete only after WWII, grudgingly acknowledged by the General who masterminded .he D-Day invasion: “In the councils of government,” President Eisenhower said on the eve of leaving office, “we must guard against the acquisition of unwarranted influence, whether sought or unsought, by the military industrial complex,” He further warned that, “Only an alert and knowledgeable citizenry can compel the proper meshing of the huge industrial and military machinery of defense with our peaceful methods and goals, so that security and liberty may prosper together.”

The extent to which General Eisenhower’s warnings fell on deaf ears is by now obvious. Meanwhile, the Founders’ concept of the right to bear arms has metastasized into an absolute right to private armaments. The American national character now rests secure in its confidence that it has a big enough stick to forever defend its libertarian version of individual freedoms – including the freedoms of the marketplace – against all opposing beliefs, Communist or otherwise.

Militarism is evident in developments both expressly directed at the pandemic and coinciding with it, spanning both macro and micro responses from saber-rattling against Iran (against whom we apparently still we feel we have a score to settle), blame-shifting against China accompanied with rhetoric that has quickly escalated to the level of a new Cold War, Congress’s self-congratulatory passage of another record-setting new defense budget, and armed militias rallying against the lockdown and supporting protestors in their belligerent non-compliance.

In its Covid-19 response, America put its money where its mouth (ideology) is.

This ideological triumvirate is evident in the spending priorities of the USA’s legislative allocation of government speaking during the lockdown, as indicated in the following two graphs, which reveal that:

  1. The amount directed to business – mostly big business – was twice again as much as the defense budget;
  2. The amount directed to healthcare – during a pandemic – was least of all – half the amount directed to individuals;
  3. The 2020 defense budget approved during the lockdown was twice the size of the amount directed to individual citizens under the CARES relief act; and
  4. Meanwhile, defense spending dwarfs that of our seven nearest national “competitors.”

The Anatomy of the $2 Trillion COVID-19 Stimulus Bill[3]

CARES Act

U.S. Defense Spending Compared to Other Countries[4]

Defense Spending

Character Over Time

“True character is revealed in the choices a human being makes under pressure,” screenwriting guru Robert McKee wrote, “the greater the pressure, the deeper the revelation, the truer the choice to the character’s essential nature.”[5]

Pressure of the magnitude brought on by the pandemic catches national response off guard. It freezes time, demands instant responses to unprecedented demands. Pretense falls off, values and priorities leap from foundational to forefront. There is no time for analysis or spin, only the unguarded release of words and actions in the pressing moment. The result is national character, fully revealed.

The way out of this dizzying spiral is to embrace the invitation to character transformation, which begins in the awareness that something essential to maintaining the status quo has been lost, life has irreversibly changed, an ending has been reached. Every ending requires a new beginning, every new beginning requires a vision for how to continue, and every vision for continuing requires the perspective of newly-transformed character. If there is going to be systemic change, character must be the one to make concessions. The nation’s policy-makers made no such concession in their Covid-19 response.

Response Without Transformation

We’ve spent a few years in this forum discovering the triumvirate’s development and contemporary dominance of government policy-making, which in turn has been supported by enough of the electorate to keep the system in place. Now, the pandemic has put our “more perfect union” under extraordinary stress.

Given the recent racial issues now dominating the headlines, it isn’t far-fetched to compare the pandemic’s moral and legal challenges to those of the Civil War. Today’s post won’t try to do that topic justice, but it’s interesting to note that slavery was a dominant economic force from before America became the United States, especially buttressing capitalist/entrepreneurial wealth generated in tobacco and cotton, and was both expressly and implicitly adopted as a social, economic, and national norm, — for example in the U.S. Constitution’s denying slaves the right to vote and providing that each slave would count as 3/5 of a resident for purposes of determining seats in the House of Representatives. These “primary causes” remained intact for the nation’s first several decades, until a variety of pressures forced a reconsideration and transformation. Those pressures included, for example, a bubble in the pre-Civil War slave market that made slaves themselves into a valuable equity holding to be bought and sold for profit — a practice particularly outrageous to Northerners.[6]

The Covid-19 triumvirate is not Constitutionally recognized as slavery was, but clearly it is based on the current emphasis of certain aspects of the USA’s foundations to the exclusion of others. Many economists argue, for example, that the way out of the deepening pandemic economic depression is a return to a Keynesian-style massive governmental investment in public works and welfare – a strategy that even then was hugely controversial for the way it aggressively rebalanced the national character. The Covid-19 response, along with the military budget, makes no attempt at such a rebalancing – which, among other things, would require policy-makers to retreat from the common assumption that government support of the public good is Communism.

It took a Civil War and three Constitutional Amendments to remove nationalized slavery from the Constitution and begin the transformation of the nation’s character on the topic of race – a transformation which current events reveal is still sadly incomplete.

What would it take to similarly realign the national character in response to the pandemic?

[1] Since we’ve been discovering and examining these for several years in this forum, in this post I’m going to depart from my usual practice of quoting and citing sources. To do otherwise would have made this post far too redundant and far too long, If you want the backstory, I invite you to examine what has gone before..

[2] My two blogs are The New Economy and the Future of Work and Iconoclast.blogt, Each has its counterpart on Medium – The Econoclast and Iconoclost.blog (recent articles only)..

[3] Visusalcapitalist.com

[4] Peter G. Peterson Foundation.

[5] McKee, Robert, Story: Substance, Structure, Style, and the Principles of Screenwriting (1997).

[6] See the analysis in Americana: A 400-Year History of American Capitalism, Bhu Srinivasan.(2017), and the author’s interview with the Wharton business school ,

Narratives of Self, Purpose, and Meaning [Part 2]: The Supernatural

It’s Youth Group night at church; I’m a high school senior and have been tapped to give the sermon. I start with, “Religions are the vehicles through which human beings try to make sense of life.” Honest, that’s what I said. I remember writing it, I remember standing at the pulpit saying it. At home afterward my dad and my sister’s seminarian boyfriend (his name was Luther – honest) were snacking on roast preacher. “Where did you get that?” Luther asked, ‘Religions are the vehicles through which human beings try to make sense of life’ – where did you get that?” He was impressed. I don’t know, it was just an idea, it seemed obvious — religion is one of the things humans do.

Making Sense of Things

As we saw last time, religion is a “teleological”[1] strategy – it’s one of the ways we invest things, events people, ourselves, our lives, and life in general with purpose and meaning. For many people, religion and the supernatural are the go-to standard for teleological thinking.

“Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, ‘Individuals’ explicit religious and paranormal beliefs’ are the best predictors of their ‘perception of purpose in life events”—their tendency ‘to view the world in terms of agency, purpose, and design.’”[2]

The prefix “super” in “supernatural” means above, beyond, over, apart from. When we say supernatural, we mean there’s something or Someone out there that’s not limited to the natural world and flesh and blood, that has it all figured out, sees what we don’t see, knows that we don’t know, explains what we can’t explain, is better at life than we are. The supernatural is personified or objectified in what we call God, who has a better take than we’ll ever have: as author Madeleine L’Engle wrote: “I have a point of view. You have a point of view. God has view.”

Religion tries to teach us God’s view but generally accepts there are limits. Besides, if we could share God’s view, we wouldn’t need God anymore, we’d be God. Short of that, we can only believe God has view, and that it’s better, more complete, more perfect than our point of view. Which means that, compared to God, we and our existence are lesser, partial, flawed, while God represents the perfected version of us – what we would be if we could be God. And somehow, knowing that’s a comforting thought — I know it was for me when I first began to believe in God (a couple years after I gave that sermon), because at least God was better than the alternative, which was me having lost my bearings and making a mess of life.

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless. From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it. For most humans, the idea that life is inherently meaningless simply will not do.”[3]

Believe First, Then Rationalize

Enter the supernatural. Now I felt better. And once I was in, I backfilled the case for believing. Over the next few years I built my case, devouring Christian apologetics and other books that were making the rounds of my collegiate fellowship. That ancillary material became part of my new religious narrative, supporting the primary doctrinal narrative.

These days, neuro-psychological research indicates that we believe first, then rationalize. Rationalizing is not the same as acting rationally. Belief in the supernatural is a story – the story we tell about ourselves and our life that gives us identity and our life purpose and meaning. To the believer, it’s nonfiction – the way things really are, who they really are. If we’re not of similar persuasion, we may think it’s fiction – a fish story, or case of “teleological error”[4]. – but neither of us can prove the other wrong. Belief is ultimately indefensible and unassailable – it’s a “first thought” from which a host of others originate. Still, we like to think our beliefs are rational, chosen in the exercise of our own free will.

Free Will (or not)

Take away free will, and you take away a key sense of personal power. Free will gives us something we can do in the face of the apparent nonsense of life: we can stem the onslaught of meaninglessness by choosing to believe – in this case, in the supernatural. We still don’t understand, we still screw up, but at least we can rely on the supernatural to understand and model what we would be like if we weren’t so… mortal.

These days, neuro-psychology also challenges our usual assumptions about the self and free will, holding that our free will isn’t as free and intentional and rational as we’d like to think. Maybe so, but at least one leading brain scientist thinks that sometimes it might be better just to fool ourselves into believing we can choose what to believe – at least we’ll feel better.

“Psychologist Dan McAdams proposes that when it comes to making sense of our lives, we create narratives or personal myths to explain where we have come from, what we do, and where we are going… These accounts are myths because they are not grounded in reality but rather follow a well-worn narrative path of a protagonist character (our self) and what the world throws at them.

“This core self, wandering down the path of development, enduring things that life throws at us is, however, the illusion. Like every other aspect of human development, the emergence of the self is epigenetic — an interaction of the genes in the environment. The self emerges out of that journey through the epigenetic landscape, combining the legacy of our genetic inheritance with the influence of the early environment to produce profound and lasting effect on how we develop socially. … These thoughts and behavior may seemingly originate from within us, but they emerge largely in a social context. IN a sense, who we are comes down to those around us. We may be born with different biological properties and dispositions, but even those emerge in the context of others and in some cases can be triggered or turned off by environmental factors.

“We may feel that we are the self treading down the path of life and making our own decisions at the various junctions and forks but that would also assume that we are free to make our choices. However, the freedom to make choices is another aspect of the illusion.

“Most of us believe that, unless we are under duress or suffering from some form of mental disorder, we all have the capacity to freely make decisions and choices. This is the common belief that our decisions are not preordained and that we can choose between alternatives. This is what most people mean by having free will — the belief that human behavior is an expression of personal choice and is not determined by physical forces, fate, or God. In other words, there is a self in control.

“However, neuroscience tells us that we are mistaken and that free will is also part of the self illusion… We think we have freedom but, in fact, we do not.

“For example, I believe that the sentence that I just typed was my choice. I thought about what I wanted to say and how to say it. Not only did I have the experience of my intention to begin this line of discussion at this point but I had the experience of agency, of actually wanting it. I knew I was the one doing it. I felt the authorship of my actions.

“It seems absurd to question my free will here but, as much as I hate to admit it, these experiences are not what they seem. This is because any choices that a person makes must be the culmination of the interaction of a multitude of hidden factors ranging from genetic inheritance, life experiences, current circumstances, and planned goals. Some of these influences must also come from external sources, but they all play out as patterns of neuronal activity in the brain. This is the matrix of distributed networks of nerve cells firing across my neuronal architecture.

“My biases, my memories, my perceptions, and my thoughts are the interacting patterns of excitation and inhibition in my brain, and when the checks and balances are finally done, the resulting sums of all of these complex interactions are the decisions and the choices that I make. We are not aware of these influences because they are unconscious and so we feel that the discussion has been arrived at independently — a problem that was recognized by the philosopher Spinoza when he wrote, “Men are mistaken in thinking themselves free; their opinion is made up of conscious of their own actions, and ignorance of the causes by which they are determined.”

“Even if the self and our ability to exercise free will is an illusion, not all is lost. In fact, beliefs seem to produce consequences for our behavior.

“Beliefs about self-control, from wherever they may derive, are powerful motivators of human behavior.

“When we believe that we are the masters of our own destiny, we behave differently than those who deny the existence of free will and believe everything is determined.

“Maybe that’s why belief in free will predicts not only better job performance but also expected career success. Workers who believe in free will outperform their colleagues ,and this is recognized and rewarded by their superiors. So, when we believe in free will, we enjoy life more.

“The moral of the tale is that, even if free will doesn’t exist, then maybe it is best to ignore what the neuroscientists or philosophers say. Sometimes ignorance is bliss.”[5]

It seems we often greet paradigm-shifting scientific findings with a shrug. Maybe somebody in a lab coat figured something out, but there’s no apparent impact on us. Maybe somebody says free will is nothing more than the confluence of multiple neural pathways — okay fine, but we’ll take own misguided, self-deceptive sense of agency any day. It’s how we’re used to feeling, and there’s no apparent downside to contradicting a bunch of intellectual hooey. In fact, the downside is all on the side of science, which wants us to think there’s no point in anything.

Plus, if we believe in the supernatural, we enjoy the safety of numbers– especially if we live in the USA, where a 2019 Gallup Poll found that 64% – 87% of us believe in God, depending on how the question was asked. (By contrast, also in 2019, the Pew Research Center found that only 4% of Americans said they were atheists.[6])

For me personally, when I first learned about neuroscience’s case against free will, it didn’t feel devastating or hopeless, didn’t throw me into a pit of despair, didn’t make me want to wallow. It was weird, but no more. I was skeptical, and still assume there’s more to be discovered before we get the whole picture, but in time, I came to like the changes in outlook the absence of God and belief in God offered. Life and my place in it were cleaner and simpler somehow – if for no other reason that I no longer needed to expend the energy belief in the supernatural used to require.

The Religious Brain

Also back when I first got religion, I experienced something else current neuroscience tells us: that religion shapes the brain as the brain shapes religion. Jordan Grafman, head of the cognitive neuroscience laboratory at Rehabilitation Institute of Chicago and neurology professor at Northwestern University, says that religions and their community behavioral codes helped to make the brain what it is today, and vice versa:

“Neurotheology is important in part because early religious practices helped develop our brains to begin with. ‘Religion has played an incredibly important role in human evolution. It’s funny, people want to separate the two but in fact they’re intertwined,’ [Dr. Grafman] says.

“Of course, it’s a two-way relationship between the brain and religion. Our brains had to develop the capacity to establish social communities and behaviors, which are the basis of religious societies. But religious practice in turn developed the brain, says Grafman. ‘As these societies became more co-operative, our brains evolved in response to that. Our brain led to behavior and then the behavior fed back to our brain to help sculpt it,’ he adds.”[7]

The mutual reinforcement loop still operates, so that the brain steeped in religion gets better at religion, finds way to reinforce and substantiate its beliefs. As a result, the religious narrative becomes more and more true the more you practice it –experience increasingly conforms to religious dictates on both an individual and community level. Neuroscientist Andrew Newberg, a pioneer of “neurotheology,” observes that the religious brain promotes social cohesiveness and conformity to social moral norms.

“‘There’s the argument that religion has benefited human beings by helping to create cohesive societies and morals and help us to determine our behavior and interact with the world more effectively,’” says Newberg. ‘The ability to think about this from a neuroscience perspective is part of that discussion.’”[8]

As a result, when you stop practicing your religious narrative, as I did, your brain circuits are no longer engaged in actively supporting it, and are now available to process alternatives. As you detach from religious immersion, your prior conviction about its truth – i.e., its ability to explain reality, which was increasingly conforming to it — fades away. At that stage, the brain’s formerly religious wiring is equally adept at promoting other individual and communal beliefs and behaviors, as well as other narratives. Andew Newberg’s website provides a sample of research findings from his book[9] indicating that the formerly religious brain is equally adept at generating rule-breaking behavior:

“The prefrontal cortex is traditionally thought to be involved in executive control, or willful behavior, as well as decision-making. So, the researchers hypothesize, it would make sense that a practice that centers on relinquishing control would result in decreased activity in this brain area.

“A recent study that Medical News Today reported on found that religion activates the same reward-processing brain circuits as sex, drugs, and other addictive activities.

“Researchers led by Dr. Jeff Anderson, Ph.D. — from the University of Utah School of Medicine in Salt Lake City — examined the brains of 19 young Mormons using a functional MRI scanner.

“When asked whether, and to what degree, the participants were “feeling the spirit,” those who reported the most intense spiritual feelings displayed increased activity in the bilateral nucleus accumbens, as well as the frontal attentional and ventromedial prefrontal cortical loci.

“These pleasure and reward-processing brain areas are also active when we engage in sexual activities, listen to music, gamble, and take drugs. The participants also reported feelings of peace and physical warmth.

“’When our study participants were instructed to think about a savior, about being with their families for eternity, about their heavenly rewards, their brains and bodies physically responded,’ says first study author Michael Ferguson.

“These findings echo those of older studies, which found that engaging in spiritual practices raises levels of serotonin, which is the “happiness” neurotransmitter, and endorphins.

“The latter are euphoria-inducing molecules whose name comes from the phrase ‘endogenous morphine.’

“Such neurophysiological effects of religion seem to give the dictum ‘Religion is the opium of the people’ a new level of meaning.”[10]

These findings explain a range of religious behaviors: charitable good deeds, the use of music in worship, and beneficial “fellowship” dynamics at one end of the spectrum; and clergy sexual crimes, cult abuses, and terrorism on the other end. Plus, the entire spectrum is supported not only by religious neural network, but by the brain’s addictive feel-good hormones — right alongside sex, drugs, and rock n roll.

Lost in the Story

Religious narratives draw upon ancient storytelling for their source material, making liberal use of metaphors and allegories in scripture and wisdom literature to create parables, koans, riddles, myths, fables, cautionary tales, and poetry. Religious storytelling illuminates the human condition, illustrates what happens when Earthy existence is aligned or at odds with Heavenly purpose.[11]

Normally, metaphors and allegories are representational: they describe one thing in terms of another – i.e., in the case of religion, worldly, fleshly experience in light of divine, spiritual truth. Sometimes, though, religious practice recasts human experience into literal, explicit religious storytelling, in which the devotee is “in but not of the world”[12] to an extreme. As a result, the zealot dwells in religious metaphor, views themselves and others as religious characters, and interprets circumstances in terms of religious drama. At this extreme, reality becomes a pious fantasyland, in which religious imagery supplants worldly experience. Religious storytelling no longer illustrates and represents, it becomes perceived reality, as the believer remains in a closed, self-reinforcing system. The condition is euphoric, supported by feel-good brain hormones – as close to what it feels like to have God’s view as we’ll ever get.

I know this experience well — I did this a lot in my religious days, and not just with religion, but also with film, theater, books, and other stories – just as I had as a child. I have a lively imagination and have “the ability to become easily engrossed, such as in movies, novels or daydreams” [13] – traits that make it easy for me to generate religious experience and make me a good subject for hypnosis..

The best example of this kind of religious storytelling excess that I can think of are the lyrics of a hymn I remember singing in the church where I grew up:

I love to tell the story
Of unseen things above,
Of Jesus and His glory,
Of Jesus and His love.
I love to tell the story,
Because I know ’tis true;
It satisfies my longings
As nothing else can do.

 I love to tell the story,
’Twill be my theme in glory
To tell the old, old story
of Jesus and His love.

I love to tell the story;
More wonderful it seems
Than all the golden fancies
Of all my golden dreams,
I love to tell the story,
It did so much for me;
And that is just the reason
I tell it now to thee.

I love to tell the story;
Tis pleasant to repeat
What seems each time I tell it,
More wonderfully sweet.
I love to tell the story;
For some have never heard
The message of salvation
From God’s own holy Word.

I love to tell the story;
For those who know it best
Seem hungering and thirsting
To hear it like the rest.
And when, in scenes of glory,
I sing the new, new song,
’Twill be the old, old story,
That I have loved so long.

I used to wonder why religious experiences were so easy for me, compared to other people, until I became aware of the neurological underpinnings of this cognitive disposition. Discovering it, and learning to keep it from running away with me, turned about to be a key development in my drift away from religion, and from narrative in general.

More on narratives next time.

[1] Wikipedia.

[2] Andersen, Kurt, How America Lost Its Mind – The nation’s current post-truth moment is the ultimate expression of mind-sets that have made America exceptional throughout its history, The Atlantic (Dec. 28, 2017). See also Routledge, Supernatural, op. cit.

[3] Routledge, Clay, Supernatural: Death, Meaning, and the Power of the Invisible World  (July 2, 2018)

[4] See this blog’s Narratives-Of-Self-Purpose-And-Meaning-Part-1-Fish-Stories.

[5] The Self Illusion: How the Social Brain Creates Identity, Bruce Hood (2012)

[6] /The Pew Research Center report is intriguingly nuanced, and worth a look if you like this sort of thing.

[7]The Neuroscience Argument That Religion Shaped The Very Structure Of Our Brains,” Quartz (December 3, 2016)

[8] Ibid.

[9] Newberg, Andrew, How God Changes Your Brain: Breakthrough Findings from a Leading Neuroscientist (2009)

[10] “What Religion Does To Your Brain,”,: Medical News Today (July 20, 2018)

[11] For more on metaphor, see the classic and definitive text Metaphors We Live By, by George Lakoff and Mark Johnson.

[12] See, for example, this online Bible study on the phrase.

[13] See The Five Traits Of A Good Hypnotic Subject, Your Visual Mind. See also Wikipedia re: “Hypnotic Susceptibility.”

Narratives of Self, Purpose, and Meaning [Part 1]: Fish Stories

A friend of mine is a Christian, business leader, author, and fisherman. He tells fish stories in each of those roles. At least it feels that way to me, so I take his stories “with a grain of salt.” A Roman luminary named Pliny the Elder[1] used that phrase in a poison antidote in 77 A.D., and he meant it literally. Today, it describes how we respond when it feels like someone’s story – like the fish –  just keeps getting bigger.

I don’t care about my friend’s fish, I care about him. When he tells a fish story, he’s sharing his personal narrative. “This is who I am,” he’s saying, “And this is how I believe life works.”

“Each of us constructs and lives a ‘narrative’, wrote the British neurologist Oliver Sacks, ‘this narrative is us’. Likewise the American cognitive psychologist Jerome Bruner: ‘Self is a perpetually rewritten story.’ And: ‘In the end, we become the autobiographical narratives by which we “tell about” our lives.’ Or a fellow American psychologist, Dan P McAdams: ‘We are all storytellers, and we are the stories we tell.’ And here’s the American moral philosopher J David Velleman: ‘We invent ourselves… but we really are the characters we invent.’ And, for good measure, another American philosopher, Daniel Dennett: ‘we are all virtuoso novelists, who find ourselves engaged in all sorts of behaviour… and we always put the best “faces” on it we can. We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character at the centre of that autobiography is one’s self.’”[2]

“Each of us conducts our lives according to a set of assumptions about how things work: how our society functions, its relationship with the natural world, what’s valuable, and what’s possible. This is our worldview, which often remains unquestioned and unstated but is deeply felt and underlies many of the choices we make in our lives.”[3]

The Self

This kind of narrative assumes the self is an entity all its own, with a purpose also all its own, and that if you get both in hand, you’ll know the meaning of life – at least your own. Current neuro-psychology doesn’t see things that way.

“The idea of there being a single ‘self’, hidden in a place that only maturity and adulthood can illuminate and which, like archaeologists, we might dig and dust away the detritus to find, is to believe that there is some inner essence locked within us – and that unearthing it could be a key to working out how to live the rest of our lives. This comforting notion of coming of age, of unlocking a true ‘self’ endures, even though it is out of step with current thinking in psychology, which denies a singular identity.”[4]

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless.”[5]

For most people, that scientific outlook is too harsh:

“From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it. For most humans, the idea that life is inherently meaningless simply will not do.”[6]

Self-Actualization

Cultivating a sense of identity, purpose, and meaning sounds good, but who’s got time? Maslow’s iconic “Hierarchy of Needs” pyramid recognizes that adult life puts the basics first.

“Abraham Maslow was the 20th-century American psychologist best-known for explaining motivation through his hierarchy of needs, which he represented in a pyramid. At the base, our physiological needs include food, water, warmth and rest. Moving up the ladder, Maslow mentions safety, love, and self-esteem and accomplishment. But after all those have been satisfied, the motivating factor at the top of the pyramid involves striving to achieve our full potential and satisfy creative goals. As one of the founders of humanistic psychology, Maslow proposed that the path to self-transcendence and, ultimately, greater compassion for all of humanity requires the ‘self-actualisation’ at the top of his pyramid – fulfilling your true potential, and becoming your authentic self.”[7]

Columbia psychologist Scott Barry Kaufman thinks we ought to get self-actualization off the back burner, for the sake of ourselves and our world.

“‘We live in times of increasing divides, selfish concerns, and individualistic pursuits of power,’ Kaufman wrote recently in a blog in Scientific American introducing his new research. He hopes that rediscovering the principles of self-actualisation might be just the tonic that the modern world is crying out for.”[8]

Kaufman’s research suggests that making room for self-awareness and growth helps to develop character traits that the world could use more of:

“Participants’ total scores… correlated with their scores on the main five personality traits (that is, with higher extraversion, agreeableness, emotional stability, openness and conscientiousness) and with the metatrait of ‘stability’, indicative of an ability to avoid impulses in the pursuit of one’s goals.

“Next, Kaufman turned to modern theories of wellbeing, such as self-determination theory, to see if people’s scores on his self-actualisation scale correlated with these contemporary measures. Sure enough, he found that people with more characteristics of self-actualisation also tended to score higher on curiosity, life-satisfaction, self-acceptance, personal growth and autonomy, among other factors.

“A criticism often levelled at Maslow’s notion of self-actualisation is that its pursuit encourages an egocentric focus on one’s own goals and needs. However, Maslow always contended that it is only through becoming our true, authentic selves that we can transcend the self and look outward with compassion to the rest of humanity. Kaufman explored this too, and found that higher scorers on his self-actualisation scale tended also to score higher on feelings of oneness with the world, but not on decreased self-salience, a sense of independence and bias toward information relevant to oneself. (These are the two main factors in a modern measure of self-transcendence developed by the psychologist David Yaden at the University of Pennsylvania.)

“The new test is sure to reinvigorate Maslow’s ideas, but if this is to help heal our divided world, then the characteristics required for self-actualisation, rather than being a permanent feature of our personalities, must be something we can develop deliberately. I put this point to Kaufman and he is optimistic. ‘I think there is significant room to develop these characteristics [by changing your habits],’ he told me. ‘A good way to start with that,’ he added, ‘is by first identifying where you stand on those characteristics and assessing your weakest links. Capitalise on your highest characteristics but also don’t forget to intentionally be mindful about what might be blocking your self-actualisation … Identify your patterns and make a concerted effort to change. I do think it’s possible with conscientiousness and willpower.’”[9]

But What if There’s No Self to Actualize?

If there’s no unified self, then there’s no beneficiary for all that “concerted effort to change” and “conscientiousness and willpower.”

“The idea of there being a single ‘self’, hidden in a place that only maturity and adulthood can illuminate and which, like archaeologists, we might dig and dust away the detritus to find, is to believe that there is some inner essence locked within us – and that unearthing it could be a key to working out how to live the rest of our lives. This comforting notion of coming of age, of unlocking a true ‘self’ endures, even though it is out of step with current thinking in psychology, which denies a singular identity.[10]

Again, it’s hard for most of us to live with that much existential angst[11]. We prefer instead to think there’s a unique self (soul) packed inside each of us, and to invest it with significance.

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless. From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it. For most humans, the idea that life is inherently meaningless simply will not do.

“Instead, people latch onto what I call teleological thinking. Teleological thinking is when people perceive phenomena in terms of purpose. When applied to natural phenomena, this type of thinking is generally considered to be flawed because it imposes design where there is no evidence for it. To impose purpose and design where there is none is what researchers refer to as a teleological error.”[12]

Teleological thinking finds design and purpose in the material world[13] to counter the feeling that we’re at the mercy of random pointlessness. We prefer our reality to be by design, so that we have a chance to align ourselves with it – a form of personal empowerment psychologists call “agency.”

“Each of us has a story we tell about our own life, a way of structuring the past and fitting events into a coherent narrative. Real life is chaotic; life narratives give it meaning and structure.”[14]

The Coming of Age Narrative

Further, we look to a specific cultural rite of passage – when we “come of age” in late adolescence — as the time when we first discover and take responsibility for our unique self and its identity and purpose. From there, we carry that sense of who we are and where we fit into responsible adult life.

“The protagonist has the double task of self-integration and integration into society… Take, for instance, the fact that the culminating fight scene in most superhero stories occurs only after the hero has learned his social lesson – what love is, how to work together, or who he’s ‘meant to be’. Romantic stories climax with the ultimate, run-to-the-airport revelation. The family-versus-work story has the protagonist making a final decision to be with his loved ones, but only after almost losing everything. Besides, for their dramatic benefit, the pointedness and singular rush of these scenes stems from the characters’ desire to finally gain control of their self: to ‘grow up’ with one action or ultimate understanding.[15]

The Redemption Narrative

The coming of age story is a variant of the “redemption” narrative, in which we learn that suffering is purposeful: it shapes and transforms us, so we can take our place in society.

“For the past 15 years, Daniel McAdams, professor of psychology at Northwestern University in Illinois, has explored this story and its five life stages: (1) an early life sense of being somehow different or special, along with (2) a strong feeling of moral steadfastness and determination, ultimately (3) tested by terrible ordeals that are (4) redeemed by a transformation into positive experiences and (5) zeal to improve society.

“This sequence doesn’t necessarily reflect the actual events of the storyteller’s life, of course. It’s about how people interpret what happened – their spin, what they emphasise in the telling and what they discard.” [16]

Redemption narratives make us good citizens, and never mind if there’s some ego involved:

“In his most recent study, the outcome of years of intensive interviews with 157 adults, McAdams has found that those who adopt [redemption narratives] tend to be generative – that is, to be a certain kind of big-hearted, responsible, constructive adult.

“Generative people are deeply concerned about the future; they’re serious mentors, teachers and parents; they might be involved in public service. They think about their legacy, and want to fix the world’s problems.

“But generative people aren’t necessarily mild-mannered do-gooders. Believing that you have a mandate to fix social problems – and that you have the moral authority and the ability to do so – also requires a sense of self-importance, even a touch of arrogance.”[17]

The American Way

Coming of age and redemption stories have been culturally and neurologically sustained in Western and Middle Eastern civilizations since the Abrahamic scriptures wrote about the Garden of Eden 5500 years ago. Americans, as heirs of this ideological legacy, have perfected it.

“For Americans, the redemption narrative is one of the most common and compelling life stories. In the arc of this life story, adversity is not meaningless suffering to be avoided or endured; it is transformative, a necessary step along the road to personal growth and fulfilment.[18]

“The coming-of-age tale has become an peculiarly American phenomenon, since self-understanding in the United States is largely predicated on a self-making mythos. Where, in Britain, one might be asked about one’s parents, one’s schooling or one’s background, Americans seem less interested in a person’s past and more interested in his or her future. More cynical observers have claimed, perhaps rightly, that this is because Americans don’t have a clear history and culture; but the coming-of-age tale has also become important in the US because of a constant – maybe optimistic, maybe pig-headed – insistence that one can always remake oneself. The past is nothing; the future is “everything.

“This idea of inherent, Adam-and-Eve innocence, and the particularly American interest in it, is perhaps tantamount to a renunciation of history. Such denialism infuses both American stories and narratives of national identity, said Ihab Hassan, the late Arab-American literary theorist. In any case, the American tale of growing up concerns itself with creating a singular, enterprising self out of supposed nothingness: an embrace of the future and its supposedly infinite possibilities.”[19]

American capitalism relies on the redemption narrative as its signature story genre.

“From a more sociological perspective, the American self-creation myth is, inherently, a capitalist one. The French philosopher Michel Foucault theorised that meditating and journaling could help to bring a person inside herself by allowing her, at least temporarily, to escape the world and her relationship to it. But the sociologist Paul du Gay, writing on this subject in 1996, argued that few people treat the self as Foucault proposed. Most people, he said, craft outward-looking ‘enterprising selves’ by which they set out to acquire cultural capital in order to move upwards in the world, gain access to certain social circles, certain jobs, and so on. We decorate ourselves and cultivate interests that reflect our social aspirations. In this way, the self becomes the ultimate capitalist machine, a Pierre Bourdieu-esque nightmare that willingly exploits itself.

“Even the idea that there is a discreet transition from youth into adulthood, either via a life-altering ‘feeling’ or via the culmination of skill acquisition, means that selfhood is a task to be accomplished in the service of social gain, and in which notions of productivity and work can be applied to one’s identity. Many students, for instance, are encouraged to take ‘gap years’ to figure out ‘who they are’ and ‘what they want to do’. (‘Do’, of course, being a not-so-subtle synonym for ‘work’.) Maturation is necessarily related to finances, and the expectation of most young people is that they will become ‘independent’ by entering the workforce. In this way, the emphasis on coming of age reifies the moral importance of work.” [20]

As usual, Silicon Valley is ahead of the game, having already harnessed the power of the redemption story as its own cultural norm:

“In Silicon Valley these days, you haven’t really succeeded until you’ve failed, or at least come very close. Failing – or nearly failing – has become a badge of pride. It’s also a story to be told, a yarn to be unspooled.

“The stories tend to unfold the same way, with the same turning points and the same language: first, a brilliant idea and a plan to conquer the world. Next, hardships that test the mettle of the entrepreneur. Finally, the downfall – usually, because the money runs out. But following that is a coda or epilogue that restores optimism. In this denouement, the founder says that great things have or will come of the tribulations: deeper understanding, new resolve, a better grip on what matters.

“Unconsciously, entrepreneurs have adopted one of the most powerful stories in our culture: the life narrative of adversity and redemption.”[21]

Writing Your Own Story

There’s nothing like a good story to make you rethink your life. A bookseller friend’s slogan for his shop is “Life is a story. Tell a good one.”

“The careers of many great novelists and filmmakers are built on the assumption, conscious or not, that stories can motivate us to re-evaluate the world and our place in it.

“New research is lending texture and credence to what generations of storytellers have known in their bones – that books, poems, movies, and real-life stories can affect the way we think and even, by extension, the way we act.

“Across time and across cultures, stories have proved their worth not just as works of art or entertaining asides, but as agents of personal transformation.”[22]

As a result, some people think we ought to take Michel Foucault’s advice and meditate (practice “mindfulness”) and journal our way to a better self-understanding. As for journaling:

“In truth, so much of what happens to us in life is random – we are pawns at the mercy of Lady Luck. To take ownership of our experiences and exert a feeling of control over our future, we tell stories about ourselves that weave meaning and continuity into our personal identity. Writing in the 1950s, the psychologist Erik Erikson put it this way:

“To be adult means among other things to see one’s own life in continuous perspective, both in retrospect and in prospect … to selectively reconstruct his past in such a way that, step for step, it seems to have planned him, or better, he seems to have planned it.

“Intriguingly, there’s some evidence that prompting people to reflect on and tell their life stories – a process called ‘life review therapy’ – could be psychologically beneficial.”[23]

Consistent with Scott Barry Kaufman’s comments from earlier, the more you can put a coming of age or redemption story spin on your own narrative, the more likely journaling will improve your outlook.

“A relevant factor in this regard is the tone, complexity and mood of the stories that people tell themselves. For instance, it’s been shown that people who tell more positive stories, including referring to more instances of personal redemption, tend to enjoy higher self-esteem and greater ‘self-concept clarity’ (the confidence and lucidity in how you see yourself). Perhaps engaging in writing or talking about one’s past will have immediate benefits only for people whose stories are more positive.

“It remains unclear exactly why the life-chapter task had the self-esteem benefits that it did. It’s possible that the task led participants to consider how they had changed in positive ways. They might also have benefited from expressing and confronting their emotional reactions to these periods of their lives – this would certainly be consistent with the well-documented benefits of expressive writing and ‘affect labelling’ (the calming effect of putting our emotions into words).

“The researchers said: ‘Our findings suggest that the experience of systematically reviewing one’s life and identifying, describing and conceptually linking life chapters may serve to enhance the self, even in the absence of increased self-concept clarity and meaning.’”[24]

An American Life

My friend the storyteller is an exemplar of all the above. He’s an American, a Christian, and a capitalist. And when he starts his day by journaling, he believes he’s writing what he’s hearing from God. I was most of that, too for the couple decades he and I shared narratives and teleological outlook. I’ve since moved on:  at this writing, we’ve had no contact for over three years. I wondered if I could still call him a friend — whether that term still applies  after your stories diverge as entirely as ours . Yes you can and yes it does, I decided, although I honestly can’t say why.

Religion: Teleological Thinking Perfected

Personal narratives – especially actually writing your own story – aren’t for everyone. They require quiet, solitude, and reflection, plus doing that feels egotistical if you’re not used to it. Religion offers a more common teleological alternative, with its beliefs, rituals, and practices designed to put you in touch with an external, transcendent source of your identity, purpose, and meaning. “Don’t look inward, look up,” is its message.

We’ll look at that next time.

[1] . Wikipedia. Pliny the Elder was a naturalist, military leader, friend of the Emperor, and a victim of the Vesuvius eruption.

[2] I Am Not a Story: Some find it comforting to think of life as a story. Others find that absurd. So are you a Narrative or a non-Narrative? Aeon (Sept. 3, 2015)

[3] Lent, Jeremy, The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning (2017)

[4] The Coming-Of-Age Con: How can you go about finding ‘who you really are’ if the whole idea of the one true self is a big fabrication? Aeon (Sept. 8, 2017)

[5] Routledge, Clay, Supernatural: Death, Meaning, and the Power of the Invisible World  (2018)

[6] Ibid.

[7] Do You Have A Self-Actualised Personality? Maslow Revisited. Aeon (Mar. 5, 2019)

[8] Ibid.

[9] Ibid.

[10] The Coming-Of-Age Con op. cit.

[11] Urban Dictionary: existential angst..

[12] Routledge, Clay, Supernatural: Death, Meaning, and the Power of the Invisible World  (July 2, 2018)

[13] Wikipedia.

[14] Silicon Phoenix: A Gifted Child, An Adventure, A Dark Time, And Then … A Pivot? How Silicon Valley Rewrote America’s Redemption Narrative, Aeon Magazine (May 2, 2016)

[15] The Coming-Of-Age Con, op cit.

[16] Silicon Phoenix, op. cit.

[17] Silicon Phoenix, op. cit.

[18] Silicon Phoenix, op. cit.

[19] The Coming-Of-Age Con op. cit.

[20] Silicon Phoenix, op cit.

[21] Silicon Phoenix, op cit.

[22] The Power of Story, op. cit.

[23] To Boost Your Self-Esteem, Write About Chapters of Your Life. Aeon (Apr. 5, 2019)

[24] Ibid.

Reborn Losers: Christian Cosmology and Worldview Are a Setup to Failure

Christian cosmology and worldview are complicated, stressful, and impossible. Trying to comply with them is a set up to failure. That failure begins with the concept of who we are as human beings living in human bodies.

I was a Christian, now I’m not. Sometimes I find it useful to write about what I believed then and compare it to what I don’t believe now. I try to express it simply, avoid religious assumptions and overtones, resist the urge to cringe at what I used to think and exalt in what I think now. Instead, I try to lay aside judgment, notice what comes up, and wonder about it. That’s the ideal, anyway — sometimes it’s more difficult than others to remain dispassionate. Today was one of those.

I wrote about cosmology (how the universe is organized) and worldview (how life works on Earth). Reading it afterward, it seemed that the Christian beliefs, institutions, and culture that dominated my life — and have dominated Western thought for two millennia — are about equal parts quaint and fantasy. I didn’t see it that way when I was immersed in them, but my last several years of study– especially neuroscience, psychology, and history — have upended my former cosmology and worldview, and taken my self concept with them.

I previously understood “reality” and my place in it by reference to a Truth outside of me. Today, I’m aware that everything I experience – including what I believe or not – is processed within my biological being.[1] My new sense of self and reality are now physical, not spiritual.

That shift has brought new clarity, simplicity, decisiveness, energy, focus, hope, joy, freedom, gratitude, and lots of other new dynamics I really like. By contrast, what struck me most about my former beliefs was how complicated they were, how stressful to maintain, and ultimately how generally impossible. Clinging to them was a setup to failure – I especially like being free of that.

The Trouble Starts With A Soul

Approaching life here by reference to a Truth out there leads us to believe in things that exist outside of us– in people, in ideas, in entities, in institutions…. That kind of thinking derives naturally from another foundational belief: that each person has an independent existence — a soul living inside their body – that sorts through available belief options and chooses this one over that.

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“This sense that we are individuals inside bodies is sometimes called the “ego theory,” although philosopher Gale Strawson captures it poetically in what he calls the ‘pearl view’ of the self. The pearl view is the common notion that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”[2]

My Christian worldview bought all that, and also held that the soul is our highest and best self, because it came from where Truth dwells. It also held that it’s hard on a soul to be in a human body. The doctrinal specifics vary – we deliberately chose to screw things up and our souls took the hit for it, our souls got damaged in transit or in installation, or there was a flaw in the source code that eventually moved them away from their ideal nature, etc. – but the end result is that the soul’s potential good influence is minimized or lost, leaving us in the throes of “sin” – falling short of the perfect divine plan for what our souls could have been if they hadn’t gotten fouled up. And since the soul’s waywardness is foundational, its problem isn’t just sin but “original sin” – the beginning of all our troubles. We don’t just struggle with garden-variety human nature, which is bad enough, but with “the flesh,” which is worse, in fact so dreadful that it puts our eternal destiny at jeopardy.

That’s where it all begins:  with a divine, timeless, perfect soul trapped in an imperfect human body. The result is a hapless human subject to all kinds of cosmic misfortune.

And it only gets worse from there.

The Cosmology and Worldview That Was (And Still Is)

It’s tricky to line up a flawed soul in a flawed body with an external perfect standard of Truth. As a result, we’re constantly screwing up our reality compared to Reality. Plus there’s the problem of perception and deception –-not seeing Reality clearly – and the problem of temptation – enticements plying on our fleshly nature that just aren’t going to end well. It’s hard to keep a clear head in the midst of those pressures, and for that we have experts – people we have to trust to know things about Reality that the rest of us don’t.

But sooner or later all fall down – experts along with everybody else. Birth is the soul’s doorway into its precarious life in the flesh, and death is the doorway out. It would be nice if the door had been designed to swing both ways so we could check in with Truth and get straightened out now and then, but it shuts firmly in both directions, and no peeking. Which means our attempts to live here by reference to what’s over there are always seriously handicapped.

Sometimes you hear about people who get a backstage pass to go there and come back, and then they write books about it and go on tour and tell us what’s it’s like. That makes them a special kind of expert, but their reports often are full of all sorts of universality, which makes them doctrinally suspect. Fortunately, there are superhuman beings– kind of like us, kind of not, but at least conscious like us, and able to communicate – to help us out. Sometimes they make the trip over here, sometimes they snatch someone from here and show them around over there and then send them back, sometimes they open up a clear channel to communicate with somebody over here, and sometimes — and this is the best – they can be born as one of us and not have a problem with losing their soul’s connection to Truth while they’re here. The point is, one way or another, when they really need to communicate with us, they figure out how.

The whole lot of them rank higher than we do: the human race is in charge of the Earth, but they’re in charge of us (and everything else). God out-ranks everyone, of course – He[3] created everything, including them and us, and although the whole thing sure looks like a mess to us it doesn’t look that way to Him – or to them either, I guess. God is the ultimate creator, communicator, executive, and enforcer, and He has more consciousness than all the rest of us combined.

“Across all cultures and all religions, universally, people consider God to be a conscious mind. God is aware. God consciously chooses to make things happen. In physical reality the tree fell, the storm bowled over a house, the man survived the car crash, the woman died prematurely, the earth orbits the sun, the cosmos exists. For many people these events, big and small, must have a consciousness and an intentionality behind them. God is that consciousness.”[4]

Of course, God is busy, which is why He has all these underlings. They’re arranged in a hierarchy – it just makes sense that they would be – and range from great big scary powerful cosmic superheroes who get to make great big scary visitations and announcements and cause all kinds of great big scary events, all the way down to petty bureaucrats, drones, and proles just doing their dull but necessary jobs (but even they outrank us in the grand cosmic scheme).

“When our anthropomorphism is applied to religious thought, it’s notably the mind, rather than the body, that’s universally applied to spirits and gods. In the diverse cultures of the world, gods come in all shapes and sizes, but one thing they always share is a mind with the ability to think symbolically just like a human. This makes sense in light of the critical importance of theory of mind in the development of our social intelligence: if other people have minds like ours, wouldn’t that be true of other agents we perceive to act intentionally in the world?”[5]

These conscious beings from over there sometimes pick a human or a whole tribe of humans to mediate Truth to the rest of us. Those people get a special supernatural security clearance, and we give their key personnel special titles like prophet and priest.

So far so good, but even Truth – also known as Heaven – has its internal power struggles. There’s a war over there between good and evil, God and Satan, angels and demons, and other kinds of beings in the high places, and some of it spills over into reality on our side of the divide. We therefore need to be careful about which of our experts are authentic and which aren’t, who they’re really serving and who they aren’t. The stakes are high, and if we’re wrong we’re going to pay with a lot of pain and suffering, both in this life and forever when we go through death’s one-way door.

And just to make things more complicated, these other-worldly beings sometimes use human experts as their agents, and they can be undercover. Plus, to make things impossibly, incomprehensibly complicated for our by now totally overtaxed souls, God and the other good guys sometimes take a turn at being deceptive themselves. The Cosmic Screenwriter apparently thought of everything in a bid to make our predicament as over-the-top bad as possible. In fact, some of what’s going on behind the scenes, taken right out of the Bible, would make a modern fantasy series blush with inadequacy – for example the part about the war in high places[6]:

“Ask, for instance, the average American Christian – say, some genial Presbyterian who attends church regularly and owns a New International Version of the Bible – what gospel the Apostle Paul preached. The reply will fall along predictable lines: human beings, bearing the guilt of original sin and destined for eternal hell, cannot save themselves through good deeds, or make themselves acceptable to God; yet God, in his mercy, sent the eternal Son to offer himself up for our sins, and the righteousness of Christ has been graciously imputed or imparted to all who have faith…. Some details might vary, but not the basic story.

“Paul’s actual teachings, however, as taken directly from the Greek of his letters, emphasise neither original guilt nor imputed righteousness (he believed in neither), but rather the overthrow of bad angels…

“The essence of Paul’s theology is something far stranger, and unfolds on a far vaster scale. .. For Paul, the present world-age is rapidly passing, while another world-age differing from the former in every dimension – heavenly or terrestrial, spiritual or physical – is already dawning. The story of salvation concerns the entire cosmos; and it is a story of invasion, conquest, spoliation and triumph.

“For Paul, the cosmos has been enslaved to death, both by our sin and by the malign governance of those ‘angelic’ or ‘daemonian’ agencies who reign over the earth from the heavens, and who hold spirits in thrall below the earth. These angelic beings, these Archons, whom Paul calls Thrones and Powers and Dominations and Spiritual Forces of Evil in the High Places, are the gods of the nations. In the Letter to the Galatians, he even hints that the angel of the Lord who rules over Israel might be one of their number. Whether fallen, or mutinous, or merely incompetent, these beings stand intractably between us and God.

“In descending to Hades and ascending again through the heavens, Christ has vanquished all the Powers below and above that separate us from the love of God, taking them captive in a kind of triumphal procession. All that now remains is the final consummation of the present age, when Christ will appear in his full glory as cosmic conqueror, having ‘subordinated’ (hypetaxen) all the cosmic powers to himself – literally, having properly ‘ordered’ them ‘under’ himself – and will then return this whole reclaimed empire to his Father. God himself, rather than wicked or inept spiritual intermediaries, will rule the cosmos directly.”

Okay then.

But despite all this vast, elaborate cosmic tangle, over there mostly keeps its own counsel about it all, while still not letting us off the hook. And, although it’s tempting, I won’t even get into all the subterfuge and confusion and (over here, at least) just plain stupidity about when the whole mess is going to resolve into that final day when “God himself, rather than wicked or inept spiritual intermediaries, will rule the cosmos directly.”

And On It Goes (And it went on way too long already, but I wanted to make a point.)

Western culture has been living with all that for over two millennia. A couple hundred years ago, in a time we call “The Great Enlightenment,” some thinkers started trying to convince us that enough is enough, maybe we ought to try out a different cosmology and worldview, based on rational thought and not just fantasy and belief. There’ve been some takers, but overall the Great Endarkenment has rolled on. I’m not as old as Yoda, but I’ve personally seen, heard, and lived all of it. A whole bunch people in the States still do, and not all of them live in Texas.

The cosmology and worldview I just reviewed are complicated, fanciful, stressful, and impose impossible demands on that impaired soul seeing it all through a glass darkly. No wonder belief systems – both secular and religious – devolve into take-it-or-leave-it fundamentalism, where questioning is punished by both God and man, and you can delegate your cosmic responsibilities to the demigods in charge. Fundamentalism dispatches our impossible obligations and blinds us to what the Bible itself says is the final outcome of all our believing: The Big Fail.

The Big Fail

We really should have seen it coming – the Bible lays out the ultimate terms of what it means to believe all of this in brutally unmistakable terms. At the end of a much-quoted and much-beloved recitation of faith heroes, the Epistle to the Hebrews provides this summary of what it means to be your highest and best self:

“Some were tortured, refusing to accept release, so that they might rise again to a better life. Others suffered mocking and flogging, and even chains and imprisonment. They were stoned, they were sawn in two, they were killed with the sword. They went about in skins of sheep and goats, destitute, afflicted, mistreated—of whom the world was not worthy—wandering about in deserts and mountains, and in dens and caves of the earth.

“And all these, though commended through their faith, did not receive what was promised,”[7]

That’s how it ends: total failure — all promises broken, all expectations dashed, all frauds revealed … after it’s way too late for any remedy.

Can We Find a Better Way?

Yes, I am aware that there’s one last phrase in that passage:

“…since God had provided something better for us, that apart from us they should not be made perfect.”[8]

What precisely is that “something better”? I’m clueless, but all the obvious difficulties don’t stop at least one thinker[9] from trying to preserve the value of the soul as our highest and best self, even if modern neuroscience has finally ended its sufferings. The key, he says, is to reinvent the soul to make it relevant to modernity:

“What is the point of gaining the whole world if you lose your soul? Today, far fewer people are likely to catch the scriptural echoes of this question than would have been the case 50 years ago. But the question retains its urgency. We might not quite know what we mean by the soul any more, but intuitively we grasp what is meant by the loss in question – the kind of moral disorientation and collapse where what is true and good slips from sight, and we find we have wasted our lives on some specious gain that is ultimately worthless.

“It used to be thought that science and technology would gain us the world. But it now looks as though they are allowing us to destroy it. The fault lies not with scientific knowledge itself, which is among humanity’s finest achievements, but with our greed and short-sightedness in exploiting that knowledge. There’s a real danger we might end up with the worst of all possible scenarios – we’ve lost the world, and lost our souls as well.

“But what is the soul? The modern scientific impulse is to dispense with supposedly occult or ‘spooky’ notions such as souls and spirits, and to understand ourselves instead as wholly and completely part of the natural world, existing and operating through the same physical, chemical and biological processes that we find anywhere else in the environment.

“We need not deny the value of the scientific perspective. But there are many aspects of human experience that cannot adequately be captured in the impersonal, quantitatively based terminology of scientific enquiry. The concept of the soul might not be part of the language of science; but we immediately recognise and respond to what is meant in poetry, novels and ordinary speech, when the term ‘soul’ is used in that it alerts us to certain powerful and transformative experiences that give meaning to our lives.

“Such precious experiences depend on certain characteristic human sensibilities that we would not wish to lose at any price. In using the term ‘soul’ to refer to them, we don’t have to think of ourselves as ghostly immaterial substances. We can think of ‘soul’ as referring, instead, to a set of attributes of cognition, feeling and reflective awareness – that might depend on the biological processes that underpin them, and yet enable us to enter a world of meaning and value that transcends our biological nature.

“Entering this world requires distinctively human qualities of thought and rationality. But we’re not abstract intellects, detached from the physical world, contemplating it and manipulating it from a distance. To realise what makes us most fully human, we need to pay attention to the richness and depth of the emotional responses that connect us to the world. Bringing our emotional lives into harmony with our rationally chosen goals and projects is a vital part of the healing and integration of the human soul.”

Full Acceptance

It seems honorable that someone would attempt this kind of synthesis, but I personally don’t see anything worth salvaging. Instead, I think this might be a good time to acknowledge something that Christianity’s troublesome cosmology and worldview have dismissed all along: human nature. In that regard, I find the following thoughts from a writer I particularly admire[10] to be bracingly clarifying, and in that, hopeful

“Our collective and personal histories — the stories we tell about ourselves to ourselves and others — are used to avoid facing the incoherence and fragmentation of our lives. Chaos, chance and irrational urges, often locked in our unconscious, propel, inform and direct us. Our self is elusive. It is not fixed. It is subject to forces often beyond our control. To be human is to be captive to these forces, forces we cannot always name or understand. We mutate and change. We are not who we were. We are not who we will become. The familiarity of habit and ritual, as well as the narratives we invent to give structure and meaning to our life, helps hide this fragmentation. But human life is fluid and inconsistent. Those who place their faith in a purely rational existence begin from the premise that human beings can have fixed and determined selves governed by reason and knowledge. This is itself an act of faith.

“We can veto a response or check an impulse, reason can direct our actions, but we are just as often hostage to the pulls of the instinctual, the irrational, and the unconscious. We can rationalize our actions later, but this does not make them rational. The social and individual virtues we promote as universal values that must be attained by the rest of the human species are more often narrow, socially conditioned responses hardwired into us for our collective and personal survival and advancements. These values are rarely disinterested. They nearly always justify our right to dominance and power.

“We do not digest every sensation and piece of information we encounter. To do so would leave us paralyzed. The bandwidth of consciousness – our ability to transmit information measured in bits per second — is too narrow to register the enormous mass of external information we receive and act upon. .. We have conscious access to about a millionth of the information we use to function in life. Much of the information we receive and our subsequent responses do not take place on the level of conscientiousness. As the philosopher John Gray points out, irrational and subconscious forces, however unacknowledged, are as potent within us as in others. [citing Gray, Straw Dogs]

“To accept the intractable and irrational forces that drive us, to admit that these forces are as entrenched in us as in all human beings, is to relinquish the fantasy that the human species can have total, rational control over human destiny. It is to accept our limitations, to live within the confines of human nature. Ethical, moral, religious, and political systems that do not concede these stark assumptions have nothing to say to us.”

We are not going to “conquer our humanness” by continuing our fundamentalist allegiance to a complicated, stressful, and self-negating cosmology and worldview. How about if instead we try full acceptance of our conflicted and flawed humanity, where we find not grandiose visions but simple hope for our small todays?

[1] I also believe there is an independent reality that is more than my brain’s construction of it. Not everyone thinks so. Maybe more on that another time.

[2] Hood, Bruce, The Self Illusion: How the Social Brain Creates Identity (2012)

[3] We get that theoretically God, as a spiritual being, probably wouldn’t have a gender, but we’re generally more comfortable giving him the male pronouns.

[4] Graziano, Michael S. A., Consciousness and the Social Brain (2013)

[5] Lent, Jeremy, The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning, Jeremy Lent (2017)

[6] Hart, David Bentley, Everything You Know About The Gospel Of Paul Is Likely Wrong, Aeon (Jan. 8, 2018). David Bentley Hart is an Eastern Orthodox scholar of religion and a philosopher, writer and cultural commentator, who recently published a translation of The New Testament (2017).

[7] Hebrews 11: 35-39.

[8] Hebrews 11: 40.

[9] Cottingham, John, What is the soul if not a better version of ourselves? Aeon (Mar. 11, 2020). John Cottingham is professor emeritus of philosophy at the University of Reading, professor of philosophy of religion at the University of Roehampton, London, and an honorary fellow of St John’s College, Oxford University.

[10] Hedges, Chris, I Don’t Believe in Atheists: The Dangerous Rise of the Secular Fundamentalist (2008)

 

Debunking the Debunkers

Then I’ll get on my knees and pray
We don’t get fooled again

Meet the new boss
Same as the old boss

The Who[1]

Debunkers believe we’ll be better off without all the bunk. If only it were that simple: the basic premise of debunking might not hold up, the “truth that lies on the other side of bunk is elusive, and there are strong social forces that oppose it. Plus, once free of it, we tend to replace old bunk with new.

UVA Professor Emily Ogden defines “bunk”:

“‘Bunk’ means baloney, hooey, bullshit. Bunk isn’t just a lie, it’s a manipulative lie, the sort of thing a con man might try to get you to believe in order to gain control of your mind and your bank account. Bunk, then, is the tool of social parasites, and the word ‘debunk’ carries with it the expectation of clearing out something that is foreign to the healthy organism. Just as you can deworm a puppy, you can debunk a religious practice, a pyramid scheme, a quack cure. Get rid of the nonsense, and the polity – just like the puppy – will fare better. Con men will be deprived of their innocent marks, and the world will take one more step in the direction of modernity.”[2]

Sounds great, but can debunking actually deliver?

“Debunk is a story of modernity in one word – but is it a true story? Here’s the way this fable goes. Modernity is when we finally muster the reason and the will to get rid of all the self-interested deceptions that aristocrats and priests had fobbed off on us in the past. Now, the true, healthy condition of human society manifests itself naturally, a state of affairs characterised by democracy, secular values, human rights, a capitalist economy and empowerment for everyone (eventually; soon). All human beings and all human societies are or ought to be headed toward this enviable situation.”

Once somebody calls something a “fable” you know it’s in trouble. Plus, there’s no indisputable “truth” waiting to be found once the bunk is cleared out.

“There is no previously existing or natural secular order that will assert itself when we get the bunk out… There is no neutral, universal goal of progress toward which all peoples are progressing; instead, the claim that such a goal ought to be universal has been a means of exploiting and dispossessing supposedly ‘backward’ peoples.”

The underlying problem with debunking seems to be the assumptions we make — about what’s true and false, what we’ll find when we sort one from the other, and most importantly, who’s qualified to do that. Debunking requires what cultural anthropologist Talal Asad has called “secular agents” – a species that may not actually exist.

“Secular agency is the picture of selfhood that Western secular cultures have often wanted to think is true. It’s more an aspiration than a reality. Secular agents know at any given moment what they do and don’t believe. When they think, their thoughts are their own. The only way that other people’s thoughts could become theirs would be through rational persuasion. Along similar lines, they are the owners of their actions and of their speech. When they speak, they are either telling the truth or lying. When they act, they are either sincere or they are faking it… Modernity, in this picture, is when we take responsibility for ourselves, freeing both society and individuals from comforting lies.”

I.e., secular agency is a high standard we mostly fall short of. Instead, we do our best to conform to social conventions even if we don’t personally buy into them. A sports star points to the sky after a home run, a touchdown, a goal, acknowledging the help of somebody or Somebody up there… a eulogy talks about a deceased loved one “looking down on us”… a friend asks us to “think good thoughts” for a family member going into surgery… We don’t buy the Somebody up there helping us, the “looking down,” or the “good thoughts,” but we don’t speak up. Instead, we figure there’s a time and place for honesty and confrontation, and this isn’t one of them.[3]

“Life includes a great many passages in which we place the demands of social bonds above strict truth…. [In] the context of some of the stories we tell collaboratively in our relationships with others, the question of lying or truth does not arise. We set it aside. We apply a different framework, something more like the framework we apply to fiction: we behave as if it were true.”

So what’s left of debunking? Well, it still has its place, especially when it’s used to call the Bunk Lords to account.

“What then is debunking? It can be a necessary way of setting the record straight. I’m by no means opposed to truth-telling. We need fact-checkers. The more highly placed the con artist, the more his or her deceptions matter. In such cases, it makes sense to insist on hewing to the truth.

“[On the other hand,] the social dynamics of debunking should not be overlooked …, especially when the stakes aren’t particularly high – when the alleged lie in question is not doing a whole lot of harm.”

To Play Along or Not to Play Along

When I was a late adolescent and lurching my way toward the Christian faith, a seminary student advised me that, “Sometimes you just need to act as if something is true. You do that long enough, and maybe it will actually become true” – which I took to mean that, even if you’re full of yourself right now, in the long haul you might be happier fitting in.

Maybe, maybe not. You might also feel that, since the things we believe are always in progress anyway, why not be real about what’s up for you right now.

“At these times, what is debunking? It’s a performed refusal to play along.… It’s the announcement that one rejects the as-if mode in which we do what social bonds require.”

Plus, there seems to be a countervailing urge that sometimes prevails over socially playing nice: when we feel like we finally got it figured out, the scales fell from our eyes and we can see clearly now, we can see life for what it really is,,, get to that beatific place, and you want to tell everybody, even it if steps on their toes – which it does, but being newly enlightened and detoxed, you can’t help yourself.

Thus the “as if” game becomes a choice: playing along preserves social currency, opting out drains it. Which do you want?

Why Bother?

There’s also the “Why bother?” issue. Debunking is often preaching to the choir while the unconverted stay that way – in fact, they never even hear what you have to say; it never shows up in their feed.

“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Failstheir 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’

“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”[4]

In light of all this cognitive self-preservation, not rocking the boat can seem like the more reasonable choice:

“Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain.

“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”[5]

But even if acting as-if is socially acceptable, sometimes you just can’t help but go after it.

Take “magical thinking” for example — a socially acceptable practice and favorite debunking target.

Magical thinking is based on a claim of cause and effect, and therefore offers a sense of predictability and control, It sounds scientific and reasonable, which makes it socially acceptable, but it’s neither; it’s faux science because you can’t test or verify it, and its not reasonable because there’s no logic to it, you can only believe it or not. The masquerade makes it a prime target for debunking.

Magical thinking [is] the belief that one’s ideas, thoughts, actions, words, or use of symbols can influence the course of events in the material world. Magical thinking presumes a causal link between one’s inner, personal experience and the external physical world. Examples include beliefs that the movement of the Sun, Moon, and wind or the occurrence of rain can be influenced by one’s thoughts or by the manipulation of some type of symbolic representation of these physical phenomena.

“Magical thinking became an important topic with the rise of sociology and anthropology in the 19th century. It was argued that magical thinking is an integral feature of most religious beliefs, such that one’s inner experience, often in participation with a higher power, could influence the course of events in the physical world.

“Prominent early theorists suggested that magical thinking characterized traditional, non-Western cultures, which contrasted with the more developmentally advanced rational-scientific thought found in industrialized Western cultures. Magical thinking, then, was tied to religion and ‘primitive’ cultures and considered developmentally inferior to the scientific reasoning found in more ‘advanced’ Western cultures.” [6]

Recent converts are notorious for their intolerance of whatever they just left behind[7] and therefore the least likely to play along with social convention. So, suppose you’re a recent convert from magical thinking and someone drops one of those refrigerator magnet aphorisms. You’ll weigh a lot of factors in the next instant, but sometimes there are just some things people need to stop believing, so you’ll go ahead and launch, and social peace-keeping be damned. You do that in part because you’re aware of your own susceptibility to temptation. This is from Psychology Today[8]

“How many times a day do you either cross your fingers, knock on wood, or worry that your good luck will turn on you? When two bad things happen to you, do you cringe in fear of an inevitable third unfortunate event? Even those of us who ‘know better’ are readily prone to this type of superstitious thinking.

“Further defying logic, we also readily believe in our own psychic powers: You’re thinking of a friend when all of a sudden your phone beeps to deliver a new text from that very person. It’s proof positive that your thoughts caused your friend to contact you at that very moment! … These are just a few examples of the type of mind tricks to which we so readily fall prey.”

The article provides a list of seven “mind tricks” taken from psychology writer Matthew Hutson’s book The 7 Laws of Magical Thinking, and invites us to “See how long it takes you to recognize some of your own mental foibles.” Here’s the list, with abbreviated commentary from the article:

  1. “Objects carry essences. We attribute special properties to items that belong or once belonged to someone we love, is famous, or has a particular quality we admire… the objects are just objects, and despite their connection with special people in our lives, they have no inherent ability to transmit those people’s powers to us.
  2. “Symbols have power. Humans have a remarkable tendency to impute meaning not only to objects but to abstract entities. We imbue these symbols with the ability to affect actual events in our lives.
  3. “Actions have distant consequences. In our constant search to control the outcomes of events in our unpredictable lives, we build up a personal library of favorite superstitious rituals or thoughts.
  4. “The mind knows no bound We are often impressed by the apparent coincidence that occurs when a person we’re thinking about suddenly contacts us. For just that moment, we believe the event “proves” that we’re psychic.
  5. “The soul lives on. [Why] do adults hold on so stubbornly to the belief that the mind can continue even after its seat (the brain) is no longer alive? The answer, in part, comes from the terror that we feel about death.
  6. “The world is alive. We attribute human-like qualities to everything from our pets to our iPhones. We read into the faces of our pets all sorts of human emotions such as humor, disappointment, and guilt. If our latest technological toy misbehaves, we yell at it and assume it has some revenge motive it needs to satisfy.
  7. “Everything happens for a reason. The most insidious form of magical thinking is our tendency to believe that there is a purpose or destiny that guides what happens to us… For the same reason, we believe in luck, fate, and chance.”

Magical thinking is one of my personal bugaboos, therefore my personal list would be longer than seven.[9] Those things make me twitch. You?

And speaking of mortality…

Miracles: Magic Gets Personal

We can (and do) make up all kinds of things about what it’s like “up there,” but we can’t really imagine it any more than we can our own death. There’s a lot of research about why that’s so[10], but as a practical matter we have to imagine death while we’re still alive in the here and now, but to do it properly we’d have to be there and then — a problem that explains the popularity of books that some call “heavenly tourism,” about people who go there and come back to tell us about it.[11]

We want our heroes and loved ones looking down on us because we miss them. Losing them makes us feel small, helpless, and powerless — like children. So we draw pictures of clouds and robes and harps and locate them there. Childish? Sure. But preferable to the idea that “they” vanished when their body and brain stopped biologically functioning. Why we like one over the other isn’t clear if we can step back and think about it, but we don’t. Instead we’re so freaked about the trip down the River Styx that we follow convention.

For the same reasons, praying for a miracle that staves off death persists in the face of little to support it.[12]

“Writing Fingerprints of God, my 2009 book about the science of spirituality, gave me an excuse to ask a question that I never openly considered before leaving Christian Science, one that was unusually freighted: Is there any scientific evidence, anything beyond the realm of anecdote, that prayer heals?

“It turns out, the evidence is mixed. Beginning in the 1980s, we’ve seen a rash of prayer studies. Some seemed to show that patients who were prayed for recovered more quickly from heart attacks. Another study found that prayer physically helped people living with AIDS.

“But for every study suggesting that prayer heals a person’s body, there is another one showing that prayer has no effect — or even makes you worse. Does prayer help people with heart problems in a coronary care unit? Researchers at the Mayo Clinic found no effect. Does it benefit people who needed to clear their arteries using angioplasty? Not according to researchers at Duke. In another study, prayer did not ease the plight of those on kidney dialysis machines. And don’t even mention skin warts: Researchers found that people who received prayer saw the number of warts actually increase slightly, compared with those who received no prayer.

“The most famous study, and probably the most damaging for advocates of healing prayer, was conducted by Harvard researcher Herbert Benson in 2006. He looked at the recovery rates of patients undergoing cardiac bypass surgery. Those patients who knew they were receiving prayer actually did worse than those who did not know they were receiving prayer.

“In the end, there is no conclusive evidence from double-blind, randomized studies that suggests that intercessory prayer works.

“Prayer studies are a ‘wild goose chase that violate everything we know about the universe,’ Richard Sloan, professor of behavioral medicine at Columbia University Medical Center and author of Blind Faith, told me: ‘There are no plausible mechanisms that account for how somebody’s thoughts or prayers can influence the health of another person. None.’”

“And yet,” the author continues, “ science has embraced a sliver of my childhood faith, a century after Mary Baker Eddy ‘discovered’ Christian Science in the late 1800s. If scientists don’t buy intercessory prayer, most do agree that there is a mind-body connection.” She also finds some connections in “another new ‘science,’ called ‘neurotheology,’” citing how the stimulation of certain brain areas can deliver the same sensations as meditation, contemplative prayer, spiritual ecstasy, and even out-of-body experiences. As a result, she wonders if the brain might act as a kind of radio: “Is the brain wired to connect with a dimension of reality that our physical senses cannot perceive?”

“Researchers have tried to replicate such out-of-body experiences, which are always after-the-fact anecdotes that cannot be tested. These experiences, they say, suggest that consciousness can exist separate from the brain — in other words, that there may be a transcendent reality that we tap into when brain functioning ceases.

“I am not asking you to believe that consciousness can continue when the brain is not functioning, that there is a God who answers prayer, or that people who pray or meditate connect with another reality. I’m not asking you to believe that all mystical or inexplicable experiences are simply the interaction of chemicals in the brain or firings of the temporal lobe. That’s the point: You don’t have to choose. Because neither side possesses the slam-dunk argument, the dispositive evidence that proves that there is a God, or there isn’t.”

I.e., she’s saying that the impermeable curtain of death means we can’t prove or disprove either the brain-as-a-radio theory or the materialist belief that when your body stops so do you. Thus we’re free to choose, and one’s as viable as the other. Obviously, unlike the Psychology Today writer, this ex-Christian Scientist is not a committed debunker. On the other hand, her reference to the lack of “dispositive evidence that proves that there is a God, or there isn’t” takes us to straight to the ultimate debunking target.

Debunking God (or not)

God is the ultimate debunking target (patriotism is a close second), and the “New Atheists[13]” are the ultimate God debunkers. They’ve also been roundly criticized for being as fundamentalist and evangelical as the fundamentalists and evangelicals they castigate.[14] That’s certainly how I respond to them. I discovered them when I was fresh in my awareness that I’d become an atheist. I put their books on my reading list, read a couple, and deleted the rest. I’d left the fighting fundamentalists behind, and had no desire to rejoin the association. On the other hand, I am grateful to them for making it easier for the rest of us to come out as atheist – something that current social convention makes more difficult than coming out gay.[15]

From what I can tell, there are lots of people like me who didn’t become atheists by being clear-thinking and purposeful[16], it was just something that happened over time, until one day they checked the “none” box beside “religious affiliation.” Atheism wasn’t an intellectual trophy we tried to win, it was a neighborhood we wandered into one day and were surprised to find we had a home there. As one writer said,

“My belief in God didn’t spontaneously combust—it faded.

“I wasn’t the only kid who stopped believing. A record number of young Americans (35 percent) report no religious affiliation, even though 91 percent of us grew up in religiously affiliated households.

“Our disbelief was gradual. Only 1 percent of Americans raised with religion who no longer believe became unaffiliated through a onetime “crisis of faith.” Instead, 36 percent became disenchanted, and another 7 percent said their views evolved.

“It’s like believing in Santa Claus. Psychologists Thalia Goldstein and Jaqueline Woolley have found that children’s disbelief in Santa Claus is progressive, not instantaneous. First kids think that the Santa in the mall or library is real, then they think he’s not real but still magically communicates with the actual Santa, and so on, until they finally realize that Santa is composed of costumed actors. “Kids don’t just turn [belief] off,” Goldstein says.

“Likewise, losing faith happens in pieces.”[17]

It seems fitting we would exit religion that way, since it’s the way many of us got into it in the first place. Yes, some people seem to have those Damascus Road conversions[18], or maybe a less dramatic “come to Jesus meeting,” as a friend of mine says, but more often religion just kind of seeps into us from the surrounding culture.

“I used to love this illustrated children’s Bible my mom gave me. Long-faced Jonah inside a yawning blue whale felt warm and right. My brain made these feelings. When we enjoy religious or associated experiences, like snuggling up with Mom reading the Bible, our brain’s reward circuits activate. Over time, religious ideas become rewarding in and of themselves. This is a powerful, unconscious motivation to keep believing.

“When I began to see my colorful Bible as boring and childish, those same reward circuits likely became less active. Religious experiences produced less pleasure. This happens involuntarily in people with Parkinson’s disease, which compromises the brain’s reward centers. [That is why] people who develop Parkinson’s are much more likely to lose their faith.”[19]

The New Magic – Or, maybe I’m just skeptical about skepticism.

But then, it’s common that having been debunked of religion, we transfer that same commitment to something else – maybe magical thinking or some other unverifiable belief system. Turns out there’s a neurological reason for that: the neural pathways that ran our old belief system are still there, so we just load them with new content:

“For many years I believed in both creationism, with a God whose hand I could shake, and evolution, a cold, scientific world that cared nothing about me. Because when we lose faith, our brain’s preexisting belief networks don’t dissolve. They’re updated, like a wardrobe. ‘Even if someone abandons or converts [religions], it’s not like they’re throwing out all the clothes they own and now buying a whole new set,’ says Jordan Grafman, director of brain injury research at the Shirley Ryan AbilityLab and a professor at Northwestern University. ‘You pick and choose what you leave and what you keep.’

“New beliefs join the same neurological framework as old ones. It’s even possible that an existing belief network paves the way for additional beliefs. [Another researcher] has found that kids who believe in fantastical beings are more likely to believe in new ones invented by researchers. “I think it’s because they already have this network that [the new belief] kind of fits into,” she explains. Sometimes the new beliefs resemble the old ones; sometimes they don’t.

“Most non-religious people are ‘passionately committed to some ideology or other,’ explains Patrick McNamara, a neurology professor at Boston University School of Medicine. These passions function neurologically as ‘faux religions.’”[20]

And then, having been newly converted to our new faux religion, we’re set up for another eventual round of debunking.

Meet the new boss.

Same as the old boss.

[1] Here’s the original music video of We Won’t Get Fooled Again. Watching it draws you all the way back into the turbulent, polarizing 60’s — if you remember them, that is — and the tone feels eerily similar to what we’re living with today. By the way, who said, “If you remember the 60’s, you really weren’t there”? Find out here.

[2] Ogden, Emily, Debunking Debunked, Aeon (Aug. 12, 2019). Ms. Ogden’s Aeon bio says she is “an associate professor of English at the University of Virginia, and an author whose work has appeared in Critical Inquiry, The New York Times and American Literature, among others. Her latest book is Credulity: A Cultural History of US Mesmerism (2018).” All quotes in this section are from this article.

nk’ means baloney, hooey, bullshit. Bunk isn’t just a lie, it’s a manipulative lie, the sort of thing a con man might try to get you to believe in order to gain control of your mind and your bank account. Bunk, then, is the tool of social parasites, and the word ‘debunk’ carries with it the expectation of clearing out something that is foreign to the healthy organism. Just as you can deworm a puppy, you can debunk a religious practice, a pyramid scheme, a quack cure. Get rid of the nonsense, and the polity – just like the puppy – will fare better. Con men will be deprived of their innocent marks, and the world will take one more step in the direction of modernity.

[3] This social convention has been around a long time: like the Bible (something else we might like to debunk) says, “There is a time for everything under heaven … a time to keep silence, and a time to speak.”   Ecclesiastes 3: 7

[4]This Article Won’t Change Your Mind,” The Atlantic (March 2017):

[5]Why Facts Don’t Change Our Minds,The New Yorker (Feb. 27, 2017).

[6] Encyclopedia Britannica.

[7] See Volck, Brian, The Convert’s Zeal, Image Journal (Aug. 22, 2019). See also this Pew Center report.

[8] 7 Ideas We Really Need to Stop Believing. Psychology Today (May 08, 2012).

[9] Mr. Hutson’s list is based on “a wealth of psychological evidence,” while mine comes from my own anecdotal judgment that magical thinking has led to all kinds of delusional decisions and disasters in my life. The irony of using my own subjective perspective to debunk my own life doesn’t escape me. – it ranks right in there with The Who’s resorting to prayer in the hope they won’t be fooled again.

[10] Doubting death: how our brains shield us from mortal truth, The Guardian (Oct. 19, 2019).

[11] Like Heaven is For Real, by Alex Malarkey. Yes, that’s his real name.

[12] The Science of Miracles, Medium (Feb. 7, 2019).

[13] Wikipedia.

[14] Wikipedia.

[15] What Atheists Can Learn From The Gay Rights Movement, The Washington Post (Apr. 3, 2013). Coming out as atheist is even trickier if you’re in the public eye: ‘I Prefer Non-Religious’: Why So Few US Politicians Come Out As Atheists, The Guardian (Aug. 3, 2019); The Last Taboo: It’s harder in America to come out as an atheist politician than a gay one. Why? Politico Magazine (Dec. 9, 2013)

[16] Such as Andrew L. Seidel, an “out-of-the-closet atheist” and author of The Founding Myth: Why Christian Nationalism Is Un-American (2019).

[17] Beaton, Caroline, What Happens to Your Brain When You Stop Believing in God: It’s like going off a drug Vice (Mar. 28 2017).

[18] The Acts of the Apostles 9: 1-9.

[19] Beaton, op. cit..

[20] Ibid.