A Talk at the Rock: How to Instantly Polarize a Crowd and End a Discussion

AreopaguslImage from Wikipedia

The Areopagus is a large rock outcropping in Athens, not far from the Acropolis, where in ancient times various legal, economic, and religious issues got a hearing. A Bible story about something that happened there two thousand years ago provides surprising insight on today’s hyper-polarized world.

Backstory:  A Dualistic Worldview

In the 17th Century, Frenchman René Descartes sorted reality into two categories: (1) the natural, physical world and (2) the unseen world of ideas, feelings, and beliefs. This duality was born of the times:

“Toward the end of the Renaissance period, a radical epistemological and metaphysical shift overcame the Western psyche. The advances of Nicolaus Copernicus, Galileo Galilei and Francis Bacon posed a serious problem for Christian dogma and its dominion over the natural world.

“In the 17th century, René Descartes’s dualism of matter and mind was an ingenious solution to the problem this created. ‘The ideas’ that had hitherto been understood as inhering in nature as ‘God’s thoughts’ were rescued from the advancing army of empirical science and withdrawn into the safety of a separate domain, ‘the mind’.

“On the one hand, this maintained a dimension proper to God, and on the other, served to ‘make the intellectual world safe for Copernicus and Galileo’, as the American philosopher Richard Rorty put it in Philosophy and the Mirror of Nature (1979).

“In one fell swoop, God’s substance-divinity was protected, while empirical science was given reign over nature-as-mechanism – something ungodly and therefore free game.”[1]

Descartes articulated this dualistic framework, but it had been around from prehistoric antiquity. It still persists today, and neurological research suggests the human brain comes pre-wired for it. This is from Psychology Today[2]:

“Recent research suggests that our brains may be pre-wired for dichotomized thinking. That’s a fancy name for thinking and perceiving in terms of two – and only two – opposing possibilities.

“Neurologists explored the activity of certain key regions of the human forebrain – the frontal lobe – trying to understand how the brain switches between tasks. Scientists generally accept the idea that the brain can only consciously manage one task at a time….

“However, some researchers are now suggesting that our brains can keep tabs on two tasks at a time, by sending each one to a different side of the brain. Apparently, we toggle back and forth, with one task being primary and the other on standby.

“Add a third task, however, and one of the others has to drop off the to-do list. Scans of brain activity during this task switching have led to the hypothesis that the brain actually likes handling things in pairs. Indeed, the brain itself is subdivided into two distinct half-brains, or hemispheres.

“Some researchers are now extending this reasoning to suggest that the brain has a built-in tendency, when confronted by complex propositions, to selfishly reduce the set of choices to just two.

“The popular vocabulary routinely signals this dichotomizing mental habit: ‘Are you with us, or against us?’ ‘If you’re not part of the solution, you’re part of the problem.’

“These research findings might help explain how and why the public discourse of our culture has become so polarized and rancorous, and how we might be able to replace it with a more intelligent conversation.

“One of our popular clichés is ‘Well, there are two sides to every story.’ Why only two? Maybe the less sophisticated and less rational members of our society are caught up in duplex thinking, because the combination of a polarized brain and unexamined emotional reflexes keep them there.”

“Less sophisticating and less rational” … the author’s ideological bias is showing, but the “unexamined emotional reflexes” finger points at both ends of the polarized spectrum. And because our brains love status quo and resist change, we hunker down on our assumptions and biases. True, the balance can shift more gradually, over time – the way objectivity ascended during the 18th Century’s Age of Enlightenment, but Romanticism pushed back in the 19th — but usually it takes something drastic like disruptive innovation, tragedy, violence, etc. to knock us off our equilibrium. Absent that, we’re usually not up for the examination required to separate what we objectively know from what we subjectively believe — it’s all just reality, and as long as it’s working, we’re good. If we’re forced to examine and adjust, we’ll most likely take our cues from our cultural context:

“Each of us conducts our lives according to a set of assumptions about how things work: how our society functions, its relationship with the natural world, what’s valuable, and what’s possible. This is our worldview, which often remains unquestioned and unstated but is deeply felt and underlies many of the choices we make in our lives. We form our worldview implicitly as we grow up, from our family, friends, and culture, and, once it’s set, we’re barely aware of it unless we’re presented with a different worldview for comparison. The unconscious origin of our worldview makes it quite inflexible.

“There is [a] potent force shaping the particular patterns we perceive around us. It’s what anthropologists call culture. Just as language shapes the perception of an infant as she listens to the patterns of sounds around her, so the mythic patterns of thought informing the culture a child is born into will literally shape how that child constructs meaning in the world. Every culture holds its own worldview: a complex and comprehensive model of how the universe works and how to act within it. This network of beliefs and values determines the way in which each child in that culture makes sense of the universe.”[3]

Culture has been sculpting the human brain ever since our earliest ancestors began living complex social lives millions of years ago. It’s only when the cultural balance runs off the rails that our brains scramble to reset, and we’re stressed while they’re at it. We would do well not to wait until then, and learn how to embrace both ends of the dualistic spectrum, argues one computational biologist[4]:

“Neuroscience was part of the dinner conversation in my family, often a prerequisite for truth. Want to talk about art? Not without neuroscience. Interested in justice? You can’t judge someone’s sanity without parsing scans of the brain. But though science helps us refine our thinking, we’re hindered by its limits: outside of mathematics, after all, no view of reality can achieve absolute certainty. Progress creates the illusion that we are moving toward deeper knowledge when, in fact, imperfect theories constantly lead us astray.

“The conflict is relevant in this age of anti-science, with far-Right activists questioning climate change, evolution and other current finds. In his book Enlightenment Now (2018), Steven Pinker describes a second assault on science from within mainstream scholarship and the arts. But is that really bad? Nineteenth-century Romanticism was the first movement to take on the Enlightenment – and we still see its effects in such areas as environmentalism, asceticism and the ethical exercise of conscience.

“In our new era of Enlightenment, we need Romanticism again. In his speech ‘Politics and Conscience’ (1984), the Czech dissident Václav Havel, discussing factories and smokestacks on the horizon, explained just why: ‘People thought they could explain and conquer nature – yet … they destroyed it and disinherited themselves from it.’ Havel was not against industry, he was just for labour relations and protection of the environment.

“The issues persist. From use of GMO seeds and aquaculture to assert control over the food chain to military strategies for gene-engineering bioweapons, power is asserted though patents and financial control over basic aspects of life. The French philosopher Michel Foucault in The Will to Knowledge (1976) referred to such advancements as ‘techniques for achieving the subjugation of bodies and the control of populations’. With winners and losers in the new arena, it only makes sense that some folks are going to push back.

“We are now on the verge of a new revolution in control over life through the gene-editing tool Crispr-Cas9, which has given us the ability to tinker with the colour of butterfly wings and alter the heritable genetic code of humans. In this uncharted territory, where ethical issues are rife, we can get blindsided by sinking too much of our faith into science, and losing our sense of humanity or belief in human rights.

“Science should inform values such as vaccine and climate policy, but it must not determine all values…. With science becoming a brutal game of market forces and patent controls, the skeptics and Romantics among us must weigh in, and we already are.”

That’s probably good advice, but we need to push through a lot of cultural status quo to get there. That’s especially true because the 20th Century brought us change at ever-accelerating rates — objective reality went spinning away and we crashed into the extreme belief end of the spectrum:

“Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. What’s problematic is going overboard — letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts.

“The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts.

“Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us. And most of us haven’t realized how far-reaching our strange new normal has become.”[5]

When we can agree that our conflict is a matter of my data vs. yours, we can debate rationally. But when it’s my beliefs vs. yours, what used to be discourse dissolves into stonewalling and shouting. Belief seeks its own perfection by eliminating doubt, and therefore devolves into fundamentalism, where discussion is a sign of doubt, punishable as heresy. Fundamentalism can be secular or religious – it’s the dynamic, not the content, that matters

“Fundamentalism is a mind-set. The iconography and language it employs can be either religious or secular or both, but because it dismisses all alternative viewpoints as inferior and unworthy of consideration it is anti-thought. This is part of its attraction. It fills a human desire for self-importance, for hope and the dream of finally attaining paradise. It creates a binary world of absolutes, of good and evil. It provides a comforting emotional certitude. It is used to elevate our cultural, social, and economic systems above others. It is used to justify imperial hubris, war, intolerance and repression as a regrettable necessity in the march of human progress. The fundamentalist murders, plunders and subjugates in the name of humankind’s most exalted ideals. Those who oppose the fundamentalists are dismissed as savages, condemned as lesser breeds of human beings, miscreants led astray by Satan or on the wrong side of Western civilization. The nation is endowed with power and military prowess, fundamentalists argue, because God or our higher form of civilization makes us superior. It is our right to dominate and rule. The core belief systems of these secular and religious antagonists are identical. They are utopian. They will lead us out of the wilderness to the land of milk and honey.”[6]

Fundamentalism is where the open mind goes into lockdown. Objectivity loses its grip and the question “Are you with us, or against us?” gives way to its declarative version, “If you’re not with us, you’re against us.”[7] Dualistic thinking ceases to be more than a source of “popular clichés,” and becomes instead a rigid disincentive to public discourse, as competing polarized beliefs dig in for a grinding, maddening war of attrition. What used to be public discourse is lost in a no-man’s land of intellectual wreckage created by each side’s incessant lobbing of ideological bombs at the other’s entrenched subjective positions. Each side is convinced it has a God’s-eye view of reality, therefore God is on its side, which motivates securing its position by all necessary means.

A Talk at the Rock

The Christian scriptures illustrate how all this works in a story from one of the Apostle Paul’s missionary journeys.

“Now while Paul was… at Athens, his spirit was provoked within him as he saw that the city was full of idols. So, he reasoned in the synagogue with the Jews and the devout persons, and in the marketplace every day with those who happened to be there. Some of the Epicurean and Stoic philosophers also conversed with him. And some said, ‘What does this babbler wish to say?’ Others said, ‘He seems to be a preacher of foreign divinities’—because he was preaching Jesus and the resurrection.  And they took him and brought him to the Areopagus, saying, May we know what this new teaching is that you are presenting? For you bring some strange things to our ears. We wish to know therefore what these things mean.’[8]

The Epicureans and Stoics were the materialists of their day – their thinking leaned toward the objective side of the dualism. When Paul came to town advocating ideas (the subjective end of the dualism), their brain patterning couldn’t process Paul’s worldview. They needed time, so they invited Paul to a Talk at the Rock (the Areopagus).

At this point, the author of the story –- widely believed to be the same “Luke the beloved physician”[9] who wrote the Gospel of Luke – inserts a biased editorial comment that signals that nothing’s going to come of this because “all the Athenians and the foreigners who lived there would spend their time in nothing except telling or hearing something new.”[10] I.e., reasonable consideration — public discourse – was going to be a waste of time. But Paul had prepared some culturally sensitive opening remarks:

“So Paul, standing in the midst of the Areopagus, said: ‘Men of Athens, I perceive that in every way you are very religious.For as I passed along and observed the objects of your worship, I found also an altar with this inscription: To the unknown god. What therefore you worship as unknown, this I proclaim to you.’”

He then offers up the idea of substituting his ‘foreign god’ for the Athenians’ statuary, altars, and temples:

“The God who made the world and everything in it, being Lord of heaven and earth, does not live in temples made by man, nor is he served by human hands, as though he needed anything, since he himself gives to all mankind life and breath and everything. And he made from one man every nation of mankind to live on all the face of the earth, having determined allotted periods and the boundaries of their dwelling place, that they should seek God, and perhaps feel their way toward him and find him.”

You can sense the crowd’s restless murmuring and shuffling feet, but then Paul goes back to cultural bridge-building:

“Yet he is actually not far from each one of us, for ‘In him we live and move and have our being’ [referring to a passage from Epimenides of Crete], and as even some of your own poets have said, ‘For we are indeed his offspring.’[{From Aratus’s poem Phainomena].”

Nice recovery, Paul. So far so good. This feels like discourse, what the Rock is for. But Paul believes that the Athenians’ practice of blending the unseen world of their gods with their physical craftmanship of statuary, altars, and temples (a practice the church would later perfect) is idolatry, and in his religious culture back home, idolatry had been on the outs since the Golden Calf.[11] At this point, Paul takes off the cultural kit gloves and goes fundamentalist:

“Being then God’s offspring, we ought not to think that the divine being is like gold or silver or stone, an image formed by the art and imagination of man. The times of ignorance God overlooked, but now he commands all people everywhere to repent, because he has fixed a day on which he will judge the world in righteousness by a man whom he has appointed; and of this he has given assurance to all by raising him from the dead.”

That’s precisely the point where he loses the crowd — well, most of them, there were some who were willing to give him another shot, and even a couple fresh converts:

“Now when they heard of the resurrection of the dead, some mocked. But others said, ‘We will hear you again about this.’ So Paul went out from their midst. But some men joined him and believed, among whom also were Dionysius the Areopagite and a woman named Damaris and others with them.”

“Some men joined him and believed….” That’s all there was left for them to do: believe or not believe. You’re either with us or against us.

Paul had violated the cultural ethics of a Talk at the Rock. It was about reasonable discourse; he made it a matter of belief, saying in effect. “forget your social customs and ethics, my God is going to hurt you if you keep it up.” With that, the conclave became irretrievably polarized, and the session was over.

Paul triggered this cultural dynamic constantly on his journeys – for example a few years later, when the Ephesus idol-building guild figured out the economic implications of Paul’s belief system[12]:

“About that time there arose no little disturbance concerning the Way.  For a man named Demetrius, a silversmith, who made silver shrines of Artemis, brought no little business to the craftsmen. These he gathered together, with the workmen in similar trades, and said, ‘Men, you know that from this business we have our wealth. And you see and hear that not only in Ephesus but in almost all of Asia this Paul has persuaded and turned away a great many people, saying that gods made with hands are not gods. And there is danger not only that this trade of ours may come into disrepute but also that the temple of the great goddess Artemis may be counted as nothing, and that she may even be deposed from her magnificence, she whom all Asia and the world worship.’ When they heard this they were enraged and were crying out, ‘Great is Artemis of the Ephesians!’”

Jesus had previously taken a whip to the merchants in the Temple in Jerusalem.[13] Apparently Demetrius and his fellow craftsmen saw the same thing coming to them, and made a preemptive strike. The scene quickly spiraled out of control:

“So the city was filled with the confusion, and they rushed together into the theater, dragging with them Gaius and Aristarchus, Macedonians who were Paul’s companions in travel.  But when Paul wished to go in among the crowd, the disciples would not let him. And even some of the Asiarchs, who were friends of his, sent to him and were urging him not to venture into the theater. Now some cried out one thing, some another, for the assembly was in confusion, and most of them did not know why they had come together.”

A local official finally quelled the riot:

“Some of the crowd prompted Alexander, whom the Jews had put forward. And Alexander, motioning with his hand, wanted to make a defense to the crowd. But when they recognized that he was a Jew, for about two hours they all cried out with one voice, ‘Great is Artemis of the Ephesians!’

“And when the town clerk had quieted the crowd, he said, ‘Men of Ephesus, who is there who does not know that the city of the Ephesians is temple keeper of the great Artemis, and of the sacred stone that fell from the sky? Seeing then that these things cannot be denied, you ought to be quiet and do nothing rash. For you have brought these men here who are neither sacrilegious nor blasphemers of our goddess. If therefore Demetrius and the craftsmen with him have a complaint against anyone, the courts are open, and there are proconsuls. Let them bring charges against one another. But if you seek anything further, it shall be settled in the regular assembly. For we really are in danger of being charged with rioting today, since there is no cause that we can give to justify this commotion.” and when he had said these things, he dismissed the assembly.”[14]

It Still Happens Today

I spent years in the evangelical church – we were fundamentalists, but didn’t want to admit it – where Paul’s Talk at the Rock was held up as the way not to “share your faith.” Forget the public discourse — you can’t just “spend [your] time in nothing except telling or hearing something new,” you need to lay the truth on them so they can believe or not believe, and if they don’t, you need to “shake the dust off your feet”[15] and get out of there. These days, we see both secular and religious cultural institutions following that advice.

Will we ever learn?

[1]How The Dualism Of Descartes Ruined Our Mental HealthMedium (May 10, 2019)

[2] Karl Albrecht, “The Tyranny of Two,” Psychology Today (Aug 18, 2010)

[3] Jeremy Lent, The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning (2017)

[4] Jim Kozubek, “The Enlightenment Rationality Is Not Enough: We Need A New Romanticism,” Aeon (Apr. 18, 2018)

[5] Andersen, Kurt, Fantasyland: How American Went Haywire, a 500-Year History (2017)

[6] Hedges, Chris, I Don’t Believe in Atheists: The Dangerous Rise of the Secular Fundamentalist (2008)

[7] The latter came from Jesus himself – see the Gospels of Matthew 21: 12-13, and John 2: 13-16. Jesus was a belief man through and through. More on that another time.

[8] The Acts of the Apostles 17: 17-20.

[9] Paul’s letter to the Colossians 4: 14.

[10] Acts 17: 21.

[11] Exodus 32.

[12] Acts 19: 23-41

[13] Mathew 21: 12-17; John 2: 13-21

[14] Acts: 23-42

[15] Matthew 10:14.

Subjective Science

quantum mechanics formula

What happened to spark all the recent scientific interest in looking for consciousness in the brains of humans and animals, in insects, and … well, everywhere? (Including not just the universe, but also the theoretical biocentric universe and quantum multiverses.)

“It has been said that, if the 20th century was the age of physics, the 21st will be the age of the brain. Among scientists today, consciousness is being hailed as one of the prime intellectual challenges. My interest in the subject is not in any particular solution to the origin of consciousness – I believe we’ll be arguing about that for millennia to come – but rather in the question: why is consciousness perceived as a ‘problem’? How exactly did it become a problem? And given that it was off the table of science for so long, why is it now becoming such a hot research subject?”

I Feel Therefore I Am — How Exactly Did Consciousness Become A Problem? And why, after years off the table, is it a hot research subject now?  Aeon Magazine (Dec. 1, 2015)

From what I can tell, two key sparks started the research fire:  (1) the full implications of quantum mechanics finally set in, and (2) machines learned how to learn.

(1)  Quantum Mechanics:  Science Goes Subjective. Ever since Descartes set up his dualistic reality a few hundred years ago, we’ve been able to trust that science could give us an objective, detached, rational, factual view of the observable universe, while philosophy and religion could explore the invisible universe where subjectivity reigns. But then the handy boundary between the two was torn in the early 20th Century when quantum mechanics found that subjectivity reigns on a sub-atomic level, where reality depends on what researchers decide ahead of time what they’re looking for. Scientists tried for the rest of the 20th Century to restore objectivity to their subatomic lab work, but eventually had to concede.

 “Physicists began to realise that consciousness might after all be critical to their own descriptions of the world. With the advent of quantum mechanics they found that, in order to make sense of what their theories were saying about the subatomic world, they had to posit that the scientist-observer was actively involved in constructing reality.

“At the subatomic level, reality appeared to be a subjective flow in which objects sometimes behave like particles and other times like waves. Which facet is manifest depends on how the human observer is looking at the situation.

“Such a view apalled many physicists, who fought desperately to find a way out, and for much of the 20th century it still seemed possible to imagine that, somehow, subjectivity could be squeezed out of the frame, leaving a purely objective description of the world.

“In other words, human subjectivity is drawing forth the world.”

I Feel Therefore I Am

(2)  Machines Learned to Learn. Remember “garbage in, garbage out”? It used to be that computers had to be supervised — they only did what we told them to do, and could only use the information we gave them. But not anymore. Now their “minds” are free to sort through the garbage on their own and make up their own rules about what to keep or throw out. Because of this kind of machine learning, we now have computers practicing law and medicine, handling customer service, writing the news, composing music, writing novels and screenplays, creating art…. all those things we used to think needed human judgment and feelings. Google wizard and overall overachiever Sebastian Thrun[1] explains the new machine learning in this conversation with TED Curator Chris Anderson:

 “Artificial intelligence and machine learning is about 60 years old and has not had a great day in its past until recently. And the reason is that today, we have reached a scale of computing and datasets that was necessary to make machines smart. The new thing now is that computers can find their own rules. So instead of an expert deciphering, step by step, a rule for every contingency, what you do now is you give the computer examples and have it infer its own rules.

 “20 years ago the computers were as big as a cockroach brain. Now they are powerful enough to really emulate specialized human thinking. And then the computers take advantage of the fact that they can look at much more data than people can.

No wonder science got rattled. Like the rest of us, it was comfortable with all the Cartesian dualisms that kept the world neatly sorted out:  science vs. religion,[2] objective vs. subjective, knowledge vs. belief, humanity vs. technology…. But now all these opposites are blurring together in a subjective vortex while non-human intelligence looks on and comments about it.

Brave New World, indeed. How shall we respond to it?

More next time.

[1] Sebastian Thrun’s TED bio describes him as “an educator, entrepreneur and troublemaker. After a long life as a professor at Stanford University, Thrun resigned from tenure to join Google. At Google, he founded Google X, home to self-driving cars and many other moonshot technologies. Thrun also founded Udacity, an online university with worldwide reach, and Kitty Hawk, a ‘flying car’ company. He has authored 11 books, 400 papers, holds 3 doctorates and has won numerous awards.”

[2] For an alternative to the science-religion dualism, see Science + Religion:  The science-versus-religion opposition is a barrier to thought. Each one is a gift, rather than a threat, to the other, Aeon Magazine (Nov. 21, 2019)

 

Knowledge, Conviction, and Belief [9]:  Reckoning With Mystery

pontius pilate

“What is truth?”
Pontius Pilate
John 18:38 (NIV)

On the science side of Cartesian dualism, truth must be falsifiable — we have to be able to prove it’s untrue. On the religious side, to falsify is to doubt, doubt becomes heresy, and heresy meets the bad end it deserves.

Neither side likes mystery, because both are trying to satisfy a more primal need:  to know, explain, and be right. It’s a survival skill:  we need to be right about a lot of things to stay alive, and there’s nothing more primal to a mortal being than staying alive. Mystery is nice if you’ve got the time, but at some point it won’t help you eat and avoid being eaten.

Science tackles mysteries with experiments and theories, religion with doctrine and ritual. Both try to nail their truth down to every “jot and tittle,” while mystery bides its time, aloof and unimpressed.

I once heard a street preacher offer his rationale for the existence of God. “Think about how big the universe is,” he said, “It’s too big for me to understand. There has to be a God behind it.” That’s God explained on a street corner:  “I don’t get it, so there has be a higher up who does. His name is God.” The preacher’s God has the expansive consciousness we lack, and if we don’t always understand, that’s part of the deal:

“For my thoughts are not your thoughts,
neither are your ways my ways,”
declares the Lord.
“As the heavens are higher than the earth,
so are my ways higher than your ways
and my thoughts than your thoughts.”

Isaiah 55:8-9 (NIV)

Compare that to a cognitive neuroscientist’s take on our ability to perceive reality, as explained in this video.

“Many scientists believe that natural selection brought our perception of reality into clearer and deeper focus, reasoning that growing more attuned to the outside world gave our ancestors an evolutionary edge. Donald Hoffman, a cognitive scientist at the University of California, Irvine, thinks that just the opposite is true. Because evolution selects for survival, not accuracy, he proposes that our conscious experience masks reality behind millennia of adaptions for ‘fitness payoffs’ – an argument supported by his work running evolutionary game-theory simulations. In this interview recorded at the HowTheLightGetsIn Festival from the Institute of Arts and Ideas in 2019, Hoffman explains why he believes that perception must necessarily hide reality for conscious agents to survive and reproduce. With that view serving as a springboard, the wide-ranging discussion also touches on Hoffman’s consciousness-centric framework for reality, and its potential implications for our everyday lives.”

The video is 40 minutes long, but a few minutes will suffice to make today/s point. Prof. Hoffman admits his theory is counterintuitive and bizarre, but promises he’s still working on it (moving it toward falsifiability). I personally favor scientific materialism’s explanation of consciousness, and I actually get the theory behind Prof. Hoffman’s ideas, but when I watch this I can’t help but think its’s amazing how far science and religion will go to define their versions of how things work. That’s why I quit trying to read philosophy:  all that meticulous logic trying to block all exits and close all loopholes, but sooner or later some mystery leaks out a seam, and when it does the whole thing seems overwrought and silly.

The street preacher thinks reality is out there, and we’re given enough brain to both get by and know when to quit trying and trust a higher intelligence that has it all figured out. The scientist starts in here, with the brain (“the meat that thinks”), then tries to describe how it creates a useful enough version of reality to help us get by in the external world.

The preacher likes the eternal human soul; the scientist goes for the bio-neuro-cultural construction we call the self. Positions established, each side takes and receives metaphysical potshots from the other. For example, when science clamors after the non-falsifiable multiverse theory of quantum physics, the intelligent designers gleefully point out that the so-called scientists are leapers of faith just like them:

“Unsurprisingly, the folks at the Discovery Institute, the Seattle-based think-tank for creationism and intelligent design, have been following the unfolding developments in theoretical physics with great interest. The Catholic evangelist Denyse O’Leary, writing for the Institute’s Evolution News blog in 2017, suggests that: ‘Advocates [of the multiverse] do not merely propose that we accept faulty evidence. They want us to abandon evidence as a key criterion for acceptance of their theory.’ The creationists are saying, with some justification: look, you accuse us of pseudoscience, but how is what you’re doing in the name of science any different? They seek to undermine the authority of science as the last word on the rational search for truth.

“And, no matter how much we might want to believe that God designed all life on Earth, we must accept that intelligent design makes no testable predictions of its own. It is simply a conceptual alternative to evolution as the cause of life’s incredible complexity. Intelligent design cannot be falsified, just as nobody can prove the existence or non-existence of a philosopher’s metaphysical God, or a God of religion that ‘moves in mysterious ways’. Intelligent design is not science: as a theory, it is simply overwhelmed by its metaphysical content.”

But Is It Science? Aeon Magazine, Oct. 7, 2019.

And so it goes. But what would be so wrong with letting mystery stay… well, um… mysterious?

We’ll look at that next time.

Knowledge, Conviction, and Belief [7]: Science and Metaphysics

college photo op

It’s a fine September day during freshman orientation week, and we’re a photo op:  a circle of students on the grass outside a stately hall of higher education. Our leader asked us to tell each other what we hope to learn while we’re here. “I’m interested in metaphysics,” one girl says. I don’t know what that means, and being a clueless frosh, I don’t bother to find out until decades later. [1]

This is from Online Etymology:

metaphysics (n.)

“the science of the inward and essential nature of things,” 1560s, plural of Middle English metaphisikmethaphesik (late 14c.), “branch of speculation which deals with the first causes of things.” … See meta- + physics.

“The name was given c.70 B.C.E. … to the customary ordering of [Aristotle’s Physics], but it was misinterpreted by Latin writers as meaning “the science of what is beyond the physical.”

Metaphysics is what happens when scholars think about the big picture. René Descartes was doing metaphysics when he split reality into seen vs. unseen, knowable vs. mysterious:  “He developed a metaphysical dualism that distinguishes radically between mind… and matter.” Encyclopaedia Britannica.

Metaphysics catches some grief about whether it’s a legitimate academic discipline, but counters that you can’t think about… well, anything… without first thinking about the bigger picture:

Metaphysics, the philosophical study whose object is to determine the real nature of things—to determine the meaning, structure, and principles of whatever is insofar as it is. Although this study is popularly conceived as referring to anything excessively subtle and highly theoretical and although it has been subjected to many criticisms, it is presented by metaphysicians as the most fundamental and most comprehensive of inquiries, inasmuch as it is concerned with reality as a whole.”

Encyclopedia Brittanica:

Even science has to concede metaphysics’ primacy:

“It turns out to be impossible even to formulate a scientific theory without metaphysics, without first assuming some things we can’t actually prove, such as the existence of an objective reality and the invisible entities we believe to exist in it.

“This is a bit awkward because it’s difficult, if not impossible, to gather empirical facts without first having some theoretical understanding of what we think we’re doing.

“Choosing between competing theories that are equivalently accommodating of the facts can become a matter for personal judgment, or our choice of metaphysical preconceptions or prejudices.”

But Is It Science? Aeon Magazine, Oct. 7, 2019. (The remaining quotes are also from this source.)

Scientific inquiry begins subjectively — with beliefs and assumptions that can’t be scrutinized by scientific method — a detail which, if left unintended, puts scientific inquiry on a par with, let’s say, late night dorm conjecture about the meaning of life. Science tries to rise above by requiring that its theories be falsifiable:  they have to be expressed in a way that lets you objectively prove them wrong.

“The philosopher Karl Popper argued that what distinguishes a scientific theory from pseudoscience and pure metaphysics is the possibility that it might be falsified on exposure to empirical data. In other words, a theory is scientific if it has the potential to be proved wrong.”

Falsifiability means you can’t appeal to metaphysics to avoid empirical scrutiny. Trouble is, our brains, once wired with our beliefs, make sure our experience conforms to them. But still…

“For me at least, there has to be a difference between science and pseudoscience; between science and pure metaphysics, or just plain ordinary bullshit.”

To be reliable, science has to make sure its metaphysics and physics line up in actual experience. For example, whatever  your metaphysical theory of the grand cosmos, you still need physics to make your GPS work:

“When you use Google Maps on your smartphone, you draw on a network of satellites orbiting Earth at 20,000 kilometres, of which four are needed for the system to work, and between six and 10 are ‘visible’ from your location at any time. Each of these satellites carries a miniaturised atomic clock, and transmits precise timing and position data to your device that allow you to pinpoint your location and identify the fastest route to the pub. “But without corrections based on Albert Einstein’s special and general theories of relativity, the Global Positioning System would accumulate clock errors, leading to position errors of up to 11 kilometres per day. Without these rather abstract and esoteric – but nevertheless highly successful – theories of physics, after a couple of days you’d have a hard time working out where on Earth you are.

“In February 2019, the pioneers of GPS were awarded the Queen Elizabeth Prize for Engineering. The judges remarked that ‘the public may not know what [GPS] stands for, but they know what it is’. This suggests a rather handy metaphor for science. We might scratch our heads about how it works, but we know that, when it’s done properly, it does.”

More about falsifiability vs. faith, subjective vs. objective, real vs. fantasy, and other Cartesian dualisms next time.

[1] Now that I know what “metaphysics” means, I realize I was interested in it, too. In fact, metaphysics has been something of a defining pursuit of mine for most of my life, although less so lately. More on that another time.

Knowledge, Conviction, and Belief [3]: Dualism – Reality A and Reality B

Janus

We’ve been talking about dualistic thinking — the kind that leads us to think we live simultaneously in two realities.

Reality A is “life in the flesh” — bound by space and time and all the imperfections of what it means to be human. It is life carried on in our physical bodies, where our impressive but ultimately limited brains are in charge.

Reality B is “life in the spirit” — the eternal, perfect, transcendent, idealized, supernatural, original source that informs, explains, and guides its poorer counterpart.

This dualistic thinking says there’s more to life than meets the eye, that humans are an “eternal soul having a worldly existence.” The dualism set ups a cascade of derivative beliefs, for example:

There’s a difference between the Reality A identity and experience we actually have and the Reality B identity and experience we would have if we could rise above Reality A and live up to the idealized version of Reality B.

Every now and then, somebody gets lucky or gets saved or called, and gets to live out their Reality B destiny, which gives them and their lives a heightened sense of purpose and meaning.

But those are the chosen few, and they’re rare. For most of us, our ordinary selves and mundane lives are only a shadow of our “higher selves” and “greater potential.”

The chosen few can — and often do — provide guidance as to how we can do better, and we do well to find some compatible relation with one of more of them, but sometimes, in the right setting and circumstance, we might discover that we have receptors of our own that can receive signals from Reality B. We call this “enlightenment” or “conversion” or “salvation” or something like that, and it’s wonderful, blissful, and euphoric.

But most of the time, for the vast majority of us, Reality A is guided by a mostly one-way communication with Reality B — a sort of moment-by-moment data upload from A to B, where everything about us and our lives — every conscious and subconscious intent, motive, thought, word, and deed — gets stored in a failsafe beyond-time data bank. When our Reality A lives end, those records determine what happens next — they inform our next trip through Reality A, or set the stage for Reality B existence we’re really going to like or we’re really going to suffer.

Everybody pretty much agrees it’s useful to have good communication with or awareness of Reality B, because that helps us live better, truer, happier, more productive lives in Reality A, and because it creates a better data record when our Reality A existence ends and we pass over to Reality B.

And on it goes. No, we don’t express any of it that way:  our cultural belief systems and institutions — religious doctrines, moral norms, legal codes, academic fields of study, etc. — offer better- dressed versions. But it’s remarkable how some version of those beliefs finds its way into common notions about  how life works.

At the heart of it all is our conviction — not knowledge — that this thing we consciously know as “me” is an independent self that remains intact and apart from the biological messiness of human life, able to choose its own beliefs, make its own decisions, and execute its own actions. In other words, we believe in consciousness, free will, and personal responsibility for what we are and do — and what we aren’t and don’t do — during what is only a sojourn — a short-term stay — on Earth.

Those beliefs explain why, for example,  it bothers us so much when someone we thought we knew departs from their beginnings and instead displays a changed inner and outer expression of who they were when we thought we knew them. “Look who’s in the big town,” we say. Or we pity them and knock wood and declare thank goodness we’ve been lucky. Or we put them on the prayer chain or call them before the Inquisition… anything but entertain the idea that maybe Reality B isn’t there– along with all the belief it takes to create it — and that instead all we have is Reality A — we’re nothing but flesh and bone.

It’s almost impossible to think that way. To go there, we have to lay aside conviction and embrace knowledge.

Almost impossible.

Almost.

We’ll give it a try in the coming weeks.

Knowledge, Conviction, and Belief

For I am convinced that neither death nor life, neither angels nor demons, neither the present nor the future, nor any powers,  neither height nor depth, nor anything else in all creation, will be able to separate us from the love of God that is in Christ Jesus our Lord.”

Paul’s letter to the Romans 8:38-39 (NIV)

How did Paul know that? Why was he so convinced?

According to psychology and neuroscience, he didn’t know it, he was convinced of it. The difference reflects Cartesian dualism:  the belief that we can know things about the natural world through scientific inquiry, but in the supernatural world, truth is a matter of conviction.

Academics draw distinctions between these and other terms,[1] but in actual experience, the essence seems to be emotional content. Scientific knowledge is thought to be emotionally detached — it wears a lab coat, pours over data, expresses conclusions intellectually. It believes its conclusions, but questioning them is hardwired into scientific inquiry; science therefore must hold its truth in an open hand — all of which establish a reliable sense of what is “real.” Conviction, on the other hand, comes with heart, with a compelling sense of certainty. The emotional strength of conviction makes questioning its truth — especially religious convictions — something to be discouraged or punished.

Further, while knowledge may come with a Eureka! moment — that satisfying flash of suddenly seeing clearly — conviction often comes with a sense of being overtaken by an authority greater than ourselves — of being apprehended and humbled, left frightened and grateful for a second chance.

Consider the etymologies of conviction and convince:

conviction (n.)

mid-15c., “the proving or finding of guilt of an offense charged,” from Late Latin convictionem(nominative convictio) “proof, refutation,” noun of action from past-participle stem of convincere “to overcome decisively,” from com-, here probably an intensive prefix (see com-), + vincere “to conquer” (from nasalized form of PIE root *weik- (3) “to fight, conquer”).

Meaning “mental state of being convinced or fully persuaded” is from 1690s; that of “firm belief, a belief held as proven” is from 1841. In a religious sense, “state of being convinced one has acted in opposition to conscience, admonition of the conscience,” from 1670s.

convince (v.)

1520s, “to overcome in argument,” from Latin convincere “to overcome decisively,” from assimilated form of com-, here probably an intensive prefix (see com-), + vincere “to conquer” (from nasalized form of PIE root *weik- (3) “to fight, conquer”). Meaning “to firmly persuade or satisfy by argument or evidence” is from c. 1600. Related: Convincedconvincingconvincingly.

To convince a person is to satisfy his understanding as to the truth of a certain statement; to persuade him is, by derivation, to affect his will by motives; but it has long been used also for convince, as in Luke xx. 6, “they be persuaded that John was a prophet.” There is a marked tendency now to confine persuade to its own distinctive meaning. [Century Dictionary, 1897]

Both knowledge and conviction, and the needs they serve, are evolutionary survival skills:  we need what they give us to be safe, individually and collectively. Knowledge satisfies our need to be rational, to think clearly and logically, to distinguish this from that, to put things into dependable categories. Conviction satisfies the need to be moved, and also to be justified — to feel as though you are in good standing in the cosmology of how life is organized.

Culturally, conviction is often the source of embarrassment, guilt, and shame, all of which have a key social function — they are part of the glue that holds society together. Becoming aware that we have transgressed societal laws or behavioral norms (the “conviction of sin”) often brings not just chastisement but also remorse and relief — to ourselves and to others in our community:  we’ve been arrested, apprehended, overtaken by a corrective authority, and saved from doing further harm to ourselves and others.

Knowledge and conviction also have something else in common:  both originate in the brain’s complex tangle of neural networks:

“It is unlikely that beliefs as wide-ranging as justice, religion, prejudice or politics are simply waiting to be found in the brain as discrete networks of neurons, each encoding for something different. ‘There’s probably a whole combination of things that go together,’ says [Peter Halligan, a psychologist at Cardiff University].

“And depending on the level of significance of a belief, there could be several networks at play. Someone with strong religious beliefs, for example, might find that they are more emotionally drawn into certain discussions because they have a large number of neural networks feeding into that belief.”

Where Belief Is Born, The Guardian (June 30,2005).

And thus protected by the knowledge and convictions wired into our neural pathways, we make our way through this precarious thing called “life.”

More next time.

[1] Consider also the differences between terms like conviction and belief, and fact, opinion, belief, and prejudice.

Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).