A Talk at the Rock: How to Instantly Polarize a Crowd and End a Discussion

AreopaguslImage from Wikipedia

The Areopagus is a large rock outcropping in Athens, not far from the Acropolis, where in ancient times various legal, economic, and religious issues got a hearing. A Bible story about something that happened there two thousand years ago provides surprising insight on today’s hyper-polarized world.

Backstory:  A Dualistic Worldview

In the 17th Century, Frenchman René Descartes sorted reality into two categories: (1) the natural, physical world and (2) the unseen world of ideas, feelings, and beliefs. This duality was born of the times:

“Toward the end of the Renaissance period, a radical epistemological and metaphysical shift overcame the Western psyche. The advances of Nicolaus Copernicus, Galileo Galilei and Francis Bacon posed a serious problem for Christian dogma and its dominion over the natural world.

“In the 17th century, René Descartes’s dualism of matter and mind was an ingenious solution to the problem this created. ‘The ideas’ that had hitherto been understood as inhering in nature as ‘God’s thoughts’ were rescued from the advancing army of empirical science and withdrawn into the safety of a separate domain, ‘the mind’.

“On the one hand, this maintained a dimension proper to God, and on the other, served to ‘make the intellectual world safe for Copernicus and Galileo’, as the American philosopher Richard Rorty put it in Philosophy and the Mirror of Nature (1979).

“In one fell swoop, God’s substance-divinity was protected, while empirical science was given reign over nature-as-mechanism – something ungodly and therefore free game.”[1]

Descartes articulated this dualistic framework, but it had been around from prehistoric antiquity. It still persists today, and neurological research suggests the human brain comes pre-wired for it. This is from Psychology Today[2]:

“Recent research suggests that our brains may be pre-wired for dichotomized thinking. That’s a fancy name for thinking and perceiving in terms of two – and only two – opposing possibilities.

“Neurologists explored the activity of certain key regions of the human forebrain – the frontal lobe – trying to understand how the brain switches between tasks. Scientists generally accept the idea that the brain can only consciously manage one task at a time….

“However, some researchers are now suggesting that our brains can keep tabs on two tasks at a time, by sending each one to a different side of the brain. Apparently, we toggle back and forth, with one task being primary and the other on standby.

“Add a third task, however, and one of the others has to drop off the to-do list. Scans of brain activity during this task switching have led to the hypothesis that the brain actually likes handling things in pairs. Indeed, the brain itself is subdivided into two distinct half-brains, or hemispheres.

“Some researchers are now extending this reasoning to suggest that the brain has a built-in tendency, when confronted by complex propositions, to selfishly reduce the set of choices to just two.

“The popular vocabulary routinely signals this dichotomizing mental habit: ‘Are you with us, or against us?’ ‘If you’re not part of the solution, you’re part of the problem.’

“These research findings might help explain how and why the public discourse of our culture has become so polarized and rancorous, and how we might be able to replace it with a more intelligent conversation.

“One of our popular clichés is ‘Well, there are two sides to every story.’ Why only two? Maybe the less sophisticated and less rational members of our society are caught up in duplex thinking, because the combination of a polarized brain and unexamined emotional reflexes keep them there.”

“Less sophisticating and less rational” … the author’s ideological bias is showing, but the “unexamined emotional reflexes” finger points at both ends of the polarized spectrum. And because our brains love status quo and resist change, we hunker down on our assumptions and biases. True, the balance can shift more gradually, over time – the way objectivity ascended during the 18th Century’s Age of Enlightenment, but Romanticism pushed back in the 19th — but usually it takes something drastic like disruptive innovation, tragedy, violence, etc. to knock us off our equilibrium. Absent that, we’re usually not up for the examination required to separate what we objectively know from what we subjectively believe — it’s all just reality, and as long as it’s working, we’re good. If we’re forced to examine and adjust, we’ll most likely take our cues from our cultural context:

“Each of us conducts our lives according to a set of assumptions about how things work: how our society functions, its relationship with the natural world, what’s valuable, and what’s possible. This is our worldview, which often remains unquestioned and unstated but is deeply felt and underlies many of the choices we make in our lives. We form our worldview implicitly as we grow up, from our family, friends, and culture, and, once it’s set, we’re barely aware of it unless we’re presented with a different worldview for comparison. The unconscious origin of our worldview makes it quite inflexible.

“There is [a] potent force shaping the particular patterns we perceive around us. It’s what anthropologists call culture. Just as language shapes the perception of an infant as she listens to the patterns of sounds around her, so the mythic patterns of thought informing the culture a child is born into will literally shape how that child constructs meaning in the world. Every culture holds its own worldview: a complex and comprehensive model of how the universe works and how to act within it. This network of beliefs and values determines the way in which each child in that culture makes sense of the universe.”[3]

Culture has been sculpting the human brain ever since our earliest ancestors began living complex social lives millions of years ago. It’s only when the cultural balance runs off the rails that our brains scramble to reset, and we’re stressed while they’re at it. We would do well not to wait until then, and learn how to embrace both ends of the dualistic spectrum, argues one computational biologist[4]:

“Neuroscience was part of the dinner conversation in my family, often a prerequisite for truth. Want to talk about art? Not without neuroscience. Interested in justice? You can’t judge someone’s sanity without parsing scans of the brain. But though science helps us refine our thinking, we’re hindered by its limits: outside of mathematics, after all, no view of reality can achieve absolute certainty. Progress creates the illusion that we are moving toward deeper knowledge when, in fact, imperfect theories constantly lead us astray.

“The conflict is relevant in this age of anti-science, with far-Right activists questioning climate change, evolution and other current finds. In his book Enlightenment Now (2018), Steven Pinker describes a second assault on science from within mainstream scholarship and the arts. But is that really bad? Nineteenth-century Romanticism was the first movement to take on the Enlightenment – and we still see its effects in such areas as environmentalism, asceticism and the ethical exercise of conscience.

“In our new era of Enlightenment, we need Romanticism again. In his speech ‘Politics and Conscience’ (1984), the Czech dissident Václav Havel, discussing factories and smokestacks on the horizon, explained just why: ‘People thought they could explain and conquer nature – yet … they destroyed it and disinherited themselves from it.’ Havel was not against industry, he was just for labour relations and protection of the environment.

“The issues persist. From use of GMO seeds and aquaculture to assert control over the food chain to military strategies for gene-engineering bioweapons, power is asserted though patents and financial control over basic aspects of life. The French philosopher Michel Foucault in The Will to Knowledge (1976) referred to such advancements as ‘techniques for achieving the subjugation of bodies and the control of populations’. With winners and losers in the new arena, it only makes sense that some folks are going to push back.

“We are now on the verge of a new revolution in control over life through the gene-editing tool Crispr-Cas9, which has given us the ability to tinker with the colour of butterfly wings and alter the heritable genetic code of humans. In this uncharted territory, where ethical issues are rife, we can get blindsided by sinking too much of our faith into science, and losing our sense of humanity or belief in human rights.

“Science should inform values such as vaccine and climate policy, but it must not determine all values…. With science becoming a brutal game of market forces and patent controls, the skeptics and Romantics among us must weigh in, and we already are.”

That’s probably good advice, but we need to push through a lot of cultural status quo to get there. That’s especially true because the 20th Century brought us change at ever-accelerating rates — objective reality went spinning away and we crashed into the extreme belief end of the spectrum:

“Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. What’s problematic is going overboard — letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts.

“The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts.

“Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us. And most of us haven’t realized how far-reaching our strange new normal has become.”[5]

When we can agree that our conflict is a matter of my data vs. yours, we can debate rationally. But when it’s my beliefs vs. yours, what used to be discourse dissolves into stonewalling and shouting. Belief seeks its own perfection by eliminating doubt, and therefore devolves into fundamentalism, where discussion is a sign of doubt, punishable as heresy. Fundamentalism can be secular or religious – it’s the dynamic, not the content, that matters

“Fundamentalism is a mind-set. The iconography and language it employs can be either religious or secular or both, but because it dismisses all alternative viewpoints as inferior and unworthy of consideration it is anti-thought. This is part of its attraction. It fills a human desire for self-importance, for hope and the dream of finally attaining paradise. It creates a binary world of absolutes, of good and evil. It provides a comforting emotional certitude. It is used to elevate our cultural, social, and economic systems above others. It is used to justify imperial hubris, war, intolerance and repression as a regrettable necessity in the march of human progress. The fundamentalist murders, plunders and subjugates in the name of humankind’s most exalted ideals. Those who oppose the fundamentalists are dismissed as savages, condemned as lesser breeds of human beings, miscreants led astray by Satan or on the wrong side of Western civilization. The nation is endowed with power and military prowess, fundamentalists argue, because God or our higher form of civilization makes us superior. It is our right to dominate and rule. The core belief systems of these secular and religious antagonists are identical. They are utopian. They will lead us out of the wilderness to the land of milk and honey.”[6]

Fundamentalism is where the open mind goes into lockdown. Objectivity loses its grip and the question “Are you with us, or against us?” gives way to its declarative version, “If you’re not with us, you’re against us.”[7] Dualistic thinking ceases to be more than a source of “popular clichés,” and becomes instead a rigid disincentive to public discourse, as competing polarized beliefs dig in for a grinding, maddening war of attrition. What used to be public discourse is lost in a no-man’s land of intellectual wreckage created by each side’s incessant lobbing of ideological bombs at the other’s entrenched subjective positions. Each side is convinced it has a God’s-eye view of reality, therefore God is on its side, which motivates securing its position by all necessary means.

A Talk at the Rock

The Christian scriptures illustrate how all this works in a story from one of the Apostle Paul’s missionary journeys.

“Now while Paul was… at Athens, his spirit was provoked within him as he saw that the city was full of idols. So, he reasoned in the synagogue with the Jews and the devout persons, and in the marketplace every day with those who happened to be there. Some of the Epicurean and Stoic philosophers also conversed with him. And some said, ‘What does this babbler wish to say?’ Others said, ‘He seems to be a preacher of foreign divinities’—because he was preaching Jesus and the resurrection.  And they took him and brought him to the Areopagus, saying, May we know what this new teaching is that you are presenting? For you bring some strange things to our ears. We wish to know therefore what these things mean.’[8]

The Epicureans and Stoics were the materialists of their day – their thinking leaned toward the objective side of the dualism. When Paul came to town advocating ideas (the subjective end of the dualism), their brain patterning couldn’t process Paul’s worldview. They needed time, so they invited Paul to a Talk at the Rock (the Areopagus).

At this point, the author of the story –- widely believed to be the same “Luke the beloved physician”[9] who wrote the Gospel of Luke – inserts a biased editorial comment that signals that nothing’s going to come of this because “all the Athenians and the foreigners who lived there would spend their time in nothing except telling or hearing something new.”[10] I.e., reasonable consideration — public discourse – was going to be a waste of time. But Paul had prepared some culturally sensitive opening remarks:

“So Paul, standing in the midst of the Areopagus, said: ‘Men of Athens, I perceive that in every way you are very religious.For as I passed along and observed the objects of your worship, I found also an altar with this inscription: To the unknown god. What therefore you worship as unknown, this I proclaim to you.’”

He then offers up the idea of substituting his ‘foreign god’ for the Athenians’ statuary, altars, and temples:

“The God who made the world and everything in it, being Lord of heaven and earth, does not live in temples made by man, nor is he served by human hands, as though he needed anything, since he himself gives to all mankind life and breath and everything. And he made from one man every nation of mankind to live on all the face of the earth, having determined allotted periods and the boundaries of their dwelling place, that they should seek God, and perhaps feel their way toward him and find him.”

You can sense the crowd’s restless murmuring and shuffling feet, but then Paul goes back to cultural bridge-building:

“Yet he is actually not far from each one of us, for ‘In him we live and move and have our being’ [referring to a passage from Epimenides of Crete], and as even some of your own poets have said, ‘For we are indeed his offspring.’[{From Aratus’s poem Phainomena].”

Nice recovery, Paul. So far so good. This feels like discourse, what the Rock is for. But Paul believes that the Athenians’ practice of blending the unseen world of their gods with their physical craftmanship of statuary, altars, and temples (a practice the church would later perfect) is idolatry, and in his religious culture back home, idolatry had been on the outs since the Golden Calf.[11] At this point, Paul takes off the cultural kit gloves and goes fundamentalist:

“Being then God’s offspring, we ought not to think that the divine being is like gold or silver or stone, an image formed by the art and imagination of man. The times of ignorance God overlooked, but now he commands all people everywhere to repent, because he has fixed a day on which he will judge the world in righteousness by a man whom he has appointed; and of this he has given assurance to all by raising him from the dead.”

That’s precisely the point where he loses the crowd — well, most of them, there were some who were willing to give him another shot, and even a couple fresh converts:

“Now when they heard of the resurrection of the dead, some mocked. But others said, ‘We will hear you again about this.’ So Paul went out from their midst. But some men joined him and believed, among whom also were Dionysius the Areopagite and a woman named Damaris and others with them.”

“Some men joined him and believed….” That’s all there was left for them to do: believe or not believe. You’re either with us or against us.

Paul had violated the cultural ethics of a Talk at the Rock. It was about reasonable discourse; he made it a matter of belief, saying in effect. “forget your social customs and ethics, my God is going to hurt you if you keep it up.” With that, the conclave became irretrievably polarized, and the session was over.

Paul triggered this cultural dynamic constantly on his journeys – for example a few years later, when the Ephesus idol-building guild figured out the economic implications of Paul’s belief system[12]:

“About that time there arose no little disturbance concerning the Way.  For a man named Demetrius, a silversmith, who made silver shrines of Artemis, brought no little business to the craftsmen. These he gathered together, with the workmen in similar trades, and said, ‘Men, you know that from this business we have our wealth. And you see and hear that not only in Ephesus but in almost all of Asia this Paul has persuaded and turned away a great many people, saying that gods made with hands are not gods. And there is danger not only that this trade of ours may come into disrepute but also that the temple of the great goddess Artemis may be counted as nothing, and that she may even be deposed from her magnificence, she whom all Asia and the world worship.’ When they heard this they were enraged and were crying out, ‘Great is Artemis of the Ephesians!’”

Jesus had previously taken a whip to the merchants in the Temple in Jerusalem.[13] Apparently Demetrius and his fellow craftsmen saw the same thing coming to them, and made a preemptive strike. The scene quickly spiraled out of control:

“So the city was filled with the confusion, and they rushed together into the theater, dragging with them Gaius and Aristarchus, Macedonians who were Paul’s companions in travel.  But when Paul wished to go in among the crowd, the disciples would not let him. And even some of the Asiarchs, who were friends of his, sent to him and were urging him not to venture into the theater. Now some cried out one thing, some another, for the assembly was in confusion, and most of them did not know why they had come together.”

A local official finally quelled the riot:

“Some of the crowd prompted Alexander, whom the Jews had put forward. And Alexander, motioning with his hand, wanted to make a defense to the crowd. But when they recognized that he was a Jew, for about two hours they all cried out with one voice, ‘Great is Artemis of the Ephesians!’

“And when the town clerk had quieted the crowd, he said, ‘Men of Ephesus, who is there who does not know that the city of the Ephesians is temple keeper of the great Artemis, and of the sacred stone that fell from the sky? Seeing then that these things cannot be denied, you ought to be quiet and do nothing rash. For you have brought these men here who are neither sacrilegious nor blasphemers of our goddess. If therefore Demetrius and the craftsmen with him have a complaint against anyone, the courts are open, and there are proconsuls. Let them bring charges against one another. But if you seek anything further, it shall be settled in the regular assembly. For we really are in danger of being charged with rioting today, since there is no cause that we can give to justify this commotion.” and when he had said these things, he dismissed the assembly.”[14]

It Still Happens Today

I spent years in the evangelical church – we were fundamentalists, but didn’t want to admit it – where Paul’s Talk at the Rock was held up as the way not to “share your faith.” Forget the public discourse — you can’t just “spend [your] time in nothing except telling or hearing something new,” you need to lay the truth on them so they can believe or not believe, and if they don’t, you need to “shake the dust off your feet”[15] and get out of there. These days, we see both secular and religious cultural institutions following that advice.

Will we ever learn?

[1]How The Dualism Of Descartes Ruined Our Mental HealthMedium (May 10, 2019)

[2] Karl Albrecht, “The Tyranny of Two,” Psychology Today (Aug 18, 2010)

[3] Jeremy Lent, The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning (2017)

[4] Jim Kozubek, “The Enlightenment Rationality Is Not Enough: We Need A New Romanticism,” Aeon (Apr. 18, 2018)

[5] Andersen, Kurt, Fantasyland: How American Went Haywire, a 500-Year History (2017)

[6] Hedges, Chris, I Don’t Believe in Atheists: The Dangerous Rise of the Secular Fundamentalist (2008)

[7] The latter came from Jesus himself – see the Gospels of Matthew 21: 12-13, and John 2: 13-16. Jesus was a belief man through and through. More on that another time.

[8] The Acts of the Apostles 17: 17-20.

[9] Paul’s letter to the Colossians 4: 14.

[10] Acts 17: 21.

[11] Exodus 32.

[12] Acts 19: 23-41

[13] Mathew 21: 12-17; John 2: 13-21

[14] Acts: 23-42

[15] Matthew 10:14.

Knowledge, Conviction, and Belief [2]: Cultural Belief and Mass Delusion

We think we have an independent ability to think and believe as we like, to know this or be convinced about that. But that’s not the whole story:  our outlook is also shaped by our cultural context.

As we’ve seen , when enough people agree about what is true — whether they “know” it or are “convinced” of it — their agreement becomes a cultural belief system — for example, as reflected in a religion, country, neighborhood, business, athletic team, or other institution. Cultural belief systems are wired into the neural pathways of individual members, and as the culture coalesces, its belief system takes on a life of its own thorough a process known as “emergence.” As the emergent belief system is increasingly reflected in and reinforced by cultural institutions, it is increasingly patterned into the neural pathways of the culture’s members, where it defines individual and collective reality and sense of identity,  The belief system becomes The Truth , defining what the group and its members know and are convinced of.

Throughout this process, whether the culture’s beliefs are true in any non-subjective sense loses relevance. The result is what physician and author Paul Singh refers to as “mass delusion”:

“[When a conviction moves from an individual to being widely held], its origins are rooted in a belief system rather than in an individual’s pathological condition. It is a mass delusion of the sort that poses no immediate threat to anyone or society. Mass delusions can become belief systems that are passed from generation to generation.”

The Great Illusion:  The Myth of Free Will, Consciousness, and the Self, Paul Singh (2016)

For a dramatic example of this concept in action, consider an experience described by Jesse Jackson:

“There is nothing more painful to me at this stage in my life than to walk down the street and hear footsteps… then turn around and see somebody white and feel relieved.”

Despite a lifetime of civil rights leadership, Jackson’s cultural neural conditioning betrayed him. What he experienced was not just personal to him; it conformed to a cultural belief system. The particular “mass delusion” involved has been confirmed by clinical research.

“Matthew Lieberman, a psychologist at the University of California, recently showed how beliefs help people’s brains categorise others and view objects as good or bad, largely unconsciously. He demonstrated that beliefs (in this case prejudice or fear) are most likely to be learned from the prevailing culture.

“When Lieberman showed a group of people photographs of expressionless black faces, he was surprised to find that the amygdala — the brain’s panic button — was triggered in almost two-thirds of cases. There was no difference in the response between black and white people.”

Where Belief Is Born, The Guardian (June 30,2005)

When cultural beliefs are not constantly reinforced — by cultural norms of thought, language, practice, etc. — the neural networks that support them can weaken, allowing opportunity for new beliefs.

“‘Beliefs are mental objects in the sense that they are embedded in the brain,’ says [Kathleen Taylor, a neuroscientist at Oxford University] ‘If you challenge [beliefs] by contradiction, or just by cutting them off from the stimuli that make you think about them, then they are going to weaken slightly. If that is combined with very strong reinforcement of new beliefs, then you’re going to get a shift in emphasis from one to the other.’”

Where Belief Is Born

This helps to explain, for example, why religious believers are more likely to “fall away” if they are “out of fellowship.” Or what can happen to a student off to college, a world traveler, or an immigrant. It also helps to explain why leaders and despots alike can manipulate brain networks to create cultural belief systems to fit their desired ends:

“In her book on the history of brainwashing, Taylor describes how everyone from the Chinese thought reform camps of the last century to religious cults have used systematic methods to persuade people to change their ideas, sometimes radically.

“The mechanism Taylor describes is similar to the way the brain learns normally. In brainwashing though, the new beliefs are inserted through a much more intensified version of that process.

“The first step is to isolate a person and control what information they receive. Their former beliefs need to be challenged by creating uncertainty. New messages need to be repeated endlessly. And the whole thing needs to be done in a pressured, emotional environment.

“Stress affects the brain such that it makes people more likely to fall back on things they know well – stereotypes and simple ways of thinking,” says Taylor.

“This manipulation of belief happens every day. Politics is a fertile arena, especially in times of anxiety.”

Where Belief Is Born

More next time.

The Hostilities of Change:  Surprise, Death, and War

Storming of the Bastille

“Ideas that require people to reorganize their picture of the world provoke hostility.”

Science historian James Gleick,
in his bestseller Chaos:  The Making of a New Science,

We looked last time at neuro-cultural resistance to change, and asked what it takes to overcome it.

It takes a paradigm shift — which, according to Merriam-Webster, is “an important change that happens when the usual way of thinking about or doing something is replaced by a new and different way.” Physicist and philosopher Thomas Kuhn coined the term in a work that was itself a paradigm shift in how we view the dynamics of change.

“The Kuhn Cycle is a simple cycle of progress described by Thomas Kuhn in 1962 in his seminal work The Structure of Scientific Revolutions… Kuhn challenged the world’s current conception of science, which was that it was a steady progression of the accumulation of new ideas. In a brilliant series of reviews of past major scientific advances, Kuhn showed this viewpoint was wrong. Science advanced the most by occasional revolutionary explosions of new knowledge, each revolution triggered by introduction of new ways of thought so large they must be called new paradigms. From Kuhn’s work came the popular use of terms like ‘paradigm,’ ‘paradigm shift,’ and ‘paradigm change.’”


Our cultural point of view determines what we see and don’t see, blinds us to new awareness and perspective. That’s why our visions of a “new normal” are often little more than uninspiring extrapolations of the past.[1] Paradigm shifts offer something more compelling:  they shock our consciousness so much that we never see things the same again; they stun us into abrupt about-faces. Without that, inertia keeps us moving in the direction we’re already going. If we even think of change, cognitive dissonance makes things uncomfortable, and if we go ahead with it anyway, things can get nasty in a hurry.

“People and systems resist change. They change only when forced to or when the change offers a strong advantage. If a person or system is biased toward its present paradigm, then a new paradigm is seen as inferior, even though it may be better. This bias can run so deep that two paradigms are incommensurate. They are incomparable because each side uses their own paradigm’s rules to judge the other paradigm. People talk past each other. Each side can ‘prove’ their paradigm is better.

“Writing in his chapter on The Resolution of Revolutions, Thomas Kuhn states that:

‘If there were but one set of scientific problems, one world within which to work on them, and one set of standards for their solution, paradigm competition might be settled more or less routinely by some process like counting the number of problems solved by each.

‘But in fact these conditions are never met. The proponents of competing paradigms are always at least slightly at cross-purposes. Neither side will grant all the non-empirical assumptions that the other needs in order to make its case.

‘Though each may hope to convert the other to his way of seeing his science and its problems, neither may hope to prove his case. The competition between paradigms is not the sort of battle that can be solved by proofs.’”


What does it take to detonate a logjam-busting “revolutionary explosion of new knowledge”? Three possibilities:

The Element of Surprise. [2]  We’re not talking “Oh that’s nice!” surprise. We’re talking blinding flash of inspiration surprise — a eureka moment, moment of truth, defining moment — that changes everything forever, in a moment, in the twinkling of an eye. In religious terms, this is St. Paul’s conversion on the Damascus Road or St. Peter’s vision of extending the gospel to the gentiles. In those moments, both men became future makers, not future takers, embodying the view of another scientist and philosopher:

“The best way to predict the future is to create it.”[3]

A New Generation.  Without the element of surprise, paradigm shifts take a long time, if they happen at all.

“A new scientific truth does not triumph by convincing its opponents
and making them see the light, but rather because its opponents eventually die,
and a new generation grows up that is familiar with it.”[4]

In religious terms, that’s why the Exodus generation had to die off in 40 years in the wilderness, leaving a new generation for whom Moses’ new paradigm was the only one they’d ever known.

Violence.  Or, if the new paradigm’s champions can’t wait, they can resort to violence, brutality, persecution, war… the kinds of power-grabbing that have long polluted religion’s proselytizing legacy.

Surprise, death, violence… three ways to bring about a paradigm shift. That’s true in religion, science, or any other cultural institution.

More next time.

[1] Carl Richards, “There’s No Such Thing as the New Normal,” New York Times ( December 20, 2010).

[2] Carl Richards, op. cit.

[3] The quote has been ascribed to a lot of different people, including Peter Drucker and computer scientist Alan Kay. But according to the Quote Investigator, “The earliest evidence appeared in 1963 in the book ‘Inventing the Future’ written by Dennis Gabor who was later awarded a Nobel Prize in Physics for his work in holography.”

[4] Max Planck, founder of quantum theory, in his Scientific Autobiography and Other Papers.

Why Belief Works

Our experience of the “real world” will conform to what we believe. It has to, because our brains insist upon it.

They do that in part through neuro-cultural conditioning — the process by which the neurological wiring of a culture’s individual members is patterned after the culture’s belief system, and vice versa. This is the case with any kind of cultural institution, whether national, religious, scientific, economic, corporate, professional, team, tribal, or otherwise.[1] This post looks at religion as an example.[2]

Tim Crane is a professor of philosophy at the Central European University in Budapest. “I work in the philosophy of mind,” his online CV says, “I have attempted to address questions about the most general nature, or essence, of the human mind, and about the place of the mind in the rest of nature.” In his book The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), he cites William James’ 1902 classic The Varieties of Religious Experience for a definition of what he calls “the religious impulse”:

“Were one asked to characterize the life of religion in the broadest and most general terms, one might say that it consists in the belief that there is an unseen order, and that our supreme good lies in harmoniously adjusting ourselves thereto.”

Christian Smith is a sociology professor and director of the Center for the Study of Religion and Society at the University of Notre Dame. Here’s his definition of religion:

“Religion is a complex of culturally prescribed practices, based on promises about the existence and nature of supernatural powers, whether personal or impersonal, which seek to help practitioners gain access to and communicate or align themselves with these powers, in hopes of realizing human goods and avoiding things bad.”

Religion: What It Is, How It Works, And Why It Matters (Princeton University Press, 2017)

Both authors stress that religious principles and practices need to match in order for religion to be effective. In other words:

“Faith without works is dead.”
The Epistle of James 2: 17

As it turns out, “faith without works is dead” is not just scripture, but accurate neuroscience as well. When we practice what we preach, we set up a self-sustaining loop in which belief drives thoughts and behavior, which in turn reinforce belief. In that way, religion develops the brain while the brain develops religion:

“Jordan Grafman, head of the cognitive neuroscience laboratory at Rehabilitation Institute of Chicago and neurology professor at Northwestern University, says that neurotheology is important in part because early religious practices helped develop our brains to begin with. ‘Religion has played an incredibly important role in human evolution. It’s funny, people want to separate the two but in fact they’re intertwined.’”

The Neuroscience Argument That Religion Shaped The Very Structure Of Our Brains,” Quartz (December 3, 2016)

The more widespread and enduring the religious practice, the more the religion develops scriptures, rituals, icons, and institutions to sustain itself. Therefore a Bible passage such as this…

“I was young and now I am old,
yet I have never seen the righteous forsaken
 or their children begging bread.”
Psalm 37: 25 NIV

… becomes both community truth and the “testimony” of individual adherents. But what happens when belief and experience don’t align — e.g., when a member of the congregation and her children in fact go begging?

Some religious thinkers, like the writer of this Huffington Post article, reckon with the contradiction by distinguishing belief from faith. Beliefs are products of the mind, she says, and deal with what can be known, while faith is a product of the spirit, which traffics in what cannot be known. Since knowledge is always shifting, belief can and probably will let us down, while faith in what can’t be known remains inscrutable. Faith therefore invites belief to step aside in favor of “trusting beyond all reason and evidence.”

That outlook captures the essential center of the definitions of religion we saw above:  that there is a “divine order” populated with “supernatural powers” that exists alongside but separate from ours. (Of which we have only limited understanding, the belief/faith outlook would add.)  Whether this satisfies the brain’s need to align internal patterning with external experience is the kind of issue being taken up by the new discipline of neurotheology which looks at where religion happens in the brain.

Neurotheology’s inquiries have far-reaching implications for many of our common assumptions about how reality is structured. For example, if faith can be explained in neurological terms, then it could be located — in whole or in part — along with belief on this side of the theoretical divide between human and supernatural existence.  This shift would likely have a ripple effect on similar dichotomies, such as known vs. unknown, real vs. imaginary, objective vs. subjective, observed vs. inscrutable, temporal vs. transcendence, etc.

More on neurotheology coming up.

[1] For more on cultural patterning, see the other posts in this blog’s category The Basics of Belief. Culture, and Reality.

[2] I talk about Christianity because it is the only religion I have personal experience with. And I am aware, by the way, that I write this post under the influence of my own neuroscientific cultural bias.