A Talk at the Rock: How to Instantly Polarize a Crowd and End a Discussion

AreopaguslImage from Wikipedia

The Areopagus is a large rock outcropping in Athens, not far from the Acropolis, where in ancient times various legal, economic, and religious issues got a hearing. A Bible story about something that happened there two thousand years ago provides surprising insight on today’s hyper-polarized world.

Backstory:  A Dualistic Worldview

In the 17th Century, Frenchman René Descartes sorted reality into two categories: (1) the natural, physical world and (2) the unseen world of ideas, feelings, and beliefs. This duality was born of the times:

“Toward the end of the Renaissance period, a radical epistemological and metaphysical shift overcame the Western psyche. The advances of Nicolaus Copernicus, Galileo Galilei and Francis Bacon posed a serious problem for Christian dogma and its dominion over the natural world.

“In the 17th century, René Descartes’s dualism of matter and mind was an ingenious solution to the problem this created. ‘The ideas’ that had hitherto been understood as inhering in nature as ‘God’s thoughts’ were rescued from the advancing army of empirical science and withdrawn into the safety of a separate domain, ‘the mind’.

“On the one hand, this maintained a dimension proper to God, and on the other, served to ‘make the intellectual world safe for Copernicus and Galileo’, as the American philosopher Richard Rorty put it in Philosophy and the Mirror of Nature (1979).

“In one fell swoop, God’s substance-divinity was protected, while empirical science was given reign over nature-as-mechanism – something ungodly and therefore free game.”[1]

Descartes articulated this dualistic framework, but it had been around from prehistoric antiquity. It still persists today, and neurological research suggests the human brain comes pre-wired for it. This is from Psychology Today[2]:

“Recent research suggests that our brains may be pre-wired for dichotomized thinking. That’s a fancy name for thinking and perceiving in terms of two – and only two – opposing possibilities.

“Neurologists explored the activity of certain key regions of the human forebrain – the frontal lobe – trying to understand how the brain switches between tasks. Scientists generally accept the idea that the brain can only consciously manage one task at a time….

“However, some researchers are now suggesting that our brains can keep tabs on two tasks at a time, by sending each one to a different side of the brain. Apparently, we toggle back and forth, with one task being primary and the other on standby.

“Add a third task, however, and one of the others has to drop off the to-do list. Scans of brain activity during this task switching have led to the hypothesis that the brain actually likes handling things in pairs. Indeed, the brain itself is subdivided into two distinct half-brains, or hemispheres.

“Some researchers are now extending this reasoning to suggest that the brain has a built-in tendency, when confronted by complex propositions, to selfishly reduce the set of choices to just two.

“The popular vocabulary routinely signals this dichotomizing mental habit: ‘Are you with us, or against us?’ ‘If you’re not part of the solution, you’re part of the problem.’

“These research findings might help explain how and why the public discourse of our culture has become so polarized and rancorous, and how we might be able to replace it with a more intelligent conversation.

“One of our popular clichés is ‘Well, there are two sides to every story.’ Why only two? Maybe the less sophisticated and less rational members of our society are caught up in duplex thinking, because the combination of a polarized brain and unexamined emotional reflexes keep them there.”

“Less sophisticating and less rational” … the author’s ideological bias is showing, but the “unexamined emotional reflexes” finger points at both ends of the polarized spectrum. And because our brains love status quo and resist change, we hunker down on our assumptions and biases. True, the balance can shift more gradually, over time – the way objectivity ascended during the 18th Century’s Age of Enlightenment, but Romanticism pushed back in the 19th — but usually it takes something drastic like disruptive innovation, tragedy, violence, etc. to knock us off our equilibrium. Absent that, we’re usually not up for the examination required to separate what we objectively know from what we subjectively believe — it’s all just reality, and as long as it’s working, we’re good. If we’re forced to examine and adjust, we’ll most likely take our cues from our cultural context:

“Each of us conducts our lives according to a set of assumptions about how things work: how our society functions, its relationship with the natural world, what’s valuable, and what’s possible. This is our worldview, which often remains unquestioned and unstated but is deeply felt and underlies many of the choices we make in our lives. We form our worldview implicitly as we grow up, from our family, friends, and culture, and, once it’s set, we’re barely aware of it unless we’re presented with a different worldview for comparison. The unconscious origin of our worldview makes it quite inflexible.

“There is [a] potent force shaping the particular patterns we perceive around us. It’s what anthropologists call culture. Just as language shapes the perception of an infant as she listens to the patterns of sounds around her, so the mythic patterns of thought informing the culture a child is born into will literally shape how that child constructs meaning in the world. Every culture holds its own worldview: a complex and comprehensive model of how the universe works and how to act within it. This network of beliefs and values determines the way in which each child in that culture makes sense of the universe.”[3]

Culture has been sculpting the human brain ever since our earliest ancestors began living complex social lives millions of years ago. It’s only when the cultural balance runs off the rails that our brains scramble to reset, and we’re stressed while they’re at it. We would do well not to wait until then, and learn how to embrace both ends of the dualistic spectrum, argues one computational biologist[4]:

“Neuroscience was part of the dinner conversation in my family, often a prerequisite for truth. Want to talk about art? Not without neuroscience. Interested in justice? You can’t judge someone’s sanity without parsing scans of the brain. But though science helps us refine our thinking, we’re hindered by its limits: outside of mathematics, after all, no view of reality can achieve absolute certainty. Progress creates the illusion that we are moving toward deeper knowledge when, in fact, imperfect theories constantly lead us astray.

“The conflict is relevant in this age of anti-science, with far-Right activists questioning climate change, evolution and other current finds. In his book Enlightenment Now (2018), Steven Pinker describes a second assault on science from within mainstream scholarship and the arts. But is that really bad? Nineteenth-century Romanticism was the first movement to take on the Enlightenment – and we still see its effects in such areas as environmentalism, asceticism and the ethical exercise of conscience.

“In our new era of Enlightenment, we need Romanticism again. In his speech ‘Politics and Conscience’ (1984), the Czech dissident Václav Havel, discussing factories and smokestacks on the horizon, explained just why: ‘People thought they could explain and conquer nature – yet … they destroyed it and disinherited themselves from it.’ Havel was not against industry, he was just for labour relations and protection of the environment.

“The issues persist. From use of GMO seeds and aquaculture to assert control over the food chain to military strategies for gene-engineering bioweapons, power is asserted though patents and financial control over basic aspects of life. The French philosopher Michel Foucault in The Will to Knowledge (1976) referred to such advancements as ‘techniques for achieving the subjugation of bodies and the control of populations’. With winners and losers in the new arena, it only makes sense that some folks are going to push back.

“We are now on the verge of a new revolution in control over life through the gene-editing tool Crispr-Cas9, which has given us the ability to tinker with the colour of butterfly wings and alter the heritable genetic code of humans. In this uncharted territory, where ethical issues are rife, we can get blindsided by sinking too much of our faith into science, and losing our sense of humanity or belief in human rights.

“Science should inform values such as vaccine and climate policy, but it must not determine all values…. With science becoming a brutal game of market forces and patent controls, the skeptics and Romantics among us must weigh in, and we already are.”

That’s probably good advice, but we need to push through a lot of cultural status quo to get there. That’s especially true because the 20th Century brought us change at ever-accelerating rates — objective reality went spinning away and we crashed into the extreme belief end of the spectrum:

“Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. What’s problematic is going overboard — letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts.

“The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts.

“Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us. And most of us haven’t realized how far-reaching our strange new normal has become.”[5]

When we can agree that our conflict is a matter of my data vs. yours, we can debate rationally. But when it’s my beliefs vs. yours, what used to be discourse dissolves into stonewalling and shouting. Belief seeks its own perfection by eliminating doubt, and therefore devolves into fundamentalism, where discussion is a sign of doubt, punishable as heresy. Fundamentalism can be secular or religious – it’s the dynamic, not the content, that matters

“Fundamentalism is a mind-set. The iconography and language it employs can be either religious or secular or both, but because it dismisses all alternative viewpoints as inferior and unworthy of consideration it is anti-thought. This is part of its attraction. It fills a human desire for self-importance, for hope and the dream of finally attaining paradise. It creates a binary world of absolutes, of good and evil. It provides a comforting emotional certitude. It is used to elevate our cultural, social, and economic systems above others. It is used to justify imperial hubris, war, intolerance and repression as a regrettable necessity in the march of human progress. The fundamentalist murders, plunders and subjugates in the name of humankind’s most exalted ideals. Those who oppose the fundamentalists are dismissed as savages, condemned as lesser breeds of human beings, miscreants led astray by Satan or on the wrong side of Western civilization. The nation is endowed with power and military prowess, fundamentalists argue, because God or our higher form of civilization makes us superior. It is our right to dominate and rule. The core belief systems of these secular and religious antagonists are identical. They are utopian. They will lead us out of the wilderness to the land of milk and honey.”[6]

Fundamentalism is where the open mind goes into lockdown. Objectivity loses its grip and the question “Are you with us, or against us?” gives way to its declarative version, “If you’re not with us, you’re against us.”[7] Dualistic thinking ceases to be more than a source of “popular clichés,” and becomes instead a rigid disincentive to public discourse, as competing polarized beliefs dig in for a grinding, maddening war of attrition. What used to be public discourse is lost in a no-man’s land of intellectual wreckage created by each side’s incessant lobbing of ideological bombs at the other’s entrenched subjective positions. Each side is convinced it has a God’s-eye view of reality, therefore God is on its side, which motivates securing its position by all necessary means.

A Talk at the Rock

The Christian scriptures illustrate how all this works in a story from one of the Apostle Paul’s missionary journeys.

“Now while Paul was… at Athens, his spirit was provoked within him as he saw that the city was full of idols. So, he reasoned in the synagogue with the Jews and the devout persons, and in the marketplace every day with those who happened to be there. Some of the Epicurean and Stoic philosophers also conversed with him. And some said, ‘What does this babbler wish to say?’ Others said, ‘He seems to be a preacher of foreign divinities’—because he was preaching Jesus and the resurrection.  And they took him and brought him to the Areopagus, saying, May we know what this new teaching is that you are presenting? For you bring some strange things to our ears. We wish to know therefore what these things mean.’[8]

The Epicureans and Stoics were the materialists of their day – their thinking leaned toward the objective side of the dualism. When Paul came to town advocating ideas (the subjective end of the dualism), their brain patterning couldn’t process Paul’s worldview. They needed time, so they invited Paul to a Talk at the Rock (the Areopagus).

At this point, the author of the story –- widely believed to be the same “Luke the beloved physician”[9] who wrote the Gospel of Luke – inserts a biased editorial comment that signals that nothing’s going to come of this because “all the Athenians and the foreigners who lived there would spend their time in nothing except telling or hearing something new.”[10] I.e., reasonable consideration — public discourse – was going to be a waste of time. But Paul had prepared some culturally sensitive opening remarks:

“So Paul, standing in the midst of the Areopagus, said: ‘Men of Athens, I perceive that in every way you are very religious.For as I passed along and observed the objects of your worship, I found also an altar with this inscription: To the unknown god. What therefore you worship as unknown, this I proclaim to you.’”

He then offers up the idea of substituting his ‘foreign god’ for the Athenians’ statuary, altars, and temples:

“The God who made the world and everything in it, being Lord of heaven and earth, does not live in temples made by man, nor is he served by human hands, as though he needed anything, since he himself gives to all mankind life and breath and everything. And he made from one man every nation of mankind to live on all the face of the earth, having determined allotted periods and the boundaries of their dwelling place, that they should seek God, and perhaps feel their way toward him and find him.”

You can sense the crowd’s restless murmuring and shuffling feet, but then Paul goes back to cultural bridge-building:

“Yet he is actually not far from each one of us, for ‘In him we live and move and have our being’ [referring to a passage from Epimenides of Crete], and as even some of your own poets have said, ‘For we are indeed his offspring.’[{From Aratus’s poem Phainomena].”

Nice recovery, Paul. So far so good. This feels like discourse, what the Rock is for. But Paul believes that the Athenians’ practice of blending the unseen world of their gods with their physical craftmanship of statuary, altars, and temples (a practice the church would later perfect) is idolatry, and in his religious culture back home, idolatry had been on the outs since the Golden Calf.[11] At this point, Paul takes off the cultural kit gloves and goes fundamentalist:

“Being then God’s offspring, we ought not to think that the divine being is like gold or silver or stone, an image formed by the art and imagination of man. The times of ignorance God overlooked, but now he commands all people everywhere to repent, because he has fixed a day on which he will judge the world in righteousness by a man whom he has appointed; and of this he has given assurance to all by raising him from the dead.”

That’s precisely the point where he loses the crowd — well, most of them, there were some who were willing to give him another shot, and even a couple fresh converts:

“Now when they heard of the resurrection of the dead, some mocked. But others said, ‘We will hear you again about this.’ So Paul went out from their midst. But some men joined him and believed, among whom also were Dionysius the Areopagite and a woman named Damaris and others with them.”

“Some men joined him and believed….” That’s all there was left for them to do: believe or not believe. You’re either with us or against us.

Paul had violated the cultural ethics of a Talk at the Rock. It was about reasonable discourse; he made it a matter of belief, saying in effect. “forget your social customs and ethics, my God is going to hurt you if you keep it up.” With that, the conclave became irretrievably polarized, and the session was over.

Paul triggered this cultural dynamic constantly on his journeys – for example a few years later, when the Ephesus idol-building guild figured out the economic implications of Paul’s belief system[12]:

“About that time there arose no little disturbance concerning the Way.  For a man named Demetrius, a silversmith, who made silver shrines of Artemis, brought no little business to the craftsmen. These he gathered together, with the workmen in similar trades, and said, ‘Men, you know that from this business we have our wealth. And you see and hear that not only in Ephesus but in almost all of Asia this Paul has persuaded and turned away a great many people, saying that gods made with hands are not gods. And there is danger not only that this trade of ours may come into disrepute but also that the temple of the great goddess Artemis may be counted as nothing, and that she may even be deposed from her magnificence, she whom all Asia and the world worship.’ When they heard this they were enraged and were crying out, ‘Great is Artemis of the Ephesians!’”

Jesus had previously taken a whip to the merchants in the Temple in Jerusalem.[13] Apparently Demetrius and his fellow craftsmen saw the same thing coming to them, and made a preemptive strike. The scene quickly spiraled out of control:

“So the city was filled with the confusion, and they rushed together into the theater, dragging with them Gaius and Aristarchus, Macedonians who were Paul’s companions in travel.  But when Paul wished to go in among the crowd, the disciples would not let him. And even some of the Asiarchs, who were friends of his, sent to him and were urging him not to venture into the theater. Now some cried out one thing, some another, for the assembly was in confusion, and most of them did not know why they had come together.”

A local official finally quelled the riot:

“Some of the crowd prompted Alexander, whom the Jews had put forward. And Alexander, motioning with his hand, wanted to make a defense to the crowd. But when they recognized that he was a Jew, for about two hours they all cried out with one voice, ‘Great is Artemis of the Ephesians!’

“And when the town clerk had quieted the crowd, he said, ‘Men of Ephesus, who is there who does not know that the city of the Ephesians is temple keeper of the great Artemis, and of the sacred stone that fell from the sky? Seeing then that these things cannot be denied, you ought to be quiet and do nothing rash. For you have brought these men here who are neither sacrilegious nor blasphemers of our goddess. If therefore Demetrius and the craftsmen with him have a complaint against anyone, the courts are open, and there are proconsuls. Let them bring charges against one another. But if you seek anything further, it shall be settled in the regular assembly. For we really are in danger of being charged with rioting today, since there is no cause that we can give to justify this commotion.” and when he had said these things, he dismissed the assembly.”[14]

It Still Happens Today

I spent years in the evangelical church – we were fundamentalists, but didn’t want to admit it – where Paul’s Talk at the Rock was held up as the way not to “share your faith.” Forget the public discourse — you can’t just “spend [your] time in nothing except telling or hearing something new,” you need to lay the truth on them so they can believe or not believe, and if they don’t, you need to “shake the dust off your feet”[15] and get out of there. These days, we see both secular and religious cultural institutions following that advice.

Will we ever learn?

[1]How The Dualism Of Descartes Ruined Our Mental HealthMedium (May 10, 2019)

[2] Karl Albrecht, “The Tyranny of Two,” Psychology Today (Aug 18, 2010)

[3] Jeremy Lent, The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning (2017)

[4] Jim Kozubek, “The Enlightenment Rationality Is Not Enough: We Need A New Romanticism,” Aeon (Apr. 18, 2018)

[5] Andersen, Kurt, Fantasyland: How American Went Haywire, a 500-Year History (2017)

[6] Hedges, Chris, I Don’t Believe in Atheists: The Dangerous Rise of the Secular Fundamentalist (2008)

[7] The latter came from Jesus himself – see the Gospels of Matthew 21: 12-13, and John 2: 13-16. Jesus was a belief man through and through. More on that another time.

[8] The Acts of the Apostles 17: 17-20.

[9] Paul’s letter to the Colossians 4: 14.

[10] Acts 17: 21.

[11] Exodus 32.

[12] Acts 19: 23-41

[13] Mathew 21: 12-17; John 2: 13-21

[14] Acts: 23-42

[15] Matthew 10:14.

Knowledge, Conviction, and Belief [5]: Looking For the Self in the Brain

My soul is lost, my friend
Tell me how do I begin again?
My city’s in ruins,
My city’s in ruins.

Bruce Springsteen

Neuroscience looks for the soul in the brain and can’t find it. What it finds instead are the elements of consciousness — sensory perception, language, cognition, memory,  etc. — in various neural networks and regions of the brain, and those diverse networks collaborating to generate a composite conscious experience. Meanwhile, the master network — the one that is equivalent to conventional notions of the soul or self — remains elusive.

Prof. Bruce Hood lays out the progression from conventional belief in a separate self to the current brain network theory:

“Psychologist Susan Blackmore makes the point that the word “illusion” does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“In challenging what is the self, what most people think is the self must first be considered. If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“This sense that we are individual inside bodies is sometimes called the ‘ego theory,’ although philosopher Gale Strawson captures it poetically in what he calls the ‘pearl view’ of the self. The pearl view is the common notion that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’

“In contrast to this ego view, there is an alternative version of the self, based on the ‘bundle theory’ after the Scottish Enlightenment philosopher David Hume… He tried to describe his inner self and thought that there was no single entity, but rather bundles of sensations, perceptions and thoughts piled on top of each other. He concluded that the self emerged out of the bundling together of these experiences.

“If the self is the sum of our thoughts and actions, then the first inescapable fact is that these depend on brains. Thoughts and actions are not exclusively the brain because we are always thinking about and acting upon things in the world with our bodies, but the brain is primarily responsible for coordinating these activities. In effect, we are our brains or at least, the brain is the most critical body part when it comes to who we are.

“There is no center in the brain where the self is constructed. The brain has many distributed jobs. It processes incoming information from the external world into meaningful patterns that are interpreted and stored for future reference. It generates different levels and types of motivations that are the human drives, emotions, and feelings. It produces all sorts of behavior — some of them automatic while other are acquired thought skill, practice, and sheer effort.

“The sense of self that most of us experience is not to be found in any one area. Rather it emerges out of the orchestra of different brain processes.”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood (2012)

Princeton neuroscientist Michael Graziano uses an “attention schema theory” to describe this collaboration of neural networks. “The heart of the theory is that awareness is a schematized, descriptive model of attention,” he says, and expands as follows:

“In the present theory, the content of consciousness, the stuff in the conscious mind, is distributed over a large set of brain areas, areas that encode vision, emotion, language, action plans, and so on. The full set of information that is present in consciousness at any one time has been called the ‘global workspace.’ In the present theory, the global workspace spans many diverse areas of the brain. But the specific property of awareness, the essence of awareness added to the global workspace, is constructed by an expert system in a limited part of the brain…. The computed property of awareness can be bound to the larger whole… One could think of awareness as information.”

Consciousness and the Social Brain. Michael S. A. Graziano (2013)

To those who hold fast to the common belief (as most people do) that the soul is something transcendent, noble, unique, special, poetic, and divine, referring to consciousness and the self as “global workspace” and calling awareness “information” lacks a little something. But is that any reason to reject the bundle theory as untrue?

Meanwhile, Prof. Graziano admits that “the attention schema theory does not even seek to answer the question of existential reality but instead tries to describe what is constructed by the brain.” And besides, is science really after truth anyway?

We’ll look at those questions next time.

“Fearfully and Wonderfully Made”

da vinci

We are starting this series on Consciousness and the Self by looking at some of the religious and secular foundations of the belief that humans are a dualist entity consisting of body and soul, and the associated belief that the two elements are best understood by different forms of inquiry — religion and the humanities for the soul, and science for the body. As we’ll see, current neuro-biological thinking defies these beliefs and threatens their ancient intellectual, cultural, and historical dominance.

This article[1] is typical in its conclusion that one of the things that makes human beings unique is our “higher consciousness.”

“[Home sapiens] sits on top of the food chain, has extended its habitats to the entire planet, and in recent centuries, experienced an explosion of technological, societal, and artistic advancements.

“The very fact that we as human beings can write and read articles like this one and contemplate the unique nature of our mental abilities is awe-inspiring.

“Neuroscientist V.S. Ramachandran said it best: ‘Here is this three-pound mass of jelly you can hold in the palm of your hand…it can contemplate the meaning of infinity, and it can contemplate itself contemplating the meaning of infinity.’

“Such self-reflective consciousness or ‘meta-wondering’ boosts our ability for self-transformation, both as individuals and as a species. It contributes to our abilities for self-monitoring, self-recognition and self-identification.”

The author of the following Biblical passage agrees, and affirms that his “soul knows it very well” — i.e., not only does he know he’s special, but he knows that he knows it:

For you formed my inward parts;
    you knitted me together in my mother’s womb.
I praise you, for I am fearfully and wonderfully made.
Wonderful are your works;
    my soul knows it very well.

Psalm 139: 13-16 (ESV)

Judging from worldwide religious practice, the “I” that is “fearfully and wonderfully made” is limited to the soul, not the body:  the former feels the love, while the latter is assaulted with unrelenting, vicious, sometimes horrific verbal and physical abuse. “Mortification of the flesh” indeed –as if the body needs help being mortal.

Science apparently concurs with this dismal assessment. The following is from the book blurb for Through a Glass Brightly:  Using Science to See Our Species as We Really Are, by evolutionary biologist and psychologist David P. Barash (2018):

“In Through a Glass Brightly, noted scientist David P. Barash explores the process by which science has, throughout time, cut humanity ‘down to size,’ and how humanity has responded. A good paradigm is a tough thing to lose, especially when its replacement leaves us feeling more vulnerable and less special. And yet, as science has progressed, we find ourselves–like it or not–bereft of many of our most cherished beliefs, confronting an array of paradigms lost.

“Barash models his argument around a set of “old” and “new” paradigms that define humanity’s place in the universe. This new set of paradigms [includes] provocative revelations [such as] whether human beings are well designed… Rather than seeing ourselves through a glass darkly, science enables us to perceive our strengths and weaknesses brightly and accurately at last, so that paradigms lost becomes wisdom gained. The result is a bracing, remarkably hopeful view of who we really are.”

Barash’s old and new paradigms about the body are as follows:

“Old paradigm:  The human body is a wonderfully well constructed thing, testimony to the wisdom of an intelligent designer.

“New paradigm:  Although there is much in our anatomy and physiology to admire, we are in fact jerry-rigged and imperfect, testimony to the limitations of a process that is nothing but natural and that in no way reflects supernatural wisdom or benevolence.”

Okay, so maybe the body has issues, but the old paradigm belief that human-level consciousness justifies lording it over the rest of creation is as old as the first chapter of the Bible:

And God blessed them. And God said to them,
“Be fruitful and multiply and fill the earth and subdue it
and have dominion over the fish of the sea
 and over the birds of the heavens
 and over every living thing that moves on the earth.”

Genesis 1:28  (ESV)

The Biblical mandate to “subdue” the earth explains a lot about how we approach the rest of creation — something people seem to be questioning more and more these days. Psychiatrist, essayist, and Oxford Fellow Neel Burton includes our superiority complex in his list of self-deceptions:

“Most people see themselves in a much more positive light than others do them, and possess an unduly rose-tinted perspective on their attributes, circumstances, and possibilities. Such positive illusions, as they are called, are of three broad kinds, an inflated sense of one’s qualities and abilities, an illusion of control over things that are mostly or entirely out of one’s control, and an unrealistic optimism about the future.” [2]

Humans as the apex of creation? More on that next time.

[1] What is it That Makes Humans Unique? Singularity Hub, Dec. 28, 2017.

[2] Hide and Seek:  The Psychology of Self-Deception (Acheron Press, 2012).

Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).

Repent, For the Paradigm Shift is at Hand

Vineyard

We talked last time about the need for radical shifts in outlook — paradigm shifts — if we want to overcome neuro-cultural resistance to change, and mentioned religious conversion as an example. This week, we’ll look at how a paradigm shift gave birth to a church renewal movement in the late 80’s and early 90’s known as “the Vineyard.” I write about it because I was personally involved with it. This is NOT a critique or judgment of the Vineyard or anyone in it; I offer this only to further our examination of the neuro-cultural dynamics of religion.

Vineyard founder John Wimber taught missionary methods and church growth at Fuller Theological Seminary, and often heard reports from foreign fields of conversions and membership growth propelled by “signs and wonders” — gospel-style miracles and personal encounters. Western theology and sensibilities mostly explained away supernatural phenomena, but non-Westerners weren’t scandalized by gospel-era experience.

Wimber formulated a ministry model based on the non-Westerners’ worldview. His message was that the Kingdom of God truly was at hand — in the here and now — a concept explored by theologians such as Fuller’s George Eldon Ladd. To embrace and practice that message, Westerners would need to embrace a new worldview — a new paradigm of practical spirituality — that made sense of signs and wonders.

Wimber catalogued what he called “ministry encounters.” where Jesus and the disciples knew things about people they had not revealed, and where people would fall down, cry out, weep, etc. when engaged. Wimber was a Quaker, and adapted the practice of waiting to be moved by the Spirit to watching for these “manifestations of the Spirit” to occur in gatherings. “Ministry teams” trained in the new paradigm would then advance the encounters through the laying on of hands and other gospel techniques.

Wimber’s model began to draw crowds — not unlike the gospel events that drew crowds from towns and their surrounding regions, and sometimes went on all night. Very soon, the Vineyard’s “ministry training” and “ministry conferences” were all the buzz.  Attendees came with high expectations, and the atmosphere was electric.

Vineyard events began with soft rock music with lyrics that addressed God on familiar and sometimes intimate terms, invoking and inviting God’s presence and expressing devotion. The songs flowed nonstop from one to another. By the time the half hour or so of music was over, the crowd was in a state of high inspiration — they were “in-spirited,” “filled with the spirit,” God had “breathed” on them — all phrases connoted in the word’s original meaning when it entered the English language in the 14th century.

After worship, Wimber would offer paradigm-shifting instruction such as describing what a “ministry encounter” looks like — e.g. “manifestations” such as  shaking, trembling, emotional release, etc. He was funny and entertaining, as were other Vineyard speakers, and readily kept up the inspired vibe. Each session would then close with a “clinic” of “ministry encounters.”

The model worked. Vineyard conferences became legend, and soon Vineyard renewal teams traveled the world. I took two overseas trips and several around the U.S. Hosting churches sometimes billed our events as “revival meetings” — their attempt to describe the conference in traditional terms. We were in and out, caused a stir over a weekend, and that was the end of it unless the sponsoring church’s leadership and members adopted the requisite new worldview. Before long the Vineyard began to “plant” its own churches and became its own denomination.

Back in the day, I thought the Vineyard was truly the kingdom come. 30 years later, I view it as one of the most remarkable examples of neuro-cultural conditioning I’ve ever been part of. Neuroscience was nowhere near its current stage of research and popular awareness back then, but what we know now reveals that Vineyard events were the perfect setting for paradigm shifting. As we’ve seen previously, inspiration releases the brain’s “feel good” hormones, activates the same brain areas as sex, drugs, gambling, and other addictive activities, generates sensations of peace and physical warmth, lowers the brain’s defensive allegiance to status quo, and raises risk tolerance — the perfect neurological set up for adopting a new outlook.[1]

As for what happened to Wimber and the Vineyard, that’s beyond the scope of this post, but easy to find if you’re inclined. Stanford anthropology professor Tanya Marie Luhrmann offers an academic (and sympathetic) analysis in her book When God Talks Back:  Understanding the American Evangelical Relationship With God and her TEDX Stanford talk.

[1] “What Religion Does To Your Brain,”,: Medical News Today (July 20, 2018). See also this prior post in this series. And for a look at thee dynamics in quite another setting — finding work you love — see this post from my other blog.

Why Faith Endures

Jesus replied, “No one who puts a hand to the plow and looks back
is fit for service in the kingdom of God.”

Luke 9: 62 NIV

I once told a leader of our campus Christian fellowship about doubts prompted by my religion major classes. “Get your Bible and read Luke 9: 62,” he said. I did, and can still see the hardness on his face when I looked up. Religions venerate those who long endure, honoring their moral steadfastness. My character and commitment were suspect. I declared a new major the following quarter.

Scarlet letterReligions punish doubt and dissidence through peer pressure, public censure, witch hunts, inquisitions, executions, jihads, war, genocide…. The year before, the dining halls had flown into an uproar the day the college newspaper reported that the fellowship had expelled a member for sleeping with her boyfriend.

Religions also have a curious way of tolerating their leaders’ nonconforming behavior — even as the leaders cry witch hunt.[1]

These things happen in all cultural institutions, not just religion. Neuroculture offers an explanation for all of them that emphasizes group dynamics over individual integrity. It goes like this:

  • When enough people believe something, a culture with a shared belief system emerges.
  • Individual doubt about the culture’s belief system introduces “cognitive dissonance” that makes individuals uneasy and threatens cultural cohesiveness.
  • Cohesiveness is essential to the group’s survival — doubt and nonconformity can’t be tolerated.
  • The culture therefore sanctifies belief and stifles doubt.
  • The culture sometimes bends its own rules to preserve its leadership power structure against larger threats.

This Article Won’t Change Your Mind,” The Atlantic (March 2017) illustrates this process:

“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Failstheir 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’

“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”

Why Facts Don’t Change Our Minds,The New Yorker (Feb. 27, 2017) explains why the process seems so perfectly reasonable:

“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain.

“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”

What does it take for individual dissent or cultural change to prevail in the face of these powerful dynamics? We’ll look at that next time.

[1]  This “bigger bully” theory was remarkably evident when Tony Perkins, leader of the Family Research Council, said evangelicals “kind of gave [Donald Trump] a mulligan” over Stormy Daniels, saying that evangelicals “were tired of being kicked around by Barack Obama and his leftists. And I think they are finally glad that there’s somebody on the playground that’s willing to punch the bully.”

Religion on Demand

god helmet

“Given that the neurological roots of religious experiences can be traced so accurately with the help of the latest neuroscientific technologies, does this mean that we could — in principle — ‘create’ these experiences on demand?”[1]

It’s a good question. And so is the obvious follow up: if technology can create religious experience on demand, how does that affect religion’s claims to authenticity and its status as a cultural institution?

Dr. Michael Persinger[2] created the “”God Helmet” (shown in the photo above, taken from this article) for use in neuro-religious research.

This is a device that is able to simulate religious experiences by stimulating an individual’s tempoparietal lobes using magnetic fields. “If the equipment and the experiment produced the presence that was God, then the extrapersonal, unreachable, and independent characteristics of the god definition might be challenged,” [says Dr. Persinger]. [3]

The experiences created are not doctrinally specific, but are of a kind widely shared among different religions — for example, sensing a numinous presence, a feeling of being filled with the spirit or overwhelmed or possessed, of being outside of self, out of body, or having died and come back to life, feelings of being one with all things or of peace, awe, fear and dread, etc. All of these states have been measured or induced in the laboratory[4]:

Some recent advances in neuroimaging techniques allow us to understand how our brains ‘create’ a spiritual or mystical experience. What causes the feeling that someone else is present in the room, or that we’ve stepped outside of our bodies and into another dimension?

“In the last few years,” says [Dr. Jeff Anderson of the University of Utah School of Medicine in Salt Lake City], “brain imaging technologies have matured in ways that are letting us approach questions that have been around for millennia.”

Prof. James Giordano, from the Georgetown University Medical Center in Washington, D.C., [says that] “We are able to even understand when a person gets into ‘ecstasy mode’ … and to identify specific brain areas that participate in this process.”

“If ‘beings’ join the mystical experience,” Prof. Giordano goes on, “we can say that the activity of the left and right temporal lobe network (found at the bottom middle part of the cortex) has changed.”

 “When activity in the networks of the superior parietal cortex [which is a region in the upper part of the parietal lobe] or our prefrontal cortex increases or decreases, our bodily boundaries change,” Prof. Giordano explains in an interview for Medium. “These parts of the brain control our sense of self in relation to other objects in the world, as well as our bodily integrity; hence the ‘out of body’ and ‘extended self’ sensations and perceptions many people who have had mystical experiences confess to.”

The parietal lobes are also the areas that [Neuroscientist Andrew Newberg, a pioneer of neurotheology, has] found to have lower brain activity during prayer.

And much more. In addition, research has also helped to explain such things as why people with chronic neurodegenerative diseases often lose their religion:

“We discovered a subgroup who were quite religious but, as the disease progressed, lost some aspects of their religiosity,” [says Patrick McNamara, professor of neurology at Boston University and author of The Neuroscience of Religious Experience (2009)]. Sufferers’ brains lack the neurotransmitter dopamine, making McNamara suspect that religiosity is linked to dopamine activity in the prefrontal lobes. “These areas of the brain handle complexity best, so it may be that people with Parkinson’s find it harder to access complex religious experiences.”

Does this research signal the end of religion any time soon? Probably not, says Dr. Newberg:

Until we gain such answers, however, religion is unlikely to go anywhere. The architecture of our brains won’t allow it, says Dr. Newberg, and religion fulfills needs that our brains are designed to have.[5]

Tim Crane, author of The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), agrees:  religion, he says, is simply “too ingrained as a human instinct.” See also this article[6]’s analysis of the current state of the science vs. religion contention, which concludes that the scale seems to be tipping more to the latter:

Religion is not going away any time soon, and science will not destroy it. If anything, it is science that is subject to increasing threats to its authority and social legitimacy.

There are plenty of contrary opinions, of course, and all the technology and research in the world is unlikely to change anybody’s mind. pro or con. We’ll look at why not next time.

[1] “What Religion Does To Your Brain,” Medical News Today (July 20, 2018).

[2] Dr. Persinger was director of the Neuroscience Department at Laurentian University in Ontario, Canada prior to his death in 2018.

[3] “What God Does To Your Brain:  The controversial science of neurotheology aims to find the answer to an age-old question: why do we believe?” The Telegraph (June 20, 2014).

[4] “What Religion Does To Your Brain,” Medical News Today (July 20, 2018).

[5] Why God Won’t Go Away: Brain Science and the Biology of Belief, Andrew Newberg, Eugene D’Aquili, Vince Rause (2001).

[6] “Why Religion Is Not Going Away And Science Will Not Destroy It,” Aeon Magazine (Sept. 7, 2017).

Emergence

 

murmuration

One fine afternoon autumn day I watched transfixed as a gigantic flock of migratory birds swarmed over the woods across the street. I was watching a “complex, self-organizing system” in action — specifically, a “murmuration” of birds, which is created by “swarm behavior,” which in turn falls in the category of emergence.

Emergence explains how the whole becomes greater than the sum of its parts. The term is widely used — in systems theory, philosophy. psychology, chemistry, biology, neurobiology, machine learning — and for purposes of this blog, it also applies to cultural belief systems and the social institutions they generate.

Consider any culture you like — a team, club, company, profession, investor group, religious gathering, political party…. As we’ve seen previously in this series, the group’s cultural sense of reality is patterned in each individual member’s neural wiring and cellular makeup. But no one member can hold it all, and different members have varying affinity for different aspects of the culture. As a result, each member takes what the others bring “on faith”:  the group believes in its communal beliefs. This faith facilitates the emergence of a cohesive, dynamic cultural body that takes on a life of its own, expressed through its institutions. .

That’s emergence.

To get a further sense of how this works, see this TED Talk that uses complex systems theory to look at how the structure of the financial industry (a transnational cultural body) helped to bring about the Great Recession of 2007-2008. Systems theorist James B. Glattfelder[1] lays out a couple key features of self-organizing systems:

“It turns out that what looks like complex behavior from the outside is actually the result of a few simple rules of interaction. This means you can forget about the equations and just start to understand the system by looking at the interactions.

“And it gets even better, because most complex systems have this amazing property called emergence. This means that the system as a whole suddenly starts to show a behavior which cannot be understood or predicted by looking at the components. The whole is literally more than the sum of its parts.”

In the end, he says, there’s an innate simplicity to it all — “an emergent property which depends on the rules of interaction in the system. We could easily reproduce [it] with a few simple rules.”[2] He compares this outcome to the inevitable polarized logjams we get from clashing cultural ideologies:

 “I really hope that this complexity perspective allows for some common ground to be found. It would be really great if it has the power to help end the gridlock created by conflicting ideas, which appears to be paralyzing our globalized world.  Ideas relating to finance, economics, politics, society, are very often tainted by people’s personal ideologies.  Reality is so complex, we need to move away from dogma.”

Trouble is, we seem to be predisposed toward ideological gridlock and dogma. Even if we’ve never heard of emergence, we have a kind of backdoor awareness of it — that there are meta-influences affecting our lives — but we’re inclined to locate their source “out there,” instead of in our bodily selves. “Out there” is where the Big Ideas live, formulated by transcendent realities and personalities — God, gods, Fate, Destiny, Natural Law, etc. — that sometimes enter our lesser existence to reveal their take on how things work. Trouble is, they have super-intelligence while we have only a lesser version, so once we receive their revelations, we codify them into vast bodies of collected wisdom and knowledge, which we then turn over to our sacred and secular  cultural institutions to administer. We and our cultures aren’t perfect like they are, but we do our best to live up to their high standards.

We do all this because, as biocentrism champion Robert Lanza has said, most of us have trouble wrapping our heads around the notion that

“Everything we see and experience is a whirl of information occurring in our head. We are not just objects embedded in some external matrix ticking away ‘out there.’”[3]

In our defense, the kind of systems analysis that James Glattfelder uses in his TED talk requires a lot of machine super-intelligence and brute data-crunching power that the human brain lacks. We’re analog and organic, not digital, and we use our limited outlook to perpetuate more polarization, ideological gridlock. and dogma. Culture may be emergent, but when it emerges, it walks right into a never-ending committee meeting  debating whether it has a place on the agenda..

Next time, we’ll look at what happens when emergent cultures clash.

[1] James B. Glattfelder holds a Ph.D. in complex systems from the Swiss Federal Institute of Technology. He began as a physicist, became a researcher at a Swiss hedge fund. and now does quantitative research at Olsen Ltd in Zurich, a foreign exchange investment manager.

[2] Here’s a YouTube explanation of the three simple rules that explain the murmuration I watched that day.

[3] From this article in Aeon Magazine.

It’s An Inside Job

In Brain and Culture:  Neurobiology, Ideology, and Social Change, Yale Medical School professor of psychiatry Bruce E. Wexler declared that “concordance between internal structure and external reality is a fundamental human neurobiological imperative.” “Concordance” is peace of mind, which we all know is an inside job.

peace of mind

But the concordance Wexler is talking about is not the kind reserved for the enlightened few, it’s the kind that’s a brain health necessity. Our brains work unceasingly to maintain harmony between us and our surroundings, including our cultural setting. When internal and external are out of sync, the result is cognitive dissonance which, when left unresolved, leads to physical, mental, and social disease, distress and disorder. Neurological concordance is therefore a surviving and thriving skill, and can be traced to the corresponding part of the brain:

“Thanks to advances in imaging methods, especially functional MRI, researchers have recently identified key brain regions linked to cognitive dissonance. The area implicated most consistently is the posterior part of the medial frontal cortex (pMFC), known to play an important role in avoiding aversive outcomes, a powerful built-in survival instinct. In fMRI studies, when subjects lie to a peer despite knowing that lying is wrong—a task that puts their actions and beliefs in conflict—the pMFC lights up.”[1]

cognitive dissonance

The straightest  path to concordance is conformity. Nonconformity, on the other hand, generates both intracultural and intercultural neurological conflict. [2] This potential for conflict was the context for Wexler’s peace of mind declaration — let’s hear it again, the full quote this time:

“This book argues that differences in belief systems can themselves occasion intercultural violence, since concordance between internal structure and external reality is a fundamental human neurobiological imperative.” (Emphasis added.)

Peace of mind therefore requires the alignment of inner and outer belief systems. This article[3] defines the term:

“Belief systems are the stories we tell ourselves to define our personal sense of Reality. Every human being has a belief system that they utilize, and it is through this mechanism that we individually ‘make sense’ of the world around us.

 “The species Homo sapiens developed so-called belief systems. These are sets of beliefs reinforced by culture, theology, and experience and training as to how the world works cultural values, stereotypes, political viewpoints, etc.”

In order for personal (internal) and shared (external) belief systems to align, the culture’s members must share comparable neural pathways, consciousness, perceptions, sensory tastes, physiology, and the like.[4] When they do, the culture becomes recognizable in its members. Think of the Olympics’ opening ceremony parade of athletes:  the Yanks are obviously the Yanks — nobody else has quite their swashbuckling sense of derring-do. Or think of professional cultures — lawyers, accountants, engineers, physicians — meet one, and you can just tell. Or remember what it’s like to visit a foreign culture — it’s not just the signage but it’s… well, everything —  how people look, sound, act, their customs and values….

All of that is the result of biological, chemical, environmental, and other influences, all stored in individual brains and bodies. But how is cultural patterning transmitted from one individual to another? John R. Searle, Professor of Philosophy, University of California, Berkeley, wanted to know, and finding out led to his seminal book The Construction of Social Reality:

“This book is about a problem that has puzzled me for a long time:  there are portions of the real world, objective facts in the world that are only facts by human agreement. In a sense there are things that exist only because we believe them to exist. I am thinking about things like money, property, governments, and marriage.

“If everybody thinks that this sort of thing is money, and they use it as money and treat it as money, then it is money. If nobody ever thinks this sort of thing is money, then it is not money. And what goes for money goes for elections, private property, wars, voting, promises, marriages, buying and selling, political offices, and so on.”

“How can there be an objective world of money, property, marriage, governments, elections, football games, cocktail parties and law courts in a world that consists entirely of physical particles in fields of force, and in which some of these particles are organized into systems that are conscious biological beasts, such as ourselves?”

This article[5] provides this summary answer to Searle’s questions:

“Because mental states cannot be transferred physically, they must be transferred by being re-created in the mind of the receiving individual.

“[W]hat is transmitted is some state of mind that produces behavior.

“[The transmitted state of mind includes] a myriad of… beliefs, values, desires, definitions, attitudes, and emotional states such as fear, regret, or pride.”

Vastly simplified, the process of enculturation looks like this:

  • New members enter via an entry point such as birth, naturalization, initiation, etc.
  • They observe the culture’s members thinking and behaving in the culture’s characteristic ways.
  • Through observation and imitation, they take on the culture’s mindset and become habituated into its belief and behavioral norms.
  • In time, they become recognizable as members of the culture along with its other members.
  • Then, an organizing principle called “emergence” asserts itself, so that the whole culture takes on a life of its own that is bigger than the sum of its individual members.

We’ll talk about emergence next time.

[1] “What Happens to the Brain During Cognitive Dissonance?” Scientific American Mind (Nov. 2015).

[2] There’s been a lot of research on conformity and nonconformity in the past ten years. If you’re interested in digging deeper, searching “neuroscience of conformity” and “neuroscience of nonconformity” will turn up several scholarly studies.

[3] “What Are Belief Systems?” Usó-Doménech and J. Nescolarde-Selva, Department of Applied Mathematics. University of Alicante. Alicante. Spain.

[4] See the prior post, Microbes of Meaning.

[5]Evolution of Mind, Brain, and Culture”, Philip G. Chase, former Senior Research Scientist and Consulting Scholar at the University of Pennsylvania,

“Be the Change You Want to See” — Why Change MUST Always Begin With Us

the-beginning-e1503252471356

In the beginning, somebody…

Told a story. Made something. Made something that made things. Drew a picture. Used their voice melodiously. Moved a certain way and did it again. Took something apart, put it back together, and built another thing like it. Watched how weather and sky and flora and fauna responded to the passage of time. Sprinkled dry leaves on meat and ate it. Drew a line in the sand and beat someone who crossed it. Traded this for that. Resolved a dispute. Helped a sick person feel better. Took something shiny from the earth or sea and wore it. Had an uncanny experience and explained it.

And then somebody else did, too — and then somebody else after that, and more somebodies after that, until the human race had organized itself into families, clans, tribes, city-states, and nations, each with its own take on life in this world. Millennia later a worldwide civilization had emerged, organized around trans-cultural institutions of law, economics, science, religion, industry, commerce, education, medicine, arts and entertainment….

And then you and I were born as new members of a highly-evolved human culture of innumerable, impossibly complex, interwoven layers.

From our first breaths we were integrated into site-specific cultural institutions that informed our beliefs about how the world works and our place in it. Those institutions weren’t external to us, they were embodied in us — microbes of meaning lodged in our neural pathways and physical biome. Our brains formed around the beliefs of our culture — our neurons drank them in, and our neural networks were wired up with the necessary assumptions, logic, and leaps of faith.

These cellular structure informed what it meant for us to be alive on the Earth, individually and in community. They shaped our observations and awareness, experiences and interpretations, tastes and sensibilities. They defined what is real and imaginary, set limits around what is true and false, acceptable and taboo. And then they reinforced the rightness of it all with feelings of place and belonging, usefulness and meaning. When that was done, our brains and bodies were overlaid with a foundation for status quo — the way things are, and are supposed to be.

All that happened in an astonishing surge of childhood development. Then came puberty, when our brain and body hormones blasted into overdrive, dredging up our genetic and environmental beginnings and parading them out for reexamination. We kept this and discarded that, activated these genes instead of those. (The process by which we do that is called epigenetics, and it explains why your kids aren’t like you.) We also tried on countercultural beliefs. welcoming some and rejecting others. From there, we entered adult life freshly realigned with a differentiated sense of self, us, and them.

From there, adult life mostly reinforces our cultural beginnings, although the nuisances and opportunities of change periodically require us to make and reaffirm shared agreements in our communities, professions, workplaces, teams, and other groups, each time reaffirming and refining our shared cultural foundations. In doing so, we sometimes flow with the changing times, and sometimes retrench with nostalgic fervor.

Where does all this biological, cognitive, and social development and maintenance happen? In the only place it possibly could:  in the hot wet darkness inside the human body’s largest organ —   our skin. Yes, there is a “real world” out there that we engage with, but the processing and storing of experience happen inside — encoded in our brains and bodies.

be the changeWhich is why individual and cultural change must always begin with us — literally inside of us, in our physical makeup — because that’s where our world and our experience of it are registered and maintained. Gandhi’s famous words are more than a catchy meme, they describe basic human reality:  if we want things to change, then we must be transformed. Think about it:  we have no belief, perception, experience, or concept of status quo that is not somehow registered in our brains and bodies, so where else could change happen? (Unless there’s something like a humanCloud where it can be uploaded and downloaded — but that’s another issue for another time.)

The implications of locating human experience in our physical selves are far-reaching and fascinating. We’ll be exploring them.

#icons #iconoclast #psychology “philosophy #sociology #neurology #biology #narrative #belief #society #socialstudies #religion #law #economics #work #jobs #science #industry #commerce #education #medicine #arts #entertainment #civilization #evolution #perception #reality #subjective #culture #culturalchange #change #paradigmshift #transformation #growth #personalgrowth #futurism #technology #identity #rational #consciousness #cognition #bias #cognitivebias #brain #development #childdevelopment #puberty #adolescence #hormones #genetics #epigenetics #gandhi #bethechange #bethechangeyouwant #neurons #neuralnetworks