Knowledge, Conviction, and Belief [2]: Cultural Belief and Mass Delusion

We think we have an independent ability to think and believe as we like, to know this or be convinced about that. But that’s not the whole story:  our outlook is also shaped by our cultural context.

As we’ve seen , when enough people agree about what is true — whether they “know” it or are “convinced” of it — their agreement becomes a cultural belief system — for example, as reflected in a religion, country, neighborhood, business, athletic team, or other institution. Cultural belief systems are wired into the neural pathways of individual members, and as the culture coalesces, its belief system takes on a life of its own thorough a process known as “emergence.” As the emergent belief system is increasingly reflected in and reinforced by cultural institutions, it is increasingly patterned into the neural pathways of the culture’s members, where it defines individual and collective reality and sense of identity,  The belief system becomes The Truth , defining what the group and its members know and are convinced of.

Throughout this process, whether the culture’s beliefs are true in any non-subjective sense loses relevance. The result is what physician and author Paul Singh refers to as “mass delusion”:

“[When a conviction moves from an individual to being widely held], its origins are rooted in a belief system rather than in an individual’s pathological condition. It is a mass delusion of the sort that poses no immediate threat to anyone or society. Mass delusions can become belief systems that are passed from generation to generation.”

The Great Illusion:  The Myth of Free Will, Consciousness, and the Self, Paul Singh (2016)

For a dramatic example of this concept in action, consider an experience described by Jesse Jackson:

“There is nothing more painful to me at this stage in my life than to walk down the street and hear footsteps… then turn around and see somebody white and feel relieved.”

Despite a lifetime of civil rights leadership, Jackson’s cultural neural conditioning betrayed him. What he experienced was not just personal to him; it conformed to a cultural belief system. The particular “mass delusion” involved has been confirmed by clinical research.

“Matthew Lieberman, a psychologist at the University of California, recently showed how beliefs help people’s brains categorise others and view objects as good or bad, largely unconsciously. He demonstrated that beliefs (in this case prejudice or fear) are most likely to be learned from the prevailing culture.

“When Lieberman showed a group of people photographs of expressionless black faces, he was surprised to find that the amygdala — the brain’s panic button — was triggered in almost two-thirds of cases. There was no difference in the response between black and white people.”

Where Belief Is Born, The Guardian (June 30,2005)

When cultural beliefs are not constantly reinforced — by cultural norms of thought, language, practice, etc. — the neural networks that support them can weaken, allowing opportunity for new beliefs.

“‘Beliefs are mental objects in the sense that they are embedded in the brain,’ says [Kathleen Taylor, a neuroscientist at Oxford University] ‘If you challenge [beliefs] by contradiction, or just by cutting them off from the stimuli that make you think about them, then they are going to weaken slightly. If that is combined with very strong reinforcement of new beliefs, then you’re going to get a shift in emphasis from one to the other.’”

Where Belief Is Born

This helps to explain, for example, why religious believers are more likely to “fall away” if they are “out of fellowship.” Or what can happen to a student off to college, a world traveler, or an immigrant. It also helps to explain why leaders and despots alike can manipulate brain networks to create cultural belief systems to fit their desired ends:

“In her book on the history of brainwashing, Taylor describes how everyone from the Chinese thought reform camps of the last century to religious cults have used systematic methods to persuade people to change their ideas, sometimes radically.

“The mechanism Taylor describes is similar to the way the brain learns normally. In brainwashing though, the new beliefs are inserted through a much more intensified version of that process.

“The first step is to isolate a person and control what information they receive. Their former beliefs need to be challenged by creating uncertainty. New messages need to be repeated endlessly. And the whole thing needs to be done in a pressured, emotional environment.

“Stress affects the brain such that it makes people more likely to fall back on things they know well – stereotypes and simple ways of thinking,” says Taylor.

“This manipulation of belief happens every day. Politics is a fertile arena, especially in times of anxiety.”

Where Belief Is Born

More next time.

Repent, For the Paradigm Shift is at Hand

Vineyard

We talked last time about the need for radical shifts in outlook — paradigm shifts — if we want to overcome neuro-cultural resistance to change, and mentioned religious conversion as an example. This week, we’ll look at how a paradigm shift gave birth to a church renewal movement in the late 80’s and early 90’s known as “the Vineyard.” I write about it because I was personally involved with it. This is NOT a critique or judgment of the Vineyard or anyone in it; I offer this only to further our examination of the neuro-cultural dynamics of religion.

Vineyard founder John Wimber taught missionary methods and church growth at Fuller Theological Seminary, and often heard reports from foreign fields of conversions and membership growth propelled by “signs and wonders” — gospel-style miracles and personal encounters. Western theology and sensibilities mostly explained away supernatural phenomena, but non-Westerners weren’t scandalized by gospel-era experience.

Wimber formulated a ministry model based on the non-Westerners’ worldview. His message was that the Kingdom of God truly was at hand — in the here and now — a concept explored by theologians such as Fuller’s George Eldon Ladd. To embrace and practice that message, Westerners would need to embrace a new worldview — a new paradigm of practical spirituality — that made sense of signs and wonders.

Wimber catalogued what he called “ministry encounters.” where Jesus and the disciples knew things about people they had not revealed, and where people would fall down, cry out, weep, etc. when engaged. Wimber was a Quaker, and adapted the practice of waiting to be moved by the Spirit to watching for these “manifestations of the Spirit” to occur in gatherings. “Ministry teams” trained in the new paradigm would then advance the encounters through the laying on of hands and other gospel techniques.

Wimber’s model began to draw crowds — not unlike the gospel events that drew crowds from towns and their surrounding regions, and sometimes went on all night. Very soon, the Vineyard’s “ministry training” and “ministry conferences” were all the buzz.  Attendees came with high expectations, and the atmosphere was electric.

Vineyard events began with soft rock music with lyrics that addressed God on familiar and sometimes intimate terms, invoking and inviting God’s presence and expressing devotion. The songs flowed nonstop from one to another. By the time the half hour or so of music was over, the crowd was in a state of high inspiration — they were “in-spirited,” “filled with the spirit,” God had “breathed” on them — all phrases connoted in the word’s original meaning when it entered the English language in the 14th century.

After worship, Wimber would offer paradigm-shifting instruction such as describing what a “ministry encounter” looks like — e.g. “manifestations” such as  shaking, trembling, emotional release, etc. He was funny and entertaining, as were other Vineyard speakers, and readily kept up the inspired vibe. Each session would then close with a “clinic” of “ministry encounters.”

The model worked. Vineyard conferences became legend, and soon Vineyard renewal teams traveled the world. I took two overseas trips and several around the U.S. Hosting churches sometimes billed our events as “revival meetings” — their attempt to describe the conference in traditional terms. We were in and out, caused a stir over a weekend, and that was the end of it unless the sponsoring church’s leadership and members adopted the requisite new worldview. Before long the Vineyard began to “plant” its own churches and became its own denomination.

Back in the day, I thought the Vineyard was truly the kingdom come. 30 years later, I view it as one of the most remarkable examples of neuro-cultural conditioning I’ve ever been part of. Neuroscience was nowhere near its current stage of research and popular awareness back then, but what we know now reveals that Vineyard events were the perfect setting for paradigm shifting. As we’ve seen previously, inspiration releases the brain’s “feel good” hormones, activates the same brain areas as sex, drugs, gambling, and other addictive activities, generates sensations of peace and physical warmth, lowers the brain’s defensive allegiance to status quo, and raises risk tolerance — the perfect neurological set up for adopting a new outlook.[1]

As for what happened to Wimber and the Vineyard, that’s beyond the scope of this post, but easy to find if you’re inclined. Stanford anthropology professor Tanya Marie Luhrmann offers an academic (and sympathetic) analysis in her book When God Talks Back:  Understanding the American Evangelical Relationship With God and her TEDX Stanford talk.

[1] “What Religion Does To Your Brain,”,: Medical News Today (July 20, 2018). See also this prior post in this series. And for a look at thee dynamics in quite another setting — finding work you love — see this post from my other blog.

The Hostilities of Change:  Surprise, Death, and War

Storming of the Bastille

“Ideas that require people to reorganize their picture of the world provoke hostility.”

Science historian James Gleick,
in his bestseller Chaos:  The Making of a New Science,

We looked last time at neuro-cultural resistance to change, and asked what it takes to overcome it.

It takes a paradigm shift — which, according to Merriam-Webster, is “an important change that happens when the usual way of thinking about or doing something is replaced by a new and different way.” Physicist and philosopher Thomas Kuhn coined the term in a work that was itself a paradigm shift in how we view the dynamics of change.

“The Kuhn Cycle is a simple cycle of progress described by Thomas Kuhn in 1962 in his seminal work The Structure of Scientific Revolutions… Kuhn challenged the world’s current conception of science, which was that it was a steady progression of the accumulation of new ideas. In a brilliant series of reviews of past major scientific advances, Kuhn showed this viewpoint was wrong. Science advanced the most by occasional revolutionary explosions of new knowledge, each revolution triggered by introduction of new ways of thought so large they must be called new paradigms. From Kuhn’s work came the popular use of terms like ‘paradigm,’ ‘paradigm shift,’ and ‘paradigm change.’”

Thwink.org

Our cultural point of view determines what we see and don’t see, blinds us to new awareness and perspective. That’s why our visions of a “new normal” are often little more than uninspiring extrapolations of the past.[1] Paradigm shifts offer something more compelling:  they shock our consciousness so much that we never see things the same again; they stun us into abrupt about-faces. Without that, inertia keeps us moving in the direction we’re already going. If we even think of change, cognitive dissonance makes things uncomfortable, and if we go ahead with it anyway, things can get nasty in a hurry.

“People and systems resist change. They change only when forced to or when the change offers a strong advantage. If a person or system is biased toward its present paradigm, then a new paradigm is seen as inferior, even though it may be better. This bias can run so deep that two paradigms are incommensurate. They are incomparable because each side uses their own paradigm’s rules to judge the other paradigm. People talk past each other. Each side can ‘prove’ their paradigm is better.

“Writing in his chapter on The Resolution of Revolutions, Thomas Kuhn states that:

‘If there were but one set of scientific problems, one world within which to work on them, and one set of standards for their solution, paradigm competition might be settled more or less routinely by some process like counting the number of problems solved by each.

‘But in fact these conditions are never met. The proponents of competing paradigms are always at least slightly at cross-purposes. Neither side will grant all the non-empirical assumptions that the other needs in order to make its case.

‘Though each may hope to convert the other to his way of seeing his science and its problems, neither may hope to prove his case. The competition between paradigms is not the sort of battle that can be solved by proofs.’”

Thwink.org

What does it take to detonate a logjam-busting “revolutionary explosion of new knowledge”? Three possibilities:

The Element of Surprise. [2]  We’re not talking “Oh that’s nice!” surprise. We’re talking blinding flash of inspiration surprise — a eureka moment, moment of truth, defining moment — that changes everything forever, in a moment, in the twinkling of an eye. In religious terms, this is St. Paul’s conversion on the Damascus Road or St. Peter’s vision of extending the gospel to the gentiles. In those moments, both men became future makers, not future takers, embodying the view of another scientist and philosopher:

“The best way to predict the future is to create it.”[3]

A New Generation.  Without the element of surprise, paradigm shifts take a long time, if they happen at all.

“A new scientific truth does not triumph by convincing its opponents
and making them see the light, but rather because its opponents eventually die,
and a new generation grows up that is familiar with it.”[4]

In religious terms, that’s why the Exodus generation had to die off in 40 years in the wilderness, leaving a new generation for whom Moses’ new paradigm was the only one they’d ever known.

Violence.  Or, if the new paradigm’s champions can’t wait, they can resort to violence, brutality, persecution, war… the kinds of power-grabbing that have long polluted religion’s proselytizing legacy.

Surprise, death, violence… three ways to bring about a paradigm shift. That’s true in religion, science, or any other cultural institution.

More next time.

[1] Carl Richards, “There’s No Such Thing as the New Normal,” New York Times ( December 20, 2010).

[2] Carl Richards, op. cit.

[3] The quote has been ascribed to a lot of different people, including Peter Drucker and computer scientist Alan Kay. But according to the Quote Investigator, “The earliest evidence appeared in 1963 in the book ‘Inventing the Future’ written by Dennis Gabor who was later awarded a Nobel Prize in Physics for his work in holography.”

[4] Max Planck, founder of quantum theory, in his Scientific Autobiography and Other Papers.

Why Faith Endures

Jesus replied, “No one who puts a hand to the plow and looks back
is fit for service in the kingdom of God.”

Luke 9: 62 NIV

I once told a leader of our campus Christian fellowship about doubts prompted by my religion major classes. “Get your Bible and read Luke 9: 62,” he said. I did, and can still see the hardness on his face when I looked up. Religions venerate those who long endure, honoring their moral steadfastness. My character and commitment were suspect. I declared a new major the following quarter.

Scarlet letterReligions punish doubt and dissidence through peer pressure, public censure, witch hunts, inquisitions, executions, jihads, war, genocide…. The year before, the dining halls had flown into an uproar the day the college newspaper reported that the fellowship had expelled a member for sleeping with her boyfriend.

Religions also have a curious way of tolerating their leaders’ nonconforming behavior — even as the leaders cry witch hunt.[1]

These things happen in all cultural institutions, not just religion. Neuroculture offers an explanation for all of them that emphasizes group dynamics over individual integrity. It goes like this:

  • When enough people believe something, a culture with a shared belief system emerges.
  • Individual doubt about the culture’s belief system introduces “cognitive dissonance” that makes individuals uneasy and threatens cultural cohesiveness.
  • Cohesiveness is essential to the group’s survival — doubt and nonconformity can’t be tolerated.
  • The culture therefore sanctifies belief and stifles doubt.
  • The culture sometimes bends its own rules to preserve its leadership power structure against larger threats.

This Article Won’t Change Your Mind,” The Atlantic (March 2017) illustrates this process:

“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Failstheir 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’

“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”

Why Facts Don’t Change Our Minds,The New Yorker (Feb. 27, 2017) explains why the process seems so perfectly reasonable:

“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain.

“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”

What does it take for individual dissent or cultural change to prevail in the face of these powerful dynamics? We’ll look at that next time.

[1]  This “bigger bully” theory was remarkably evident when Tony Perkins, leader of the Family Research Council, said evangelicals “kind of gave [Donald Trump] a mulligan” over Stormy Daniels, saying that evangelicals “were tired of being kicked around by Barack Obama and his leftists. And I think they are finally glad that there’s somebody on the playground that’s willing to punch the bully.”

Why Belief Works

Our experience of the “real world” will conform to what we believe. It has to, because our brains insist upon it.

They do that in part through neuro-cultural conditioning — the process by which the neurological wiring of a culture’s individual members is patterned after the culture’s belief system, and vice versa. This is the case with any kind of cultural institution, whether national, religious, scientific, economic, corporate, professional, team, tribal, or otherwise.[1] This post looks at religion as an example.[2]

Tim Crane is a professor of philosophy at the Central European University in Budapest. “I work in the philosophy of mind,” his online CV says, “I have attempted to address questions about the most general nature, or essence, of the human mind, and about the place of the mind in the rest of nature.” In his book The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), he cites William James’ 1902 classic The Varieties of Religious Experience for a definition of what he calls “the religious impulse”:

“Were one asked to characterize the life of religion in the broadest and most general terms, one might say that it consists in the belief that there is an unseen order, and that our supreme good lies in harmoniously adjusting ourselves thereto.”

Christian Smith is a sociology professor and director of the Center for the Study of Religion and Society at the University of Notre Dame. Here’s his definition of religion:

“Religion is a complex of culturally prescribed practices, based on promises about the existence and nature of supernatural powers, whether personal or impersonal, which seek to help practitioners gain access to and communicate or align themselves with these powers, in hopes of realizing human goods and avoiding things bad.”

Religion: What It Is, How It Works, And Why It Matters (Princeton University Press, 2017)

Both authors stress that religious principles and practices need to match in order for religion to be effective. In other words:

“Faith without works is dead.”
The Epistle of James 2: 17

As it turns out, “faith without works is dead” is not just scripture, but accurate neuroscience as well. When we practice what we preach, we set up a self-sustaining loop in which belief drives thoughts and behavior, which in turn reinforce belief. In that way, religion develops the brain while the brain develops religion:

“Jordan Grafman, head of the cognitive neuroscience laboratory at Rehabilitation Institute of Chicago and neurology professor at Northwestern University, says that neurotheology is important in part because early religious practices helped develop our brains to begin with. ‘Religion has played an incredibly important role in human evolution. It’s funny, people want to separate the two but in fact they’re intertwined.’”

The Neuroscience Argument That Religion Shaped The Very Structure Of Our Brains,” Quartz (December 3, 2016)

The more widespread and enduring the religious practice, the more the religion develops scriptures, rituals, icons, and institutions to sustain itself. Therefore a Bible passage such as this…

“I was young and now I am old,
yet I have never seen the righteous forsaken
 or their children begging bread.”
Psalm 37: 25 NIV

… becomes both community truth and the “testimony” of individual adherents. But what happens when belief and experience don’t align — e.g., when a member of the congregation and her children in fact go begging?

Some religious thinkers, like the writer of this Huffington Post article, reckon with the contradiction by distinguishing belief from faith. Beliefs are products of the mind, she says, and deal with what can be known, while faith is a product of the spirit, which traffics in what cannot be known. Since knowledge is always shifting, belief can and probably will let us down, while faith in what can’t be known remains inscrutable. Faith therefore invites belief to step aside in favor of “trusting beyond all reason and evidence.”

That outlook captures the essential center of the definitions of religion we saw above:  that there is a “divine order” populated with “supernatural powers” that exists alongside but separate from ours. (Of which we have only limited understanding, the belief/faith outlook would add.)  Whether this satisfies the brain’s need to align internal patterning with external experience is the kind of issue being taken up by the new discipline of neurotheology which looks at where religion happens in the brain.

Neurotheology’s inquiries have far-reaching implications for many of our common assumptions about how reality is structured. For example, if faith can be explained in neurological terms, then it could be located — in whole or in part — along with belief on this side of the theoretical divide between human and supernatural existence.  This shift would likely have a ripple effect on similar dichotomies, such as known vs. unknown, real vs. imaginary, objective vs. subjective, observed vs. inscrutable, temporal vs. transcendence, etc.

More on neurotheology coming up.

[1] For more on cultural patterning, see the other posts in this blog’s category The Basics of Belief. Culture, and Reality.

[2] I talk about Christianity because it is the only religion I have personal experience with. And I am aware, by the way, that I write this post under the influence of my own neuroscientific cultural bias.