Jesus replied, “No one who puts a hand to the plow and looks back
is fit for service in the kingdom of God.”
Luke 9: 62 NIV
I once told a leader of our campus Christian fellowship about doubts prompted by my religion major classes. “Get your Bible and read Luke 9: 62,” he said. I did, and can still see the hardness on his face when I looked up. Religions venerate those who long endure, honoring their moral steadfastness. My character and commitment were suspect. I declared a new major the following quarter.
Religions punish doubt and dissidence through peer pressure, public censure, witch hunts, inquisitions, executions, jihads, war, genocide…. The year before, the dining halls had flown into an uproar the day the college newspaper reported that the fellowship had expelled a member for sleeping with her boyfriend.
Religions also have a curious way of tolerating their leaders’ nonconforming behavior — even as the leaders cry witch hunt.
These things happen in all cultural institutions, not just religion. Neuroculture offers an explanation for all of them that emphasizes group dynamics over individual integrity. It goes like this:
- When enough people believe something, a culture with a shared belief system emerges.
- Individual doubt about the culture’s belief system introduces “cognitive dissonance” that makes individuals uneasy and threatens cultural cohesiveness.
- Cohesiveness is essential to the group’s survival — doubt and nonconformity can’t be tolerated.
- The culture therefore sanctifies belief and stifles doubt.
- The culture sometimes bends its own rules to preserve its leadership power structure against larger threats.
“This Article Won’t Change Your Mind,” The Atlantic (March 2017) illustrates this process:
“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.
“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Fails, their 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’
“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.
“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.
“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”
Why Facts Don’t Change Our Minds,” The New Yorker (Feb. 27, 2017) explains why the process seems so perfectly reasonable:
“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain.
“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.
“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”
What does it take for individual dissent or cultural change to prevail in the face of these powerful dynamics? We’ll look at that next time.
 This “bigger bully” theory was remarkably evident when Tony Perkins, leader of the Family Research Council, said evangelicals “kind of gave [Donald Trump] a mulligan” over Stormy Daniels, saying that evangelicals “were tired of being kicked around by Barack Obama and his leftists. And I think they are finally glad that there’s somebody on the playground that’s willing to punch the bully.”