The Hostilities of Change:  Surprise, Death, and War

Storming of the Bastille

“Ideas that require people to reorganize their picture of the world provoke hostility.”

Science historian James Gleick,
in his bestseller Chaos:  The Making of a New Science,

We looked last time at neuro-cultural resistance to change, and asked what it takes to overcome it.

It takes a paradigm shift — which, according to Merriam-Webster, is “an important change that happens when the usual way of thinking about or doing something is replaced by a new and different way.” Physicist and philosopher Thomas Kuhn coined the term in a work that was itself a paradigm shift in how we view the dynamics of change.

“The Kuhn Cycle is a simple cycle of progress described by Thomas Kuhn in 1962 in his seminal work The Structure of Scientific Revolutions… Kuhn challenged the world’s current conception of science, which was that it was a steady progression of the accumulation of new ideas. In a brilliant series of reviews of past major scientific advances, Kuhn showed this viewpoint was wrong. Science advanced the most by occasional revolutionary explosions of new knowledge, each revolution triggered by introduction of new ways of thought so large they must be called new paradigms. From Kuhn’s work came the popular use of terms like ‘paradigm,’ ‘paradigm shift,’ and ‘paradigm change.’”

Our cultural point of view determines what we see and don’t see, blinds us to new awareness and perspective. That’s why our visions of a “new normal” are often little more than uninspiring extrapolations of the past.[1] Paradigm shifts offer something more compelling:  they shock our consciousness so much that we never see things the same again; they stun us into abrupt about-faces. Without that, inertia keeps us moving in the direction we’re already going. If we even think of change, cognitive dissonance makes things uncomfortable, and if we go ahead with it anyway, things can get nasty in a hurry.

“People and systems resist change. They change only when forced to or when the change offers a strong advantage. If a person or system is biased toward its present paradigm, then a new paradigm is seen as inferior, even though it may be better. This bias can run so deep that two paradigms are incommensurate. They are incomparable because each side uses their own paradigm’s rules to judge the other paradigm. People talk past each other. Each side can ‘prove’ their paradigm is better.

“Writing in his chapter on The Resolution of Revolutions, Thomas Kuhn states that:

‘If there were but one set of scientific problems, one world within which to work on them, and one set of standards for their solution, paradigm competition might be settled more or less routinely by some process like counting the number of problems solved by each.

‘But in fact these conditions are never met. The proponents of competing paradigms are always at least slightly at cross-purposes. Neither side will grant all the non-empirical assumptions that the other needs in order to make its case.

‘Though each may hope to convert the other to his way of seeing his science and its problems, neither may hope to prove his case. The competition between paradigms is not the sort of battle that can be solved by proofs.’”

What does it take to detonate a logjam-busting “revolutionary explosion of new knowledge”? Three possibilities:

The Element of Surprise. [2]  We’re not talking “Oh that’s nice!” surprise. We’re talking blinding flash of inspiration surprise — a eureka moment, moment of truth, defining moment — that changes everything forever, in a moment, in the twinkling of an eye. In religious terms, this is St. Paul’s conversion on the Damascus Road or St. Peter’s vision of extending the gospel to the gentiles. In those moments, both men became future makers, not future takers, embodying the view of another scientist and philosopher:

“The best way to predict the future is to create it.”[3]

A New Generation.  Without the element of surprise, paradigm shifts take a long time, if they happen at all.

“A new scientific truth does not triumph by convincing its opponents
and making them see the light, but rather because its opponents eventually die,
and a new generation grows up that is familiar with it.”[4]

In religious terms, that’s why the Exodus generation had to die off in 40 years in the wilderness, leaving a new generation for whom Moses’ new paradigm was the only one they’d ever known.

Violence.  Or, if the new paradigm’s champions can’t wait, they can resort to violence, brutality, persecution, war… the kinds of power-grabbing that have long polluted religion’s proselytizing legacy.

Surprise, death, violence… three ways to bring about a paradigm shift. That’s true in religion, science, or any other cultural institution.

More next time.

[1] Carl Richards, “There’s No Such Thing as the New Normal,” New York Times ( December 20, 2010).

[2] Carl Richards, op. cit.

[3] The quote has been ascribed to a lot of different people, including Peter Drucker and computer scientist Alan Kay. But according to the Quote Investigator, “The earliest evidence appeared in 1963 in the book ‘Inventing the Future’ written by Dennis Gabor who was later awarded a Nobel Prize in Physics for his work in holography.”

[4] Max Planck, founder of quantum theory, in his Scientific Autobiography and Other Papers.

Religion on Demand

god helmet

“Given that the neurological roots of religious experiences can be traced so accurately with the help of the latest neuroscientific technologies, does this mean that we could — in principle — ‘create’ these experiences on demand?”[1]

It’s a good question. And so is the obvious follow up: if technology can create religious experience on demand, how does that affect religion’s claims to authenticity and its status as a cultural institution?

Dr. Michael Persinger[2] created the “”God Helmet” (shown in the photo above, taken from this article) for use in neuro-religious research.

This is a device that is able to simulate religious experiences by stimulating an individual’s tempoparietal lobes using magnetic fields. “If the equipment and the experiment produced the presence that was God, then the extrapersonal, unreachable, and independent characteristics of the god definition might be challenged,” [says Dr. Persinger]. [3]

The experiences created are not doctrinally specific, but are of a kind widely shared among different religions — for example, sensing a numinous presence, a feeling of being filled with the spirit or overwhelmed or possessed, of being outside of self, out of body, or having died and come back to life, feelings of being one with all things or of peace, awe, fear and dread, etc. All of these states have been measured or induced in the laboratory[4]:

Some recent advances in neuroimaging techniques allow us to understand how our brains ‘create’ a spiritual or mystical experience. What causes the feeling that someone else is present in the room, or that we’ve stepped outside of our bodies and into another dimension?

“In the last few years,” says [Dr. Jeff Anderson of the University of Utah School of Medicine in Salt Lake City], “brain imaging technologies have matured in ways that are letting us approach questions that have been around for millennia.”

Prof. James Giordano, from the Georgetown University Medical Center in Washington, D.C., [says that] “We are able to even understand when a person gets into ‘ecstasy mode’ … and to identify specific brain areas that participate in this process.”

“If ‘beings’ join the mystical experience,” Prof. Giordano goes on, “we can say that the activity of the left and right temporal lobe network (found at the bottom middle part of the cortex) has changed.”

 “When activity in the networks of the superior parietal cortex [which is a region in the upper part of the parietal lobe] or our prefrontal cortex increases or decreases, our bodily boundaries change,” Prof. Giordano explains in an interview for Medium. “These parts of the brain control our sense of self in relation to other objects in the world, as well as our bodily integrity; hence the ‘out of body’ and ‘extended self’ sensations and perceptions many people who have had mystical experiences confess to.”

The parietal lobes are also the areas that [Neuroscientist Andrew Newberg, a pioneer of neurotheology, has] found to have lower brain activity during prayer.

And much more. In addition, research has also helped to explain such things as why people with chronic neurodegenerative diseases often lose their religion:

“We discovered a subgroup who were quite religious but, as the disease progressed, lost some aspects of their religiosity,” [says Patrick McNamara, professor of neurology at Boston University and author of The Neuroscience of Religious Experience (2009)]. Sufferers’ brains lack the neurotransmitter dopamine, making McNamara suspect that religiosity is linked to dopamine activity in the prefrontal lobes. “These areas of the brain handle complexity best, so it may be that people with Parkinson’s find it harder to access complex religious experiences.”

Does this research signal the end of religion any time soon? Probably not, says Dr. Newberg:

Until we gain such answers, however, religion is unlikely to go anywhere. The architecture of our brains won’t allow it, says Dr. Newberg, and religion fulfills needs that our brains are designed to have.[5]

Tim Crane, author of The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), agrees:  religion, he says, is simply “too ingrained as a human instinct.” See also this article[6]’s analysis of the current state of the science vs. religion contention, which concludes that the scale seems to be tipping more to the latter:

Religion is not going away any time soon, and science will not destroy it. If anything, it is science that is subject to increasing threats to its authority and social legitimacy.

There are plenty of contrary opinions, of course, and all the technology and research in the world is unlikely to change anybody’s mind. pro or con. We’ll look at why not next time.

[1] “What Religion Does To Your Brain,” Medical News Today (July 20, 2018).

[2] Dr. Persinger was director of the Neuroscience Department at Laurentian University in Ontario, Canada prior to his death in 2018.

[3] “What God Does To Your Brain:  The controversial science of neurotheology aims to find the answer to an age-old question: why do we believe?” The Telegraph (June 20, 2014).

[4] “What Religion Does To Your Brain,” Medical News Today (July 20, 2018).

[5] Why God Won’t Go Away: Brain Science and the Biology of Belief, Andrew Newberg, Eugene D’Aquili, Vince Rause (2001).

[6] “Why Religion Is Not Going Away And Science Will Not Destroy It,” Aeon Magazine (Sept. 7, 2017).

Why Belief Works

Our experience of the “real world” will conform to what we believe. It has to, because our brains insist upon it.

They do that in part through neuro-cultural conditioning — the process by which the neurological wiring of a culture’s individual members is patterned after the culture’s belief system, and vice versa. This is the case with any kind of cultural institution, whether national, religious, scientific, economic, corporate, professional, team, tribal, or otherwise.[1] This post looks at religion as an example.[2]

Tim Crane is a professor of philosophy at the Central European University in Budapest. “I work in the philosophy of mind,” his online CV says, “I have attempted to address questions about the most general nature, or essence, of the human mind, and about the place of the mind in the rest of nature.” In his book The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), he cites William James’ 1902 classic The Varieties of Religious Experience for a definition of what he calls “the religious impulse”:

“Were one asked to characterize the life of religion in the broadest and most general terms, one might say that it consists in the belief that there is an unseen order, and that our supreme good lies in harmoniously adjusting ourselves thereto.”

Christian Smith is a sociology professor and director of the Center for the Study of Religion and Society at the University of Notre Dame. Here’s his definition of religion:

“Religion is a complex of culturally prescribed practices, based on promises about the existence and nature of supernatural powers, whether personal or impersonal, which seek to help practitioners gain access to and communicate or align themselves with these powers, in hopes of realizing human goods and avoiding things bad.”

Religion: What It Is, How It Works, And Why It Matters (Princeton University Press, 2017)

Both authors stress that religious principles and practices need to match in order for religion to be effective. In other words:

“Faith without works is dead.”
The Epistle of James 2: 17

As it turns out, “faith without works is dead” is not just scripture, but accurate neuroscience as well. When we practice what we preach, we set up a self-sustaining loop in which belief drives thoughts and behavior, which in turn reinforce belief. In that way, religion develops the brain while the brain develops religion:

“Jordan Grafman, head of the cognitive neuroscience laboratory at Rehabilitation Institute of Chicago and neurology professor at Northwestern University, says that neurotheology is important in part because early religious practices helped develop our brains to begin with. ‘Religion has played an incredibly important role in human evolution. It’s funny, people want to separate the two but in fact they’re intertwined.’”

The Neuroscience Argument That Religion Shaped The Very Structure Of Our Brains,” Quartz (December 3, 2016)

The more widespread and enduring the religious practice, the more the religion develops scriptures, rituals, icons, and institutions to sustain itself. Therefore a Bible passage such as this…

“I was young and now I am old,
yet I have never seen the righteous forsaken
 or their children begging bread.”
Psalm 37: 25 NIV

… becomes both community truth and the “testimony” of individual adherents. But what happens when belief and experience don’t align — e.g., when a member of the congregation and her children in fact go begging?

Some religious thinkers, like the writer of this Huffington Post article, reckon with the contradiction by distinguishing belief from faith. Beliefs are products of the mind, she says, and deal with what can be known, while faith is a product of the spirit, which traffics in what cannot be known. Since knowledge is always shifting, belief can and probably will let us down, while faith in what can’t be known remains inscrutable. Faith therefore invites belief to step aside in favor of “trusting beyond all reason and evidence.”

That outlook captures the essential center of the definitions of religion we saw above:  that there is a “divine order” populated with “supernatural powers” that exists alongside but separate from ours. (Of which we have only limited understanding, the belief/faith outlook would add.)  Whether this satisfies the brain’s need to align internal patterning with external experience is the kind of issue being taken up by the new discipline of neurotheology which looks at where religion happens in the brain.

Neurotheology’s inquiries have far-reaching implications for many of our common assumptions about how reality is structured. For example, if faith can be explained in neurological terms, then it could be located — in whole or in part — along with belief on this side of the theoretical divide between human and supernatural existence.  This shift would likely have a ripple effect on similar dichotomies, such as known vs. unknown, real vs. imaginary, objective vs. subjective, observed vs. inscrutable, temporal vs. transcendence, etc.

More on neurotheology coming up.

[1] For more on cultural patterning, see the other posts in this blog’s category The Basics of Belief. Culture, and Reality.

[2] I talk about Christianity because it is the only religion I have personal experience with. And I am aware, by the way, that I write this post under the influence of my own neuroscientific cultural bias.

Moral Compass:  How We Know Right From Wrong


Our brains are amoral. They need cultural context to give them a moral compass.

Real and Imaginary

It’s a staple of self-help advice and sports and performance psychology that our brains don’t know the difference between real and imagined, therefore we can trick them into getting us what we want. There’s good science to back this up, although recent research suggests that the brain actually does know the difference — that it has specific neurons for that purpose. Science Daily. Plus, although both real and imaginary run over the same neural pathway, they move in opposite directions:  input from the outside world runs bottom up — from lower level sensory to higher level cognitive processing — while imagined input runs top down. Psychology Today, Knowledge Nuts.

Getting Into Our Bodies

Not that Harold Hill’s “think system” is enough — we still need to practice and rehearse effectively. We need to get our bodies involved. We’re out there in the “real world” taking in sensory input, interacting with people, things, and experiences, meanwhile we’re imagining things, throwing in doses of speculation and making things up. Our brains and bodies need to work together to ground this swirl of information. This article[1] explains how they do that:

“When considering the senses, we tend to think of sight and sound, taste, touch and smell. However, these are classified as exteroceptive senses, that is, they tell us something about the outside world. In contrast, interoception is a sense that informs us about our internal bodily sensations, such as the pounding of our heart, the flutter of butterflies in our stomach or feelings of hunger.

“The brain represents, integrates and prioritises interoceptive information from the internal body. These are communicated through a set of distinct neural and humoural (i.e., blood-borne) pathways. This sensing of internal states of the body is part of the interplay between body and brain: it maintains homeostasis, the physiological stability necessary for survival; it provides key motivational drivers such as hunger and thirst; it explicitly represents bodily sensations, such as bladder distension. But that is not all, and herein lies the beauty of interoception, as our feelings, thoughts and perceptions are also influenced by the dynamic interaction between body and brain.”

University of Sussex cognitive neuroscientist Anil Seth lays all this out in his TED2017 talk “Your Brain Hallucinates Your Conscious Reality.”

TED Hallucinating reality

“When we agree about our hallucinations,” he says, “we call that reality.” Those agreements blend external (outside world) and internal (imagined) input into shared belief about what the real world is, and how it works. They also add another key cultural component:  a sense of right and wrong.

Why We Need a Moral Compass, and Where We Get It

Humans need community to survive. Community, in turn, needs a shared behavioral code. Our brains are flexible and amoral on issues of right and wrong — they take their cues from cultural context. Cultural moral coding is therefore evolutionary — motivated by the survival instinct.[2] All of that goes a long way toward explaining why activities honored by one group are despicable to another, and why, when confronted with those differences, each group’s first instinct is to point fingers.

This article reviews three prominent books[3] supporting culturally based morality, and concludes as follows:

“…one must come to the conclusion that inside human beings, as Gazzaniga says, ‘there is a moral compass.’ But ‘we have to be smart enough to figure out how it works.’ Across the realm of human experience—personal, collective, historical, and now neuroscientific—it is abundantly clear that we have the capacity to consciously consider consequences and choose our actions… The mind is a physio-spiritual mechanism built for choice, but it must be given direction. We may be endowed with a moral compass, but it does not arrive with prewired direction. Moral calibration is required.”

The article’s source is “the Church of God, an international community,” which according to its website is a “a nondenominational organization based in Pasadena, California [which] traces its antecedents to Sabbatarian communities in 17th-century Europe, and before that to the first-century apostolic Church at Jerusalem.” Its tool of choice for the brain’s “moral calibration” is the Bible:

“The Bible, too, is unequivocal in the need for [moral calibration] (see, for example, Proverbs 3:31 and Job 34:2–4), adding that there is a spiritual factor responsible for imparting this ability to the human mind (Job 32:8–9)… The Bible serves as the lodestone that sets our compass’s orientation and helps us establish our moral bearings.”

But of course the Church of God didn’t write the article — an individual or collaboration of individuals wrote it, in furtherance of the Church’s culture and institutional belief system. It’s not surprising that the Bible was its cultural choice for moral calibration. Another culture might have chosen a different tool — Mein Kampf, for instance, or the ISIS Manifesto.

The article closes with reservations about the three authors’ neuro-cultural approach to morality:

“As secularists, of course, these authors cannot be expected to pursue [the Bible] in their search for the source of moral standards, especially when, as Gazzaniga notes, so much of what constitutes religious faith is founded on superstition rather than on truth. And so, as researchers improve drug cocktails to ultimately manipulate and control the brain (as Tancredi believes they will), and as society haltingly accepts science as arbiter of good and evil (as Gazzaniga believes it must), it is not too farfetched to imagine that the moral grammar Hauser describes can be refashioned as well. In fact, if history provides any clue, it seems a done deal. The only question that remains is whether our ongoing recalibrations will be for the better or for the worse.”

Yes — whether “for the better or for the worse” remains to be seen…. But according to whose cultural point of view?

[1]How The Body And Mind Talk To One Another To Understand The World,” Aeon Magazine (Feb. 15, 2019).

[2] Here’s a nice primer on this concept. And here’s another, maybe more grownup version.

[3] Hardwired Behavior:  What Neuroscience Reveals About Morality, by Laurence Tancredi, Clinical Professor of Psychiatry at New York University School of Medicine, a psychiatrist in private practice, and a lawyer who consults on criminal cases involving psychiatric issues. The Ethical Brain:  The Science of Our Moral Dilemmas, by Michael S. Gazzaniga, Professor of Psychology and Director of the SAGE Center for the Study of the Mind at the University of California, Santa Barbara. Moral Minds:  How Nature Designed Our Moral Minds, by Marc D. Hauser, Professor of Psychology, Organismic and Evolutionary Biology, and Biological Anthropology at Harvard University, where he is director of the Cognitive Evolution Laboratory and co-director of the Mind, Brain and Behavior Program.

“Be the Change You Want to See” — Why Change MUST Always Begin With Us


In the beginning, somebody…

Told a story. Made something. Made something that made things. Drew a picture. Used their voice melodiously. Moved a certain way and did it again. Took something apart, put it back together, and built another thing like it. Watched how weather and sky and flora and fauna responded to the passage of time. Sprinkled dry leaves on meat and ate it. Drew a line in the sand and beat someone who crossed it. Traded this for that. Resolved a dispute. Helped a sick person feel better. Took something shiny from the earth or sea and wore it. Had an uncanny experience and explained it.

And then somebody else did, too — and then somebody else after that, and more somebodies after that, until the human race had organized itself into families, clans, tribes, city-states, and nations, each with its own take on life in this world. Millennia later a worldwide civilization had emerged, organized around trans-cultural institutions of law, economics, science, religion, industry, commerce, education, medicine, arts and entertainment….

And then you and I were born as new members of a highly-evolved human culture of innumerable, impossibly complex, interwoven layers.

From our first breaths we were integrated into site-specific cultural institutions that informed our beliefs about how the world works and our place in it. Those institutions weren’t external to us, they were embodied in us — microbes of meaning lodged in our neural pathways and physical biome. Our brains formed around the beliefs of our culture — our neurons drank them in, and our neural networks were wired up with the necessary assumptions, logic, and leaps of faith.

These cellular structure informed what it meant for us to be alive on the Earth, individually and in community. They shaped our observations and awareness, experiences and interpretations, tastes and sensibilities. They defined what is real and imaginary, set limits around what is true and false, acceptable and taboo. And then they reinforced the rightness of it all with feelings of place and belonging, usefulness and meaning. When that was done, our brains and bodies were overlaid with a foundation for status quo — the way things are, and are supposed to be.

All that happened in an astonishing surge of childhood development. Then came puberty, when our brain and body hormones blasted into overdrive, dredging up our genetic and environmental beginnings and parading them out for reexamination. We kept this and discarded that, activated these genes instead of those. (The process by which we do that is called epigenetics, and it explains why your kids aren’t like you.) We also tried on countercultural beliefs. welcoming some and rejecting others. From there, we entered adult life freshly realigned with a differentiated sense of self, us, and them.

From there, adult life mostly reinforces our cultural beginnings, although the nuisances and opportunities of change periodically require us to make and reaffirm shared agreements in our communities, professions, workplaces, teams, and other groups, each time reaffirming and refining our shared cultural foundations. In doing so, we sometimes flow with the changing times, and sometimes retrench with nostalgic fervor.

Where does all this biological, cognitive, and social development and maintenance happen? In the only place it possibly could:  in the hot wet darkness inside the human body’s largest organ —   our skin. Yes, there is a “real world” out there that we engage with, but the processing and storing of experience happen inside — encoded in our brains and bodies.

be the changeWhich is why individual and cultural change must always begin with us — literally inside of us, in our physical makeup — because that’s where our world and our experience of it are registered and maintained. Gandhi’s famous words are more than a catchy meme, they describe basic human reality:  if we want things to change, then we must be transformed. Think about it:  we have no belief, perception, experience, or concept of status quo that is not somehow registered in our brains and bodies, so where else could change happen? (Unless there’s something like a humanCloud where it can be uploaded and downloaded — but that’s another issue for another time.)

The implications of locating human experience in our physical selves are far-reaching and fascinating. We’ll be exploring them.

#icons #iconoclast #psychology “philosophy #sociology #neurology #biology #narrative #belief #society #socialstudies #religion #law #economics #work #jobs #science #industry #commerce #education #medicine #arts #entertainment #civilization #evolution #perception #reality #subjective #culture #culturalchange #change #paradigmshift #transformation #growth #personalgrowth #futurism #technology #identity #rational #consciousness #cognition #bias #cognitivebias #brain #development #childdevelopment #puberty #adolescence #hormones #genetics #epigenetics #gandhi #bethechange #bethechangeyouwant #neurons #neuralnetworks