Knowledge, Conviction, and Belief [3]

Janus

We’ve been talking about dualistic thinking — the kind that leads us to think we live simultaneously in two realities.

Reality A is “life in the flesh” — bound by space and time and all the imperfections of what it means to be human. It is life carried on in our physical bodies, where our impressive but ultimately limited brains are in charge.

Reality B is “life in the spirit” — the eternal, perfect, transcendent, idealized, supernatural, original source that informs, explains, and guides its poorer counterpart.

This dualistic thinking says there’s more to life than meets the eye, that humans are an “eternal soul having a worldly existence.” The dualism set ups a cascade of derivative beliefs, for example:

There’s a difference between the Reality A identity and experience we actually have and the Reality B identity and experience we would have if we could rise above Reality A and live up to the idealized version of Reality B.

Every now and then, somebody gets lucky or gets saved or called, and gets to live out their Reality B destiny, which gives them and their lives a heightened sense of purpose and meaning.

But those are the chosen few, and they’re rare. For most of us, our ordinary selves and mundane lives are only a shadow of our “higher selves” and “greater potential.”

The chosen few can — and often do — provide guidance as to how we can do better, and we do well to find some compatible relation with one of more of them, but sometimes, in the right setting and circumstance, we might discover that we have receptors of our own that can receive signals from Reality B. We call this “enlightenment” or “conversion” or “salvation” or something like that, and it’s wonderful, blissful, and euphoric.

But most of the time, for the vast majority of us, Reality A is guided by a mostly one-way communication with Reality B — a sort of moment-by-moment data upload from A to B, where everything about us and our lives — every conscious and subconscious intent, motive, thought, word, and deed — gets stored in a failsafe beyond-time data bank. When our Reality A lives end, those records determine what happens next — they inform our next trip through Reality A, or set the stage for Reality B existence we’re really going to like or we’re really going to suffer.

Everybody pretty much agrees it’s useful to have good communication with or awareness of Reality B, because that helps us live better, truer, happier, more productive lives in Reality A, and because it creates a better data record when our Reality A existence ends and we pass over to Reality B.

And on it goes. No, we don’t express any of it that way:  our cultural belief systems and institutions — religious doctrines, moral norms, legal codes, academic fields of study, etc. — offer better- dressed versions. But it’s remarkable how some version of those beliefs finds its way into common notions about  how life works.

At the heart of it all is our conviction — not knowledge — that this thing we consciously know as “me” is an independent self that remains intact and apart from the biological messiness of human life, able to choose its own beliefs, make its own decisions, and execute its own actions. In other words, we believe in consciousness, free will, and personal responsibility for what we are and do — and what we aren’t and don’t do — during what is only a sojourn — a short-term stay — on Earth.

Those beliefs explain why, for example,  it bothers us so much when someone we thought we knew departs from their beginnings and instead displays a changed inner and outer expression of who they were when we thought we knew them. “Look who’s in the big town,” we say. Or we pity them and knock wood and declare thank goodness we’ve been lucky. Or we put them on the prayer chain or call them before the Inquisition… anything but entertain the idea that maybe Reality B isn’t there– along with all the belief it takes to create it — and that instead all we have is Reality A — we’re nothing but flesh and bone.

It’s almost impossible to think that way. To go there, we have to lay aside conviction and embrace knowledge.

Almost impossible.

Almost.

We’ll give it a try in the coming weeks.

“Before You Were Born I Knew You”

The_Summoner_-_Ellesmere_Chaucer-300x282The Summoner in Chaucer’s The Canterbury Tales,
Ellesmere MSS, circa 1400

Last time we looked at the common dualistic paradigm of consciousness, which is based on (a) the belief that humans are made in two parts — an ethereal self housed in a physical body — and (b) the corollary belief that religion and the humanities understand the self best, while science is the proper lens for the body.

Current neuroscience theorizes instead that consciousness arises from brain, body, and environment — all part of the physical, natural world, and therefore best understood by scientific inquiry.

We looked at the origins of the dualistic paradigm last time. This week, we’ll look at an example of how it works in the world of jobs and careers —  particularly the notion of being “called” to a “vocation.”

According to the Online Etymology Dictionary, the notion of “calling” entered the English language around Chaucer’s time, originating from Old Norse kalla — “to cry loudly, summon in a loud voice; name, call by name.” Being legally summoned wasn’t a happy thing in Chaucer’s day (it still isn’t), and summoners were generally wicked, corrupt, and otherwise worthy of Chaucer’s pillory in The Friar’s Tale.

“Calling” got an image upgrade a century and a half later, in the 1550’s, when the term acquired the connotation of “vocation, profession, trade, occupation.” Meanwhile, “vocation” took on the meaning of “spiritual calling,” from Old French vocacio, meaning “call, consecration; calling, profession,” and Latin vocationem — “a calling, a being called” to “one’s occupation or profession.”

“Calling” and “vocation” together support the common dream of being able to do the work we were born to do, and the related belief that this would make our work significant and us happy. The idea of vocational calling is distinctly Biblical:[1]

“Before I formed you in the womb I knew you,
and before you were born I consecrated you;
I appointed you a prophet to the nations.”

Jeremiah 1:5 (ESV

Something in us — an evolutionary survival instinct, I would guess — wants to be known, especially by those in power. Vocational calling invokes power at the highest level:  never mind your parents’ hormones, you were a gleam in God’s eye; and never mind the genes you inherited, God coded vocational identity and purpose into your soul.

2600 years after Jeremiah, we’re still looking for the same kind of affirmation.

“Amy Wrzesniewski, a professor at Yale School of Management and a leading scholar on meaning at work, told me that she senses a great deal of anxiety among her students and clients. ‘They think their calling is under a rock,’ she said, ‘and that if they turn over enough rocks, they will find it.’ If they do not find their one true calling, she went on to say, they feel like something is missing from their lives and that they will never find a job that will satisfy them. And yet only about one third to one half of people whom researchers have surveyed see their work as a calling. Does that mean the rest will not find meaning and purpose in their careers?”

The Power of Meaning:  Crafting a Life That Matters, Emily Esfahani Smith

If only one-third to one-half of us feel like we’re living our vocational calling, then why do we hang onto the dream? Maybe the problem is what Romantic Era poet William Wordsworth wrote about in his Ode:  Intimations of Immortality:

“Our birth is but a sleep and a forgetting:
The Soul that rises with us, our life’s Star,
Hath had elsewhere its setting,
And cometh from afar:
Not in entire forgetfulness,
And not in utter nakedness,
But trailing clouds of glory do we come
From God, who is our home:
Heaven lies about us in our infancy!

“Shades of the prison-house begin to close
Upon the growing Boy,
But he beholds the light, and whence it flows,
He sees it in his joy;
The Youth, who daily farther from the east
Must travel, still is Nature’s Priest,
And by the vision splendid
Is on his way attended;
At length the Man perceives it die away,
And fade into the light of common day.”

I.e., maybe something tragic happens when an immortal self comes to live in a mortal body. This, too, is a common corollary belief to body/soul dualism — religion’s distrust of “the flesh” is standard issue.

Cognitive neuroscientist Christian Jarrett offers career advice to the afflicted:  you might be able to turn the job you already have into a calling if you invest enough in it, or failing that, you might find your source of energy and determination somewhere else than in your work. This Forbes article reaches a similar conclusion:

“Years ago, I read a very thought-provoking article by Michael Lewis … about the difference between a calling and a job. He had some powerful insights. What struck me most were two intriguing concepts:

‘There’s a direct relationship between risk and reward. A fantastically rewarding career usually requires you to take fantastic risks.’

‘A calling is an activity that you find so compelling that you wind up organizing your entire self around it — often to the detriment of your life outside of it.’”

I.e., maybe career satisfaction isn’t heaven-sent; maybe instead it’s developed in the unglamorous daily grind of life in the flesh.

More on historical roots and related beliefs coming up.

[1] For more Biblical examples, see Isaiah 44:24:  Thus says the Lord, your Redeemer, who formed you from the womb: Galatians 1:15:  But when he who had set me apart before I was born; Psalm 139:13, 16:  13  For you formed my inward parts; you knitted me together in my mother’s womb; your eyes saw my unformed substance; in your book were written, every one of them, the days that were formed for me, when as yet there was none of them.

Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).

Why Faith Endures

Jesus replied, “No one who puts a hand to the plow and looks back
is fit for service in the kingdom of God.”

Luke 9: 62 NIV

I once told a leader of our campus Christian fellowship about doubts prompted by my religion major classes. “Get your Bible and read Luke 9: 62,” he said. I did, and can still see the hardness on his face when I looked up. Religions venerate those who long endure, honoring their moral steadfastness. My character and commitment were suspect. I declared a new major the following quarter.

Scarlet letterReligions punish doubt and dissidence through peer pressure, public censure, witch hunts, inquisitions, executions, jihads, war, genocide…. The year before, the dining halls had flown into an uproar the day the college newspaper reported that the fellowship had expelled a member for sleeping with her boyfriend.

Religions also have a curious way of tolerating their leaders’ nonconforming behavior — even as the leaders cry witch hunt.[1]

These things happen in all cultural institutions, not just religion. Neuroculture offers an explanation for all of them that emphasizes group dynamics over individual integrity. It goes like this:

  • When enough people believe something, a culture with a shared belief system emerges.
  • Individual doubt about the culture’s belief system introduces “cognitive dissonance” that makes individuals uneasy and threatens cultural cohesiveness.
  • Cohesiveness is essential to the group’s survival — doubt and nonconformity can’t be tolerated.
  • The culture therefore sanctifies belief and stifles doubt.
  • The culture sometimes bends its own rules to preserve its leadership power structure against larger threats.

This Article Won’t Change Your Mind,” The Atlantic (March 2017) illustrates this process:

“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Failstheir 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’

“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”

Why Facts Don’t Change Our Minds,The New Yorker (Feb. 27, 2017) explains why the process seems so perfectly reasonable:

“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain.

“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”

What does it take for individual dissent or cultural change to prevail in the face of these powerful dynamics? We’ll look at that next time.

[1]  This “bigger bully” theory was remarkably evident when Tony Perkins, leader of the Family Research Council, said evangelicals “kind of gave [Donald Trump] a mulligan” over Stormy Daniels, saying that evangelicals “were tired of being kicked around by Barack Obama and his leftists. And I think they are finally glad that there’s somebody on the playground that’s willing to punch the bully.”

Why Belief Works

Our experience of the “real world” will conform to what we believe. It has to, because our brains insist upon it.

They do that in part through neuro-cultural conditioning — the process by which the neurological wiring of a culture’s individual members is patterned after the culture’s belief system, and vice versa. This is the case with any kind of cultural institution, whether national, religious, scientific, economic, corporate, professional, team, tribal, or otherwise.[1] This post looks at religion as an example.[2]

Tim Crane is a professor of philosophy at the Central European University in Budapest. “I work in the philosophy of mind,” his online CV says, “I have attempted to address questions about the most general nature, or essence, of the human mind, and about the place of the mind in the rest of nature.” In his book The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), he cites William James’ 1902 classic The Varieties of Religious Experience for a definition of what he calls “the religious impulse”:

“Were one asked to characterize the life of religion in the broadest and most general terms, one might say that it consists in the belief that there is an unseen order, and that our supreme good lies in harmoniously adjusting ourselves thereto.”

Christian Smith is a sociology professor and director of the Center for the Study of Religion and Society at the University of Notre Dame. Here’s his definition of religion:

“Religion is a complex of culturally prescribed practices, based on promises about the existence and nature of supernatural powers, whether personal or impersonal, which seek to help practitioners gain access to and communicate or align themselves with these powers, in hopes of realizing human goods and avoiding things bad.”

Religion: What It Is, How It Works, And Why It Matters (Princeton University Press, 2017)

Both authors stress that religious principles and practices need to match in order for religion to be effective. In other words:

“Faith without works is dead.”
The Epistle of James 2: 17

As it turns out, “faith without works is dead” is not just scripture, but accurate neuroscience as well. When we practice what we preach, we set up a self-sustaining loop in which belief drives thoughts and behavior, which in turn reinforce belief. In that way, religion develops the brain while the brain develops religion:

“Jordan Grafman, head of the cognitive neuroscience laboratory at Rehabilitation Institute of Chicago and neurology professor at Northwestern University, says that neurotheology is important in part because early religious practices helped develop our brains to begin with. ‘Religion has played an incredibly important role in human evolution. It’s funny, people want to separate the two but in fact they’re intertwined.’”

The Neuroscience Argument That Religion Shaped The Very Structure Of Our Brains,” Quartz (December 3, 2016)

The more widespread and enduring the religious practice, the more the religion develops scriptures, rituals, icons, and institutions to sustain itself. Therefore a Bible passage such as this…

“I was young and now I am old,
yet I have never seen the righteous forsaken
 or their children begging bread.”
Psalm 37: 25 NIV

… becomes both community truth and the “testimony” of individual adherents. But what happens when belief and experience don’t align — e.g., when a member of the congregation and her children in fact go begging?

Some religious thinkers, like the writer of this Huffington Post article, reckon with the contradiction by distinguishing belief from faith. Beliefs are products of the mind, she says, and deal with what can be known, while faith is a product of the spirit, which traffics in what cannot be known. Since knowledge is always shifting, belief can and probably will let us down, while faith in what can’t be known remains inscrutable. Faith therefore invites belief to step aside in favor of “trusting beyond all reason and evidence.”

That outlook captures the essential center of the definitions of religion we saw above:  that there is a “divine order” populated with “supernatural powers” that exists alongside but separate from ours. (Of which we have only limited understanding, the belief/faith outlook would add.)  Whether this satisfies the brain’s need to align internal patterning with external experience is the kind of issue being taken up by the new discipline of neurotheology which looks at where religion happens in the brain.

Neurotheology’s inquiries have far-reaching implications for many of our common assumptions about how reality is structured. For example, if faith can be explained in neurological terms, then it could be located — in whole or in part — along with belief on this side of the theoretical divide between human and supernatural existence.  This shift would likely have a ripple effect on similar dichotomies, such as known vs. unknown, real vs. imaginary, objective vs. subjective, observed vs. inscrutable, temporal vs. transcendence, etc.

More on neurotheology coming up.

[1] For more on cultural patterning, see the other posts in this blog’s category The Basics of Belief. Culture, and Reality.

[2] I talk about Christianity because it is the only religion I have personal experience with. And I am aware, by the way, that I write this post under the influence of my own neuroscientific cultural bias.

Emergence

 

murmuration

One fine afternoon autumn day I watched transfixed as a gigantic flock of migratory birds swarmed over the woods across the street. I was watching a “complex, self-organizing system” in action — specifically, a “murmuration” of birds, which is created by “swarm behavior,” which in turn falls in the category of emergence.

Emergence explains how the whole becomes greater than the sum of its parts. The term is widely used — in systems theory, philosophy. psychology, chemistry, biology, neurobiology, machine learning — and for purposes of this blog, it also applies to cultural belief systems and the social institutions they generate.

Consider any culture you like — a team, club, company, profession, investor group, religious gathering, political party…. As we’ve seen previously in this series, the group’s cultural sense of reality is patterned in each individual member’s neural wiring and cellular makeup. But no one member can hold it all, and different members have varying affinity for different aspects of the culture. As a result, each member takes what the others bring “on faith”:  the group believes in its communal beliefs. This faith facilitates the emergence of a cohesive, dynamic cultural body that takes on a life of its own, expressed through its institutions. .

That’s emergence.

To get a further sense of how this works, see this TED Talk that uses complex systems theory to look at how the structure of the financial industry (a transnational cultural body) helped to bring about the Great Recession of 2007-2008. Systems theorist James B. Glattfelder[1] lays out a couple key features of self-organizing systems:

“It turns out that what looks like complex behavior from the outside is actually the result of a few simple rules of interaction. This means you can forget about the equations and just start to understand the system by looking at the interactions.

“And it gets even better, because most complex systems have this amazing property called emergence. This means that the system as a whole suddenly starts to show a behavior which cannot be understood or predicted by looking at the components. The whole is literally more than the sum of its parts.”

In the end, he says, there’s an innate simplicity to it all — “an emergent property which depends on the rules of interaction in the system. We could easily reproduce [it] with a few simple rules.”[2] He compares this outcome to the inevitable polarized logjams we get from clashing cultural ideologies:

 “I really hope that this complexity perspective allows for some common ground to be found. It would be really great if it has the power to help end the gridlock created by conflicting ideas, which appears to be paralyzing our globalized world.  Ideas relating to finance, economics, politics, society, are very often tainted by people’s personal ideologies.  Reality is so complex, we need to move away from dogma.”

Trouble is, we seem to be predisposed toward ideological gridlock and dogma. Even if we’ve never heard of emergence, we have a kind of backdoor awareness of it — that there are meta-influences affecting our lives — but we’re inclined to locate their source “out there,” instead of in our bodily selves. “Out there” is where the Big Ideas live, formulated by transcendent realities and personalities — God, gods, Fate, Destiny, Natural Law, etc. — that sometimes enter our lesser existence to reveal their take on how things work. Trouble is, they have super-intelligence while we have only a lesser version, so once we receive their revelations, we codify them into vast bodies of collected wisdom and knowledge, which we then turn over to our sacred and secular  cultural institutions to administer. We and our cultures aren’t perfect like they are, but we do our best to live up to their high standards.

We do all this because, as biocentrism champion Robert Lanza has said, most of us have trouble wrapping our heads around the notion that

“Everything we see and experience is a whirl of information occurring in our head. We are not just objects embedded in some external matrix ticking away ‘out there.’”[3]

In our defense, the kind of systems analysis that James Glattfelder uses in his TED talk requires a lot of machine super-intelligence and brute data-crunching power that the human brain lacks. We’re analog and organic, not digital, and we use our limited outlook to perpetuate more polarization, ideological gridlock. and dogma. Culture may be emergent, but when it emerges, it walks right into a never-ending committee meeting  debating whether it has a place on the agenda..

Next time, we’ll look at what happens when emergent cultures clash.

[1] James B. Glattfelder holds a Ph.D. in complex systems from the Swiss Federal Institute of Technology. He began as a physicist, became a researcher at a Swiss hedge fund. and now does quantitative research at Olsen Ltd in Zurich, a foreign exchange investment manager.

[2] Here’s a YouTube explanation of the three simple rules that explain the murmuration I watched that day.

[3] From this article in Aeon Magazine.

It’s An Inside Job

In Brain and Culture:  Neurobiology, Ideology, and Social Change, Yale Medical School professor of psychiatry Bruce E. Wexler declared that “concordance between internal structure and external reality is a fundamental human neurobiological imperative.” “Concordance” is peace of mind, which we all know is an inside job.

peace of mind

But the concordance Wexler is talking about is not the kind reserved for the enlightened few, it’s the kind that’s a brain health necessity. Our brains work unceasingly to maintain harmony between us and our surroundings, including our cultural setting. When internal and external are out of sync, the result is cognitive dissonance which, when left unresolved, leads to physical, mental, and social disease, distress and disorder. Neurological concordance is therefore a surviving and thriving skill, and can be traced to the corresponding part of the brain:

“Thanks to advances in imaging methods, especially functional MRI, researchers have recently identified key brain regions linked to cognitive dissonance. The area implicated most consistently is the posterior part of the medial frontal cortex (pMFC), known to play an important role in avoiding aversive outcomes, a powerful built-in survival instinct. In fMRI studies, when subjects lie to a peer despite knowing that lying is wrong—a task that puts their actions and beliefs in conflict—the pMFC lights up.”[1]

cognitive dissonance

The straightest  path to concordance is conformity. Nonconformity, on the other hand, generates both intracultural and intercultural neurological conflict. [2] This potential for conflict was the context for Wexler’s peace of mind declaration — let’s hear it again, the full quote this time:

“This book argues that differences in belief systems can themselves occasion intercultural violence, since concordance between internal structure and external reality is a fundamental human neurobiological imperative.” (Emphasis added.)

Peace of mind therefore requires the alignment of inner and outer belief systems. This article[3] defines the term:

“Belief systems are the stories we tell ourselves to define our personal sense of Reality. Every human being has a belief system that they utilize, and it is through this mechanism that we individually ‘make sense’ of the world around us.

 “The species Homo sapiens developed so-called belief systems. These are sets of beliefs reinforced by culture, theology, and experience and training as to how the world works cultural values, stereotypes, political viewpoints, etc.”

In order for personal (internal) and shared (external) belief systems to align, the culture’s members must share comparable neural pathways, consciousness, perceptions, sensory tastes, physiology, and the like.[4] When they do, the culture becomes recognizable in its members. Think of the Olympics’ opening ceremony parade of athletes:  the Yanks are obviously the Yanks — nobody else has quite their swashbuckling sense of derring-do. Or think of professional cultures — lawyers, accountants, engineers, physicians — meet one, and you can just tell. Or remember what it’s like to visit a foreign culture — it’s not just the signage but it’s… well, everything —  how people look, sound, act, their customs and values….

All of that is the result of biological, chemical, environmental, and other influences, all stored in individual brains and bodies. But how is cultural patterning transmitted from one individual to another? John R. Searle, Professor of Philosophy, University of California, Berkeley, wanted to know, and finding out led to his seminal book The Construction of Social Reality:

“This book is about a problem that has puzzled me for a long time:  there are portions of the real world, objective facts in the world that are only facts by human agreement. In a sense there are things that exist only because we believe them to exist. I am thinking about things like money, property, governments, and marriage.

“If everybody thinks that this sort of thing is money, and they use it as money and treat it as money, then it is money. If nobody ever thinks this sort of thing is money, then it is not money. And what goes for money goes for elections, private property, wars, voting, promises, marriages, buying and selling, political offices, and so on.”

“How can there be an objective world of money, property, marriage, governments, elections, football games, cocktail parties and law courts in a world that consists entirely of physical particles in fields of force, and in which some of these particles are organized into systems that are conscious biological beasts, such as ourselves?”

This article[5] provides this summary answer to Searle’s questions:

“Because mental states cannot be transferred physically, they must be transferred by being re-created in the mind of the receiving individual.

“[W]hat is transmitted is some state of mind that produces behavior.

“[The transmitted state of mind includes] a myriad of… beliefs, values, desires, definitions, attitudes, and emotional states such as fear, regret, or pride.”

Vastly simplified, the process of enculturation looks like this:

  • New members enter via an entry point such as birth, naturalization, initiation, etc.
  • They observe the culture’s members thinking and behaving in the culture’s characteristic ways.
  • Through observation and imitation, they take on the culture’s mindset and become habituated into its belief and behavioral norms.
  • In time, they become recognizable as members of the culture along with its other members.
  • Then, an organizing principle called “emergence” asserts itself, so that the whole culture takes on a life of its own that is bigger than the sum of its individual members.

We’ll talk about emergence next time.

[1] “What Happens to the Brain During Cognitive Dissonance?” Scientific American Mind (Nov. 2015).

[2] There’s been a lot of research on conformity and nonconformity in the past ten years. If you’re interested in digging deeper, searching “neuroscience of conformity” and “neuroscience of nonconformity” will turn up several scholarly studies.

[3] “What Are Belief Systems?” Usó-Doménech and J. Nescolarde-Selva, Department of Applied Mathematics. University of Alicante. Alicante. Spain.

[4] See the prior post, Microbes of Meaning.

[5]Evolution of Mind, Brain, and Culture”, Philip G. Chase, former Senior Research Scientist and Consulting Scholar at the University of Pennsylvania,