Knowledge, Conviction, and Belief [3]

Janus

We’ve been talking about dualistic thinking — the kind that leads us to think we live simultaneously in two realities.

Reality A is “life in the flesh” — bound by space and time and all the imperfections of what it means to be human. It is life carried on in our physical bodies, where our impressive but ultimately limited brains are in charge.

Reality B is “life in the spirit” — the eternal, perfect, transcendent, idealized, supernatural, original source that informs, explains, and guides its poorer counterpart.

This dualistic thinking says there’s more to life than meets the eye, that humans are an “eternal soul having a worldly existence.” The dualism set ups a cascade of derivative beliefs, for example:

There’s a difference between the Reality A identity and experience we actually have and the Reality B identity and experience we would have if we could rise above Reality A and live up to the idealized version of Reality B.

Every now and then, somebody gets lucky or gets saved or called, and gets to live out their Reality B destiny, which gives them and their lives a heightened sense of purpose and meaning.

But those are the chosen few, and they’re rare. For most of us, our ordinary selves and mundane lives are only a shadow of our “higher selves” and “greater potential.”

The chosen few can — and often do — provide guidance as to how we can do better, and we do well to find some compatible relation with one of more of them, but sometimes, in the right setting and circumstance, we might discover that we have receptors of our own that can receive signals from Reality B. We call this “enlightenment” or “conversion” or “salvation” or something like that, and it’s wonderful, blissful, and euphoric.

But most of the time, for the vast majority of us, Reality A is guided by a mostly one-way communication with Reality B — a sort of moment-by-moment data upload from A to B, where everything about us and our lives — every conscious and subconscious intent, motive, thought, word, and deed — gets stored in a failsafe beyond-time data bank. When our Reality A lives end, those records determine what happens next — they inform our next trip through Reality A, or set the stage for Reality B existence we’re really going to like or we’re really going to suffer.

Everybody pretty much agrees it’s useful to have good communication with or awareness of Reality B, because that helps us live better, truer, happier, more productive lives in Reality A, and because it creates a better data record when our Reality A existence ends and we pass over to Reality B.

And on it goes. No, we don’t express any of it that way:  our cultural belief systems and institutions — religious doctrines, moral norms, legal codes, academic fields of study, etc. — offer better- dressed versions. But it’s remarkable how some version of those beliefs finds its way into common notions about  how life works.

At the heart of it all is our conviction — not knowledge — that this thing we consciously know as “me” is an independent self that remains intact and apart from the biological messiness of human life, able to choose its own beliefs, make its own decisions, and execute its own actions. In other words, we believe in consciousness, free will, and personal responsibility for what we are and do — and what we aren’t and don’t do — during what is only a sojourn — a short-term stay — on Earth.

Those beliefs explain why, for example,  it bothers us so much when someone we thought we knew departs from their beginnings and instead displays a changed inner and outer expression of who they were when we thought we knew them. “Look who’s in the big town,” we say. Or we pity them and knock wood and declare thank goodness we’ve been lucky. Or we put them on the prayer chain or call them before the Inquisition… anything but entertain the idea that maybe Reality B isn’t there– along with all the belief it takes to create it — and that instead all we have is Reality A — we’re nothing but flesh and bone.

It’s almost impossible to think that way. To go there, we have to lay aside conviction and embrace knowledge.

Almost impossible.

Almost.

We’ll give it a try in the coming weeks.

Who’s In Charge Here?

Edelweiss mit Blüten,Wallis, Schweiz.
© Michael Peuckert

Edelweiss, edelweiss
Every morning you greet me

Small and white
Clean and bright
You look happy to meet me

(A little exercise in anthropomorphism
from The Sound of Music)

This hierarchy of consciousness we looked at last time — ours is higher than the rest of creation, angels’ is higher than ours, God’s is highest — is an exercise in what philosophy calls teleology:   “the explanation of phenomena in terms of the purpose they serve.” Teleology is about cause and effect — it looks for design and purpose, and its holy grail is what psychologists call agency:  who or what is causing things we can’t control or explain.

“This agency-detection system is so deeply ingrained that it causes us to attribute agency in all kinds of natural phenomena, such as anger in a thunderclap or voices in the wind, resulting in our universal tendency for anthropomorphism.

“Stewart Guthrie, author of Faces in the Clouds:  A New Theory of Religion, argues that ‘anthropomorphism may best be explained as the result of an attempt to see not what we want to see or what is easy to see, but what is important to see:  what may affect us, for better or worse.’ Because of our powerful anthropomorphic tendency, ‘we search everywhere, involuntarily and unknowingly, for human form and results of human action, and often seem to find them where they do not exist.’”

The Patterning Instinct:  A Cultural History of Humanity’s Search for Meaning, Jeremy Lent (2017)

Teleological thinking is a characteristic feature of religious, magical, and supernatural thinking:

“Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, ‘Individuals’ explicit religious and paranormal beliefs are the best predictors of their perception of purpose in life events’—their tendency ‘to view the world in terms of agency, purpose, and design.”

How American Lost its Mind, The Atlantic (Sept. 2017)

Psychology prof Clay Routledge describes how science debunks teleology, but also acknowledges why it’s a comfortable way of thinking:

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless. From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it.

“For most humans, the idea that life is inherently meaningless simply will not do.

“Instead, people latch onto what I call teleological thinking. Teleological thinking is when people perceive phenomena in terms of purpose. When applied to natural phenomena, this type of thinking is generally considered to be flawed because it imposes design where there is no evidence for it.  To impose purpose and design where there is none is what researchers refer to as a teleological error.”

Supernatural: Death, Meaning, and the Power of the Invisible World, Clay Routledge (2018)

It’s one thing to recognize “teleological error,” it’s another to resist it — even for those who pride themselves on their rationality:

“Even atheists who reject the supernatural and scientists who are trained not to rely on teleological explanations of the world do, in fact, engage in teleological thinking.

“Many people who reject the supernatural do so through thoughtful reasoning. … However, when these people are making teleological judgments, they are not fully deploying their rational thinking abilities.

“Teleological meaning comes more from an intuitive feeling than it does from a rational decision-making process.”

Supernatural: Death, Meaning, and the Power of the Invisible World

Teleological thinking may be understandable, but scientist and medical doctor Paul Singh comes down hard on the side of science as the only way to truly “know” something:

“All scientists know that the methods we use to prove or disprove theories are the only dependable methods of understanding our universe. All other methodologies of learning, while appropriate to employ in situations when science cannot guide us, are inherently flawed. Reasoning alone — even the reasoning of great intellects — is not enough. It must be combined with the scientific method if it is to yield genuine knowledge about the universe.”

The Great Illusion:  The Myth of Free Will, Consciousness, and the Self, Paul Singh (2016)

After admitting that “evidence shows that the human brain is universally delusional in many ways,” Singh makes his case that “the use of logic and scientific skepticism is a skill that can be used to overcome the limitations of our own brains.”

Next time, we’ll look more into the differences in how science and religion “know” things to be “true.”

“Fearfully and Wonderfully Made”

da vinci

We are starting this series on Consciousness and the Self by looking at some of the religious and secular foundations of the belief that humans are a dualist entity consisting of body and soul, and the associated belief that the two elements are best understood by different forms of inquiry — religion and the humanities for the soul, and science for the body. As we’ll see, current neuro-biological thinking defies these beliefs and threatens their ancient intellectual, cultural, and historical dominance.

This article[1] is typical in its conclusion that one of the things that makes human beings unique is our “higher consciousness.”

“[Home sapiens] sits on top of the food chain, has extended its habitats to the entire planet, and in recent centuries, experienced an explosion of technological, societal, and artistic advancements.

“The very fact that we as human beings can write and read articles like this one and contemplate the unique nature of our mental abilities is awe-inspiring.

“Neuroscientist V.S. Ramachandran said it best: ‘Here is this three-pound mass of jelly you can hold in the palm of your hand…it can contemplate the meaning of infinity, and it can contemplate itself contemplating the meaning of infinity.’

“Such self-reflective consciousness or ‘meta-wondering’ boosts our ability for self-transformation, both as individuals and as a species. It contributes to our abilities for self-monitoring, self-recognition and self-identification.”

The author of the following Biblical passage agrees, and affirms that his “soul knows it very well” — i.e., not only does he know he’s special, but he knows that he knows it:

For you formed my inward parts;
    you knitted me together in my mother’s womb.
I praise you, for I am fearfully and wonderfully made.
Wonderful are your works;
    my soul knows it very well.

Psalm 139: 13-16 (ESV)

Judging from worldwide religious practice, the “I” that is “fearfully and wonderfully made” is limited to the soul, not the body:  the former feels the love, while the latter is assaulted with unrelenting, vicious, sometimes horrific verbal and physical abuse. “Mortification of the flesh” indeed –as if the body needs help being mortal.

Science apparently concurs with this dismal assessment. The following is from the book blurb for Through a Glass Brightly:  Using Science to See Our Species as We Really Are, by evolutionary biologist and psychologist David P. Barash (2018):

“In Through a Glass Brightly, noted scientist David P. Barash explores the process by which science has, throughout time, cut humanity ‘down to size,’ and how humanity has responded. A good paradigm is a tough thing to lose, especially when its replacement leaves us feeling more vulnerable and less special. And yet, as science has progressed, we find ourselves–like it or not–bereft of many of our most cherished beliefs, confronting an array of paradigms lost.

“Barash models his argument around a set of “old” and “new” paradigms that define humanity’s place in the universe. This new set of paradigms [includes] provocative revelations [such as] whether human beings are well designed… Rather than seeing ourselves through a glass darkly, science enables us to perceive our strengths and weaknesses brightly and accurately at last, so that paradigms lost becomes wisdom gained. The result is a bracing, remarkably hopeful view of who we really are.”

Barash’s old and new paradigms about the body are as follows:

“Old paradigm:  The human body is a wonderfully well constructed thing, testimony to the wisdom of an intelligent designer.

“New paradigm:  Although there is much in our anatomy and physiology to admire, we are in fact jerry-rigged and imperfect, testimony to the limitations of a process that is nothing but natural and that in no way reflects supernatural wisdom or benevolence.”

Okay, so maybe the body has issues, but the old paradigm belief that human-level consciousness justifies lording it over the rest of creation is as old as the first chapter of the Bible:

And God blessed them. And God said to them,
“Be fruitful and multiply and fill the earth and subdue it
and have dominion over the fish of the sea
 and over the birds of the heavens
 and over every living thing that moves on the earth.”

Genesis 1:28  (ESV)

The Biblical mandate to “subdue” the earth explains a lot about how we approach the rest of creation — something people seem to be questioning more and more these days. Psychiatrist, essayist, and Oxford Fellow Neel Burton includes our superiority complex in his list of self-deceptions:

“Most people see themselves in a much more positive light than others do them, and possess an unduly rose-tinted perspective on their attributes, circumstances, and possibilities. Such positive illusions, as they are called, are of three broad kinds, an inflated sense of one’s qualities and abilities, an illusion of control over things that are mostly or entirely out of one’s control, and an unrealistic optimism about the future.” [2]

Humans as the apex of creation? More on that next time.

[1] What is it That Makes Humans Unique? Singularity Hub, Dec. 28, 2017.

[2] Hide and Seek:  The Psychology of Self-Deception (Acheron Press, 2012).

“Before You Were Born I Knew You”

The_Summoner_-_Ellesmere_Chaucer-300x282The Summoner in Chaucer’s The Canterbury Tales,
Ellesmere MSS, circa 1400

Last time we looked at the common dualistic paradigm of consciousness, which is based on (a) the belief that humans are made in two parts — an ethereal self housed in a physical body — and (b) the corollary belief that religion and the humanities understand the self best, while science is the proper lens for the body.

Current neuroscience theorizes instead that consciousness arises from brain, body, and environment — all part of the physical, natural world, and therefore best understood by scientific inquiry.

We looked at the origins of the dualistic paradigm last time. This week, we’ll look at an example of how it works in the world of jobs and careers —  particularly the notion of being “called” to a “vocation.”

According to the Online Etymology Dictionary, the notion of “calling” entered the English language around Chaucer’s time, originating from Old Norse kalla — “to cry loudly, summon in a loud voice; name, call by name.” Being legally summoned wasn’t a happy thing in Chaucer’s day (it still isn’t), and summoners were generally wicked, corrupt, and otherwise worthy of Chaucer’s pillory in The Friar’s Tale.

“Calling” got an image upgrade a century and a half later, in the 1550’s, when the term acquired the connotation of “vocation, profession, trade, occupation.” Meanwhile, “vocation” took on the meaning of “spiritual calling,” from Old French vocacio, meaning “call, consecration; calling, profession,” and Latin vocationem — “a calling, a being called” to “one’s occupation or profession.”

“Calling” and “vocation” together support the common dream of being able to do the work we were born to do, and the related belief that this would make our work significant and us happy. The idea of vocational calling is distinctly Biblical:[1]

“Before I formed you in the womb I knew you,
and before you were born I consecrated you;
I appointed you a prophet to the nations.”

Jeremiah 1:5 (ESV

Something in us — an evolutionary survival instinct, I would guess — wants to be known, especially by those in power. Vocational calling invokes power at the highest level:  never mind your parents’ hormones, you were a gleam in God’s eye; and never mind the genes you inherited, God coded vocational identity and purpose into your soul.

2600 years after Jeremiah, we’re still looking for the same kind of affirmation.

“Amy Wrzesniewski, a professor at Yale School of Management and a leading scholar on meaning at work, told me that she senses a great deal of anxiety among her students and clients. ‘They think their calling is under a rock,’ she said, ‘and that if they turn over enough rocks, they will find it.’ If they do not find their one true calling, she went on to say, they feel like something is missing from their lives and that they will never find a job that will satisfy them. And yet only about one third to one half of people whom researchers have surveyed see their work as a calling. Does that mean the rest will not find meaning and purpose in their careers?”

The Power of Meaning:  Crafting a Life That Matters, Emily Esfahani Smith

If only one-third to one-half of us feel like we’re living our vocational calling, then why do we hang onto the dream? Maybe the problem is what Romantic Era poet William Wordsworth wrote about in his Ode:  Intimations of Immortality:

“Our birth is but a sleep and a forgetting:
The Soul that rises with us, our life’s Star,
Hath had elsewhere its setting,
And cometh from afar:
Not in entire forgetfulness,
And not in utter nakedness,
But trailing clouds of glory do we come
From God, who is our home:
Heaven lies about us in our infancy!

“Shades of the prison-house begin to close
Upon the growing Boy,
But he beholds the light, and whence it flows,
He sees it in his joy;
The Youth, who daily farther from the east
Must travel, still is Nature’s Priest,
And by the vision splendid
Is on his way attended;
At length the Man perceives it die away,
And fade into the light of common day.”

I.e., maybe something tragic happens when an immortal self comes to live in a mortal body. This, too, is a common corollary belief to body/soul dualism — religion’s distrust of “the flesh” is standard issue.

Cognitive neuroscientist Christian Jarrett offers career advice to the afflicted:  you might be able to turn the job you already have into a calling if you invest enough in it, or failing that, you might find your source of energy and determination somewhere else than in your work. This Forbes article reaches a similar conclusion:

“Years ago, I read a very thought-provoking article by Michael Lewis … about the difference between a calling and a job. He had some powerful insights. What struck me most were two intriguing concepts:

‘There’s a direct relationship between risk and reward. A fantastically rewarding career usually requires you to take fantastic risks.’

‘A calling is an activity that you find so compelling that you wind up organizing your entire self around it — often to the detriment of your life outside of it.’”

I.e., maybe career satisfaction isn’t heaven-sent; maybe instead it’s developed in the unglamorous daily grind of life in the flesh.

More on historical roots and related beliefs coming up.

[1] For more Biblical examples, see Isaiah 44:24:  Thus says the Lord, your Redeemer, who formed you from the womb: Galatians 1:15:  But when he who had set me apart before I was born; Psalm 139:13, 16:  13  For you formed my inward parts; you knitted me together in my mother’s womb; your eyes saw my unformed substance; in your book were written, every one of them, the days that were formed for me, when as yet there was none of them.

Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).

Why Faith Endures

Jesus replied, “No one who puts a hand to the plow and looks back
is fit for service in the kingdom of God.”

Luke 9: 62 NIV

I once told a leader of our campus Christian fellowship about doubts prompted by my religion major classes. “Get your Bible and read Luke 9: 62,” he said. I did, and can still see the hardness on his face when I looked up. Religions venerate those who long endure, honoring their moral steadfastness. My character and commitment were suspect. I declared a new major the following quarter.

Scarlet letterReligions punish doubt and dissidence through peer pressure, public censure, witch hunts, inquisitions, executions, jihads, war, genocide…. The year before, the dining halls had flown into an uproar the day the college newspaper reported that the fellowship had expelled a member for sleeping with her boyfriend.

Religions also have a curious way of tolerating their leaders’ nonconforming behavior — even as the leaders cry witch hunt.[1]

These things happen in all cultural institutions, not just religion. Neuroculture offers an explanation for all of them that emphasizes group dynamics over individual integrity. It goes like this:

  • When enough people believe something, a culture with a shared belief system emerges.
  • Individual doubt about the culture’s belief system introduces “cognitive dissonance” that makes individuals uneasy and threatens cultural cohesiveness.
  • Cohesiveness is essential to the group’s survival — doubt and nonconformity can’t be tolerated.
  • The culture therefore sanctifies belief and stifles doubt.
  • The culture sometimes bends its own rules to preserve its leadership power structure against larger threats.

This Article Won’t Change Your Mind,” The Atlantic (March 2017) illustrates this process:

“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Failstheir 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’

“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”

Why Facts Don’t Change Our Minds,The New Yorker (Feb. 27, 2017) explains why the process seems so perfectly reasonable:

“Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain.

“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”

What does it take for individual dissent or cultural change to prevail in the face of these powerful dynamics? We’ll look at that next time.

[1]  This “bigger bully” theory was remarkably evident when Tony Perkins, leader of the Family Research Council, said evangelicals “kind of gave [Donald Trump] a mulligan” over Stormy Daniels, saying that evangelicals “were tired of being kicked around by Barack Obama and his leftists. And I think they are finally glad that there’s somebody on the playground that’s willing to punch the bully.”

Why Belief Works

Our experience of the “real world” will conform to what we believe. It has to, because our brains insist upon it.

They do that in part through neuro-cultural conditioning — the process by which the neurological wiring of a culture’s individual members is patterned after the culture’s belief system, and vice versa. This is the case with any kind of cultural institution, whether national, religious, scientific, economic, corporate, professional, team, tribal, or otherwise.[1] This post looks at religion as an example.[2]

Tim Crane is a professor of philosophy at the Central European University in Budapest. “I work in the philosophy of mind,” his online CV says, “I have attempted to address questions about the most general nature, or essence, of the human mind, and about the place of the mind in the rest of nature.” In his book The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), he cites William James’ 1902 classic The Varieties of Religious Experience for a definition of what he calls “the religious impulse”:

“Were one asked to characterize the life of religion in the broadest and most general terms, one might say that it consists in the belief that there is an unseen order, and that our supreme good lies in harmoniously adjusting ourselves thereto.”

Christian Smith is a sociology professor and director of the Center for the Study of Religion and Society at the University of Notre Dame. Here’s his definition of religion:

“Religion is a complex of culturally prescribed practices, based on promises about the existence and nature of supernatural powers, whether personal or impersonal, which seek to help practitioners gain access to and communicate or align themselves with these powers, in hopes of realizing human goods and avoiding things bad.”

Religion: What It Is, How It Works, And Why It Matters (Princeton University Press, 2017)

Both authors stress that religious principles and practices need to match in order for religion to be effective. In other words:

“Faith without works is dead.”
The Epistle of James 2: 17

As it turns out, “faith without works is dead” is not just scripture, but accurate neuroscience as well. When we practice what we preach, we set up a self-sustaining loop in which belief drives thoughts and behavior, which in turn reinforce belief. In that way, religion develops the brain while the brain develops religion:

“Jordan Grafman, head of the cognitive neuroscience laboratory at Rehabilitation Institute of Chicago and neurology professor at Northwestern University, says that neurotheology is important in part because early religious practices helped develop our brains to begin with. ‘Religion has played an incredibly important role in human evolution. It’s funny, people want to separate the two but in fact they’re intertwined.’”

The Neuroscience Argument That Religion Shaped The Very Structure Of Our Brains,” Quartz (December 3, 2016)

The more widespread and enduring the religious practice, the more the religion develops scriptures, rituals, icons, and institutions to sustain itself. Therefore a Bible passage such as this…

“I was young and now I am old,
yet I have never seen the righteous forsaken
 or their children begging bread.”
Psalm 37: 25 NIV

… becomes both community truth and the “testimony” of individual adherents. But what happens when belief and experience don’t align — e.g., when a member of the congregation and her children in fact go begging?

Some religious thinkers, like the writer of this Huffington Post article, reckon with the contradiction by distinguishing belief from faith. Beliefs are products of the mind, she says, and deal with what can be known, while faith is a product of the spirit, which traffics in what cannot be known. Since knowledge is always shifting, belief can and probably will let us down, while faith in what can’t be known remains inscrutable. Faith therefore invites belief to step aside in favor of “trusting beyond all reason and evidence.”

That outlook captures the essential center of the definitions of religion we saw above:  that there is a “divine order” populated with “supernatural powers” that exists alongside but separate from ours. (Of which we have only limited understanding, the belief/faith outlook would add.)  Whether this satisfies the brain’s need to align internal patterning with external experience is the kind of issue being taken up by the new discipline of neurotheology which looks at where religion happens in the brain.

Neurotheology’s inquiries have far-reaching implications for many of our common assumptions about how reality is structured. For example, if faith can be explained in neurological terms, then it could be located — in whole or in part — along with belief on this side of the theoretical divide between human and supernatural existence.  This shift would likely have a ripple effect on similar dichotomies, such as known vs. unknown, real vs. imaginary, objective vs. subjective, observed vs. inscrutable, temporal vs. transcendence, etc.

More on neurotheology coming up.

[1] For more on cultural patterning, see the other posts in this blog’s category The Basics of Belief. Culture, and Reality.

[2] I talk about Christianity because it is the only religion I have personal experience with. And I am aware, by the way, that I write this post under the influence of my own neuroscientific cultural bias.

Emergence

 

murmuration

One fine afternoon autumn day I watched transfixed as a gigantic flock of migratory birds swarmed over the woods across the street. I was watching a “complex, self-organizing system” in action — specifically, a “murmuration” of birds, which is created by “swarm behavior,” which in turn falls in the category of emergence.

Emergence explains how the whole becomes greater than the sum of its parts. The term is widely used — in systems theory, philosophy. psychology, chemistry, biology, neurobiology, machine learning — and for purposes of this blog, it also applies to cultural belief systems and the social institutions they generate.

Consider any culture you like — a team, club, company, profession, investor group, religious gathering, political party…. As we’ve seen previously in this series, the group’s cultural sense of reality is patterned in each individual member’s neural wiring and cellular makeup. But no one member can hold it all, and different members have varying affinity for different aspects of the culture. As a result, each member takes what the others bring “on faith”:  the group believes in its communal beliefs. This faith facilitates the emergence of a cohesive, dynamic cultural body that takes on a life of its own, expressed through its institutions. .

That’s emergence.

To get a further sense of how this works, see this TED Talk that uses complex systems theory to look at how the structure of the financial industry (a transnational cultural body) helped to bring about the Great Recession of 2007-2008. Systems theorist James B. Glattfelder[1] lays out a couple key features of self-organizing systems:

“It turns out that what looks like complex behavior from the outside is actually the result of a few simple rules of interaction. This means you can forget about the equations and just start to understand the system by looking at the interactions.

“And it gets even better, because most complex systems have this amazing property called emergence. This means that the system as a whole suddenly starts to show a behavior which cannot be understood or predicted by looking at the components. The whole is literally more than the sum of its parts.”

In the end, he says, there’s an innate simplicity to it all — “an emergent property which depends on the rules of interaction in the system. We could easily reproduce [it] with a few simple rules.”[2] He compares this outcome to the inevitable polarized logjams we get from clashing cultural ideologies:

 “I really hope that this complexity perspective allows for some common ground to be found. It would be really great if it has the power to help end the gridlock created by conflicting ideas, which appears to be paralyzing our globalized world.  Ideas relating to finance, economics, politics, society, are very often tainted by people’s personal ideologies.  Reality is so complex, we need to move away from dogma.”

Trouble is, we seem to be predisposed toward ideological gridlock and dogma. Even if we’ve never heard of emergence, we have a kind of backdoor awareness of it — that there are meta-influences affecting our lives — but we’re inclined to locate their source “out there,” instead of in our bodily selves. “Out there” is where the Big Ideas live, formulated by transcendent realities and personalities — God, gods, Fate, Destiny, Natural Law, etc. — that sometimes enter our lesser existence to reveal their take on how things work. Trouble is, they have super-intelligence while we have only a lesser version, so once we receive their revelations, we codify them into vast bodies of collected wisdom and knowledge, which we then turn over to our sacred and secular  cultural institutions to administer. We and our cultures aren’t perfect like they are, but we do our best to live up to their high standards.

We do all this because, as biocentrism champion Robert Lanza has said, most of us have trouble wrapping our heads around the notion that

“Everything we see and experience is a whirl of information occurring in our head. We are not just objects embedded in some external matrix ticking away ‘out there.’”[3]

In our defense, the kind of systems analysis that James Glattfelder uses in his TED talk requires a lot of machine super-intelligence and brute data-crunching power that the human brain lacks. We’re analog and organic, not digital, and we use our limited outlook to perpetuate more polarization, ideological gridlock. and dogma. Culture may be emergent, but when it emerges, it walks right into a never-ending committee meeting  debating whether it has a place on the agenda..

Next time, we’ll look at what happens when emergent cultures clash.

[1] James B. Glattfelder holds a Ph.D. in complex systems from the Swiss Federal Institute of Technology. He began as a physicist, became a researcher at a Swiss hedge fund. and now does quantitative research at Olsen Ltd in Zurich, a foreign exchange investment manager.

[2] Here’s a YouTube explanation of the three simple rules that explain the murmuration I watched that day.

[3] From this article in Aeon Magazine.

“Be the Change You Want to See” — Why Change MUST Always Begin With Us

the-beginning-e1503252471356

In the beginning, somebody…

Told a story. Made something. Made something that made things. Drew a picture. Used their voice melodiously. Moved a certain way and did it again. Took something apart, put it back together, and built another thing like it. Watched how weather and sky and flora and fauna responded to the passage of time. Sprinkled dry leaves on meat and ate it. Drew a line in the sand and beat someone who crossed it. Traded this for that. Resolved a dispute. Helped a sick person feel better. Took something shiny from the earth or sea and wore it. Had an uncanny experience and explained it.

And then somebody else did, too — and then somebody else after that, and more somebodies after that, until the human race had organized itself into families, clans, tribes, city-states, and nations, each with its own take on life in this world. Millennia later a worldwide civilization had emerged, organized around trans-cultural institutions of law, economics, science, religion, industry, commerce, education, medicine, arts and entertainment….

And then you and I were born as new members of a highly-evolved human culture of innumerable, impossibly complex, interwoven layers.

From our first breaths we were integrated into site-specific cultural institutions that informed our beliefs about how the world works and our place in it. Those institutions weren’t external to us, they were embodied in us — microbes of meaning lodged in our neural pathways and physical biome. Our brains formed around the beliefs of our culture — our neurons drank them in, and our neural networks were wired up with the necessary assumptions, logic, and leaps of faith.

These cellular structure informed what it meant for us to be alive on the Earth, individually and in community. They shaped our observations and awareness, experiences and interpretations, tastes and sensibilities. They defined what is real and imaginary, set limits around what is true and false, acceptable and taboo. And then they reinforced the rightness of it all with feelings of place and belonging, usefulness and meaning. When that was done, our brains and bodies were overlaid with a foundation for status quo — the way things are, and are supposed to be.

All that happened in an astonishing surge of childhood development. Then came puberty, when our brain and body hormones blasted into overdrive, dredging up our genetic and environmental beginnings and parading them out for reexamination. We kept this and discarded that, activated these genes instead of those. (The process by which we do that is called epigenetics, and it explains why your kids aren’t like you.) We also tried on countercultural beliefs. welcoming some and rejecting others. From there, we entered adult life freshly realigned with a differentiated sense of self, us, and them.

From there, adult life mostly reinforces our cultural beginnings, although the nuisances and opportunities of change periodically require us to make and reaffirm shared agreements in our communities, professions, workplaces, teams, and other groups, each time reaffirming and refining our shared cultural foundations. In doing so, we sometimes flow with the changing times, and sometimes retrench with nostalgic fervor.

Where does all this biological, cognitive, and social development and maintenance happen? In the only place it possibly could:  in the hot wet darkness inside the human body’s largest organ —   our skin. Yes, there is a “real world” out there that we engage with, but the processing and storing of experience happen inside — encoded in our brains and bodies.

be the changeWhich is why individual and cultural change must always begin with us — literally inside of us, in our physical makeup — because that’s where our world and our experience of it are registered and maintained. Gandhi’s famous words are more than a catchy meme, they describe basic human reality:  if we want things to change, then we must be transformed. Think about it:  we have no belief, perception, experience, or concept of status quo that is not somehow registered in our brains and bodies, so where else could change happen? (Unless there’s something like a humanCloud where it can be uploaded and downloaded — but that’s another issue for another time.)

The implications of locating human experience in our physical selves are far-reaching and fascinating. We’ll be exploring them.

#icons #iconoclast #psychology “philosophy #sociology #neurology #biology #narrative #belief #society #socialstudies #religion #law #economics #work #jobs #science #industry #commerce #education #medicine #arts #entertainment #civilization #evolution #perception #reality #subjective #culture #culturalchange #change #paradigmshift #transformation #growth #personalgrowth #futurism #technology #identity #rational #consciousness #cognition #bias #cognitivebias #brain #development #childdevelopment #puberty #adolescence #hormones #genetics #epigenetics #gandhi #bethechange #bethechangeyouwant #neurons #neuralnetworks

 

What Iconoclast.blog Is About

icono1

I’ve spent the past ten years writing books, blogs, and articles on technology, jobs, economics, law, personal growth, cultural transformation, psychology, neurology, fitness and health… all sprinkled with futurism. In all those seemingly unrelated topics, I’ve been drawn to a common theme:  change. One lesson stands out:

Beliefs create who we are individually and collectively.
The first step of change is to be aware of them.
The second step is to leave them behind.

Beliefs inform personal and collective identity, establish perspective, explain biases, screen out inconsistent information, attract conforming experience, deflect non-conforming information and experience, and make decisions for us that we only rationalize in hindsight.

Those things are useful:  they tame the wild and advance civilization, help us locate our bewildered selves and draw us into protective communities. We need that to survive and thrive.  But they can be too much of a good thing. They make us willfully blind, show us only what we will see and hide what we won’t. They build our silos, sort us into polarities, close our minds, cut us off from compassion, empathy, and meaningful discourse.

Faced with the prospect of change, beliefs guard status quo against the possibility that something else is possible — which is precisely what we have to believe if we’re after change. Trouble is, to believe just that much threatens all our other beliefs. Which means that, if we want something else,

We need to become iconoclasts.

The Online Etymology Dictionary says that “iconoclast” originally meant “breaker or destroyer of images,” originally referring to religious zealots who vandalized icons in Catholic and Orthodox churches because they were “idols.” Later, the meaning was broadened to “one who attacks orthodox beliefs or cherished institutions.”

Our beliefs are reflected, transmitted, and reinforced in our religious, national, economic, and other cultural institutions. These become our icons, and we cherish them, invest them with great dignity, revere them as divine, respect them as Truth with a capital T, and fear their wrath if we neglect or resist them. We confer otherworldly status on them, treat them as handed down from an untouchable level of reality that supersedes our personal agency and self-efficacy. We devote ourselves to them, grant them unquestioned allegiance, and chastise those who don’t bow to them alongside us.

Doing that, we forget that our icons only exist because they were created out of belief in the first place. In the beginning, we made them up. From there, they evolved with us. To now and then examine, challenge, and reconfigure them and the institutions that sustain them is an act of creative empowerment — one of the highest and most difficult gifts of being human.

Change often begins when that still small voice pipes up and says, “Maybe not. Maybe something else is possible.” We are practiced in ignoring it; to become an iconoclast requires that we listen, and question the icons that warn us not to. From there, thinking back to the word’s origins, I like “challenge” better than “attack.”  I’m not an attacker by nature, I’m an essayist — a reflective, slow thinker who weighs things and tries to make sense of them. I’m especially not a debater or an evangelist — I’m not out to convince or convert anyone, and besides, I lack the quick-thinking mental skill set.

I’m also not an anarchist, libertarian, revolutionary… not even a wannabe Star Wars rebel hero, cool as that sounds. I was old enough in the 60’s to party at the dawning of the Age of Aquarius, but then it failed like all the other botched utopias — exposed as one more bogus roadmap claiming to chart the way back to the Garden.

Sorry, but the Garden has been closed for a long, long time.

garden closed

A friend used to roll his eyes and say, “Some open minds ought to close for business.” Becoming an iconoclast requires enough open-mindedness to suspend status quo long enough to consider that something else is possible. That’s not easy, but it is the essential beginning of change, and it can be done.

Change needs us to be okay with changing our minds.

All the above is what I had in mind when I created Iconoclast.blog. I am aware of its obvious potential for inviting scoffing on a good day, embarrassment and shaming on a worse, and vituperation, viciousness, trolling, and general spam and nastiness on the worst. (Which is why I disabled comments on the blog, and instead set up a Facebook page that offers ample raving opportunity.) Despite those risks, I plan to pick up some cherished icons and wonder out loud what might be possible in their absence. . If you’re inclined to join me, then please click the follow button for email delivery, or follow the blog on Facebook. I would enjoy the company.

#icons #iconoclast #psychology “philosophy #sociology #neurology #biology #narrative #belief #society #socialstudies #religion #law #economics #work #jobs #science #industry #commerce #education #medicine #arts #entertainment #civilization #evolution #perception #reality #subjective #culture #culturalchange #change #paradigmshift #transformation #growth #personalgrowth #futurism #technology #identity #rational #consciousness #cognition #bias #cognitivebias#agency #selfefficacy