Knowledge, Conviction, and Belief [5]

My soul is lost, my friend
Tell me how do I begin again?
My city’s in ruins,
My city’s in ruins.

Bruce Springsteen

Neuroscience looks for the soul in the brain and can’t find it. What it finds instead are the elements of consciousness — sensory perception, language, cognition, memory,  etc. — in various neural networks and regions of the brain, and those diverse networks collaborating to generate a composite conscious experience. Meanwhile, the master network — the one that is equivalent to conventional notions of the soul or self — remains elusive.

Prof. Bruce Hood lays out the progression from conventional belief in a separate self to the current brain network theory:

“Psychologist Susan Blackmore makes the point that the word “illusion” does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“In challenging what is the self, what most people think is the self must first be considered. If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“This sense that we are individual inside bodies is sometimes called the ‘ego theory,’ although philosopher Gale Strawson captures it poetically in what he calls the ‘pearl view’ of the self. The pearl view is the common notion that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’

“In contrast to this ego view, there is an alternative version of the self, based on the ‘bundle theory’ after the Scottish Enlightenment philosopher David Hume… He tried to describe his inner self and thought that there was no single entity, but rather bundles of sensations, perceptions and thoughts piled on top of each other. He concluded that the self emerged out of the bundling together of these experiences.

“If the self is the sum of our thoughts and actions, then the first inescapable fact is that these depend on brains. Thoughts and actions are not exclusively the brain because we are always thinking about and acting upon things in the world with our bodies, but the brain is primarily responsible for coordinating these activities. In effect, we are our brains or at least, the brain is the most critical body part when it comes to who we are.

“There is no center in the brain where the self is constructed. The brain has many distributed jobs. It processes incoming information from the external world into meaningful patterns that are interpreted and stored for future reference. It generates different levels and types of motivations that are the human drives, emotions, and feelings. It produces all sorts of behavior — some of them automatic while other are acquired thought skill, practice, and sheer effort.

“The sense of self that most of us experience is not to be found in any one area. Rather it emerges out of the orchestra of different brain processes.”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood (2012)

Princeton neuroscientist Michael Graziano uses an “attention schema theory” to describe this collaboration of neural networks. “The heart of the theory is that awareness is a schematized, descriptive model of attention,” he says, and expands as follows:

“In the present theory, the content of consciousness, the stuff in the conscious mind, is distributed over a large set of brain areas, areas that encode vision, emotion, language, action plans, and so on. The full set of information that is present in consciousness at any one time has been called the ‘global workspace.’ In the present theory, the global workspace spans many diverse areas of the brain. But the specific property of awareness, the essence of awareness added to the global workspace, is constructed by an expert system in a limited part of the brain…. The computed property of awareness can be bound to the larger whole… One could think of awareness as information.”

Consciousness and the Social Brain. Michael S. A. Graziano (2013)

To those who hold fast to the common belief (as most people do) that the soul is something transcendent, noble, unique, special, poetic, and divine, referring to consciousness and the self as “global workspace” and calling awareness “information” lacks a little something. But is that any reason to reject the bundle theory as untrue?

Meanwhile, Prof. Graziano admits that “the attention schema theory does not even seek to answer the question of existential reality but instead tries to describe what is constructed by the brain.” And besides, is science really after truth anyway?

We’ll look at those questions next time.

Knowledge, Conviction, and Belief [4]

brain - illuniated

The period of roughly 2010-2016 apparently was a breakthrough time for neuroscience and the study of consciousness. About then, a scientific consensus began to emerge that the conscious human mind was generated by the brain — or, as some put it, “the mind is what the brain does.”

In a 2016 article[1], University of Sussex professor of cognitive and computational neuroscience Anil K. Seth wrote that,

“In my own research, a new picture is taking shape in which conscious experience is seen as deeply grounded in how brains and bodies work together to maintain physiological integrity – to stay alive.”

Since then, other brain researchers have added a third essential component:  our environment, particularly our cultural setting and its institutionalized belief systems.

Brain, body, environment — that’s it, that’s what brain science has come up with. It went looking for the soul and didn’t find it. As physician and researcher Paul Singh also wrote in 2016[2]:

“The idea of a transcendent self is a myth; the truth is that the self is a constructed self.”

Not only that, but when a consciousness based purely in physicality replaces traditional belief in an eternal, transcendent soul temporarily at home in a physical human body, other companion notions about the self, consciousness, and free will also come tumbling down. Singh admitted this wasn’t going to be easy news to swallow:

“I will be the first one to admit that the debates about the nature of free will, consciousness, and the self are far from over. It is not, however, because we don’t know the answers, but because we are not at a stage of human evolution and progress yet for people to accept such radical ideas. Such truths are scary in the sense that they undermine our ordinary and commonsensical beliefs about human nature and seem to threaten values that we hold dearly — one of the most important of which is moral responsibility.”

Difficult, yes, but not impossible if you can suspend allegiance to the things you’re convinced of and convicted about, and instead give scientific knowledge a try. Singh makes his case with impassioned advocacy of science and the scientific method:

“I believe, however, that the truthfulness of a fact should be judged on its own merit rather than based on its social and emotional implications for the well-being of an individual or society. Truth should be acknowledged first and then solutions sought that will be implemented in light of the good and bad that truth has revealed, not the other way around. Truth is about truth and not about convenience or about making us feel good about ourselves.

“We should never believe a claim to be true simply because on one can prove it to be false. Theologians are experts at this kind of nonsense. Are delusional people making things up? Evidence shows that the human brain is universally delusional in many ways and therefore people who promote superstitions are not particularly more delusional that the rest of us. It is just that examples of religious delusions are rather classic examples of how the brain creates illusions and delusions. The use of logic and scientific skepticism is a skill that can be used to overcome the limitations of our own brains. This skill is like any other skill such as learning to play the piano. It involves training in metacognition as well as basic education in basic sciences.”

Frankly, that kind of rhetoric invariably come across as bombastic and opinionated and therefore easy for those convinced otherwise to dismiss. The well-worn neural pathways of our own brains are deeply rutted with their own notions of what is true, and not about to change to a new paradigm just because someone else is convinced it is the “Truth is about truth.”

On the other hand, in my personal experience, I’ve found that the precursor to scientific knowledge — “scientific skepticism” — is in fact a “a skill that can be used to overcome the limitations of our own brains.” I’ve been developing the skill gradually for years, without intentionally doing so. I was no scientist; I’d spent a lifetime in the humanities; my allegiance was with Romanticism, not the Enlightenment. I was not out to find or prove truth, or convince anybody of it. But I was looking for new thoughts, and years of reading and reflecting — like water carving sandstone — slowly brought my thinking to a new place.

The first time I read about the “materialist” version of consciousness I thought it was just plain odd, which made me highly skeptical. Ironically that skepticism eventually sharpened into a practice that brought me where I don’t find the materialist idea odd at all; in fact, it seems odd to think the way I used to. It now seems simple and obvious that everything we experience is processed within the confines of our largest organ — our skin — and that it has to be that way because, as a biological organism, there’s no other place where it can happen. Even if we think about an eternal, transcendent soul, we do so from our ephemeral, fleshly point of view. That’s all the equipment we’ve got.

Continued next time.

[1] The Real Problem:  It looks like scientists and philosophers might have made consciousness far more mysterious than it needs to be, Aeon Magazien (Nov. 2, 2016)

[2] The Great Illusion:  The Myth of Free Will, Consciousness, and the Self.

Knowledge, Conviction, and Belief [3]

Janus

We’ve been talking about dualistic thinking — the kind that leads us to think we live simultaneously in two realities.

Reality A is “life in the flesh” — bound by space and time and all the imperfections of what it means to be human. It is life carried on in our physical bodies, where our impressive but ultimately limited brains are in charge.

Reality B is “life in the spirit” — the eternal, perfect, transcendent, idealized, supernatural, original source that informs, explains, and guides its poorer counterpart.

This dualistic thinking says there’s more to life than meets the eye, that humans are an “eternal soul having a worldly existence.” The dualism set ups a cascade of derivative beliefs, for example:

There’s a difference between the Reality A identity and experience we actually have and the Reality B identity and experience we would have if we could rise above Reality A and live up to the idealized version of Reality B.

Every now and then, somebody gets lucky or gets saved or called, and gets to live out their Reality B destiny, which gives them and their lives a heightened sense of purpose and meaning.

But those are the chosen few, and they’re rare. For most of us, our ordinary selves and mundane lives are only a shadow of our “higher selves” and “greater potential.”

The chosen few can — and often do — provide guidance as to how we can do better, and we do well to find some compatible relation with one of more of them, but sometimes, in the right setting and circumstance, we might discover that we have receptors of our own that can receive signals from Reality B. We call this “enlightenment” or “conversion” or “salvation” or something like that, and it’s wonderful, blissful, and euphoric.

But most of the time, for the vast majority of us, Reality A is guided by a mostly one-way communication with Reality B — a sort of moment-by-moment data upload from A to B, where everything about us and our lives — every conscious and subconscious intent, motive, thought, word, and deed — gets stored in a failsafe beyond-time data bank. When our Reality A lives end, those records determine what happens next — they inform our next trip through Reality A, or set the stage for Reality B existence we’re really going to like or we’re really going to suffer.

Everybody pretty much agrees it’s useful to have good communication with or awareness of Reality B, because that helps us live better, truer, happier, more productive lives in Reality A, and because it creates a better data record when our Reality A existence ends and we pass over to Reality B.

And on it goes. No, we don’t express any of it that way:  our cultural belief systems and institutions — religious doctrines, moral norms, legal codes, academic fields of study, etc. — offer better- dressed versions. But it’s remarkable how some version of those beliefs finds its way into common notions about  how life works.

At the heart of it all is our conviction — not knowledge — that this thing we consciously know as “me” is an independent self that remains intact and apart from the biological messiness of human life, able to choose its own beliefs, make its own decisions, and execute its own actions. In other words, we believe in consciousness, free will, and personal responsibility for what we are and do — and what we aren’t and don’t do — during what is only a sojourn — a short-term stay — on Earth.

Those beliefs explain why, for example,  it bothers us so much when someone we thought we knew departs from their beginnings and instead displays a changed inner and outer expression of who they were when we thought we knew them. “Look who’s in the big town,” we say. Or we pity them and knock wood and declare thank goodness we’ve been lucky. Or we put them on the prayer chain or call them before the Inquisition… anything but entertain the idea that maybe Reality B isn’t there– along with all the belief it takes to create it — and that instead all we have is Reality A — we’re nothing but flesh and bone.

It’s almost impossible to think that way. To go there, we have to lay aside conviction and embrace knowledge.

Almost impossible.

Almost.

We’ll give it a try in the coming weeks.

Knowledge, Conviction, and Belief [2]

We think we have an independent ability to think and believe as we like, to know this or be convinced about that. But that’s not the whole story:  our outlook is also shaped by our cultural context.

As we’ve seen , when enough people agree about what is true — whether they “know” it or are “convinced” of it — their agreement becomes a cultural belief system — for example, as reflected in a religion, country, neighborhood, business, athletic team, or other institution. Cultural belief systems are wired into the neural pathways of individual members, and as the culture coalesces, its belief system takes on a life of its own thorough a process known as “emergence.” As the emergent belief system is increasingly reflected in and reinforced by cultural institutions, it is increasingly patterned into the neural pathways of the culture’s members, where it defines individual and collective reality and sense of identity,  The belief system becomes The Truth , defining what the group and its members know and are convinced of.

Throughout this process, whether the culture’s beliefs are true in any non-subjective sense loses relevance. The result is what physician and author Paul Singh refers to as “mass delusion”:

“[When a conviction moves from an individual to being widely held], its origins are rooted in a belief system rather than in an individual’s pathological condition. It is a mass delusion of the sort that poses no immediate threat to anyone or society. Mass delusions can become belief systems that are passed from generation to generation.”

The Great Illusion:  The Myth of Free Will, Consciousness, and the Self, Paul Singh (2016)

For a dramatic example of this concept in action, consider an experience described by Jesse Jackson:

“There is nothing more painful to me at this stage in my life than to walk down the street and hear footsteps… then turn around and see somebody white and feel relieved.”

Despite a lifetime of civil rights leadership, Jackson’s cultural neural conditioning betrayed him. What he experienced was not just personal to him; it conformed to a cultural belief system. The particular “mass delusion” involved has been confirmed by clinical research.

“Matthew Lieberman, a psychologist at the University of California, recently showed how beliefs help people’s brains categorise others and view objects as good or bad, largely unconsciously. He demonstrated that beliefs (in this case prejudice or fear) are most likely to be learned from the prevailing culture.

“When Lieberman showed a group of people photographs of expressionless black faces, he was surprised to find that the amygdala — the brain’s panic button — was triggered in almost two-thirds of cases. There was no difference in the response between black and white people.”

Where Belief Is Born, The Guardian (June 30,2005)

When cultural beliefs are not constantly reinforced — by cultural norms of thought, language, practice, etc. — the neural networks that support them can weaken, allowing opportunity for new beliefs.

“‘Beliefs are mental objects in the sense that they are embedded in the brain,’ says [Kathleen Taylor, a neuroscientist at Oxford University] ‘If you challenge [beliefs] by contradiction, or just by cutting them off from the stimuli that make you think about them, then they are going to weaken slightly. If that is combined with very strong reinforcement of new beliefs, then you’re going to get a shift in emphasis from one to the other.’”

Where Belief Is Born

This helps to explain, for example, why religious believers are more likely to “fall away” if they are “out of fellowship.” Or what can happen to a student off to college, a world traveler, or an immigrant. It also helps to explain why leaders and despots alike can manipulate brain networks to create cultural belief systems to fit their desired ends:

“In her book on the history of brainwashing, Taylor describes how everyone from the Chinese thought reform camps of the last century to religious cults have used systematic methods to persuade people to change their ideas, sometimes radically.

“The mechanism Taylor describes is similar to the way the brain learns normally. In brainwashing though, the new beliefs are inserted through a much more intensified version of that process.

“The first step is to isolate a person and control what information they receive. Their former beliefs need to be challenged by creating uncertainty. New messages need to be repeated endlessly. And the whole thing needs to be done in a pressured, emotional environment.

“Stress affects the brain such that it makes people more likely to fall back on things they know well – stereotypes and simple ways of thinking,” says Taylor.

“This manipulation of belief happens every day. Politics is a fertile arena, especially in times of anxiety.”

Where Belief Is Born

More next time.

Knowledge, Conviction, and Belief

For I am convinced that neither death nor life, neither angels nor demons, neither the present nor the future, nor any powers,  neither height nor depth, nor anything else in all creation, will be able to separate us from the love of God that is in Christ Jesus our Lord.”

Paul’s letter to the Romans 8:38-39 (NIV)

How did Paul know that? Why was he so convinced?

According to psychology and neuroscience, he didn’t know it, he was convinced of it. The difference reflects Cartesian dualism:  the belief that we can know things about the natural world through scientific inquiry, but in the supernatural world, truth is a matter of conviction.

Academics draw distinctions between these and other terms,[1] but in actual experience, the essence seems to be emotional content. Scientific knowledge is thought to be emotionally detached — it wears a lab coat, pours over data, expresses conclusions intellectually. It believes its conclusions, but questioning them is hardwired into scientific inquiry; science therefore must hold its truth in an open hand — all of which establish a reliable sense of what is “real.” Conviction, on the other hand, comes with heart, with a compelling sense of certainty. The emotional strength of conviction makes questioning its truth — especially religious convictions — something to be discouraged or punished.

Further, while knowledge may come with a Eureka! moment — that satisfying flash of suddenly seeing clearly — conviction often comes with a sense of being overtaken by an authority greater than ourselves — of being apprehended and humbled, left frightened and grateful for a second chance.

Consider the etymologies of conviction and convince:

conviction (n.)

mid-15c., “the proving or finding of guilt of an offense charged,” from Late Latin convictionem(nominative convictio) “proof, refutation,” noun of action from past-participle stem of convincere “to overcome decisively,” from com-, here probably an intensive prefix (see com-), + vincere “to conquer” (from nasalized form of PIE root *weik- (3) “to fight, conquer”).

Meaning “mental state of being convinced or fully persuaded” is from 1690s; that of “firm belief, a belief held as proven” is from 1841. In a religious sense, “state of being convinced one has acted in opposition to conscience, admonition of the conscience,” from 1670s.

convince (v.)

1520s, “to overcome in argument,” from Latin convincere “to overcome decisively,” from assimilated form of com-, here probably an intensive prefix (see com-), + vincere “to conquer” (from nasalized form of PIE root *weik- (3) “to fight, conquer”). Meaning “to firmly persuade or satisfy by argument or evidence” is from c. 1600. Related: Convincedconvincingconvincingly.

To convince a person is to satisfy his understanding as to the truth of a certain statement; to persuade him is, by derivation, to affect his will by motives; but it has long been used also for convince, as in Luke xx. 6, “they be persuaded that John was a prophet.” There is a marked tendency now to confine persuade to its own distinctive meaning. [Century Dictionary, 1897]

Both knowledge and conviction, and the needs they serve, are evolutionary survival skills:  we need what they give us to be safe, individually and collectively. Knowledge satisfies our need to be rational, to think clearly and logically, to distinguish this from that, to put things into dependable categories. Conviction satisfies the need to be moved, and also to be justified — to feel as though you are in good standing in the cosmology of how life is organized.

Culturally, conviction is often the source of embarrassment, guilt, and shame, all of which have a key social function — they are part of the glue that holds society together. Becoming aware that we have transgressed societal laws or behavioral norms (the “conviction of sin”) often brings not just chastisement but also remorse and relief — to ourselves and to others in our community:  we’ve been arrested, apprehended, overtaken by a corrective authority, and saved from doing further harm to ourselves and others.

Knowledge and conviction also have something else in common:  both originate in the brain’s complex tangle of neural networks:

“It is unlikely that beliefs as wide-ranging as justice, religion, prejudice or politics are simply waiting to be found in the brain as discrete networks of neurons, each encoding for something different. ‘There’s probably a whole combination of things that go together,’ says [Peter Halligan, a psychologist at Cardiff University].

“And depending on the level of significance of a belief, there could be several networks at play. Someone with strong religious beliefs, for example, might find that they are more emotionally drawn into certain discussions because they have a large number of neural networks feeding into that belief.”

Where Belief Is Born, The Guardian (June 30,2005).

And thus protected by the knowledge and convictions wired into our neural pathways, we make our way through this precarious thing called “life.”

More next time.

[1] Consider also the differences between terms like conviction and belief, and fact, opinion, belief, and prejudice.

Who’s In Charge Here?

Edelweiss mit Blüten,Wallis, Schweiz.
© Michael Peuckert

Edelweiss, edelweiss
Every morning you greet me

Small and white
Clean and bright
You look happy to meet me

(A little exercise in anthropomorphism
from The Sound of Music)

This hierarchy of consciousness we looked at last time — ours is higher than the rest of creation, angels’ is higher than ours, God’s is highest — is an exercise in what philosophy calls teleology:   “the explanation of phenomena in terms of the purpose they serve.” Teleology is about cause and effect — it looks for design and purpose, and its holy grail is what psychologists call agency:  who or what is causing things we can’t control or explain.

“This agency-detection system is so deeply ingrained that it causes us to attribute agency in all kinds of natural phenomena, such as anger in a thunderclap or voices in the wind, resulting in our universal tendency for anthropomorphism.

“Stewart Guthrie, author of Faces in the Clouds:  A New Theory of Religion, argues that ‘anthropomorphism may best be explained as the result of an attempt to see not what we want to see or what is easy to see, but what is important to see:  what may affect us, for better or worse.’ Because of our powerful anthropomorphic tendency, ‘we search everywhere, involuntarily and unknowingly, for human form and results of human action, and often seem to find them where they do not exist.’”

The Patterning Instinct:  A Cultural History of Humanity’s Search for Meaning, Jeremy Lent (2017)

Teleological thinking is a characteristic feature of religious, magical, and supernatural thinking:

“Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, ‘Individuals’ explicit religious and paranormal beliefs are the best predictors of their perception of purpose in life events’—their tendency ‘to view the world in terms of agency, purpose, and design.”

How American Lost its Mind, The Atlantic (Sept. 2017)

Psychology prof Clay Routledge describes how science debunks teleology, but also acknowledges why it’s a comfortable way of thinking:

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless. From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it.

“For most humans, the idea that life is inherently meaningless simply will not do.

“Instead, people latch onto what I call teleological thinking. Teleological thinking is when people perceive phenomena in terms of purpose. When applied to natural phenomena, this type of thinking is generally considered to be flawed because it imposes design where there is no evidence for it.  To impose purpose and design where there is none is what researchers refer to as a teleological error.”

Supernatural: Death, Meaning, and the Power of the Invisible World, Clay Routledge (2018)

It’s one thing to recognize “teleological error,” it’s another to resist it — even for those who pride themselves on their rationality:

“Even atheists who reject the supernatural and scientists who are trained not to rely on teleological explanations of the world do, in fact, engage in teleological thinking.

“Many people who reject the supernatural do so through thoughtful reasoning. … However, when these people are making teleological judgments, they are not fully deploying their rational thinking abilities.

“Teleological meaning comes more from an intuitive feeling than it does from a rational decision-making process.”

Supernatural: Death, Meaning, and the Power of the Invisible World

Teleological thinking may be understandable, but scientist and medical doctor Paul Singh comes down hard on the side of science as the only way to truly “know” something:

“All scientists know that the methods we use to prove or disprove theories are the only dependable methods of understanding our universe. All other methodologies of learning, while appropriate to employ in situations when science cannot guide us, are inherently flawed. Reasoning alone — even the reasoning of great intellects — is not enough. It must be combined with the scientific method if it is to yield genuine knowledge about the universe.”

The Great Illusion:  The Myth of Free Will, Consciousness, and the Self, Paul Singh (2016)

After admitting that “evidence shows that the human brain is universally delusional in many ways,” Singh makes his case that “the use of logic and scientific skepticism is a skill that can be used to overcome the limitations of our own brains.”

Next time, we’ll look more into the differences in how science and religion “know” things to be “true.”

A Little Lower Than the Angels

“When I consider Your heavens, the work of Your fingers,
The moon and the stars, which You have ordained,
What is man that You are mindful of him,
And the son of man that You visit him?
For You have made him a little lower than [b]the angels,
And You have crowned him with glory and honor.
You have made him to have dominion over the works of Your hands;
You have put all things under his feet.”

Psalm 8:3-6 (NKJV)

Anthropocentrism is the belief that humans are the apex of creation. The belief is so common that the mere suggestion we might not be throws us into cognitive dissonance — “the state of having inconsistent thoughts, beliefs, or attitudes, especially as relating to behavioural decisions and attitude change.”

Cognitive dissonance runs especially hot when science threatens religious paradigms like the anthropocentric one in the Biblical passage above.[1] Biologist David Barash wrote his book to bring it on — this is from the Amazon promo:

 “Noted scientist David P. Barash explores the process by which science has, throughout time, cut humanity “down to size,” and how humanity has responded. A good paradigm is a tough thing to lose, especially when its replacement leaves us feeling more vulnerable and less special. And yet, as science has progressed, we find ourselves–like it or not–bereft of many of our most cherished beliefs, confronting an array of paradigms lost… Barash models his argument around a set of ‘old’ and ‘new’ paradigms that define humanity’s place in the universe.”

Through a Glass Brightly:  Using Science to See Our Species as We Really Are

Here’s his old/new paradigm summary re: anthropocentrism:

Old:  Human beings are fundamentally important to the cosmos.
New:  We aren’t.

Old:  We are literally central to the universe, not only astronomically, but in other ways, too.
New:  We occupy a very small and peripheral place in a not terribly consequential galaxy, tucked away in just one insignificant corner of an unimaginably large universe.

Cognitive dissonance is  why non- anthropocentric paradigms come across as just plain weird — like Robert Lanza’s biocentrism:

“Every now and then, a simple yet radical idea shakes the very foundations of knowledge. The startling discovery that the world was not flat challenged and ultimately changed the way people perceived themselves and their relationships with the world.

“The whole of Western natural philosophy is undergoing a sea change again, forced upon us by the experimental findings of quantum theory. At the same time, these findings have increased our doubt and uncertainty about traditional physical explanations of the universe’s genesis and structure.

“Biocentrism completes this shift in worldview, turning the planet upside down again with the revolutionary view that life creates the universe instead of the other way around. In this new paradigm, life is not just an accidental byproduct of the laws of physics.

“Biocentrism shatters the reader’s ideas of life, time and space, and even death. At the same time, it releases us from the dull worldview that life is merely the activity of an admixture of carbon and a few other elements; it suggests the exhilarating possibility that life is fundamentally immortal.”

Anthropocentrism works closely with another human-centered belief practice:  “anthropomorphism,” which is “the attribution of human traits, emotions, or intentions to non-human entities” — for example those angels we’re just a little lower than, and God, who put the God-angels-us-the rest of creation hierarchy in place. The human trait we attribute to God and the angels is the same one we believe sets us apart from the rest of creation:  consciousness.

“When our anthropomorphism is applied to religious thought, it’s notably the mind, rather than the body, that’s universally applied to spirits and gods. In the diverse cultures of the world, gods come in all shapes and sizes, but one thing they always share is a mind with the ability to think symbolically just like a human. This makes sense in light of the critical importance of theory of mind in the development of our social intelligence:  if other people have minds like ours, wouldn’t that be true of other agents we perceive to act intentionally in the world?”

The Patterning Instinct:  A Cultural History of Humanity’s Search for Meaning, Jeremy Lent (2017)

Anthropocentrism puts us in charge as far as our consciousness can reach. Anthropomorphism puts beings with higher consciousness in  charge of the rest. Both practices are truly anthropo- (human) centered; the beliefs they generate start and end with our own human consciousness. Which means our attempts to think beyond our range are inescapably idolatrous:  we create God and the angels in our image, and they return the favor.

There’s a philosophical term that describes what’s behind all this, called “teleology” — the search for explanation and design, purpose and meaning. We’ll look at that next time.

[1] The case for anthropocentrism starts in the first chapter of the Bible:  “Then God said, “Let us make mankind in our image, in our likeness, so that they may rule over the fish in the sea and the birds in the sky, over the livestock and all the wild animals,[a] and over all the creatures that move along the ground. So God created mankind in his own image, in the image of God he created them;    male and female he created them. God blessed them and said to them, “Be fruitful and increase in number; fill the earth and subdue it. Rule over the fish in the sea and the birds in the sky and over every living creature that moves on the ground. Then God said, “I give you every seed-bearing plant on the face of the whole earth and every tree that has fruit with seed in it. They will be yours for food. And to all the beasts of the earth and all the birds in the sky and all the creatures that move along the ground—everything that has the breath of life in it—I give every green plant for food.” Genesis 1: 26-30.The post-deluge version removed the vegetarian requirement:  “Then God blessed Noah and his sons, saying to them, “Be fruitful and increase in number and fill the earth. The fear and dread of you will fall on all the beasts of the earth, and on all the birds in the sky, on every creature that moves along the ground, and on all the fish in the sea; they are given into your hands. Everything that lives and moves about will be food for you. Just as I gave you the green plants, I now give you everything.” Genesis 9: 1-3.

“Fearfully and Wonderfully Made”

da vinci

We are starting this series on Consciousness and the Self by looking at some of the religious and secular foundations of the belief that humans are a dualist entity consisting of body and soul, and the associated belief that the two elements are best understood by different forms of inquiry — religion and the humanities for the soul, and science for the body. As we’ll see, current neuro-biological thinking defies these beliefs and threatens their ancient intellectual, cultural, and historical dominance.

This article[1] is typical in its conclusion that one of the things that makes human beings unique is our “higher consciousness.”

“[Home sapiens] sits on top of the food chain, has extended its habitats to the entire planet, and in recent centuries, experienced an explosion of technological, societal, and artistic advancements.

“The very fact that we as human beings can write and read articles like this one and contemplate the unique nature of our mental abilities is awe-inspiring.

“Neuroscientist V.S. Ramachandran said it best: ‘Here is this three-pound mass of jelly you can hold in the palm of your hand…it can contemplate the meaning of infinity, and it can contemplate itself contemplating the meaning of infinity.’

“Such self-reflective consciousness or ‘meta-wondering’ boosts our ability for self-transformation, both as individuals and as a species. It contributes to our abilities for self-monitoring, self-recognition and self-identification.”

The author of the following Biblical passage agrees, and affirms that his “soul knows it very well” — i.e., not only does he know he’s special, but he knows that he knows it:

For you formed my inward parts;
    you knitted me together in my mother’s womb.
I praise you, for I am fearfully and wonderfully made.
Wonderful are your works;
    my soul knows it very well.

Psalm 139: 13-16 (ESV)

Judging from worldwide religious practice, the “I” that is “fearfully and wonderfully made” is limited to the soul, not the body:  the former feels the love, while the latter is assaulted with unrelenting, vicious, sometimes horrific verbal and physical abuse. “Mortification of the flesh” indeed –as if the body needs help being mortal.

Science apparently concurs with this dismal assessment. The following is from the book blurb for Through a Glass Brightly:  Using Science to See Our Species as We Really Are, by evolutionary biologist and psychologist David P. Barash (2018):

“In Through a Glass Brightly, noted scientist David P. Barash explores the process by which science has, throughout time, cut humanity ‘down to size,’ and how humanity has responded. A good paradigm is a tough thing to lose, especially when its replacement leaves us feeling more vulnerable and less special. And yet, as science has progressed, we find ourselves–like it or not–bereft of many of our most cherished beliefs, confronting an array of paradigms lost.

“Barash models his argument around a set of “old” and “new” paradigms that define humanity’s place in the universe. This new set of paradigms [includes] provocative revelations [such as] whether human beings are well designed… Rather than seeing ourselves through a glass darkly, science enables us to perceive our strengths and weaknesses brightly and accurately at last, so that paradigms lost becomes wisdom gained. The result is a bracing, remarkably hopeful view of who we really are.”

Barash’s old and new paradigms about the body are as follows:

“Old paradigm:  The human body is a wonderfully well constructed thing, testimony to the wisdom of an intelligent designer.

“New paradigm:  Although there is much in our anatomy and physiology to admire, we are in fact jerry-rigged and imperfect, testimony to the limitations of a process that is nothing but natural and that in no way reflects supernatural wisdom or benevolence.”

Okay, so maybe the body has issues, but the old paradigm belief that human-level consciousness justifies lording it over the rest of creation is as old as the first chapter of the Bible:

And God blessed them. And God said to them,
“Be fruitful and multiply and fill the earth and subdue it
and have dominion over the fish of the sea
 and over the birds of the heavens
 and over every living thing that moves on the earth.”

Genesis 1:28  (ESV)

The Biblical mandate to “subdue” the earth explains a lot about how we approach the rest of creation — something people seem to be questioning more and more these days. Psychiatrist, essayist, and Oxford Fellow Neel Burton includes our superiority complex in his list of self-deceptions:

“Most people see themselves in a much more positive light than others do them, and possess an unduly rose-tinted perspective on their attributes, circumstances, and possibilities. Such positive illusions, as they are called, are of three broad kinds, an inflated sense of one’s qualities and abilities, an illusion of control over things that are mostly or entirely out of one’s control, and an unrealistic optimism about the future.” [2]

Humans as the apex of creation? More on that next time.

[1] What is it That Makes Humans Unique? Singularity Hub, Dec. 28, 2017.

[2] Hide and Seek:  The Psychology of Self-Deception (Acheron Press, 2012).

“Before You Were Born I Knew You”

The_Summoner_-_Ellesmere_Chaucer-300x282The Summoner in Chaucer’s The Canterbury Tales,
Ellesmere MSS, circa 1400

Last time we looked at the common dualistic paradigm of consciousness, which is based on (a) the belief that humans are made in two parts — an ethereal self housed in a physical body — and (b) the corollary belief that religion and the humanities understand the self best, while science is the proper lens for the body.

Current neuroscience theorizes instead that consciousness arises from brain, body, and environment — all part of the physical, natural world, and therefore best understood by scientific inquiry.

We looked at the origins of the dualistic paradigm last time. This week, we’ll look at an example of how it works in the world of jobs and careers —  particularly the notion of being “called” to a “vocation.”

According to the Online Etymology Dictionary, the notion of “calling” entered the English language around Chaucer’s time, originating from Old Norse kalla — “to cry loudly, summon in a loud voice; name, call by name.” Being legally summoned wasn’t a happy thing in Chaucer’s day (it still isn’t), and summoners were generally wicked, corrupt, and otherwise worthy of Chaucer’s pillory in The Friar’s Tale.

“Calling” got an image upgrade a century and a half later, in the 1550’s, when the term acquired the connotation of “vocation, profession, trade, occupation.” Meanwhile, “vocation” took on the meaning of “spiritual calling,” from Old French vocacio, meaning “call, consecration; calling, profession,” and Latin vocationem — “a calling, a being called” to “one’s occupation or profession.”

“Calling” and “vocation” together support the common dream of being able to do the work we were born to do, and the related belief that this would make our work significant and us happy. The idea of vocational calling is distinctly Biblical:[1]

“Before I formed you in the womb I knew you,
and before you were born I consecrated you;
I appointed you a prophet to the nations.”

Jeremiah 1:5 (ESV

Something in us — an evolutionary survival instinct, I would guess — wants to be known, especially by those in power. Vocational calling invokes power at the highest level:  never mind your parents’ hormones, you were a gleam in God’s eye; and never mind the genes you inherited, God coded vocational identity and purpose into your soul.

2600 years after Jeremiah, we’re still looking for the same kind of affirmation.

“Amy Wrzesniewski, a professor at Yale School of Management and a leading scholar on meaning at work, told me that she senses a great deal of anxiety among her students and clients. ‘They think their calling is under a rock,’ she said, ‘and that if they turn over enough rocks, they will find it.’ If they do not find their one true calling, she went on to say, they feel like something is missing from their lives and that they will never find a job that will satisfy them. And yet only about one third to one half of people whom researchers have surveyed see their work as a calling. Does that mean the rest will not find meaning and purpose in their careers?”

The Power of Meaning:  Crafting a Life That Matters, Emily Esfahani Smith

If only one-third to one-half of us feel like we’re living our vocational calling, then why do we hang onto the dream? Maybe the problem is what Romantic Era poet William Wordsworth wrote about in his Ode:  Intimations of Immortality:

“Our birth is but a sleep and a forgetting:
The Soul that rises with us, our life’s Star,
Hath had elsewhere its setting,
And cometh from afar:
Not in entire forgetfulness,
And not in utter nakedness,
But trailing clouds of glory do we come
From God, who is our home:
Heaven lies about us in our infancy!

“Shades of the prison-house begin to close
Upon the growing Boy,
But he beholds the light, and whence it flows,
He sees it in his joy;
The Youth, who daily farther from the east
Must travel, still is Nature’s Priest,
And by the vision splendid
Is on his way attended;
At length the Man perceives it die away,
And fade into the light of common day.”

I.e., maybe something tragic happens when an immortal self comes to live in a mortal body. This, too, is a common corollary belief to body/soul dualism — religion’s distrust of “the flesh” is standard issue.

Cognitive neuroscientist Christian Jarrett offers career advice to the afflicted:  you might be able to turn the job you already have into a calling if you invest enough in it, or failing that, you might find your source of energy and determination somewhere else than in your work. This Forbes article reaches a similar conclusion:

“Years ago, I read a very thought-provoking article by Michael Lewis … about the difference between a calling and a job. He had some powerful insights. What struck me most were two intriguing concepts:

‘There’s a direct relationship between risk and reward. A fantastically rewarding career usually requires you to take fantastic risks.’

‘A calling is an activity that you find so compelling that you wind up organizing your entire self around it — often to the detriment of your life outside of it.’”

I.e., maybe career satisfaction isn’t heaven-sent; maybe instead it’s developed in the unglamorous daily grind of life in the flesh.

More on historical roots and related beliefs coming up.

[1] For more Biblical examples, see Isaiah 44:24:  Thus says the Lord, your Redeemer, who formed you from the womb: Galatians 1:15:  But when he who had set me apart before I was born; Psalm 139:13, 16:  13  For you formed my inward parts; you knitted me together in my mother’s womb; your eyes saw my unformed substance; in your book were written, every one of them, the days that were formed for me, when as yet there was none of them.

Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).