Knowledge, Conviction, and Belief [3]

Janus

We’ve been talking about dualistic thinking — the kind that leads us to think we live simultaneously in two realities.

Reality A is “life in the flesh” — bound by space and time and all the imperfections of what it means to be human. It is life carried on in our physical bodies, where our impressive but ultimately limited brains are in charge.

Reality B is “life in the spirit” — the eternal, perfect, transcendent, idealized, supernatural, original source that informs, explains, and guides its poorer counterpart.

This dualistic thinking says there’s more to life than meets the eye, that humans are an “eternal soul having a worldly existence.” The dualism set ups a cascade of derivative beliefs, for example:

There’s a difference between the Reality A identity and experience we actually have and the Reality B identity and experience we would have if we could rise above Reality A and live up to the idealized version of Reality B.

Every now and then, somebody gets lucky or gets saved or called, and gets to live out their Reality B destiny, which gives them and their lives a heightened sense of purpose and meaning.

But those are the chosen few, and they’re rare. For most of us, our ordinary selves and mundane lives are only a shadow of our “higher selves” and “greater potential.”

The chosen few can — and often do — provide guidance as to how we can do better, and we do well to find some compatible relation with one of more of them, but sometimes, in the right setting and circumstance, we might discover that we have receptors of our own that can receive signals from Reality B. We call this “enlightenment” or “conversion” or “salvation” or something like that, and it’s wonderful, blissful, and euphoric.

But most of the time, for the vast majority of us, Reality A is guided by a mostly one-way communication with Reality B — a sort of moment-by-moment data upload from A to B, where everything about us and our lives — every conscious and subconscious intent, motive, thought, word, and deed — gets stored in a failsafe beyond-time data bank. When our Reality A lives end, those records determine what happens next — they inform our next trip through Reality A, or set the stage for Reality B existence we’re really going to like or we’re really going to suffer.

Everybody pretty much agrees it’s useful to have good communication with or awareness of Reality B, because that helps us live better, truer, happier, more productive lives in Reality A, and because it creates a better data record when our Reality A existence ends and we pass over to Reality B.

And on it goes. No, we don’t express any of it that way:  our cultural belief systems and institutions — religious doctrines, moral norms, legal codes, academic fields of study, etc. — offer better- dressed versions. But it’s remarkable how some version of those beliefs finds its way into common notions about  how life works.

At the heart of it all is our conviction — not knowledge — that this thing we consciously know as “me” is an independent self that remains intact and apart from the biological messiness of human life, able to choose its own beliefs, make its own decisions, and execute its own actions. In other words, we believe in consciousness, free will, and personal responsibility for what we are and do — and what we aren’t and don’t do — during what is only a sojourn — a short-term stay — on Earth.

Those beliefs explain why, for example,  it bothers us so much when someone we thought we knew departs from their beginnings and instead displays a changed inner and outer expression of who they were when we thought we knew them. “Look who’s in the big town,” we say. Or we pity them and knock wood and declare thank goodness we’ve been lucky. Or we put them on the prayer chain or call them before the Inquisition… anything but entertain the idea that maybe Reality B isn’t there– along with all the belief it takes to create it — and that instead all we have is Reality A — we’re nothing but flesh and bone.

It’s almost impossible to think that way. To go there, we have to lay aside conviction and embrace knowledge.

Almost impossible.

Almost.

We’ll give it a try in the coming weeks.

Knowledge, Conviction, and Belief

For I am convinced that neither death nor life, neither angels nor demons, neither the present nor the future, nor any powers,  neither height nor depth, nor anything else in all creation, will be able to separate us from the love of God that is in Christ Jesus our Lord.”

Paul’s letter to the Romans 8:38-39 (NIV)

How did Paul know that? Why was he so convinced?

According to psychology and neuroscience, he didn’t know it, he was convinced of it. The difference reflects Cartesian dualism:  the belief that we can know things about the natural world through scientific inquiry, but in the supernatural world, truth is a matter of conviction.

Academics draw distinctions between these and other terms,[1] but in actual experience, the essence seems to be emotional content. Scientific knowledge is thought to be emotionally detached — it wears a lab coat, pours over data, expresses conclusions intellectually. It believes its conclusions, but questioning them is hardwired into scientific inquiry; science therefore must hold its truth in an open hand — all of which establish a reliable sense of what is “real.” Conviction, on the other hand, comes with heart, with a compelling sense of certainty. The emotional strength of conviction makes questioning its truth — especially religious convictions — something to be discouraged or punished.

Further, while knowledge may come with a Eureka! moment — that satisfying flash of suddenly seeing clearly — conviction often comes with a sense of being overtaken by an authority greater than ourselves — of being apprehended and humbled, left frightened and grateful for a second chance.

Consider the etymologies of conviction and convince:

conviction (n.)

mid-15c., “the proving or finding of guilt of an offense charged,” from Late Latin convictionem(nominative convictio) “proof, refutation,” noun of action from past-participle stem of convincere “to overcome decisively,” from com-, here probably an intensive prefix (see com-), + vincere “to conquer” (from nasalized form of PIE root *weik- (3) “to fight, conquer”).

Meaning “mental state of being convinced or fully persuaded” is from 1690s; that of “firm belief, a belief held as proven” is from 1841. In a religious sense, “state of being convinced one has acted in opposition to conscience, admonition of the conscience,” from 1670s.

convince (v.)

1520s, “to overcome in argument,” from Latin convincere “to overcome decisively,” from assimilated form of com-, here probably an intensive prefix (see com-), + vincere “to conquer” (from nasalized form of PIE root *weik- (3) “to fight, conquer”). Meaning “to firmly persuade or satisfy by argument or evidence” is from c. 1600. Related: Convincedconvincingconvincingly.

To convince a person is to satisfy his understanding as to the truth of a certain statement; to persuade him is, by derivation, to affect his will by motives; but it has long been used also for convince, as in Luke xx. 6, “they be persuaded that John was a prophet.” There is a marked tendency now to confine persuade to its own distinctive meaning. [Century Dictionary, 1897]

Both knowledge and conviction, and the needs they serve, are evolutionary survival skills:  we need what they give us to be safe, individually and collectively. Knowledge satisfies our need to be rational, to think clearly and logically, to distinguish this from that, to put things into dependable categories. Conviction satisfies the need to be moved, and also to be justified — to feel as though you are in good standing in the cosmology of how life is organized.

Culturally, conviction is often the source of embarrassment, guilt, and shame, all of which have a key social function — they are part of the glue that holds society together. Becoming aware that we have transgressed societal laws or behavioral norms (the “conviction of sin”) often brings not just chastisement but also remorse and relief — to ourselves and to others in our community:  we’ve been arrested, apprehended, overtaken by a corrective authority, and saved from doing further harm to ourselves and others.

Knowledge and conviction also have something else in common:  both originate in the brain’s complex tangle of neural networks:

“It is unlikely that beliefs as wide-ranging as justice, religion, prejudice or politics are simply waiting to be found in the brain as discrete networks of neurons, each encoding for something different. ‘There’s probably a whole combination of things that go together,’ says [Peter Halligan, a psychologist at Cardiff University].

“And depending on the level of significance of a belief, there could be several networks at play. Someone with strong religious beliefs, for example, might find that they are more emotionally drawn into certain discussions because they have a large number of neural networks feeding into that belief.”

Where Belief Is Born, The Guardian (June 30,2005).

And thus protected by the knowledge and convictions wired into our neural pathways, we make our way through this precarious thing called “life.”

More next time.

[1] Consider also the differences between terms like conviction and belief, and fact, opinion, belief, and prejudice.

Who’s In Charge Here?

Edelweiss mit Blüten,Wallis, Schweiz.
© Michael Peuckert

Edelweiss, edelweiss
Every morning you greet me

Small and white
Clean and bright
You look happy to meet me

(A little exercise in anthropomorphism
from The Sound of Music)

This hierarchy of consciousness we looked at last time — ours is higher than the rest of creation, angels’ is higher than ours, God’s is highest — is an exercise in what philosophy calls teleology:   “the explanation of phenomena in terms of the purpose they serve.” Teleology is about cause and effect — it looks for design and purpose, and its holy grail is what psychologists call agency:  who or what is causing things we can’t control or explain.

“This agency-detection system is so deeply ingrained that it causes us to attribute agency in all kinds of natural phenomena, such as anger in a thunderclap or voices in the wind, resulting in our universal tendency for anthropomorphism.

“Stewart Guthrie, author of Faces in the Clouds:  A New Theory of Religion, argues that ‘anthropomorphism may best be explained as the result of an attempt to see not what we want to see or what is easy to see, but what is important to see:  what may affect us, for better or worse.’ Because of our powerful anthropomorphic tendency, ‘we search everywhere, involuntarily and unknowingly, for human form and results of human action, and often seem to find them where they do not exist.’”

The Patterning Instinct:  A Cultural History of Humanity’s Search for Meaning, Jeremy Lent (2017)

Teleological thinking is a characteristic feature of religious, magical, and supernatural thinking:

“Academic research shows that religious and supernatural thinking leads people to believe that almost no big life events are accidental or random. As the authors of some recent cognitive-science studies at Yale put it, ‘Individuals’ explicit religious and paranormal beliefs are the best predictors of their perception of purpose in life events’—their tendency ‘to view the world in terms of agency, purpose, and design.”

How American Lost its Mind, The Atlantic (Sept. 2017)

Psychology prof Clay Routledge describes how science debunks teleology, but also acknowledges why it’s a comfortable way of thinking:

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless. From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it.

“For most humans, the idea that life is inherently meaningless simply will not do.

“Instead, people latch onto what I call teleological thinking. Teleological thinking is when people perceive phenomena in terms of purpose. When applied to natural phenomena, this type of thinking is generally considered to be flawed because it imposes design where there is no evidence for it.  To impose purpose and design where there is none is what researchers refer to as a teleological error.”

Supernatural: Death, Meaning, and the Power of the Invisible World, Clay Routledge (2018)

It’s one thing to recognize “teleological error,” it’s another to resist it — even for those who pride themselves on their rationality:

“Even atheists who reject the supernatural and scientists who are trained not to rely on teleological explanations of the world do, in fact, engage in teleological thinking.

“Many people who reject the supernatural do so through thoughtful reasoning. … However, when these people are making teleological judgments, they are not fully deploying their rational thinking abilities.

“Teleological meaning comes more from an intuitive feeling than it does from a rational decision-making process.”

Supernatural: Death, Meaning, and the Power of the Invisible World

Teleological thinking may be understandable, but scientist and medical doctor Paul Singh comes down hard on the side of science as the only way to truly “know” something:

“All scientists know that the methods we use to prove or disprove theories are the only dependable methods of understanding our universe. All other methodologies of learning, while appropriate to employ in situations when science cannot guide us, are inherently flawed. Reasoning alone — even the reasoning of great intellects — is not enough. It must be combined with the scientific method if it is to yield genuine knowledge about the universe.”

The Great Illusion:  The Myth of Free Will, Consciousness, and the Self, Paul Singh (2016)

After admitting that “evidence shows that the human brain is universally delusional in many ways,” Singh makes his case that “the use of logic and scientific skepticism is a skill that can be used to overcome the limitations of our own brains.”

Next time, we’ll look more into the differences in how science and religion “know” things to be “true.”

A Little Lower Than the Angels

“When I consider Your heavens, the work of Your fingers,
The moon and the stars, which You have ordained,
What is man that You are mindful of him,
And the son of man that You visit him?
For You have made him a little lower than [b]the angels,
And You have crowned him with glory and honor.
You have made him to have dominion over the works of Your hands;
You have put all things under his feet.”

Psalm 8:3-6 (NKJV)

Anthropocentrism is the belief that humans are the apex of creation. The belief is so common that the mere suggestion we might not be throws us into cognitive dissonance — “the state of having inconsistent thoughts, beliefs, or attitudes, especially as relating to behavioural decisions and attitude change.”

Cognitive dissonance runs especially hot when science threatens religious paradigms like the anthropocentric one in the Biblical passage above.[1] Biologist David Barash wrote his book to bring it on — this is from the Amazon promo:

 “Noted scientist David P. Barash explores the process by which science has, throughout time, cut humanity “down to size,” and how humanity has responded. A good paradigm is a tough thing to lose, especially when its replacement leaves us feeling more vulnerable and less special. And yet, as science has progressed, we find ourselves–like it or not–bereft of many of our most cherished beliefs, confronting an array of paradigms lost… Barash models his argument around a set of ‘old’ and ‘new’ paradigms that define humanity’s place in the universe.”

Through a Glass Brightly:  Using Science to See Our Species as We Really Are

Here’s his old/new paradigm summary re: anthropocentrism:

Old:  Human beings are fundamentally important to the cosmos.
New:  We aren’t.

Old:  We are literally central to the universe, not only astronomically, but in other ways, too.
New:  We occupy a very small and peripheral place in a not terribly consequential galaxy, tucked away in just one insignificant corner of an unimaginably large universe.

Cognitive dissonance is  why non- anthropocentric paradigms come across as just plain weird — like Robert Lanza’s biocentrism:

“Every now and then, a simple yet radical idea shakes the very foundations of knowledge. The startling discovery that the world was not flat challenged and ultimately changed the way people perceived themselves and their relationships with the world.

“The whole of Western natural philosophy is undergoing a sea change again, forced upon us by the experimental findings of quantum theory. At the same time, these findings have increased our doubt and uncertainty about traditional physical explanations of the universe’s genesis and structure.

“Biocentrism completes this shift in worldview, turning the planet upside down again with the revolutionary view that life creates the universe instead of the other way around. In this new paradigm, life is not just an accidental byproduct of the laws of physics.

“Biocentrism shatters the reader’s ideas of life, time and space, and even death. At the same time, it releases us from the dull worldview that life is merely the activity of an admixture of carbon and a few other elements; it suggests the exhilarating possibility that life is fundamentally immortal.”

Anthropocentrism works closely with another human-centered belief practice:  “anthropomorphism,” which is “the attribution of human traits, emotions, or intentions to non-human entities” — for example those angels we’re just a little lower than, and God, who put the God-angels-us-the rest of creation hierarchy in place. The human trait we attribute to God and the angels is the same one we believe sets us apart from the rest of creation:  consciousness.

“When our anthropomorphism is applied to religious thought, it’s notably the mind, rather than the body, that’s universally applied to spirits and gods. In the diverse cultures of the world, gods come in all shapes and sizes, but one thing they always share is a mind with the ability to think symbolically just like a human. This makes sense in light of the critical importance of theory of mind in the development of our social intelligence:  if other people have minds like ours, wouldn’t that be true of other agents we perceive to act intentionally in the world?”

The Patterning Instinct:  A Cultural History of Humanity’s Search for Meaning, Jeremy Lent (2017)

Anthropocentrism puts us in charge as far as our consciousness can reach. Anthropomorphism puts beings with higher consciousness in  charge of the rest. Both practices are truly anthropo- (human) centered; the beliefs they generate start and end with our own human consciousness. Which means our attempts to think beyond our range are inescapably idolatrous:  we create God and the angels in our image, and they return the favor.

There’s a philosophical term that describes what’s behind all this, called “teleology” — the search for explanation and design, purpose and meaning. We’ll look at that next time.

[1] The case for anthropocentrism starts in the first chapter of the Bible:  “Then God said, “Let us make mankind in our image, in our likeness, so that they may rule over the fish in the sea and the birds in the sky, over the livestock and all the wild animals,[a] and over all the creatures that move along the ground. So God created mankind in his own image, in the image of God he created them;    male and female he created them. God blessed them and said to them, “Be fruitful and increase in number; fill the earth and subdue it. Rule over the fish in the sea and the birds in the sky and over every living creature that moves on the ground. Then God said, “I give you every seed-bearing plant on the face of the whole earth and every tree that has fruit with seed in it. They will be yours for food. And to all the beasts of the earth and all the birds in the sky and all the creatures that move along the ground—everything that has the breath of life in it—I give every green plant for food.” Genesis 1: 26-30.The post-deluge version removed the vegetarian requirement:  “Then God blessed Noah and his sons, saying to them, “Be fruitful and increase in number and fill the earth. The fear and dread of you will fall on all the beasts of the earth, and on all the birds in the sky, on every creature that moves along the ground, and on all the fish in the sea; they are given into your hands. Everything that lives and moves about will be food for you. Just as I gave you the green plants, I now give you everything.” Genesis 9: 1-3.

“Fearfully and Wonderfully Made”

da vinci

We are starting this series on Consciousness and the Self by looking at some of the religious and secular foundations of the belief that humans are a dualist entity consisting of body and soul, and the associated belief that the two elements are best understood by different forms of inquiry — religion and the humanities for the soul, and science for the body. As we’ll see, current neuro-biological thinking defies these beliefs and threatens their ancient intellectual, cultural, and historical dominance.

This article[1] is typical in its conclusion that one of the things that makes human beings unique is our “higher consciousness.”

“[Home sapiens] sits on top of the food chain, has extended its habitats to the entire planet, and in recent centuries, experienced an explosion of technological, societal, and artistic advancements.

“The very fact that we as human beings can write and read articles like this one and contemplate the unique nature of our mental abilities is awe-inspiring.

“Neuroscientist V.S. Ramachandran said it best: ‘Here is this three-pound mass of jelly you can hold in the palm of your hand…it can contemplate the meaning of infinity, and it can contemplate itself contemplating the meaning of infinity.’

“Such self-reflective consciousness or ‘meta-wondering’ boosts our ability for self-transformation, both as individuals and as a species. It contributes to our abilities for self-monitoring, self-recognition and self-identification.”

The author of the following Biblical passage agrees, and affirms that his “soul knows it very well” — i.e., not only does he know he’s special, but he knows that he knows it:

For you formed my inward parts;
    you knitted me together in my mother’s womb.
I praise you, for I am fearfully and wonderfully made.
Wonderful are your works;
    my soul knows it very well.

Psalm 139: 13-16 (ESV)

Judging from worldwide religious practice, the “I” that is “fearfully and wonderfully made” is limited to the soul, not the body:  the former feels the love, while the latter is assaulted with unrelenting, vicious, sometimes horrific verbal and physical abuse. “Mortification of the flesh” indeed –as if the body needs help being mortal.

Science apparently concurs with this dismal assessment. The following is from the book blurb for Through a Glass Brightly:  Using Science to See Our Species as We Really Are, by evolutionary biologist and psychologist David P. Barash (2018):

“In Through a Glass Brightly, noted scientist David P. Barash explores the process by which science has, throughout time, cut humanity ‘down to size,’ and how humanity has responded. A good paradigm is a tough thing to lose, especially when its replacement leaves us feeling more vulnerable and less special. And yet, as science has progressed, we find ourselves–like it or not–bereft of many of our most cherished beliefs, confronting an array of paradigms lost.

“Barash models his argument around a set of “old” and “new” paradigms that define humanity’s place in the universe. This new set of paradigms [includes] provocative revelations [such as] whether human beings are well designed… Rather than seeing ourselves through a glass darkly, science enables us to perceive our strengths and weaknesses brightly and accurately at last, so that paradigms lost becomes wisdom gained. The result is a bracing, remarkably hopeful view of who we really are.”

Barash’s old and new paradigms about the body are as follows:

“Old paradigm:  The human body is a wonderfully well constructed thing, testimony to the wisdom of an intelligent designer.

“New paradigm:  Although there is much in our anatomy and physiology to admire, we are in fact jerry-rigged and imperfect, testimony to the limitations of a process that is nothing but natural and that in no way reflects supernatural wisdom or benevolence.”

Okay, so maybe the body has issues, but the old paradigm belief that human-level consciousness justifies lording it over the rest of creation is as old as the first chapter of the Bible:

And God blessed them. And God said to them,
“Be fruitful and multiply and fill the earth and subdue it
and have dominion over the fish of the sea
 and over the birds of the heavens
 and over every living thing that moves on the earth.”

Genesis 1:28  (ESV)

The Biblical mandate to “subdue” the earth explains a lot about how we approach the rest of creation — something people seem to be questioning more and more these days. Psychiatrist, essayist, and Oxford Fellow Neel Burton includes our superiority complex in his list of self-deceptions:

“Most people see themselves in a much more positive light than others do them, and possess an unduly rose-tinted perspective on their attributes, circumstances, and possibilities. Such positive illusions, as they are called, are of three broad kinds, an inflated sense of one’s qualities and abilities, an illusion of control over things that are mostly or entirely out of one’s control, and an unrealistic optimism about the future.” [2]

Humans as the apex of creation? More on that next time.

[1] What is it That Makes Humans Unique? Singularity Hub, Dec. 28, 2017.

[2] Hide and Seek:  The Psychology of Self-Deception (Acheron Press, 2012).

“Before You Were Born I Knew You”

The_Summoner_-_Ellesmere_Chaucer-300x282The Summoner in Chaucer’s The Canterbury Tales,
Ellesmere MSS, circa 1400

Last time we looked at the common dualistic paradigm of consciousness, which is based on (a) the belief that humans are made in two parts — an ethereal self housed in a physical body — and (b) the corollary belief that religion and the humanities understand the self best, while science is the proper lens for the body.

Current neuroscience theorizes instead that consciousness arises from brain, body, and environment — all part of the physical, natural world, and therefore best understood by scientific inquiry.

We looked at the origins of the dualistic paradigm last time. This week, we’ll look at an example of how it works in the world of jobs and careers —  particularly the notion of being “called” to a “vocation.”

According to the Online Etymology Dictionary, the notion of “calling” entered the English language around Chaucer’s time, originating from Old Norse kalla — “to cry loudly, summon in a loud voice; name, call by name.” Being legally summoned wasn’t a happy thing in Chaucer’s day (it still isn’t), and summoners were generally wicked, corrupt, and otherwise worthy of Chaucer’s pillory in The Friar’s Tale.

“Calling” got an image upgrade a century and a half later, in the 1550’s, when the term acquired the connotation of “vocation, profession, trade, occupation.” Meanwhile, “vocation” took on the meaning of “spiritual calling,” from Old French vocacio, meaning “call, consecration; calling, profession,” and Latin vocationem — “a calling, a being called” to “one’s occupation or profession.”

“Calling” and “vocation” together support the common dream of being able to do the work we were born to do, and the related belief that this would make our work significant and us happy. The idea of vocational calling is distinctly Biblical:[1]

“Before I formed you in the womb I knew you,
and before you were born I consecrated you;
I appointed you a prophet to the nations.”

Jeremiah 1:5 (ESV

Something in us — an evolutionary survival instinct, I would guess — wants to be known, especially by those in power. Vocational calling invokes power at the highest level:  never mind your parents’ hormones, you were a gleam in God’s eye; and never mind the genes you inherited, God coded vocational identity and purpose into your soul.

2600 years after Jeremiah, we’re still looking for the same kind of affirmation.

“Amy Wrzesniewski, a professor at Yale School of Management and a leading scholar on meaning at work, told me that she senses a great deal of anxiety among her students and clients. ‘They think their calling is under a rock,’ she said, ‘and that if they turn over enough rocks, they will find it.’ If they do not find their one true calling, she went on to say, they feel like something is missing from their lives and that they will never find a job that will satisfy them. And yet only about one third to one half of people whom researchers have surveyed see their work as a calling. Does that mean the rest will not find meaning and purpose in their careers?”

The Power of Meaning:  Crafting a Life That Matters, Emily Esfahani Smith

If only one-third to one-half of us feel like we’re living our vocational calling, then why do we hang onto the dream? Maybe the problem is what Romantic Era poet William Wordsworth wrote about in his Ode:  Intimations of Immortality:

“Our birth is but a sleep and a forgetting:
The Soul that rises with us, our life’s Star,
Hath had elsewhere its setting,
And cometh from afar:
Not in entire forgetfulness,
And not in utter nakedness,
But trailing clouds of glory do we come
From God, who is our home:
Heaven lies about us in our infancy!

“Shades of the prison-house begin to close
Upon the growing Boy,
But he beholds the light, and whence it flows,
He sees it in his joy;
The Youth, who daily farther from the east
Must travel, still is Nature’s Priest,
And by the vision splendid
Is on his way attended;
At length the Man perceives it die away,
And fade into the light of common day.”

I.e., maybe something tragic happens when an immortal self comes to live in a mortal body. This, too, is a common corollary belief to body/soul dualism — religion’s distrust of “the flesh” is standard issue.

Cognitive neuroscientist Christian Jarrett offers career advice to the afflicted:  you might be able to turn the job you already have into a calling if you invest enough in it, or failing that, you might find your source of energy and determination somewhere else than in your work. This Forbes article reaches a similar conclusion:

“Years ago, I read a very thought-provoking article by Michael Lewis … about the difference between a calling and a job. He had some powerful insights. What struck me most were two intriguing concepts:

‘There’s a direct relationship between risk and reward. A fantastically rewarding career usually requires you to take fantastic risks.’

‘A calling is an activity that you find so compelling that you wind up organizing your entire self around it — often to the detriment of your life outside of it.’”

I.e., maybe career satisfaction isn’t heaven-sent; maybe instead it’s developed in the unglamorous daily grind of life in the flesh.

More on historical roots and related beliefs coming up.

[1] For more Biblical examples, see Isaiah 44:24:  Thus says the Lord, your Redeemer, who formed you from the womb: Galatians 1:15:  But when he who had set me apart before I was born; Psalm 139:13, 16:  13  For you formed my inward parts; you knitted me together in my mother’s womb; your eyes saw my unformed substance; in your book were written, every one of them, the days that were formed for me, when as yet there was none of them.

Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).

Paying the Moral Debt of War

soldier

War confers on soldiers the right to think and do what is immoral and illegal in peacetime society. This creates a moral debt that must be paid or otherwise ethically discharged in order for the nation at war to return to peacetime business as usual. The returning soldiers bear the burden of this debt, and society must help them be freed of it, although there is little heart for enduring the narcotic withdrawal required:

“However much soldiers regret killing once it is finished, however much they spend their lives trying to cope with the experience, the act itself, fueled by fear, excitement, the pull of the crowd, and the god-like exhilaration of destroying, is often thrilling.”

“Fundamental questions about the meaning, or meaninglessness, of our place on the planet are laid bare when we watch those around us sink to the lowest depths. War exposes the capacity for evil that lurks not far below the surface within all of us. And this is why for many war is so hard to discuss once it is over.

Chris Hedges, War is a Force That Gives Us Meaning

“How do you talk about morally reprehensible things that have left a bruise on your soul?” asks the author of The Conversation We Refuse to Have About War and Our Veterans, Medium (May 24, 2019). The article is insightful and moving, and I’ll quote it at length:

“The guilt and moral tension many veterans feel is not necessarily post-traumatic stress disorder, but moral injury — the emotional shame and psychological damage soldiers incur when we have to do things that violate our sense of right and wrong. Shooting a woman or child. Killing another human. Watching a friend die. Laughing about situations that would normally disgust us.

“Because so few in America have served, those who have can no longer relate to their peers, friends, and family. We fear being viewed as monsters, or lauded as heroes when we feel the things we’ve done were morally ambiguous or wrong.

“As Amy Amidon, a Navy psychologist, stated in an interview regarding moral injury:

‘Civilians are lucky that we still have a sense of naiveté about what the world is like. The average American means well, but what they need to know is that these [military] men and women are seeing incredible evil, and coming home with that weighing on them and not knowing how to fit back into society.’

“Most of the time… people only want to hear the heroics. They don’t want to know what the war is costing our sons and daughters in regard to mental health, and this only makes the gap wider. In order for our soldiers to heal, society needs to own up to its part in sending us to war. The citizen at home may not have pulled the trigger, but they asked the soldier to go in their place. Citing a 2004 studyDavid Wood explains that the ‘grief over losing a combat buddy was comparable, more than 30 years later, to that of a bereaved spouse whose partner had died in the previous six months.’ The soul wounds we experience are much greater. Society needs to come alongside us rather than pointing us to the VA.

“Historically, many cultures performed purification rites for soldiers returning home from war. These rites purified a broad spectrum of warriors, from the Roman Centurion to the Navajo to the Medieval Knight. Perhaps most fascinating is that soldiers returning home from the Crusades were instructed to observe a period of purification that involved the Christian church and their community. Though the church had sanctioned the Crusades, they viewed taking another life as morally wrong and damaging to their knights’ souls.

“Today, churches typically put veterans on stage to praise our heroics or speak of a great battle we’ve overcome while drawing spiritual parallels for their congregation. What they don’t do is talk about the moral weight we bear on their behalf.

“Dr. Jonathan Shay, the clinical psychologist who coined the term moral injury, argues that in order for the soldier and society to find healing, we must come together and bear the moral responsibility of what soldiers have done in our name.

“As General Douglas MacArthur eloquently put it:

‘The soldier above all other people prays for peace, for he must suffer and bear the deepest wounds and scars of war.’”

More next time.

Photo by Obed Hernández on Unsplash

 

All War is Holy War

holy war

According to one anthropologist,[1] the Yanomami Amazonian tribe lives in a “chronic state of war”:  violence against outsiders and members alike is a normal way of life. Their culture is the exception — most require a shift from peacetime to wartime culture in order for maiming and murdering to be acceptable. The shift begins with a cause to rally around:

“It is hard, maybe impossible, to fight a war if the cause is viewed as bankrupt. The sanctity of the cause is crucial to the war effort.”

War is a Force That Gives Us Meaning, Chris Hedges (2002).[2]

Most cultures are governed by some version of “Thou shalt not kill,” but God and the gods are not so constrained — they can and do kill, and direct their followers to do so. Therefore, to justify the mayhem, the state must become religious, and its cause must be sacred.

“War celebrates only power — and we come to believe in wartime that it is the only real form of power. It preys on our most primal and savage impulses. It allows us  to do what peacetime society forbids or restrains us from doing:  It allows us to kill.”

In wartime, the state is anointed with the requisite elements of religious culture:  dogmas and orthodox language; rites of initiation and passage; songs, symbols, metaphors, and icons; customs and laws to honor heroes, demonize foes, discipline skeptics, and punish nonbelievers.

“Because we in modern society have walked away from institutions that stand outside the state to find moral guidance and spiritual direction, we turn to the state in times of war.

“We believe in the nobility and self-sacrifice demanded by war… We discover in the communal struggle, the shared sense of meaning and purpose, a cause. War fills our spiritual void.”

Religious anointing reverses the secular aversion to killing and death:

“War finds its meaning in death.

“The cause is built on the backs of victims, portrayed always as innocent. Indeed, most conflicts are ignited with martyrs, whether real or created. The death of an innocent, one who is perceived as emblematic of the nation or the group under attack, becomes the initial rallying point for war. These dead become the standard bearers of the cause and all causes feed off the steady supply of corpses.

“The cause, sanctified by the dead, cannot be questioned without dishonoring those who gave up their lives. We become enmeshed in the imposed language.

“There is a constant act of remembering and honoring the fallen during war. These ceremonies sanctify the cause.

The first death is the most essential:

“Elias Canetti [winner of the Nobel Prize in Literature in 1981] wrote, “it is the first death which infects everyone with the feeling of being threatened. It is impossible to overrate the part played  by the first dead man in the kindling of war. Rulers who want to unleash war know very well that they must procure or invent a first victim. It need not be anyone of particular importance, and can even be someone quite unknown. Nothing matters except his death, and it must be believed that the enemy is responsible for this. Every possible cause of his death is suppressed except one:  his membership of the group to which one belongs oneself.”

Dissent has no place in the culture of war. The nation’s institutions and citizens are expected to speak the language of war, which frames and limits public discourse.

“The adoption of the cause means adoption of the language of the cause.

“The state spends tremendous time protecting, explaining, and promoting the cause. And some of the most important cheerleaders of the cause are the reporters. This is true in nearly every war. During the Gulf War, as in the weeks after the September attacks, communities gathered for vigils and worship services. The enterprise of the state became imbued with a religious aura. We, even those in the press, spoke in the collective.

“The official jargon obscures the game of war — the hunters and the hunted. We accept terms imposed on us by the state — for example, the “war on terror” — and these terms set the narrow parameters by which we are able to think and discuss.”

Exaltation of the nation, faith in the cause, honoring of the dead, and conformity to the language of war make doubt and dissent damnable:

“When we speak within the confines of this language we give up our linguistic capacity to question and make moral choices.

“The cause is unassailable, wrapped in the mystery reserved for the divine. Those who attempt to expose the fabrications and to unwrap the contradictions of the cause are left isolated and reviled.

“The state and the institutions of state become, for many, the center of worship in wartime. To expose the holes in the myth is to court excommunication.

“When any contradiction is raised or there is a sense that the cause is not just in an absolute sense, the doubts are attacked as apostasy.”

In war, the state shares dominion with the gods. When war ends, the state’s leaders, intoxicated with power, may not release war’s grip on the culture:

“There is a danger of a growing fusion between those in the state who wage war — both for and against modern states — and those who believe they understand and can act as agents of God.

“The moral certitude of the state in wartime is a kind of fundamentalism… And this dangerous messianic brand of religion, one where self-doubt is minimal, has come increasingly to color the modern world of Christianity, Judaism, and Islam.”

For the state to revert to peacetime culture, the moral shift that supported war must be reversed by both civilians and soldiers. This requires a harrowing withdrawal from addiction to wartime culture. We’ll talk about that next time.

[1] Napoleon Alphonseau Chagnon,

[2] All quotes in this article are from Chris Hedges’ book.

It’s a MAD MAD MAD MAD World

Mad,_Mad,_Mad,_Mad_World_(1963)_theatrical_poster

MAD — Mutually Assured Destruction — might be the most ironic policy acronym ever. The theory behind it seems reasonable:  if everybody knows that nuclear war will end in total destruction no matter who starts it, then nobody will start it.

The theory holds if both sides have sufficient fire power and neither has a foolproof defense or survival strategy. President Reagan tried to one-up the latter with his Star Wars” Strategic Defense Initiative, but it didn’t last. President Putin has made similar claims recently, but nobody seems to be taking him seriously. Thus MAD lives on. But if it’s so airtight, then why aren’t we relieved? Why do we still feel the “assured destruction” shadow?

Well for one thing, MAD can’t deter everybody. It only takes one nutcase with access to the button, and there’s always been one of those somewhere, either in charge of a nation that has the bomb or a religion, revolution, or other powerful institution that might get its hands on it.

“What we can say is that, as of this morning, those with the power to exterminate life have not done so. But this is not altogether comforting, and history is no more reassuring.”

The Deterrence Myth Aeon Magazine (Jan. 9, 2018) (Except where otherwise noted, the following quotes are also from this source.)

For another thing, “it is not legitimate to argue that nuclear weapons have deterred any sort of war, or that they will do so in the future” — even when there is an imbalance of power:

“Even when possessed by just one side, nuclear weapons have not deterred other forms of war. The Chinese, Cuban, Iranian and Nicaraguan revolutions all took place even though a nuclear-armed US backed the overthrown governments. Similarly, the US lost the Vietnam War, just as the Soviet Union lost in Afghanistan, despite both countries not only possessing nuclear weapons, but also more and better conventional arms than their adversaries. Nor did nuclear weapons aid Russia in its unsuccessful war against Chechen rebels in 1994-96, or in 1999-2000, when Russia’s conventional weapons devastated the suffering Chechen Republic. Nuclear weapons did not help the US achieve its goals in Iraq or Afghanistan, which have become expensive catastrophic failures for the country with the world’s most advanced nuclear weapons. Moreover, despite its nuclear arsenal, the US remains fearful of domestic terrorist attacks, which are more likely to be made with nuclear weapons than be deterred by them.”

Plus, however rational MAD may be in theory, it ignores the impetuous aspects of human nature:

“Deterrence theory assumes optimal rationality on the part of decision-makers. It presumes that those with their fingers on the nuclear triggers are rational actors who will also remain calm and cognitively unimpaired under extremely stressful conditions. It also presumes that leaders will always retain control over their forces and that, moreover, they will always retain control over their emotions as well, making decisions based solely on a cool calculation of strategic costs and benefits.

“Deterrence theory maintains, in short, that each side will scare the pants off the other with the prospect of the most hideous, unimaginable consequences, and will then conduct itself with the utmost deliberate and precise rationality. Virtually everything known about human psychology suggests that this is absurd.

“It requires no arcane wisdom to know that people often act out of misperceptions, anger, despair, insanity, stubbornness, revenge, pride and/or dogmatic conviction. Moreover, in certain situations – as when either side is convinced that war is inevitable, or when the pressures to avoid losing face are especially intense – an irrational act, including a lethal one, can appear appropriate, even unavoidable.”

Further, deterrence requires readiness — another rational-sounding ideal, but where to draw the line between self-defense and aggression is anybody’s guess.

“The military knows its purpose, and that purpose does not end with awareness and deterrence. The commander of Air Force Space Command is clear about the mandate. ‘Our job is to prepare for conflict. We hope this preparation will deter potential adversaries…, but our job is to be ready when and if that day comes.’”

Accessory to War:  The Unspoken Alliance Between Astrophysics and the Military, Neil deGrasse Tyson and Avis Lang

That said, MAD’s fatal flaw might be that it promotes militarism as a shared cultural belief,[1] which feeds the beast known as the “military-industrial complex” — a term usually associated with dissent, which belies its origins. More on that next time.

[1] The author of The Deterrence Myth is David P. Barash, who has written about demilitarization as a preferable strategy. See Strength Through Peace:  How Demilitarization Led to Peace and Happiness in Costa Rica, and What the Rest of the World can Learn From a Tiny, Tropical Nation. See also Through a Glass Brightly: Using Science to See Our Species as We Really Are.