Belief in Belief

ya gotta believe

New York Mets fans at the 1973 World Series
(they lost)

The quest to resolve the consciousness hard problem needs a boost from quantum mechanics to get any further. Either that, or there needs to be a better way to state the issue. As things stand, neuroscience’s inability to locate subjectivity in our brain matter gives pro-subjectivity the right to cite quantum mechanics as its go-to scientific justification.

The $12 Billion self-help industry and its coaches, speakers, and authors love quantum mechanics:  if subjectivity works on a sub-atomic level, the argument goes, then why not apply it on a macro, conscious level? Meanwhile, quantum scientists seem to have resigned themselves to the notion that, if their theories don’t have to be grounded in traditional objective standards like empirical testing and falsifiability, then why not hypothesize about multiverses and call that science?

Thus scientific rationalism continues to be on the wane — in science and as a way of life — especially in the USA, where belief in belief has been an ever-expanding feature of the American Way since we got started. To get the full perspective on America’s belief in belief, you need to read Kurt Andersen’s book, Fantasyland:  How American Went Haywire, a 500-Year History (2017), which I quoted at length last time. (Or for the short version, see this Atlantic article.)  The book provides a lot of history we never learned, but also reveals that the roots of our belief in belief go back even further than our own founding, and beyond our own shores. Although we weren’t founded as a Christian nation[1] (in the same way, for example, that Pakistan was expressly founded as a Muslim nation), Andersen traces this aspect of our ideological foundations to the Protestant Reformation:

“[Luther] insisted that clergymen have no special access to God or Jesus or truth. Everything a Christian needed to know was in the Bible. So every individual Christian believer could and should read and interpret Scripture for himself or herself. Every believer, Protestants said, was now a priest.

“Apart from devolving religious power to ordinary people — that is, critically expanding individual liberty — Luther’s other big idea was that belief in the Bible’s supernatural stories, especially those concerning Jesus, was the only prerequisite for being a Christian. You couldn’t earn your way into Heaven by performing virtuous deeds. Having a particular set of beliefs was all that mattered.

“However, out of the new Protestant religion, a new proto-American attitude emerged during the 1500s. Millions of ordinary people decided that they, each of them, had the right to decide what was true or untrue, regardless of what fancy experts said. And furthermore, they believed, passionate fantastical belief was the key to everything. The footings for Fantasyland had been cast.”

But even the Protestant Reformation isn’t back far enough. Luther’s insistence that anybody can get all the truth they need from the Bible is the Christian doctrine of sola scirptura, which holds that the Bible is the ultimate source of truth. And the Bible is where we find the original endorsement of the primacy of belief, in the teachings of none other than Jesus himself:

“Truly, I say to you, whoever says to this mountain, ‘Be taken up and thrown into the sea,’ and does not doubt in his heart,  but believes that what he says will come to pass, it will be done for him.”

Mark 11:23 (ESV)

Thus, the Christian rationale for belief in belief goes something like this:

  • “We believe the Bible tells the truth;
  • “The Bible says Jesus was God incarnate;
  • “God knows what’s true;
  • “Jesus, as God, spoke truth;
  • “Therefore, what Jesus said about belief is true.”

The rationale begins and ends in belief. Belief is a closed loop — you either buy it by believing, or you don’t. And if you believe, you don’t doubt or question, because if you do, belief won’t work for you, and it will be your own fault — you’ll be guilty of doubting in your heart or some other kind of sabotage. For example,

“If any of you lacks wisdom, let him ask God, who gives generously to all without reproach, and it will be given him. 6 But let him ask in faith, with no doubting, for the one who doubts is like a wave of the sea that is driven and tossed by the wind. 7 For that person must not suppose that he will receive anything from the Lord; 8 he is a double-minded man, unstable in all his ways.”

James 1:5-8 (ESV)

Thus belief disposes of every criticism against it. You’re either in or out, either with us or against us. Or, as a friend of mine used to say, “The Bible says it, I believe it, and that settles it!” And if your doubts persist, there are consequences. When I expressed some of mine back in college, the same friend handed me a Bible and said, “Read Luke 6: 62.”

“Jesus said to him, ‘No one who puts his hand to the plow and looks back is fit for the kingdom of God.’

Luke 9: 62  (ESV)

End of discussion.

But not here, not in this blog. Here, our mission is to challenge cherished beliefs and institutions. Here, we’ll to look more into what it means to believe in belief, and consider other options. In the meantime, we’ll set aside the hard problem of consciousness while we wait for further developments,

For more on today’s topic, you might take a look at Should We Believe In Belief? (The Guardian, July 17, 2009), and be sure to click the links at the end and read those pieces, too. All the articles are short and instructive.

[1] For a detailed consideration (and ultimate refutation) of the claim that American was founded as a Christian nation , see The Founding Myth, by Andrew L. Seidel (2019).

How Impossible Becomes Possible

active nerve cell in human neural system

network

Scientific materialism explains a lot about how the brain creates consciousness, but hasn’t yet fully accounted for subjective awareness. As a result, the “hard problem” of consciousness remains unsolved, and we’re alternately urged to either concede that the human brain just isn’t ready to figure itself out, or conclude that reality is ultimately determined subjectively.

Princeton psychology and neuroscience professor Michael S. A. Graziano isn’t ready to do either. He thinks the “hard problem” label is itself the problem, because it cuts off further inquiry:

“Many thinkers are pessimistic about ever finding an explanation of consciousness. The philosopher Chalmers in 1995, put it in a way that has become particularly popular. He suggested that the challenge of explaining consciousness can be divided into two problems. One, the easy problem, is to explain how the brain computes and stores information. Calling this problem easy is, of course, a euphemism. What it meant is something more like the technically possible problem given a lot of scientific work.

“In contrast, the hard problem is to explain how we become aware of all that stuff going on in the brain. Awareness itself, the essence of awareness, because it is presumed to be nonphysical, because it is by definition private, seems to be scientifically unapproachable. Again, calling it the hard problem is a euphemism, it is the impossible problem.

“The hard-problem view has a pinch of defeatism in it. I suspect that for some people it also has a pinch of religiosity. It is a keep-your-scientific-hands-off-my-mystery perspective. In the hard problem view, rather than try to explain consciousness, we should marvel at its insolubility. We have no choice but to accept it as a mystery.

“One conceptual difficulty with the hard-problem view is that it argues against any explanation of consciousness without knowing what explanations might arise. It is difficult to make a cogent argument against the unknown. Perhaps an explanation exists such that, once we see what it is, once we understand it, we will find that it makes sense and accounts for consciousness.”

Consciousness and the Social Brain. by Michael S. A. Graziano (2013).

I.e., if science is going to explain consciousness, it needs to reframe its inquiry, so that what is now an “impossible,” “scientifically unapproachable” problem becomes a “technically possible problem” that can be solved “given a lot of scientific work.”

Technology and innovation writer Steven Johnson describes how he thinks the impossible becomes possible in Where Good Ideas Come From — available as a TED talk. book, and animated whiteboard drawing piece on YouTube. In his TED talk, he contrasted popular subjective notions with what neuroscience has discovered about how the brain actually works:

“[We] have to do away with a lot of the way in which our conventional metaphors and language steer us towards certain concepts of idea-creation. We have this very rich vocabulary to describe moments of inspiration. We have … the flash of insight, the stroke of insight, we have epiphanies, we have ‘eureka!’ moments, we have the lightbulb moments… All of these concepts, as kind of rhetorically florid as they are, share this basic assumption, which is that an idea is a single thing, it’s something that happens often in a wonderful illuminating moment.

“But in fact, what I would argue is … that an idea is a network on the most elemental level. I mean, this is what is happening inside your brain. An idea — a new idea — is a new network of neurons firing in sync with each other inside your brain. It’s a new configuration that has never formed before. And the question is, how do you get your brain into environments where these new networks are going to be more likely to form?”

Johnson expands on the work of biologist and complex systems researcher Stuart Kauffman, who dubbed this idea the “adjacent possibility.” Adjacent possibility is where the brain’s neural networks (top picture above) meet data networks (the bottom picture):  neither is a static, closed environment; both are dynamic, constantly shifting and re-organizing, with each node representing a new point from which the network can expand. Thus the shift from unknown to known is always a next step away:

“The adjacent possible is a kind of shadow future, hovering on the edges of the present state of things, a map of all the ways in which the present can reinvent itself.”

Vittorio Loreto and his colleagues at Sapienza University of Rome turned adjacent possibility into a mathematical model which they then submitted to objective, empirical, real world testing. As he said in his TED talk:

“Experiencing the new means exploring a very peculiar space, the space of what could be, the space of the possible, the space of possibilities.

“We conceived our mathematical formulation for the adjacent possible, 20 years after the original Kauffman proposals.

“We had to work out this theory, and we came up with a certain number of predictions to be tested in real life.”

Their test results suggest that adjacent possibility is good science — that impossible doesn’t step out of the ether, it waits at the edge of expanding neural networks, ready to become possible.[1] As Steven Johnson said above, that’s a far cry from our popular romantic notions of revelations, big ideas, and flashes of brilliance. We look more at those next time.

[1] For a nerdier version, see this Wired piece: The ‘Adjacent Possible’ of Big Data: What Evolution Teaches About Insights Generation.

So Consciousness Has a Hard Problem… Now What?

god helmet

We’ve been looking at the “hard problem” of consciousness:

  • Neuroscience can identify the brain circuits that create the elements of consciousness and otherwise parse out how “the meat thinks,” but it can’t quite get its discoveries all the way around the mysteries of subjective experience.
  • That’s a problem because we’re used to thinking along Descartes’ dualistic distinction between scientific knowledge, which is objective, empirical, and invites disproving, and belief-based conviction, which is subjective, can’t be tested and doesn’t want to be.
  • What’s worse, science’s recent work in quantum mechanics, artificial intelligence, and machine learning has blurred those dualistic lines by exposing the primacy of subjectivity even in scientific inquiry.
  • All of which frustrates our evolutionary survival need to know how the world really works.[1]

Some people are ready to declare that subjective belief wins, and science will just have to get over it. That’s what happened with the “God Helmet” (shown in the photo above, taken from this article), Dr. Michael Persinger[2] created the helmet for use in neuro-religious research:

“This is a device that is able to simulate religious experiences by stimulating an individual’s tempoparietal lobes using magnetic fields. ‘If the equipment and the experiment produced the presence that was God, then the extrapersonal, unreachable, and independent characteristics of the god definition might be challenged,’ [says Dr. Persinger].” [3]

The God Helmet creates subjective experiences shared among various religions, such as sensing a numinous presence, a feeling of being filled with the spirit or overwhelmed or possessed, of being outside of self, out of body, or having died and come back to life, feelings of being one with all things or of peace, awe, fear and dread, etc. Since all of these states have been either measured or induced in the laboratory, you’d think that might dampen allegiance to the belief that they are God-given, but not so. Instead, when the God Helmet was tested on a group of meditating nuns, their conclusion was, how wonderful that God equipped the brain in that way, so he could communicate with us. Similarly,

 “Some years ago, I discussed this issue with Father George Coyne, a Jesuit priest and astronomer who was then Director of the Vatican Observatory. I asked him what he thought of the notion that when the 12th‑century Hildegard of Bingen was having her visions of God, perhaps she was having epileptic fits. He had no problem with the fits. Indeed, he thought that when something so powerful was going on in a mind, there would necessarily be neurological correlates. Hildegard might well have been an epileptic, Father Coyne opined; that didn’t mean God wasn’t also talking to her.”

The Mental Block – Consciousness Is The Greatest Mystery In Science. Aeon Magazine (Oct. 9, 2013)

If we’re not willing to concede the primacy of subjectivity, then what? Well, we could give up on the idea that the human race is equipped to figure out everything it would really like to know.

 “It would be poetic – albeit deeply frustrating – were it ultimately to prove that the one thing the human mind is incapable of comprehending is itself. An answer must be out there somewhere. And finding it matters: indeed, one could argue that nothing else could ever matter more – since anything at all that matters, in life, only does so as a consequence of its impact on conscious brains. Yet there’s no reason to assume that our brains will be adequate vessels for the voyage towards that answer. Nor that, were we to stumble on a solution to the Hard Problem, on some distant shore where neuroscience meets philosophy, we would even recognise that we’d found it.”

Why Can’t The World’s Greatest Minds Solve The Mystery Of Consciousness? The Guardian (Jan. 21, 2015)

“Maybe philosophical problems are hard not because they are divine or irreducible or workaday science, but because the mind of Homo sapiens lacks the cognitive equipment to solve them. We are organisms, not angels, and our minds are organs, not pipelines to the truth. Our minds evolved by natural selection to solve problems that were life-and-death matters to our ancestors, not to commune with correctness or to answer any question we are capable of asking. We cannot hold ten thousand words in short-term memory. We cannot see in ultraviolet light. We cannot mentally rotate an object in the fourth dimension. And perhaps we cannot solve conundrums like free will and sentience.”

How the Mind Works, Steven Pinker (1997)

Evolutionary biologist David Barash attributes our inability to the vastly different pace of biological evolution (what the operative biology of our brains can process) vs. cultural evolution (what we keep learning and inventing and hypothesizing about). Trouble is, the latter moves way too fast for the former to keep up.

“On the one hand, there is our biological evolution, a relatively slow-moving organic process that can never proceed more rapidly than one generation at a time, and that nearly always requires an enormous number of generations for any appreciable effect to arise.

“On the other hand is cultural evolution, a process that is, by contrast, extraordinary in its speed.

“Whereas biological evolution is Darwinian, moving by the gradual substitution and accumulation of genes, cultural evolution is … powered by a nongenetic ‘inheritance” of acquired characteristics. During a single generation, people have selectively picked up, discarded, manipulated, and transmitted cultural, social, and technological innovations that have become almost entirely independent of any biological moorings.

“We are, via our cultural evolution, in over our biological heads.”

Through a Glass Brightly:  Using Science to See Our Species as We Really Are, David P. Barash (2018)

Give in to subjectivity, or just give up…. We’ll look at another option next time.

[1] The study of how we know things is Epistemology.

[2] Dr. Persinger was director of the Neuroscience Department at Laurentian University in Ontario, Canada prior to his death in 2018.

[3] “What God Does To Your Brain:  The controversial science of neurotheology aims to find the answer to an age-old question: why do we believe?” The Telegraph (June 20, 2014).

Zombies and the Consciousness Hard Problem

              night of the living dead                   Walking Dead

                      Poster from the 1968 movie     https://comicbook.com/thewalkingdead

Philosophers and psychologists call human traits like feelings, conscience, and self- awareness “qualia,” and believe that, if zombies can lack them but still look and act like us (on a really bad day), then locating consciousness entirely in human biology (“physicalism”) can’t be right.

“Physicalism allows us to imagine a world without consciousness, a ‘Zombie world’ that looks exactly like our own, peopled with beings who act exactly like us but aren’t conscious. Such Zombies have no feelings, emotions or subjective experience; they live lives without qualia. As [philosopher David Chalmers][1] has noted, there is literally nothing it is like to be Zombie. And if Zombies can exist in the physicalist account of the world, then, according to Chalmers, that account can’t be a complete description of our world, where feelings do  exist: something more is needed, beyond the laws of nature, to account for conscious subjective experience.”

I Feel Therefore I Am, Aeon Magazine Dec. 1, 2015

To physicalists, says the article, “those are fighting words, and some scientists are fighting back”:

“In the frontline are the neuroscientists who, with increasing frequency, are proposing theories for how subjective experience might emerge from a matrix of neurons and brain chemistry. A slew of books over the past two decades have proffered solutions to the ‘problem’ of consciousness. Among the best known are Christof Koch’s The Quest for Consciousness: A Neurobiological Approach (2004); Giulio Tononi and Gerald Edelman’s A Universe of Consciousness: How Matter Becomes Imagination (2000); Antonio Damasio’s The Feeling of What Happens: Body and Emotion in the Making of Consciousness (1999); and the philosopher Daniel Dennett’s bluntly titled Consciousness Explained (1991).”

Of particular interest in that battery of academic firepower is Daniel Dennett, who has a unique take on Zombies and the consciousness “hard problem”:

“Not everybody agrees there is a Hard Problem to begin with – making the whole debate kickstarted by Chalmers an exercise in pointlessness. Daniel Dennett, the high-profile atheist and professor at Tufts University outside Boston, argues that consciousness, as we think of it, is an illusion: there just isn’t anything in addition to the spongy stuff of the brain, and that spongy stuff doesn’t actually give rise to something called consciousness.

“Common sense may tell us there’s a subjective world of inner experience – but then common sense told us that the sun orbits the Earth, and that the world was flat. Consciousness, according to Dennett’s theory, is like a conjuring trick: the normal functioning of the brain just makes it look as if there is something non-physical going on.

“To look for a real, substantive thing called consciousness, Dennett argues, is as silly as insisting that characters in novels, such as Sherlock Holmes or Harry Potter, must be made up of a peculiar substance named “fictoplasm”; the idea is absurd and unnecessary, since the characters do not exist to begin with.

“This is the point at which the debate tends to collapse into incredulous laughter and head-shaking: neither camp can quite believe what the other is saying. To Dennett’s opponents, he is simply denying the existence of something everyone knows for certain: their inner experience of sights, smells, emotions and the rest. (Chalmers has speculated, largely in jest, that Dennett himself might be a Zombie.)

“More than one critic of Dennett’s most famous book, Consciousness Explained, has joked that its title ought to be Consciousness Explained Away  Dennett’s reply is characteristically breezy: explaining things away, he insists, is exactly what scientists do… However hard it feels to accept, we should concede that consciousness is just the physical brain, doing what brains do.”

Why Can’t The World’s Greatest Minds Solve The Mystery Of Consciousness? The Guardian (Jan. 21, 2015)

Zombies also appear in another current scientific inquiry:  whether artificially intelligent machines can be conscious. “Who’s to say machines don’t already have minds?” asks this article.[2] If they do, then “we need a better way to define and test for consciousness,” but formulating one means you “still face what you might call the Zombie problem.” (Oh great — so a machine could be a Zombie, too, as if there weren’t already enough of them already.)

Suppose you create a test to detect human qualia in machines, and weed out the Zombies, but who’s going to believe it if it comes back positive?

“Suppose a test finds that a thermostat is conscious. If you’re inclined to think a thermostat is conscious, you will feel vindicated. If sentient thermostats strike you as silly, you will reject the verdict. In that case, why bother conducting the test at all?”

Consciousness Creep

And if conscious thermostats aren’t enough to make you “collapse into incredulous laughter and head-shaking,” then how about finding consciousness in … insects? Turns out, they, too, have a Zombie problem, according to this article, co-written by a biologist and a philosopher.[3]

What happened to science that it’s tackling these issues, and with a straight face? I promised last time we’d look into that. We’ll do that next.

[1] As we saw last time, David Chalmers defined the “easy” and “hard” problems of consciousness.

[2] Consciousness Creep:  Our machines could become self-aware without our knowing it. Aeon Magazine, February 25, 2016

[3] Bee-Brained;  Are Insects ‘Philosophical Zombies’ With No Inner Life? Close attention to their behaviours and moods suggests otherwise, Aeon Magazine (Sept. 27, 2018)

The Greatest Unsolved Mystery

sherlock holmes

Academic disciplines take turns being more or less in the public eye — although, as we saw a couple posts back, metaphysicians think their discipline ought to be the perennial front runner. After all, it’s about figuring out the real nature of things”[1] and what could be more important than that?

Figuring out the human mind that’s doing the figuring, that’s what![2] Thus neuroscience’s quest to understand human consciousness finds itself at the front of the line as the greatest unsolved scientific mystery of our time.

“Nearly a quarter of a century ago, Daniel Dennett wrote that: ‘Human consciousness is just about the last surviving mystery.’ A few years later, [David] Chalmers added: ‘[It] may be the largest outstanding obstacle in our quest for a scientific understanding of the universe.’ They were right then and, despite the tremendous scientific advances since, they are still right today.

“I think it is possible that, compared with the hard problem [of consciousness], the rest of science is a sideshow. Until we get a grip on our own minds, our grip on anything else could be suspect. The hard problem is still the toughest kid on the block.”

The Mental Block – Consciousness Is The Greatest Mystery In Science, Aeon Magazine Oct. 9, 2013

“Hard problem” is a term of art in the consciousness quest:

“The philosopher [David] Chalmers … suggested that the challenge of explaining consciousness can be divided into two problems.

“One, the easy problem, is to explain how the brain computes and stores information. Calling this problem easy is, of course, a euphemism. What is meant is something more like the technically possible problem given a lot of scientific work.

“In contrast, the hard problem is to explain how we become aware of all that stuff going on in the brain. Awareness itself, the essence of awareness, because it is presumed to be nonphysical, because it is by definition private, seems to be scientifically unapproachable.”

Consciousness and the Social Brain. Michael S. A. Graziano (2013).

Solving the “easy” problem requires objective, empirical inquiry into how our brains are organized and wired, what brain areas and neural circuits process which kinds of experience, how they all share relevant information, etc. Armed with MRIs and other technologies, neuroscience has made great progress on all that. What it can’t seem to get its instruments around is the personal and  private subjection interpretation of the brain’s objective processing of experience.

“First coined in 1995 by the Australian philosopher David Chalmers, this ‘hard problem’ of consciousness highlights the distinction between registering and actually feeling a phenomenon. Such feelings are what philosophers refer to as qualia: roughly speaking, the properties by which we classify experiences according to ‘what they are like’. In 2008, the French thinker Michel Bitbol nicely parsed the distinction between feeling and registering by pointing to the difference between the subjective statement ‘I feel hot’, and the objective assertion that ‘The temperature of this room is higher than the boiling point of alcohol’ – a statement that is amenable to test by thermometer.”

I Feel Therefore I Am  Aeon Magazine Dec. 1, 2015

Neuroscience does objective just fine, but meets its match with subjective.

“The question of how the brain produces the feeling of subjective experience, the so-called ‘hard problem’, is a conundrum so intractable that one scientist I know refuses even to discuss it at the dinner table. Another, the British psychologist Stuart Sutherland, declared in 1989 that ‘nothing worth reading has been written on it’.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

Recently though, neuroscience has unleashed new urgency on the hard problem:

“For long periods, it is as if science gives up on the subject in disgust. But the hard problem is back in the news, and a growing number of scientists believe that they have consciousness, if not licked, then at least in their sights.

“A triple barrage of neuroscientific, computational and evolutionary artillery promises to reduce the hard problem to a pile of rubble. Today’s consciousness jockeys talk of p‑zombies and Global Workspace Theory, mirror neurons, ego tunnels, and attention schemata. They bow before that deus ex machina of brain science, the functional magnetic resonance imaging (fMRI) machine.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

Impressive, but are they making progress? Not so much.

“Their work is frequently very impressive and it explains a lot. All the same, it is reasonable to doubt whether it can ever hope to land a blow on the hard problem.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

The quest to map and measure the “personalized feeling level” of consciousness has taken researchers to some odd places indeed — as we saw in the video featured last time. Zombies also feature prominently:

“All those tests still face what you might call the zombie problem. How do you know your uncle, let alone your computer, isn’t a pod person – a zombie in the philosophical sense, going through the motions but lacking an internal life? He could look, act, and talk like your uncle, but have no experience of being your uncle. None of us can ever enter another mind, so we can never really know whether anyone’s home.”

Consciousness CreepAeon Magazine, February 25, 2016

More about Zombies and other consciousness conundrums coming up, along with a look at what made consciousness shoot to the top of the unsolved scientific mysteries pile.

[1] Encyclopedia Briitanica

[2] We’ll see later in this series what made illuminating the human mind so critical to science in general, not just neuroscience in particular.

Knowledge, Conviction, and Belief [9]:  Reckoning With Mystery

pontius pilate

“What is truth?”
Pontius Pilate
John 18:38 (NIV)

On the science side of Cartesian dualism, truth must be falsifiable — we have to be able to prove it’s untrue. On the religious side, to falsify is to doubt, doubt becomes heresy, and heresy meets the bad end it deserves.

Neither side likes mystery, because both are trying to satisfy a more primal need:  to know, explain, and be right. It’s a survival skill:  we need to be right about a lot of things to stay alive, and there’s nothing more primal to a mortal being than staying alive. Mystery is nice if you’ve got the time, but at some point it won’t help you eat and avoid being eaten.

Science tackles mysteries with experiments and theories, religion with doctrine and ritual. Both try to nail their truth down to every “jot and tittle,” while mystery bides its time, aloof and unimpressed.

I once heard a street preacher offer his rationale for the existence of God. “Think about how big the universe is,” he said, “It’s too big for me to understand. There has to be a God behind it.” That’s God explained on a street corner:  “I don’t get it, so there has be a higher up who does. His name is God.” The preacher’s God has the expansive consciousness we lack, and if we don’t always understand, that’s part of the deal:

“For my thoughts are not your thoughts,
neither are your ways my ways,”
declares the Lord.
“As the heavens are higher than the earth,
so are my ways higher than your ways
and my thoughts than your thoughts.”

Isaiah 55:8-9 (NIV)

Compare that to a cognitive neuroscientist’s take on our ability to perceive reality, as explained in this video.

“Many scientists believe that natural selection brought our perception of reality into clearer and deeper focus, reasoning that growing more attuned to the outside world gave our ancestors an evolutionary edge. Donald Hoffman, a cognitive scientist at the University of California, Irvine, thinks that just the opposite is true. Because evolution selects for survival, not accuracy, he proposes that our conscious experience masks reality behind millennia of adaptions for ‘fitness payoffs’ – an argument supported by his work running evolutionary game-theory simulations. In this interview recorded at the HowTheLightGetsIn Festival from the Institute of Arts and Ideas in 2019, Hoffman explains why he believes that perception must necessarily hide reality for conscious agents to survive and reproduce. With that view serving as a springboard, the wide-ranging discussion also touches on Hoffman’s consciousness-centric framework for reality, and its potential implications for our everyday lives.”

The video is 40 minutes long, but a few minutes will suffice to make today/s point. Prof. Hoffman admits his theory is counterintuitive and bizarre, but promises he’s still working on it (moving it toward falsifiability). I personally favor scientific materialism’s explanation of consciousness, and I actually get the theory behind Prof. Hoffman’s ideas, but when I watch this I can’t help but think its’s amazing how far science and religion will go to define their versions of how things work. That’s why I quit trying to read philosophy:  all that meticulous logic trying to block all exits and close all loopholes, but sooner or later some mystery leaks out a seam, and when it does the whole thing seems overwrought and silly.

The street preacher thinks reality is out there, and we’re given enough brain to both get by and know when to quit trying and trust a higher intelligence that has it all figured out. The scientist starts in here, with the brain (“the meat that thinks”), then tries to describe how it creates a useful enough version of reality to help us get by in the external world.

The preacher likes the eternal human soul; the scientist goes for the bio-neuro-cultural construction we call the self. Positions established, each side takes and receives metaphysical potshots from the other. For example, when science clamors after the non-falsifiable multiverse theory of quantum physics, the intelligent designers gleefully point out that the so-called scientists are leapers of faith just like them:

“Unsurprisingly, the folks at the Discovery Institute, the Seattle-based think-tank for creationism and intelligent design, have been following the unfolding developments in theoretical physics with great interest. The Catholic evangelist Denyse O’Leary, writing for the Institute’s Evolution News blog in 2017, suggests that: ‘Advocates [of the multiverse] do not merely propose that we accept faulty evidence. They want us to abandon evidence as a key criterion for acceptance of their theory.’ The creationists are saying, with some justification: look, you accuse us of pseudoscience, but how is what you’re doing in the name of science any different? They seek to undermine the authority of science as the last word on the rational search for truth.

“And, no matter how much we might want to believe that God designed all life on Earth, we must accept that intelligent design makes no testable predictions of its own. It is simply a conceptual alternative to evolution as the cause of life’s incredible complexity. Intelligent design cannot be falsified, just as nobody can prove the existence or non-existence of a philosopher’s metaphysical God, or a God of religion that ‘moves in mysterious ways’. Intelligent design is not science: as a theory, it is simply overwhelmed by its metaphysical content.”

But Is It Science? Aeon Magazine, Oct. 7, 2019.

And so it goes. But what would be so wrong with letting mystery stay… well, um… mysterious?

We’ll look at that next time.

Knowledge, Conviction, and Belief [5]: Looking For the Self in the Brain

My soul is lost, my friend
Tell me how do I begin again?
My city’s in ruins,
My city’s in ruins.

Bruce Springsteen

Neuroscience looks for the soul in the brain and can’t find it. What it finds instead are the elements of consciousness — sensory perception, language, cognition, memory,  etc. — in various neural networks and regions of the brain, and those diverse networks collaborating to generate a composite conscious experience. Meanwhile, the master network — the one that is equivalent to conventional notions of the soul or self — remains elusive.

Prof. Bruce Hood lays out the progression from conventional belief in a separate self to the current brain network theory:

“Psychologist Susan Blackmore makes the point that the word “illusion” does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“In challenging what is the self, what most people think is the self must first be considered. If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“This sense that we are individual inside bodies is sometimes called the ‘ego theory,’ although philosopher Gale Strawson captures it poetically in what he calls the ‘pearl view’ of the self. The pearl view is the common notion that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’

“In contrast to this ego view, there is an alternative version of the self, based on the ‘bundle theory’ after the Scottish Enlightenment philosopher David Hume… He tried to describe his inner self and thought that there was no single entity, but rather bundles of sensations, perceptions and thoughts piled on top of each other. He concluded that the self emerged out of the bundling together of these experiences.

“If the self is the sum of our thoughts and actions, then the first inescapable fact is that these depend on brains. Thoughts and actions are not exclusively the brain because we are always thinking about and acting upon things in the world with our bodies, but the brain is primarily responsible for coordinating these activities. In effect, we are our brains or at least, the brain is the most critical body part when it comes to who we are.

“There is no center in the brain where the self is constructed. The brain has many distributed jobs. It processes incoming information from the external world into meaningful patterns that are interpreted and stored for future reference. It generates different levels and types of motivations that are the human drives, emotions, and feelings. It produces all sorts of behavior — some of them automatic while other are acquired thought skill, practice, and sheer effort.

“The sense of self that most of us experience is not to be found in any one area. Rather it emerges out of the orchestra of different brain processes.”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood (2012)

Princeton neuroscientist Michael Graziano uses an “attention schema theory” to describe this collaboration of neural networks. “The heart of the theory is that awareness is a schematized, descriptive model of attention,” he says, and expands as follows:

“In the present theory, the content of consciousness, the stuff in the conscious mind, is distributed over a large set of brain areas, areas that encode vision, emotion, language, action plans, and so on. The full set of information that is present in consciousness at any one time has been called the ‘global workspace.’ In the present theory, the global workspace spans many diverse areas of the brain. But the specific property of awareness, the essence of awareness added to the global workspace, is constructed by an expert system in a limited part of the brain…. The computed property of awareness can be bound to the larger whole… One could think of awareness as information.”

Consciousness and the Social Brain. Michael S. A. Graziano (2013)

To those who hold fast to the common belief (as most people do) that the soul is something transcendent, noble, unique, special, poetic, and divine, referring to consciousness and the self as “global workspace” and calling awareness “information” lacks a little something. But is that any reason to reject the bundle theory as untrue?

Meanwhile, Prof. Graziano admits that “the attention schema theory does not even seek to answer the question of existential reality but instead tries to describe what is constructed by the brain.” And besides, is science really after truth anyway?

We’ll look at those questions next time.

Knowledge, Conviction, and Belief [2]: Cultural Belief and Mass Delusion

We think we have an independent ability to think and believe as we like, to know this or be convinced about that. But that’s not the whole story:  our outlook is also shaped by our cultural context.

As we’ve seen , when enough people agree about what is true — whether they “know” it or are “convinced” of it — their agreement becomes a cultural belief system — for example, as reflected in a religion, country, neighborhood, business, athletic team, or other institution. Cultural belief systems are wired into the neural pathways of individual members, and as the culture coalesces, its belief system takes on a life of its own thorough a process known as “emergence.” As the emergent belief system is increasingly reflected in and reinforced by cultural institutions, it is increasingly patterned into the neural pathways of the culture’s members, where it defines individual and collective reality and sense of identity,  The belief system becomes The Truth , defining what the group and its members know and are convinced of.

Throughout this process, whether the culture’s beliefs are true in any non-subjective sense loses relevance. The result is what physician and author Paul Singh refers to as “mass delusion”:

“[When a conviction moves from an individual to being widely held], its origins are rooted in a belief system rather than in an individual’s pathological condition. It is a mass delusion of the sort that poses no immediate threat to anyone or society. Mass delusions can become belief systems that are passed from generation to generation.”

The Great Illusion:  The Myth of Free Will, Consciousness, and the Self, Paul Singh (2016)

For a dramatic example of this concept in action, consider an experience described by Jesse Jackson:

“There is nothing more painful to me at this stage in my life than to walk down the street and hear footsteps… then turn around and see somebody white and feel relieved.”

Despite a lifetime of civil rights leadership, Jackson’s cultural neural conditioning betrayed him. What he experienced was not just personal to him; it conformed to a cultural belief system. The particular “mass delusion” involved has been confirmed by clinical research.

“Matthew Lieberman, a psychologist at the University of California, recently showed how beliefs help people’s brains categorise others and view objects as good or bad, largely unconsciously. He demonstrated that beliefs (in this case prejudice or fear) are most likely to be learned from the prevailing culture.

“When Lieberman showed a group of people photographs of expressionless black faces, he was surprised to find that the amygdala — the brain’s panic button — was triggered in almost two-thirds of cases. There was no difference in the response between black and white people.”

Where Belief Is Born, The Guardian (June 30,2005)

When cultural beliefs are not constantly reinforced — by cultural norms of thought, language, practice, etc. — the neural networks that support them can weaken, allowing opportunity for new beliefs.

“‘Beliefs are mental objects in the sense that they are embedded in the brain,’ says [Kathleen Taylor, a neuroscientist at Oxford University] ‘If you challenge [beliefs] by contradiction, or just by cutting them off from the stimuli that make you think about them, then they are going to weaken slightly. If that is combined with very strong reinforcement of new beliefs, then you’re going to get a shift in emphasis from one to the other.’”

Where Belief Is Born

This helps to explain, for example, why religious believers are more likely to “fall away” if they are “out of fellowship.” Or what can happen to a student off to college, a world traveler, or an immigrant. It also helps to explain why leaders and despots alike can manipulate brain networks to create cultural belief systems to fit their desired ends:

“In her book on the history of brainwashing, Taylor describes how everyone from the Chinese thought reform camps of the last century to religious cults have used systematic methods to persuade people to change their ideas, sometimes radically.

“The mechanism Taylor describes is similar to the way the brain learns normally. In brainwashing though, the new beliefs are inserted through a much more intensified version of that process.

“The first step is to isolate a person and control what information they receive. Their former beliefs need to be challenged by creating uncertainty. New messages need to be repeated endlessly. And the whole thing needs to be done in a pressured, emotional environment.

“Stress affects the brain such that it makes people more likely to fall back on things they know well – stereotypes and simple ways of thinking,” says Taylor.

“This manipulation of belief happens every day. Politics is a fertile arena, especially in times of anxiety.”

Where Belief Is Born

More next time.

“Before You Were Born I Knew You”

The_Summoner_-_Ellesmere_Chaucer-300x282The Summoner in Chaucer’s The Canterbury Tales,
Ellesmere MSS, circa 1400

Last time we looked at the common dualistic paradigm of consciousness, which is based on (a) the belief that humans are made in two parts — an ethereal self housed in a physical body — and (b) the corollary belief that religion and the humanities understand the self best, while science is the proper lens for the body.

Current neuroscience theorizes instead that consciousness arises from brain, body, and environment — all part of the physical, natural world, and therefore best understood by scientific inquiry.

We looked at the origins of the dualistic paradigm last time. This week, we’ll look at an example of how it works in the world of jobs and careers —  particularly the notion of being “called” to a “vocation.”

According to the Online Etymology Dictionary, the notion of “calling” entered the English language around Chaucer’s time, originating from Old Norse kalla — “to cry loudly, summon in a loud voice; name, call by name.” Being legally summoned wasn’t a happy thing in Chaucer’s day (it still isn’t), and summoners were generally wicked, corrupt, and otherwise worthy of Chaucer’s pillory in The Friar’s Tale.

“Calling” got an image upgrade a century and a half later, in the 1550’s, when the term acquired the connotation of “vocation, profession, trade, occupation.” Meanwhile, “vocation” took on the meaning of “spiritual calling,” from Old French vocacio, meaning “call, consecration; calling, profession,” and Latin vocationem — “a calling, a being called” to “one’s occupation or profession.”

“Calling” and “vocation” together support the common dream of being able to do the work we were born to do, and the related belief that this would make our work significant and us happy. The idea of vocational calling is distinctly Biblical:[1]

“Before I formed you in the womb I knew you,
and before you were born I consecrated you;
I appointed you a prophet to the nations.”

Jeremiah 1:5 (ESV

Something in us — an evolutionary survival instinct, I would guess — wants to be known, especially by those in power. Vocational calling invokes power at the highest level:  never mind your parents’ hormones, you were a gleam in God’s eye; and never mind the genes you inherited, God coded vocational identity and purpose into your soul.

2600 years after Jeremiah, we’re still looking for the same kind of affirmation.

“Amy Wrzesniewski, a professor at Yale School of Management and a leading scholar on meaning at work, told me that she senses a great deal of anxiety among her students and clients. ‘They think their calling is under a rock,’ she said, ‘and that if they turn over enough rocks, they will find it.’ If they do not find their one true calling, she went on to say, they feel like something is missing from their lives and that they will never find a job that will satisfy them. And yet only about one third to one half of people whom researchers have surveyed see their work as a calling. Does that mean the rest will not find meaning and purpose in their careers?”

The Power of Meaning:  Crafting a Life That Matters, Emily Esfahani Smith

If only one-third to one-half of us feel like we’re living our vocational calling, then why do we hang onto the dream? Maybe the problem is what Romantic Era poet William Wordsworth wrote about in his Ode:  Intimations of Immortality:

“Our birth is but a sleep and a forgetting:
The Soul that rises with us, our life’s Star,
Hath had elsewhere its setting,
And cometh from afar:
Not in entire forgetfulness,
And not in utter nakedness,
But trailing clouds of glory do we come
From God, who is our home:
Heaven lies about us in our infancy!

“Shades of the prison-house begin to close
Upon the growing Boy,
But he beholds the light, and whence it flows,
He sees it in his joy;
The Youth, who daily farther from the east
Must travel, still is Nature’s Priest,
And by the vision splendid
Is on his way attended;
At length the Man perceives it die away,
And fade into the light of common day.”

I.e., maybe something tragic happens when an immortal self comes to live in a mortal body. This, too, is a common corollary belief to body/soul dualism — religion’s distrust of “the flesh” is standard issue.

Cognitive neuroscientist Christian Jarrett offers career advice to the afflicted:  you might be able to turn the job you already have into a calling if you invest enough in it, or failing that, you might find your source of energy and determination somewhere else than in your work. This Forbes article reaches a similar conclusion:

“Years ago, I read a very thought-provoking article by Michael Lewis … about the difference between a calling and a job. He had some powerful insights. What struck me most were two intriguing concepts:

‘There’s a direct relationship between risk and reward. A fantastically rewarding career usually requires you to take fantastic risks.’

‘A calling is an activity that you find so compelling that you wind up organizing your entire self around it — often to the detriment of your life outside of it.’”

I.e., maybe career satisfaction isn’t heaven-sent; maybe instead it’s developed in the unglamorous daily grind of life in the flesh.

More on historical roots and related beliefs coming up.

[1] For more Biblical examples, see Isaiah 44:24:  Thus says the Lord, your Redeemer, who formed you from the womb: Galatians 1:15:  But when he who had set me apart before I was born; Psalm 139:13, 16:  13  For you formed my inward parts; you knitted me together in my mother’s womb; your eyes saw my unformed substance; in your book were written, every one of them, the days that were formed for me, when as yet there was none of them.

Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).