Debunking the Debunkers

Then I’ll get on my knees and pray
We don’t get fooled again

Meet the new boss
Same as the old boss

The Who[1]

Debunkers believe we’ll be better off without all the bunk. If only it were that simple: the basic premise of debunking might not hold up, the “truth that lies on the other side of bunk is elusive, and there are strong social forces that oppose it. Plus, once free of it, we tend to replace old bunk with new.

UVA Professor Emily Ogden defines “bunk”:

“‘Bunk’ means baloney, hooey, bullshit. Bunk isn’t just a lie, it’s a manipulative lie, the sort of thing a con man might try to get you to believe in order to gain control of your mind and your bank account. Bunk, then, is the tool of social parasites, and the word ‘debunk’ carries with it the expectation of clearing out something that is foreign to the healthy organism. Just as you can deworm a puppy, you can debunk a religious practice, a pyramid scheme, a quack cure. Get rid of the nonsense, and the polity – just like the puppy – will fare better. Con men will be deprived of their innocent marks, and the world will take one more step in the direction of modernity.”[2]

Sounds great, but can debunking actually deliver?

“Debunk is a story of modernity in one word – but is it a true story? Here’s the way this fable goes. Modernity is when we finally muster the reason and the will to get rid of all the self-interested deceptions that aristocrats and priests had fobbed off on us in the past. Now, the true, healthy condition of human society manifests itself naturally, a state of affairs characterised by democracy, secular values, human rights, a capitalist economy and empowerment for everyone (eventually; soon). All human beings and all human societies are or ought to be headed toward this enviable situation.”

Once somebody calls something a “fable” you know it’s in trouble. Plus, there’s no indisputable “truth” waiting to be found once the bunk is cleared out.

“There is no previously existing or natural secular order that will assert itself when we get the bunk out… There is no neutral, universal goal of progress toward which all peoples are progressing; instead, the claim that such a goal ought to be universal has been a means of exploiting and dispossessing supposedly ‘backward’ peoples.”

The underlying problem with debunking seems to be the assumptions we make — about what’s true and false, what we’ll find when we sort one from the other, and most importantly, who’s qualified to do that. Debunking requires what cultural anthropologist Talal Asad has called “secular agents” – a species that may not actually exist.

“Secular agency is the picture of selfhood that Western secular cultures have often wanted to think is true. It’s more an aspiration than a reality. Secular agents know at any given moment what they do and don’t believe. When they think, their thoughts are their own. The only way that other people’s thoughts could become theirs would be through rational persuasion. Along similar lines, they are the owners of their actions and of their speech. When they speak, they are either telling the truth or lying. When they act, they are either sincere or they are faking it… Modernity, in this picture, is when we take responsibility for ourselves, freeing both society and individuals from comforting lies.”

I.e., secular agency is a high standard we mostly fall short of. Instead, we do our best to conform to social conventions even if we don’t personally buy into them. A sports star points to the sky after a home run, a touchdown, a goal, acknowledging the help of somebody or Somebody up there… a eulogy talks about a deceased loved one “looking down on us”… a friend asks us to “think good thoughts” for a family member going into surgery… We don’t buy the Somebody up there helping us, the “looking down,” or the “good thoughts,” but we don’t speak up. Instead, we figure there’s a time and place for honesty and confrontation, and this isn’t one of them.[3]

“Life includes a great many passages in which we place the demands of social bonds above strict truth…. [In] the context of some of the stories we tell collaboratively in our relationships with others, the question of lying or truth does not arise. We set it aside. We apply a different framework, something more like the framework we apply to fiction: we behave as if it were true.”

So what’s left of debunking? Well, it still has its place, especially when it’s used to call the Bunk Lords to account.

“What then is debunking? It can be a necessary way of setting the record straight. I’m by no means opposed to truth-telling. We need fact-checkers. The more highly placed the con artist, the more his or her deceptions matter. In such cases, it makes sense to insist on hewing to the truth.

“[On the other hand,] the social dynamics of debunking should not be overlooked …, especially when the stakes aren’t particularly high – when the alleged lie in question is not doing a whole lot of harm.”

To Play Along or Not to Play Along

When I was a late adolescent and lurching my way toward the Christian faith, a seminary student advised me that, “Sometimes you just need to act as if something is true. You do that long enough, and maybe it will actually become true” – which I took to mean that, even if you’re full of yourself right now, in the long haul you might be happier fitting in.

Maybe, maybe not. You might also feel that, since the things we believe are always in progress anyway, why not be real about what’s up for you right now.

“At these times, what is debunking? It’s a performed refusal to play along.… It’s the announcement that one rejects the as-if mode in which we do what social bonds require.”

Plus, there seems to be a countervailing urge that sometimes prevails over socially playing nice: when we feel like we finally got it figured out, the scales fell from our eyes and we can see clearly now, we can see life for what it really is,,, get to that beatific place, and you want to tell everybody, even it if steps on their toes – which it does, but being newly enlightened and detoxed, you can’t help yourself.

Thus the “as if” game becomes a choice: playing along preserves social currency, opting out drains it. Which do you want?

Why Bother?

There’s also the “Why bother?” issue. Debunking is often preaching to the choir while the unconverted stay that way – in fact, they never even hear what you have to say; it never shows up in their feed.

“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Failstheir 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’

“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”[4]

In light of all this cognitive self-preservation, not rocking the boat can seem like the more reasonable choice:

“Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain.

“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”[5]

But even if acting as-if is socially acceptable, sometimes you just can’t help but go after it.

Take “magical thinking” for example — a socially acceptable practice and favorite debunking target.

Magical thinking is based on a claim of cause and effect, and therefore offers a sense of predictability and control, It sounds scientific and reasonable, which makes it socially acceptable, but it’s neither; it’s faux science because you can’t test or verify it, and its not reasonable because there’s no logic to it, you can only believe it or not. The masquerade makes it a prime target for debunking.

Magical thinking [is] the belief that one’s ideas, thoughts, actions, words, or use of symbols can influence the course of events in the material world. Magical thinking presumes a causal link between one’s inner, personal experience and the external physical world. Examples include beliefs that the movement of the Sun, Moon, and wind or the occurrence of rain can be influenced by one’s thoughts or by the manipulation of some type of symbolic representation of these physical phenomena.

“Magical thinking became an important topic with the rise of sociology and anthropology in the 19th century. It was argued that magical thinking is an integral feature of most religious beliefs, such that one’s inner experience, often in participation with a higher power, could influence the course of events in the physical world.

“Prominent early theorists suggested that magical thinking characterized traditional, non-Western cultures, which contrasted with the more developmentally advanced rational-scientific thought found in industrialized Western cultures. Magical thinking, then, was tied to religion and ‘primitive’ cultures and considered developmentally inferior to the scientific reasoning found in more ‘advanced’ Western cultures.” [6]

Recent converts are notorious for their intolerance of whatever they just left behind[7] and therefore the least likely to play along with social convention. So, suppose you’re a recent convert from magical thinking and someone drops one of those refrigerator magnet aphorisms. You’ll weigh a lot of factors in the next instant, but sometimes there are just some things people need to stop believing, so you’ll go ahead and launch, and social peace-keeping be damned. You do that in part because you’re aware of your own susceptibility to temptation. This is from Psychology Today[8]

“How many times a day do you either cross your fingers, knock on wood, or worry that your good luck will turn on you? When two bad things happen to you, do you cringe in fear of an inevitable third unfortunate event? Even those of us who ‘know better’ are readily prone to this type of superstitious thinking.

“Further defying logic, we also readily believe in our own psychic powers: You’re thinking of a friend when all of a sudden your phone beeps to deliver a new text from that very person. It’s proof positive that your thoughts caused your friend to contact you at that very moment! … These are just a few examples of the type of mind tricks to which we so readily fall prey.”

The article provides a list of seven “mind tricks” taken from psychology writer Matthew Hutson’s book The 7 Laws of Magical Thinking, and invites us to “See how long it takes you to recognize some of your own mental foibles.” Here’s the list, with abbreviated commentary from the article:

  1. “Objects carry essences. We attribute special properties to items that belong or once belonged to someone we love, is famous, or has a particular quality we admire… the objects are just objects, and despite their connection with special people in our lives, they have no inherent ability to transmit those people’s powers to us.
  2. “Symbols have power. Humans have a remarkable tendency to impute meaning not only to objects but to abstract entities. We imbue these symbols with the ability to affect actual events in our lives.
  3. “Actions have distant consequences. In our constant search to control the outcomes of events in our unpredictable lives, we build up a personal library of favorite superstitious rituals or thoughts.
  4. “The mind knows no bound We are often impressed by the apparent coincidence that occurs when a person we’re thinking about suddenly contacts us. For just that moment, we believe the event “proves” that we’re psychic.
  5. “The soul lives on. [Why] do adults hold on so stubbornly to the belief that the mind can continue even after its seat (the brain) is no longer alive? The answer, in part, comes from the terror that we feel about death.
  6. “The world is alive. We attribute human-like qualities to everything from our pets to our iPhones. We read into the faces of our pets all sorts of human emotions such as humor, disappointment, and guilt. If our latest technological toy misbehaves, we yell at it and assume it has some revenge motive it needs to satisfy.
  7. “Everything happens for a reason. The most insidious form of magical thinking is our tendency to believe that there is a purpose or destiny that guides what happens to us… For the same reason, we believe in luck, fate, and chance.”

Magical thinking is one of my personal bugaboos, therefore my personal list would be longer than seven.[9] Those things make me twitch. You?

And speaking of mortality…

Miracles: Magic Gets Personal

We can (and do) make up all kinds of things about what it’s like “up there,” but we can’t really imagine it any more than we can our own death. There’s a lot of research about why that’s so[10], but as a practical matter we have to imagine death while we’re still alive in the here and now, but to do it properly we’d have to be there and then — a problem that explains the popularity of books that some call “heavenly tourism,” about people who go there and come back to tell us about it.[11]

We want our heroes and loved ones looking down on us because we miss them. Losing them makes us feel small, helpless, and powerless — like children. So we draw pictures of clouds and robes and harps and locate them there. Childish? Sure. But preferable to the idea that “they” vanished when their body and brain stopped biologically functioning. Why we like one over the other isn’t clear if we can step back and think about it, but we don’t. Instead we’re so freaked about the trip down the River Styx that we follow convention.

For the same reasons, praying for a miracle that staves off death persists in the face of little to support it.[12]

“Writing Fingerprints of God, my 2009 book about the science of spirituality, gave me an excuse to ask a question that I never openly considered before leaving Christian Science, one that was unusually freighted: Is there any scientific evidence, anything beyond the realm of anecdote, that prayer heals?

“It turns out, the evidence is mixed. Beginning in the 1980s, we’ve seen a rash of prayer studies. Some seemed to show that patients who were prayed for recovered more quickly from heart attacks. Another study found that prayer physically helped people living with AIDS.

“But for every study suggesting that prayer heals a person’s body, there is another one showing that prayer has no effect — or even makes you worse. Does prayer help people with heart problems in a coronary care unit? Researchers at the Mayo Clinic found no effect. Does it benefit people who needed to clear their arteries using angioplasty? Not according to researchers at Duke. In another study, prayer did not ease the plight of those on kidney dialysis machines. And don’t even mention skin warts: Researchers found that people who received prayer saw the number of warts actually increase slightly, compared with those who received no prayer.

“The most famous study, and probably the most damaging for advocates of healing prayer, was conducted by Harvard researcher Herbert Benson in 2006. He looked at the recovery rates of patients undergoing cardiac bypass surgery. Those patients who knew they were receiving prayer actually did worse than those who did not know they were receiving prayer.

“In the end, there is no conclusive evidence from double-blind, randomized studies that suggests that intercessory prayer works.

“Prayer studies are a ‘wild goose chase that violate everything we know about the universe,’ Richard Sloan, professor of behavioral medicine at Columbia University Medical Center and author of Blind Faith, told me: ‘There are no plausible mechanisms that account for how somebody’s thoughts or prayers can influence the health of another person. None.’”

“And yet,” the author continues, “ science has embraced a sliver of my childhood faith, a century after Mary Baker Eddy ‘discovered’ Christian Science in the late 1800s. If scientists don’t buy intercessory prayer, most do agree that there is a mind-body connection.” She also finds some connections in “another new ‘science,’ called ‘neurotheology,’” citing how the stimulation of certain brain areas can deliver the same sensations as meditation, contemplative prayer, spiritual ecstasy, and even out-of-body experiences. As a result, she wonders if the brain might act as a kind of radio: “Is the brain wired to connect with a dimension of reality that our physical senses cannot perceive?”

“Researchers have tried to replicate such out-of-body experiences, which are always after-the-fact anecdotes that cannot be tested. These experiences, they say, suggest that consciousness can exist separate from the brain — in other words, that there may be a transcendent reality that we tap into when brain functioning ceases.

“I am not asking you to believe that consciousness can continue when the brain is not functioning, that there is a God who answers prayer, or that people who pray or meditate connect with another reality. I’m not asking you to believe that all mystical or inexplicable experiences are simply the interaction of chemicals in the brain or firings of the temporal lobe. That’s the point: You don’t have to choose. Because neither side possesses the slam-dunk argument, the dispositive evidence that proves that there is a God, or there isn’t.”

I.e., she’s saying that the impermeable curtain of death means we can’t prove or disprove either the brain-as-a-radio theory or the materialist belief that when your body stops so do you. Thus we’re free to choose, and one’s as viable as the other. Obviously, unlike the Psychology Today writer, this ex-Christian Scientist is not a committed debunker. On the other hand, her reference to the lack of “dispositive evidence that proves that there is a God, or there isn’t” takes us to straight to the ultimate debunking target.

Debunking God (or not)

God is the ultimate debunking target (patriotism is a close second), and the “New Atheists[13]” are the ultimate God debunkers. They’ve also been roundly criticized for being as fundamentalist and evangelical as the fundamentalists and evangelicals they castigate.[14] That’s certainly how I respond to them. I discovered them when I was fresh in my awareness that I’d become an atheist. I put their books on my reading list, read a couple, and deleted the rest. I’d left the fighting fundamentalists behind, and had no desire to rejoin the association. On the other hand, I am grateful to them for making it easier for the rest of us to come out as atheist – something that current social convention makes more difficult than coming out gay.[15]

From what I can tell, there are lots of people like me who didn’t become atheists by being clear-thinking and purposeful[16], it was just something that happened over time, until one day they checked the “none” box beside “religious affiliation.” Atheism wasn’t an intellectual trophy we tried to win, it was a neighborhood we wandered into one day and were surprised to find we had a home there. As one writer said,

“My belief in God didn’t spontaneously combust—it faded.

“I wasn’t the only kid who stopped believing. A record number of young Americans (35 percent) report no religious affiliation, even though 91 percent of us grew up in religiously affiliated households.

“Our disbelief was gradual. Only 1 percent of Americans raised with religion who no longer believe became unaffiliated through a onetime “crisis of faith.” Instead, 36 percent became disenchanted, and another 7 percent said their views evolved.

“It’s like believing in Santa Claus. Psychologists Thalia Goldstein and Jaqueline Woolley have found that children’s disbelief in Santa Claus is progressive, not instantaneous. First kids think that the Santa in the mall or library is real, then they think he’s not real but still magically communicates with the actual Santa, and so on, until they finally realize that Santa is composed of costumed actors. “Kids don’t just turn [belief] off,” Goldstein says.

“Likewise, losing faith happens in pieces.”[17]

It seems fitting we would exit religion that way, since it’s the way many of us got into it in the first place. Yes, some people seem to have those Damascus Road conversions[18], or maybe a less dramatic “come to Jesus meeting,” as a friend of mine says, but more often religion just kind of seeps into us from the surrounding culture.

“I used to love this illustrated children’s Bible my mom gave me. Long-faced Jonah inside a yawning blue whale felt warm and right. My brain made these feelings. When we enjoy religious or associated experiences, like snuggling up with Mom reading the Bible, our brain’s reward circuits activate. Over time, religious ideas become rewarding in and of themselves. This is a powerful, unconscious motivation to keep believing.

“When I began to see my colorful Bible as boring and childish, those same reward circuits likely became less active. Religious experiences produced less pleasure. This happens involuntarily in people with Parkinson’s disease, which compromises the brain’s reward centers. [That is why] people who develop Parkinson’s are much more likely to lose their faith.”[19]

The New Magic – Or, maybe I’m just skeptical about skepticism.

But then, it’s common that having been debunked of religion, we transfer that same commitment to something else – maybe magical thinking or some other unverifiable belief system. Turns out there’s a neurological reason for that: the neural pathways that ran our old belief system are still there, so we just load them with new content:

“For many years I believed in both creationism, with a God whose hand I could shake, and evolution, a cold, scientific world that cared nothing about me. Because when we lose faith, our brain’s preexisting belief networks don’t dissolve. They’re updated, like a wardrobe. ‘Even if someone abandons or converts [religions], it’s not like they’re throwing out all the clothes they own and now buying a whole new set,’ says Jordan Grafman, director of brain injury research at the Shirley Ryan AbilityLab and a professor at Northwestern University. ‘You pick and choose what you leave and what you keep.’

“New beliefs join the same neurological framework as old ones. It’s even possible that an existing belief network paves the way for additional beliefs. [Another researcher] has found that kids who believe in fantastical beings are more likely to believe in new ones invented by researchers. “I think it’s because they already have this network that [the new belief] kind of fits into,” she explains. Sometimes the new beliefs resemble the old ones; sometimes they don’t.

“Most non-religious people are ‘passionately committed to some ideology or other,’ explains Patrick McNamara, a neurology professor at Boston University School of Medicine. These passions function neurologically as ‘faux religions.’”[20]

And then, having been newly converted to our new faux religion, we’re set up for another eventual round of debunking.

Meet the new boss.

Same as the old boss.

[1] Here’s the original music video of We Won’t Get Fooled Again. Watching it draws you all the way back into the turbulent, polarizing 60’s — if you remember them, that is — and the tone feels eerily similar to what we’re living with today. By the way, who said, “If you remember the 60’s, you really weren’t there”? Find out here.

[2] Ogden, Emily, Debunking Debunked, Aeon (Aug. 12, 2019). Ms. Ogden’s Aeon bio says she is “an associate professor of English at the University of Virginia, and an author whose work has appeared in Critical Inquiry, The New York Times and American Literature, among others. Her latest book is Credulity: A Cultural History of US Mesmerism (2018).” All quotes in this section are from this article.

nk’ means baloney, hooey, bullshit. Bunk isn’t just a lie, it’s a manipulative lie, the sort of thing a con man might try to get you to believe in order to gain control of your mind and your bank account. Bunk, then, is the tool of social parasites, and the word ‘debunk’ carries with it the expectation of clearing out something that is foreign to the healthy organism. Just as you can deworm a puppy, you can debunk a religious practice, a pyramid scheme, a quack cure. Get rid of the nonsense, and the polity – just like the puppy – will fare better. Con men will be deprived of their innocent marks, and the world will take one more step in the direction of modernity.

[3] This social convention has been around a long time: like the Bible (something else we might like to debunk) says, “There is a time for everything under heaven … a time to keep silence, and a time to speak.”   Ecclesiastes 3: 7

[4]This Article Won’t Change Your Mind,” The Atlantic (March 2017):

[5]Why Facts Don’t Change Our Minds,The New Yorker (Feb. 27, 2017).

[6] Encyclopedia Britannica.

[7] See Volck, Brian, The Convert’s Zeal, Image Journal (Aug. 22, 2019). See also this Pew Center report.

[8] 7 Ideas We Really Need to Stop Believing. Psychology Today (May 08, 2012).

[9] Mr. Hutson’s list is based on “a wealth of psychological evidence,” while mine comes from my own anecdotal judgment that magical thinking has led to all kinds of delusional decisions and disasters in my life. The irony of using my own subjective perspective to debunk my own life doesn’t escape me. – it ranks right in there with The Who’s resorting to prayer in the hope they won’t be fooled again.

[10] Doubting death: how our brains shield us from mortal truth, The Guardian (Oct. 19, 2019).

[11] Like Heaven is For Real, by Alex Malarkey. Yes, that’s his real name.

[12] The Science of Miracles, Medium (Feb. 7, 2019).

[13] Wikipedia.

[14] Wikipedia.

[15] What Atheists Can Learn From The Gay Rights Movement, The Washington Post (Apr. 3, 2013). Coming out as atheist is even trickier if you’re in the public eye: ‘I Prefer Non-Religious’: Why So Few US Politicians Come Out As Atheists, The Guardian (Aug. 3, 2019); The Last Taboo: It’s harder in America to come out as an atheist politician than a gay one. Why? Politico Magazine (Dec. 9, 2013)

[16] Such as Andrew L. Seidel, an “out-of-the-closet atheist” and author of The Founding Myth: Why Christian Nationalism Is Un-American (2019).

[17] Beaton, Caroline, What Happens to Your Brain When You Stop Believing in God: It’s like going off a drug Vice (Mar. 28 2017).

[18] The Acts of the Apostles 9: 1-9.

[19] Beaton, op. cit..

[20] Ibid.

A Talk at the Rock: How to Instantly Polarize a Crowd and End a Discussion

AreopaguslImage from Wikipedia

The Areopagus is a large rock outcropping in Athens, not far from the Acropolis, where in ancient times various legal, economic, and religious issues got a hearing. A Bible story about something that happened there two thousand years ago provides surprising insight on today’s hyper-polarized world.

Backstory:  A Dualistic Worldview

In the 17th Century, Frenchman René Descartes sorted reality into two categories: (1) the natural, physical world and (2) the unseen world of ideas, feelings, and beliefs. This duality was born of the times:

“Toward the end of the Renaissance period, a radical epistemological and metaphysical shift overcame the Western psyche. The advances of Nicolaus Copernicus, Galileo Galilei and Francis Bacon posed a serious problem for Christian dogma and its dominion over the natural world.

“In the 17th century, René Descartes’s dualism of matter and mind was an ingenious solution to the problem this created. ‘The ideas’ that had hitherto been understood as inhering in nature as ‘God’s thoughts’ were rescued from the advancing army of empirical science and withdrawn into the safety of a separate domain, ‘the mind’.

“On the one hand, this maintained a dimension proper to God, and on the other, served to ‘make the intellectual world safe for Copernicus and Galileo’, as the American philosopher Richard Rorty put it in Philosophy and the Mirror of Nature (1979).

“In one fell swoop, God’s substance-divinity was protected, while empirical science was given reign over nature-as-mechanism – something ungodly and therefore free game.”[1]

Descartes articulated this dualistic framework, but it had been around from prehistoric antiquity. It still persists today, and neurological research suggests the human brain comes pre-wired for it. This is from Psychology Today[2]:

“Recent research suggests that our brains may be pre-wired for dichotomized thinking. That’s a fancy name for thinking and perceiving in terms of two – and only two – opposing possibilities.

“Neurologists explored the activity of certain key regions of the human forebrain – the frontal lobe – trying to understand how the brain switches between tasks. Scientists generally accept the idea that the brain can only consciously manage one task at a time….

“However, some researchers are now suggesting that our brains can keep tabs on two tasks at a time, by sending each one to a different side of the brain. Apparently, we toggle back and forth, with one task being primary and the other on standby.

“Add a third task, however, and one of the others has to drop off the to-do list. Scans of brain activity during this task switching have led to the hypothesis that the brain actually likes handling things in pairs. Indeed, the brain itself is subdivided into two distinct half-brains, or hemispheres.

“Some researchers are now extending this reasoning to suggest that the brain has a built-in tendency, when confronted by complex propositions, to selfishly reduce the set of choices to just two.

“The popular vocabulary routinely signals this dichotomizing mental habit: ‘Are you with us, or against us?’ ‘If you’re not part of the solution, you’re part of the problem.’

“These research findings might help explain how and why the public discourse of our culture has become so polarized and rancorous, and how we might be able to replace it with a more intelligent conversation.

“One of our popular clichés is ‘Well, there are two sides to every story.’ Why only two? Maybe the less sophisticated and less rational members of our society are caught up in duplex thinking, because the combination of a polarized brain and unexamined emotional reflexes keep them there.”

“Less sophisticating and less rational” … the author’s ideological bias is showing, but the “unexamined emotional reflexes” finger points at both ends of the polarized spectrum. And because our brains love status quo and resist change, we hunker down on our assumptions and biases. True, the balance can shift more gradually, over time – the way objectivity ascended during the 18th Century’s Age of Enlightenment, but Romanticism pushed back in the 19th — but usually it takes something drastic like disruptive innovation, tragedy, violence, etc. to knock us off our equilibrium. Absent that, we’re usually not up for the examination required to separate what we objectively know from what we subjectively believe — it’s all just reality, and as long as it’s working, we’re good. If we’re forced to examine and adjust, we’ll most likely take our cues from our cultural context:

“Each of us conducts our lives according to a set of assumptions about how things work: how our society functions, its relationship with the natural world, what’s valuable, and what’s possible. This is our worldview, which often remains unquestioned and unstated but is deeply felt and underlies many of the choices we make in our lives. We form our worldview implicitly as we grow up, from our family, friends, and culture, and, once it’s set, we’re barely aware of it unless we’re presented with a different worldview for comparison. The unconscious origin of our worldview makes it quite inflexible.

“There is [a] potent force shaping the particular patterns we perceive around us. It’s what anthropologists call culture. Just as language shapes the perception of an infant as she listens to the patterns of sounds around her, so the mythic patterns of thought informing the culture a child is born into will literally shape how that child constructs meaning in the world. Every culture holds its own worldview: a complex and comprehensive model of how the universe works and how to act within it. This network of beliefs and values determines the way in which each child in that culture makes sense of the universe.”[3]

Culture has been sculpting the human brain ever since our earliest ancestors began living complex social lives millions of years ago. It’s only when the cultural balance runs off the rails that our brains scramble to reset, and we’re stressed while they’re at it. We would do well not to wait until then, and learn how to embrace both ends of the dualistic spectrum, argues one computational biologist[4]:

“Neuroscience was part of the dinner conversation in my family, often a prerequisite for truth. Want to talk about art? Not without neuroscience. Interested in justice? You can’t judge someone’s sanity without parsing scans of the brain. But though science helps us refine our thinking, we’re hindered by its limits: outside of mathematics, after all, no view of reality can achieve absolute certainty. Progress creates the illusion that we are moving toward deeper knowledge when, in fact, imperfect theories constantly lead us astray.

“The conflict is relevant in this age of anti-science, with far-Right activists questioning climate change, evolution and other current finds. In his book Enlightenment Now (2018), Steven Pinker describes a second assault on science from within mainstream scholarship and the arts. But is that really bad? Nineteenth-century Romanticism was the first movement to take on the Enlightenment – and we still see its effects in such areas as environmentalism, asceticism and the ethical exercise of conscience.

“In our new era of Enlightenment, we need Romanticism again. In his speech ‘Politics and Conscience’ (1984), the Czech dissident Václav Havel, discussing factories and smokestacks on the horizon, explained just why: ‘People thought they could explain and conquer nature – yet … they destroyed it and disinherited themselves from it.’ Havel was not against industry, he was just for labour relations and protection of the environment.

“The issues persist. From use of GMO seeds and aquaculture to assert control over the food chain to military strategies for gene-engineering bioweapons, power is asserted though patents and financial control over basic aspects of life. The French philosopher Michel Foucault in The Will to Knowledge (1976) referred to such advancements as ‘techniques for achieving the subjugation of bodies and the control of populations’. With winners and losers in the new arena, it only makes sense that some folks are going to push back.

“We are now on the verge of a new revolution in control over life through the gene-editing tool Crispr-Cas9, which has given us the ability to tinker with the colour of butterfly wings and alter the heritable genetic code of humans. In this uncharted territory, where ethical issues are rife, we can get blindsided by sinking too much of our faith into science, and losing our sense of humanity or belief in human rights.

“Science should inform values such as vaccine and climate policy, but it must not determine all values…. With science becoming a brutal game of market forces and patent controls, the skeptics and Romantics among us must weigh in, and we already are.”

That’s probably good advice, but we need to push through a lot of cultural status quo to get there. That’s especially true because the 20th Century brought us change at ever-accelerating rates — objective reality went spinning away and we crashed into the extreme belief end of the spectrum:

“Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. What’s problematic is going overboard — letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts.

“The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts.

“Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us. And most of us haven’t realized how far-reaching our strange new normal has become.”[5]

When we can agree that our conflict is a matter of my data vs. yours, we can debate rationally. But when it’s my beliefs vs. yours, what used to be discourse dissolves into stonewalling and shouting. Belief seeks its own perfection by eliminating doubt, and therefore devolves into fundamentalism, where discussion is a sign of doubt, punishable as heresy. Fundamentalism can be secular or religious – it’s the dynamic, not the content, that matters

“Fundamentalism is a mind-set. The iconography and language it employs can be either religious or secular or both, but because it dismisses all alternative viewpoints as inferior and unworthy of consideration it is anti-thought. This is part of its attraction. It fills a human desire for self-importance, for hope and the dream of finally attaining paradise. It creates a binary world of absolutes, of good and evil. It provides a comforting emotional certitude. It is used to elevate our cultural, social, and economic systems above others. It is used to justify imperial hubris, war, intolerance and repression as a regrettable necessity in the march of human progress. The fundamentalist murders, plunders and subjugates in the name of humankind’s most exalted ideals. Those who oppose the fundamentalists are dismissed as savages, condemned as lesser breeds of human beings, miscreants led astray by Satan or on the wrong side of Western civilization. The nation is endowed with power and military prowess, fundamentalists argue, because God or our higher form of civilization makes us superior. It is our right to dominate and rule. The core belief systems of these secular and religious antagonists are identical. They are utopian. They will lead us out of the wilderness to the land of milk and honey.”[6]

Fundamentalism is where the open mind goes into lockdown. Objectivity loses its grip and the question “Are you with us, or against us?” gives way to its declarative version, “If you’re not with us, you’re against us.”[7] Dualistic thinking ceases to be more than a source of “popular clichés,” and becomes instead a rigid disincentive to public discourse, as competing polarized beliefs dig in for a grinding, maddening war of attrition. What used to be public discourse is lost in a no-man’s land of intellectual wreckage created by each side’s incessant lobbing of ideological bombs at the other’s entrenched subjective positions. Each side is convinced it has a God’s-eye view of reality, therefore God is on its side, which motivates securing its position by all necessary means.

A Talk at the Rock

The Christian scriptures illustrate how all this works in a story from one of the Apostle Paul’s missionary journeys.

“Now while Paul was… at Athens, his spirit was provoked within him as he saw that the city was full of idols. So, he reasoned in the synagogue with the Jews and the devout persons, and in the marketplace every day with those who happened to be there. Some of the Epicurean and Stoic philosophers also conversed with him. And some said, ‘What does this babbler wish to say?’ Others said, ‘He seems to be a preacher of foreign divinities’—because he was preaching Jesus and the resurrection.  And they took him and brought him to the Areopagus, saying, May we know what this new teaching is that you are presenting? For you bring some strange things to our ears. We wish to know therefore what these things mean.’[8]

The Epicureans and Stoics were the materialists of their day – their thinking leaned toward the objective side of the dualism. When Paul came to town advocating ideas (the subjective end of the dualism), their brain patterning couldn’t process Paul’s worldview. They needed time, so they invited Paul to a Talk at the Rock (the Areopagus).

At this point, the author of the story –- widely believed to be the same “Luke the beloved physician”[9] who wrote the Gospel of Luke – inserts a biased editorial comment that signals that nothing’s going to come of this because “all the Athenians and the foreigners who lived there would spend their time in nothing except telling or hearing something new.”[10] I.e., reasonable consideration — public discourse – was going to be a waste of time. But Paul had prepared some culturally sensitive opening remarks:

“So Paul, standing in the midst of the Areopagus, said: ‘Men of Athens, I perceive that in every way you are very religious.For as I passed along and observed the objects of your worship, I found also an altar with this inscription: To the unknown god. What therefore you worship as unknown, this I proclaim to you.’”

He then offers up the idea of substituting his ‘foreign god’ for the Athenians’ statuary, altars, and temples:

“The God who made the world and everything in it, being Lord of heaven and earth, does not live in temples made by man, nor is he served by human hands, as though he needed anything, since he himself gives to all mankind life and breath and everything. And he made from one man every nation of mankind to live on all the face of the earth, having determined allotted periods and the boundaries of their dwelling place, that they should seek God, and perhaps feel their way toward him and find him.”

You can sense the crowd’s restless murmuring and shuffling feet, but then Paul goes back to cultural bridge-building:

“Yet he is actually not far from each one of us, for ‘In him we live and move and have our being’ [referring to a passage from Epimenides of Crete], and as even some of your own poets have said, ‘For we are indeed his offspring.’[{From Aratus’s poem Phainomena].”

Nice recovery, Paul. So far so good. This feels like discourse, what the Rock is for. But Paul believes that the Athenians’ practice of blending the unseen world of their gods with their physical craftmanship of statuary, altars, and temples (a practice the church would later perfect) is idolatry, and in his religious culture back home, idolatry had been on the outs since the Golden Calf.[11] At this point, Paul takes off the cultural kit gloves and goes fundamentalist:

“Being then God’s offspring, we ought not to think that the divine being is like gold or silver or stone, an image formed by the art and imagination of man. The times of ignorance God overlooked, but now he commands all people everywhere to repent, because he has fixed a day on which he will judge the world in righteousness by a man whom he has appointed; and of this he has given assurance to all by raising him from the dead.”

That’s precisely the point where he loses the crowd — well, most of them, there were some who were willing to give him another shot, and even a couple fresh converts:

“Now when they heard of the resurrection of the dead, some mocked. But others said, ‘We will hear you again about this.’ So Paul went out from their midst. But some men joined him and believed, among whom also were Dionysius the Areopagite and a woman named Damaris and others with them.”

“Some men joined him and believed….” That’s all there was left for them to do: believe or not believe. You’re either with us or against us.

Paul had violated the cultural ethics of a Talk at the Rock. It was about reasonable discourse; he made it a matter of belief, saying in effect. “forget your social customs and ethics, my God is going to hurt you if you keep it up.” With that, the conclave became irretrievably polarized, and the session was over.

Paul triggered this cultural dynamic constantly on his journeys – for example a few years later, when the Ephesus idol-building guild figured out the economic implications of Paul’s belief system[12]:

“About that time there arose no little disturbance concerning the Way.  For a man named Demetrius, a silversmith, who made silver shrines of Artemis, brought no little business to the craftsmen. These he gathered together, with the workmen in similar trades, and said, ‘Men, you know that from this business we have our wealth. And you see and hear that not only in Ephesus but in almost all of Asia this Paul has persuaded and turned away a great many people, saying that gods made with hands are not gods. And there is danger not only that this trade of ours may come into disrepute but also that the temple of the great goddess Artemis may be counted as nothing, and that she may even be deposed from her magnificence, she whom all Asia and the world worship.’ When they heard this they were enraged and were crying out, ‘Great is Artemis of the Ephesians!’”

Jesus had previously taken a whip to the merchants in the Temple in Jerusalem.[13] Apparently Demetrius and his fellow craftsmen saw the same thing coming to them, and made a preemptive strike. The scene quickly spiraled out of control:

“So the city was filled with the confusion, and they rushed together into the theater, dragging with them Gaius and Aristarchus, Macedonians who were Paul’s companions in travel.  But when Paul wished to go in among the crowd, the disciples would not let him. And even some of the Asiarchs, who were friends of his, sent to him and were urging him not to venture into the theater. Now some cried out one thing, some another, for the assembly was in confusion, and most of them did not know why they had come together.”

A local official finally quelled the riot:

“Some of the crowd prompted Alexander, whom the Jews had put forward. And Alexander, motioning with his hand, wanted to make a defense to the crowd. But when they recognized that he was a Jew, for about two hours they all cried out with one voice, ‘Great is Artemis of the Ephesians!’

“And when the town clerk had quieted the crowd, he said, ‘Men of Ephesus, who is there who does not know that the city of the Ephesians is temple keeper of the great Artemis, and of the sacred stone that fell from the sky? Seeing then that these things cannot be denied, you ought to be quiet and do nothing rash. For you have brought these men here who are neither sacrilegious nor blasphemers of our goddess. If therefore Demetrius and the craftsmen with him have a complaint against anyone, the courts are open, and there are proconsuls. Let them bring charges against one another. But if you seek anything further, it shall be settled in the regular assembly. For we really are in danger of being charged with rioting today, since there is no cause that we can give to justify this commotion.” and when he had said these things, he dismissed the assembly.”[14]

It Still Happens Today

I spent years in the evangelical church – we were fundamentalists, but didn’t want to admit it – where Paul’s Talk at the Rock was held up as the way not to “share your faith.” Forget the public discourse — you can’t just “spend [your] time in nothing except telling or hearing something new,” you need to lay the truth on them so they can believe or not believe, and if they don’t, you need to “shake the dust off your feet”[15] and get out of there. These days, we see both secular and religious cultural institutions following that advice.

Will we ever learn?

[1]How The Dualism Of Descartes Ruined Our Mental HealthMedium (May 10, 2019)

[2] Karl Albrecht, “The Tyranny of Two,” Psychology Today (Aug 18, 2010)

[3] Jeremy Lent, The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning (2017)

[4] Jim Kozubek, “The Enlightenment Rationality Is Not Enough: We Need A New Romanticism,” Aeon (Apr. 18, 2018)

[5] Andersen, Kurt, Fantasyland: How American Went Haywire, a 500-Year History (2017)

[6] Hedges, Chris, I Don’t Believe in Atheists: The Dangerous Rise of the Secular Fundamentalist (2008)

[7] The latter came from Jesus himself – see the Gospels of Matthew 21: 12-13, and John 2: 13-16. Jesus was a belief man through and through. More on that another time.

[8] The Acts of the Apostles 17: 17-20.

[9] Paul’s letter to the Colossians 4: 14.

[10] Acts 17: 21.

[11] Exodus 32.

[12] Acts 19: 23-41

[13] Mathew 21: 12-17; John 2: 13-21

[14] Acts: 23-42

[15] Matthew 10:14.

Belief in Belief

ya gotta believe

New York Mets fans at the 1973 World Series
(they lost)

The quest to resolve the consciousness hard problem needs a boost from quantum mechanics to get any further. Either that, or there needs to be a better way to state the issue. As things stand, neuroscience’s inability to locate subjectivity in our brain matter gives pro-subjectivity the right to cite quantum mechanics as its go-to scientific justification.

The $12 Billion self-help industry and its coaches, speakers, and authors love quantum mechanics:  if subjectivity works on a sub-atomic level, the argument goes, then why not apply it on a macro, conscious level? Meanwhile, quantum scientists seem to have resigned themselves to the notion that, if their theories don’t have to be grounded in traditional objective standards like empirical testing and falsifiability, then why not hypothesize about multiverses and call that science?

Thus scientific rationalism continues to be on the wane — in science and as a way of life — especially in the USA, where belief in belief has been an ever-expanding feature of the American Way since we got started. To get the full perspective on America’s belief in belief, you need to read Kurt Andersen’s book, Fantasyland:  How American Went Haywire, a 500-Year History (2017), which I quoted at length last time. (Or for the short version, see this Atlantic article.)  The book provides a lot of history we never learned, but also reveals that the roots of our belief in belief go back even further than our own founding, and beyond our own shores. Although we weren’t founded as a Christian nation[1] (in the same way, for example, that Pakistan was expressly founded as a Muslim nation), Andersen traces this aspect of our ideological foundations to the Protestant Reformation:

“[Luther] insisted that clergymen have no special access to God or Jesus or truth. Everything a Christian needed to know was in the Bible. So every individual Christian believer could and should read and interpret Scripture for himself or herself. Every believer, Protestants said, was now a priest.

“Apart from devolving religious power to ordinary people — that is, critically expanding individual liberty — Luther’s other big idea was that belief in the Bible’s supernatural stories, especially those concerning Jesus, was the only prerequisite for being a Christian. You couldn’t earn your way into Heaven by performing virtuous deeds. Having a particular set of beliefs was all that mattered.

“However, out of the new Protestant religion, a new proto-American attitude emerged during the 1500s. Millions of ordinary people decided that they, each of them, had the right to decide what was true or untrue, regardless of what fancy experts said. And furthermore, they believed, passionate fantastical belief was the key to everything. The footings for Fantasyland had been cast.”

But even the Protestant Reformation isn’t back far enough. Luther’s insistence that anybody can get all the truth they need from the Bible is the Christian doctrine of sola scirptura, which holds that the Bible is the ultimate source of truth. And the Bible is where we find the original endorsement of the primacy of belief, in the teachings of none other than Jesus himself:

“Truly, I say to you, whoever says to this mountain, ‘Be taken up and thrown into the sea,’ and does not doubt in his heart,  but believes that what he says will come to pass, it will be done for him.”

Mark 11:23 (ESV)

Thus, the Christian rationale for belief in belief goes something like this:

  • “We believe the Bible tells the truth;
  • “The Bible says Jesus was God incarnate;
  • “God knows what’s true;
  • “Jesus, as God, spoke truth;
  • “Therefore, what Jesus said about belief is true.”

The rationale begins and ends in belief. Belief is a closed loop — you either buy it by believing, or you don’t. And if you believe, you don’t doubt or question, because if you do, belief won’t work for you, and it will be your own fault — you’ll be guilty of doubting in your heart or some other kind of sabotage. For example,

“If any of you lacks wisdom, let him ask God, who gives generously to all without reproach, and it will be given him. 6 But let him ask in faith, with no doubting, for the one who doubts is like a wave of the sea that is driven and tossed by the wind. 7 For that person must not suppose that he will receive anything from the Lord; 8 he is a double-minded man, unstable in all his ways.”

James 1:5-8 (ESV)

Thus belief disposes of every criticism against it. You’re either in or out, either with us or against us. Or, as a friend of mine used to say, “The Bible says it, I believe it, and that settles it!” And if your doubts persist, there are consequences. When I expressed some of mine back in college, the same friend handed me a Bible and said, “Read Luke 6: 62.”

“Jesus said to him, ‘No one who puts his hand to the plow and looks back is fit for the kingdom of God.’

Luke 9: 62  (ESV)

End of discussion.

But not here, not in this blog. Here, our mission is to challenge cherished beliefs and institutions. Here, we’ll to look more into what it means to believe in belief, and consider other options. In the meantime, we’ll set aside the hard problem of consciousness while we wait for further developments,

For more on today’s topic, you might take a look at Should We Believe In Belief? (The Guardian, July 17, 2009), and be sure to click the links at the end and read those pieces, too. All the articles are short and instructive.

[1] For a detailed consideration (and ultimate refutation) of the claim that American was founded as a Christian nation , see The Founding Myth, by Andrew L. Seidel (2019).

How Impossible Becomes Possible (2)

While objective, scientific knowledge scrambles to explain consciousness in purely biological terms (“the meat thinks”), subjective belief enjoys cultural and scientific predominance. And no wonder — the allure of subjectivity is freedom and power:  if scientists can control the outcome of their quantum mechanics lab work by what they believe, then surely the rest of us can also believe the results we want into existence. In fact, isn’t it true that we create our own reality, either consciously or not? If so, then consciously is better, because that way we’ll get what we intend instead of something mashed together by our shady, suspect subconscious. And the good news is, we can learn and practice conscious creation. Put that to work, and we can do and have and be whatever we want! Nothing is impossible for us!

I.e, belief in belief is the apex of human consciousness and self-efficacy:  it’s what makes the impossible possible. At least, that’s the self-help gospel, which also has deep roots in the New Testament. We’ll be looking deeper into both.

The Music Man lampooned belief in belief as practiced by con man Harold Hill’s “think method.” The show came out in 1957. Five years before, the Reverend Norman Vincent Peale published The Power of Positive Thinking, and twenty years before, Napolean Hill published Think and Grow Rich, in which he penned its most-quoted aphorism, “Whatever your mind can conceive and believe, it can achieve.”

Americans in particular have had an enduring allegiance to belief in belief, ever since we got started 500 years ago. Since then, we’ve taken it to ever-increasing extremes:

“The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts. Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation.

“Why are we like this?

“The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned. Once people commit to that approach, the world turns inside out, and no cause-and-effect connection is fixed. The credible becomes incredible and the incredible credible.

“America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successful—but also by a people uniquely susceptible to fantasy, as epitomized by everything from Salem’s hunting witches to Joseph Smith’s creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump. In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes ’60s and the internet age. The result is the America we inhabit today.

Fantasyland

Belief in belief soared to new heights in mega-bestseller The Secret:

“The Secret takes the American fundamentals, individualism and supernaturalism and belief in belief, and strips away the middlemen and most the pious packaging…. What’s left is a “law of attraction,” and if you just crave anything hard enough, it will become yours. Belief is all. The Secret’s extreme version of magical thinking goes far beyond its predecessors’. It is staggering. A parody would be almost impossible. It was number one on the Times’s nonfiction list for three years and sold around twenty million copies.”

Fantasyland:  How American Went Haywire, a 500-Year History, Kurt Andersen (2017)

American culture’s embrace of belief in belief was supercharged in its earliest days by the Puritans, about whom Kurt Andersen concludes, “In other words, America was founded by a nutty religious cult.” Maybe that’s why The Secret distanced itself from those Christian moorings:

“The closest antecedent to The Secret was The Power of Positive Thinking in the 1950s, back when a mega-bestselling guide to supernatural success still needed an explicit tether to Christianity.

“In The Secret, on the other hand, Rhonda Byrn mentions Jesus only once, as the founder of the prosperity gospel. All the major biblical heroes, including Christ, she claims, ‘were not only prosperity teachers, but also millionaires themselves, with more affluent lifestyles than many present-day millionaires could conceive of.’”

Fantasyland

The Secret also stakes its claim on the side of subjective science:

“There isn’t a single thing you cannot do with this knowledge,’ the book promises. ‘It doesn’t matter who your are or where you are. The Secret can give you whatever you want. ‘Because it’s a scientific fact.’”

Fantasyland

But The Secret is just one example of the subjective good news. Believe it into existence —  that’s how the impossible is done American, self-help, Christian, subjective science style. Never mind the objective, empirically-verified, scientific “adjacent possibility” approach we looked at last time — that’s just too stuffy, too intellectual. Belief in belief is much more inspiring, more of a joyride.

And that’s a problem.

More next time.

How Impossible Becomes Possible

active nerve cell in human neural system

network

Scientific materialism explains a lot about how the brain creates consciousness, but hasn’t yet fully accounted for subjective awareness. As a result, the “hard problem” of consciousness remains unsolved, and we’re alternately urged to either concede that the human brain just isn’t ready to figure itself out, or conclude that reality is ultimately determined subjectively.

Princeton psychology and neuroscience professor Michael S. A. Graziano isn’t ready to do either. He thinks the “hard problem” label is itself the problem, because it cuts off further inquiry:

“Many thinkers are pessimistic about ever finding an explanation of consciousness. The philosopher Chalmers in 1995, put it in a way that has become particularly popular. He suggested that the challenge of explaining consciousness can be divided into two problems. One, the easy problem, is to explain how the brain computes and stores information. Calling this problem easy is, of course, a euphemism. What it meant is something more like the technically possible problem given a lot of scientific work.

“In contrast, the hard problem is to explain how we become aware of all that stuff going on in the brain. Awareness itself, the essence of awareness, because it is presumed to be nonphysical, because it is by definition private, seems to be scientifically unapproachable. Again, calling it the hard problem is a euphemism, it is the impossible problem.

“The hard-problem view has a pinch of defeatism in it. I suspect that for some people it also has a pinch of religiosity. It is a keep-your-scientific-hands-off-my-mystery perspective. In the hard problem view, rather than try to explain consciousness, we should marvel at its insolubility. We have no choice but to accept it as a mystery.

“One conceptual difficulty with the hard-problem view is that it argues against any explanation of consciousness without knowing what explanations might arise. It is difficult to make a cogent argument against the unknown. Perhaps an explanation exists such that, once we see what it is, once we understand it, we will find that it makes sense and accounts for consciousness.”

Consciousness and the Social Brain. by Michael S. A. Graziano (2013).

I.e., if science is going to explain consciousness, it needs to reframe its inquiry, so that what is now an “impossible,” “scientifically unapproachable” problem becomes a “technically possible problem” that can be solved “given a lot of scientific work.”

Technology and innovation writer Steven Johnson describes how he thinks the impossible becomes possible in Where Good Ideas Come From — available as a TED talk. book, and animated whiteboard drawing piece on YouTube. In his TED talk, he contrasted popular subjective notions with what neuroscience has discovered about how the brain actually works:

“[We] have to do away with a lot of the way in which our conventional metaphors and language steer us towards certain concepts of idea-creation. We have this very rich vocabulary to describe moments of inspiration. We have … the flash of insight, the stroke of insight, we have epiphanies, we have ‘eureka!’ moments, we have the lightbulb moments… All of these concepts, as kind of rhetorically florid as they are, share this basic assumption, which is that an idea is a single thing, it’s something that happens often in a wonderful illuminating moment.

“But in fact, what I would argue is … that an idea is a network on the most elemental level. I mean, this is what is happening inside your brain. An idea — a new idea — is a new network of neurons firing in sync with each other inside your brain. It’s a new configuration that has never formed before. And the question is, how do you get your brain into environments where these new networks are going to be more likely to form?”

Johnson expands on the work of biologist and complex systems researcher Stuart Kauffman, who dubbed this idea the “adjacent possibility.” Adjacent possibility is where the brain’s neural networks (top picture above) meet data networks (the bottom picture):  neither is a static, closed environment; both are dynamic, constantly shifting and re-organizing, with each node representing a new point from which the network can expand. Thus the shift from unknown to known is always a next step away:

“The adjacent possible is a kind of shadow future, hovering on the edges of the present state of things, a map of all the ways in which the present can reinvent itself.”

Vittorio Loreto and his colleagues at Sapienza University of Rome turned adjacent possibility into a mathematical model which they then submitted to objective, empirical, real world testing. As he said in his TED talk:

“Experiencing the new means exploring a very peculiar space, the space of what could be, the space of the possible, the space of possibilities.

“We conceived our mathematical formulation for the adjacent possible, 20 years after the original Kauffman proposals.

“We had to work out this theory, and we came up with a certain number of predictions to be tested in real life.”

Their test results suggest that adjacent possibility is good science — that impossible doesn’t step out of the ether, it waits at the edge of expanding neural networks, ready to become possible.[1] As Steven Johnson said above, that’s a far cry from our popular romantic notions of revelations, big ideas, and flashes of brilliance. We look more at those next time.

[1] For a nerdier version, see this Wired piece: The ‘Adjacent Possible’ of Big Data: What Evolution Teaches About Insights Generation.

So Consciousness Has a Hard Problem… Now What?

god helmet

We’ve been looking at the “hard problem” of consciousness:

  • Neuroscience can identify the brain circuits that create the elements of consciousness and otherwise parse out how “the meat thinks,” but it can’t quite get its discoveries all the way around the mysteries of subjective experience.
  • That’s a problem because we’re used to thinking along Descartes’ dualistic distinction between scientific knowledge, which is objective, empirical, and invites disproving, and belief-based conviction, which is subjective, can’t be tested and doesn’t want to be.
  • What’s worse, science’s recent work in quantum mechanics, artificial intelligence, and machine learning has blurred those dualistic lines by exposing the primacy of subjectivity even in scientific inquiry.
  • All of which frustrates our evolutionary survival need to know how the world really works.[1]

Some people are ready to declare that subjective belief wins, and science will just have to get over it. That’s what happened with the “God Helmet” (shown in the photo above, taken from this article), Dr. Michael Persinger[2] created the helmet for use in neuro-religious research:

“This is a device that is able to simulate religious experiences by stimulating an individual’s tempoparietal lobes using magnetic fields. ‘If the equipment and the experiment produced the presence that was God, then the extrapersonal, unreachable, and independent characteristics of the god definition might be challenged,’ [says Dr. Persinger].” [3]

The God Helmet creates subjective experiences shared among various religions, such as sensing a numinous presence, a feeling of being filled with the spirit or overwhelmed or possessed, of being outside of self, out of body, or having died and come back to life, feelings of being one with all things or of peace, awe, fear and dread, etc. Since all of these states have been either measured or induced in the laboratory, you’d think that might dampen allegiance to the belief that they are God-given, but not so. Instead, when the God Helmet was tested on a group of meditating nuns, their conclusion was, how wonderful that God equipped the brain in that way, so he could communicate with us. Similarly,

 “Some years ago, I discussed this issue with Father George Coyne, a Jesuit priest and astronomer who was then Director of the Vatican Observatory. I asked him what he thought of the notion that when the 12th‑century Hildegard of Bingen was having her visions of God, perhaps she was having epileptic fits. He had no problem with the fits. Indeed, he thought that when something so powerful was going on in a mind, there would necessarily be neurological correlates. Hildegard might well have been an epileptic, Father Coyne opined; that didn’t mean God wasn’t also talking to her.”

The Mental Block – Consciousness Is The Greatest Mystery In Science. Aeon Magazine (Oct. 9, 2013)

If we’re not willing to concede the primacy of subjectivity, then what? Well, we could give up on the idea that the human race is equipped to figure out everything it would really like to know.

 “It would be poetic – albeit deeply frustrating – were it ultimately to prove that the one thing the human mind is incapable of comprehending is itself. An answer must be out there somewhere. And finding it matters: indeed, one could argue that nothing else could ever matter more – since anything at all that matters, in life, only does so as a consequence of its impact on conscious brains. Yet there’s no reason to assume that our brains will be adequate vessels for the voyage towards that answer. Nor that, were we to stumble on a solution to the Hard Problem, on some distant shore where neuroscience meets philosophy, we would even recognise that we’d found it.”

Why Can’t The World’s Greatest Minds Solve The Mystery Of Consciousness? The Guardian (Jan. 21, 2015)

“Maybe philosophical problems are hard not because they are divine or irreducible or workaday science, but because the mind of Homo sapiens lacks the cognitive equipment to solve them. We are organisms, not angels, and our minds are organs, not pipelines to the truth. Our minds evolved by natural selection to solve problems that were life-and-death matters to our ancestors, not to commune with correctness or to answer any question we are capable of asking. We cannot hold ten thousand words in short-term memory. We cannot see in ultraviolet light. We cannot mentally rotate an object in the fourth dimension. And perhaps we cannot solve conundrums like free will and sentience.”

How the Mind Works, Steven Pinker (1997)

Evolutionary biologist David Barash attributes our inability to the vastly different pace of biological evolution (what the operative biology of our brains can process) vs. cultural evolution (what we keep learning and inventing and hypothesizing about). Trouble is, the latter moves way too fast for the former to keep up.

“On the one hand, there is our biological evolution, a relatively slow-moving organic process that can never proceed more rapidly than one generation at a time, and that nearly always requires an enormous number of generations for any appreciable effect to arise.

“On the other hand is cultural evolution, a process that is, by contrast, extraordinary in its speed.

“Whereas biological evolution is Darwinian, moving by the gradual substitution and accumulation of genes, cultural evolution is … powered by a nongenetic ‘inheritance” of acquired characteristics. During a single generation, people have selectively picked up, discarded, manipulated, and transmitted cultural, social, and technological innovations that have become almost entirely independent of any biological moorings.

“We are, via our cultural evolution, in over our biological heads.”

Through a Glass Brightly:  Using Science to See Our Species as We Really Are, David P. Barash (2018)

Give in to subjectivity, or just give up…. We’ll look at another option next time.

[1] The study of how we know things is Epistemology.

[2] Dr. Persinger was director of the Neuroscience Department at Laurentian University in Ontario, Canada prior to his death in 2018.

[3] “What God Does To Your Brain:  The controversial science of neurotheology aims to find the answer to an age-old question: why do we believe?” The Telegraph (June 20, 2014).

Subjective Science

quantum mechanics formula

What happened to spark all the recent scientific interest in looking for consciousness in the brains of humans and animals, in insects, and … well, everywhere? (Including not just the universe, but also the theoretical biocentric universe and quantum multiverses.)

“It has been said that, if the 20th century was the age of physics, the 21st will be the age of the brain. Among scientists today, consciousness is being hailed as one of the prime intellectual challenges. My interest in the subject is not in any particular solution to the origin of consciousness – I believe we’ll be arguing about that for millennia to come – but rather in the question: why is consciousness perceived as a ‘problem’? How exactly did it become a problem? And given that it was off the table of science for so long, why is it now becoming such a hot research subject?”

I Feel Therefore I Am — How Exactly Did Consciousness Become A Problem? And why, after years off the table, is it a hot research subject now?  Aeon Magazine (Dec. 1, 2015)

From what I can tell, two key sparks started the research fire:  (1) the full implications of quantum mechanics finally set in, and (2) machines learned how to learn.

(1)  Quantum Mechanics:  Science Goes Subjective. Ever since Descartes set up his dualistic reality a few hundred years ago, we’ve been able to trust that science could give us an objective, detached, rational, factual view of the observable universe, while philosophy and religion could explore the invisible universe where subjectivity reigns. But then the handy boundary between the two was torn in the early 20th Century when quantum mechanics found that subjectivity reigns on a sub-atomic level, where reality depends on what researchers decide ahead of time what they’re looking for. Scientists tried for the rest of the 20th Century to restore objectivity to their subatomic lab work, but eventually had to concede.

 “Physicists began to realise that consciousness might after all be critical to their own descriptions of the world. With the advent of quantum mechanics they found that, in order to make sense of what their theories were saying about the subatomic world, they had to posit that the scientist-observer was actively involved in constructing reality.

“At the subatomic level, reality appeared to be a subjective flow in which objects sometimes behave like particles and other times like waves. Which facet is manifest depends on how the human observer is looking at the situation.

“Such a view apalled many physicists, who fought desperately to find a way out, and for much of the 20th century it still seemed possible to imagine that, somehow, subjectivity could be squeezed out of the frame, leaving a purely objective description of the world.

“In other words, human subjectivity is drawing forth the world.”

I Feel Therefore I Am

(2)  Machines Learned to Learn. Remember “garbage in, garbage out”? It used to be that computers had to be supervised — they only did what we told them to do, and could only use the information we gave them. But not anymore. Now their “minds” are free to sort through the garbage on their own and make up their own rules about what to keep or throw out. Because of this kind of machine learning, we now have computers practicing law and medicine, handling customer service, writing the news, composing music, writing novels and screenplays, creating art…. all those things we used to think needed human judgment and feelings. Google wizard and overall overachiever Sebastian Thrun[1] explains the new machine learning in this conversation with TED Curator Chris Anderson:

 “Artificial intelligence and machine learning is about 60 years old and has not had a great day in its past until recently. And the reason is that today, we have reached a scale of computing and datasets that was necessary to make machines smart. The new thing now is that computers can find their own rules. So instead of an expert deciphering, step by step, a rule for every contingency, what you do now is you give the computer examples and have it infer its own rules.

 “20 years ago the computers were as big as a cockroach brain. Now they are powerful enough to really emulate specialized human thinking. And then the computers take advantage of the fact that they can look at much more data than people can.

No wonder science got rattled. Like the rest of us, it was comfortable with all the Cartesian dualisms that kept the world neatly sorted out:  science vs. religion,[2] objective vs. subjective, knowledge vs. belief, humanity vs. technology…. But now all these opposites are blurring together in a subjective vortex while non-human intelligence looks on and comments about it.

Brave New World, indeed. How shall we respond to it?

More next time.

[1] Sebastian Thrun’s TED bio describes him as “an educator, entrepreneur and troublemaker. After a long life as a professor at Stanford University, Thrun resigned from tenure to join Google. At Google, he founded Google X, home to self-driving cars and many other moonshot technologies. Thrun also founded Udacity, an online university with worldwide reach, and Kitty Hawk, a ‘flying car’ company. He has authored 11 books, 400 papers, holds 3 doctorates and has won numerous awards.”

[2] For an alternative to the science-religion dualism, see Science + Religion:  The science-versus-religion opposition is a barrier to thought. Each one is a gift, rather than a threat, to the other, Aeon Magazine (Nov. 21, 2019)

 

Zombies and the Consciousness Hard Problem

              night of the living dead                   Walking Dead

                      Poster from the 1968 movie     https://comicbook.com/thewalkingdead

Philosophers and psychologists call human traits like feelings, conscience, and self- awareness “qualia,” and believe that, if zombies can lack them but still look and act like us (on a really bad day), then locating consciousness entirely in human biology (“physicalism”) can’t be right.

“Physicalism allows us to imagine a world without consciousness, a ‘Zombie world’ that looks exactly like our own, peopled with beings who act exactly like us but aren’t conscious. Such Zombies have no feelings, emotions or subjective experience; they live lives without qualia. As [philosopher David Chalmers][1] has noted, there is literally nothing it is like to be Zombie. And if Zombies can exist in the physicalist account of the world, then, according to Chalmers, that account can’t be a complete description of our world, where feelings do  exist: something more is needed, beyond the laws of nature, to account for conscious subjective experience.”

I Feel Therefore I Am, Aeon Magazine Dec. 1, 2015

To physicalists, says the article, “those are fighting words, and some scientists are fighting back”:

“In the frontline are the neuroscientists who, with increasing frequency, are proposing theories for how subjective experience might emerge from a matrix of neurons and brain chemistry. A slew of books over the past two decades have proffered solutions to the ‘problem’ of consciousness. Among the best known are Christof Koch’s The Quest for Consciousness: A Neurobiological Approach (2004); Giulio Tononi and Gerald Edelman’s A Universe of Consciousness: How Matter Becomes Imagination (2000); Antonio Damasio’s The Feeling of What Happens: Body and Emotion in the Making of Consciousness (1999); and the philosopher Daniel Dennett’s bluntly titled Consciousness Explained (1991).”

Of particular interest in that battery of academic firepower is Daniel Dennett, who has a unique take on Zombies and the consciousness “hard problem”:

“Not everybody agrees there is a Hard Problem to begin with – making the whole debate kickstarted by Chalmers an exercise in pointlessness. Daniel Dennett, the high-profile atheist and professor at Tufts University outside Boston, argues that consciousness, as we think of it, is an illusion: there just isn’t anything in addition to the spongy stuff of the brain, and that spongy stuff doesn’t actually give rise to something called consciousness.

“Common sense may tell us there’s a subjective world of inner experience – but then common sense told us that the sun orbits the Earth, and that the world was flat. Consciousness, according to Dennett’s theory, is like a conjuring trick: the normal functioning of the brain just makes it look as if there is something non-physical going on.

“To look for a real, substantive thing called consciousness, Dennett argues, is as silly as insisting that characters in novels, such as Sherlock Holmes or Harry Potter, must be made up of a peculiar substance named “fictoplasm”; the idea is absurd and unnecessary, since the characters do not exist to begin with.

“This is the point at which the debate tends to collapse into incredulous laughter and head-shaking: neither camp can quite believe what the other is saying. To Dennett’s opponents, he is simply denying the existence of something everyone knows for certain: their inner experience of sights, smells, emotions and the rest. (Chalmers has speculated, largely in jest, that Dennett himself might be a Zombie.)

“More than one critic of Dennett’s most famous book, Consciousness Explained, has joked that its title ought to be Consciousness Explained Away  Dennett’s reply is characteristically breezy: explaining things away, he insists, is exactly what scientists do… However hard it feels to accept, we should concede that consciousness is just the physical brain, doing what brains do.”

Why Can’t The World’s Greatest Minds Solve The Mystery Of Consciousness? The Guardian (Jan. 21, 2015)

Zombies also appear in another current scientific inquiry:  whether artificially intelligent machines can be conscious. “Who’s to say machines don’t already have minds?” asks this article.[2] If they do, then “we need a better way to define and test for consciousness,” but formulating one means you “still face what you might call the Zombie problem.” (Oh great — so a machine could be a Zombie, too, as if there weren’t already enough of them already.)

Suppose you create a test to detect human qualia in machines, and weed out the Zombies, but who’s going to believe it if it comes back positive?

“Suppose a test finds that a thermostat is conscious. If you’re inclined to think a thermostat is conscious, you will feel vindicated. If sentient thermostats strike you as silly, you will reject the verdict. In that case, why bother conducting the test at all?”

Consciousness Creep

And if conscious thermostats aren’t enough to make you “collapse into incredulous laughter and head-shaking,” then how about finding consciousness in … insects? Turns out, they, too, have a Zombie problem, according to this article, co-written by a biologist and a philosopher.[3]

What happened to science that it’s tackling these issues, and with a straight face? I promised last time we’d look into that. We’ll do that next.

[1] As we saw last time, David Chalmers defined the “easy” and “hard” problems of consciousness.

[2] Consciousness Creep:  Our machines could become self-aware without our knowing it. Aeon Magazine, February 25, 2016

[3] Bee-Brained;  Are Insects ‘Philosophical Zombies’ With No Inner Life? Close attention to their behaviours and moods suggests otherwise, Aeon Magazine (Sept. 27, 2018)

The Greatest Unsolved Mystery

sherlock holmes

Academic disciplines take turns being more or less in the public eye — although, as we saw a couple posts back, metaphysicians think their discipline ought to be the perennial front runner. After all, it’s about figuring out the real nature of things”[1] and what could be more important than that?

Figuring out the human mind that’s doing the figuring, that’s what![2] Thus neuroscience’s quest to understand human consciousness finds itself at the front of the line as the greatest unsolved scientific mystery of our time.

“Nearly a quarter of a century ago, Daniel Dennett wrote that: ‘Human consciousness is just about the last surviving mystery.’ A few years later, [David] Chalmers added: ‘[It] may be the largest outstanding obstacle in our quest for a scientific understanding of the universe.’ They were right then and, despite the tremendous scientific advances since, they are still right today.

“I think it is possible that, compared with the hard problem [of consciousness], the rest of science is a sideshow. Until we get a grip on our own minds, our grip on anything else could be suspect. The hard problem is still the toughest kid on the block.”

The Mental Block – Consciousness Is The Greatest Mystery In Science, Aeon Magazine Oct. 9, 2013

“Hard problem” is a term of art in the consciousness quest:

“The philosopher [David] Chalmers … suggested that the challenge of explaining consciousness can be divided into two problems.

“One, the easy problem, is to explain how the brain computes and stores information. Calling this problem easy is, of course, a euphemism. What is meant is something more like the technically possible problem given a lot of scientific work.

“In contrast, the hard problem is to explain how we become aware of all that stuff going on in the brain. Awareness itself, the essence of awareness, because it is presumed to be nonphysical, because it is by definition private, seems to be scientifically unapproachable.”

Consciousness and the Social Brain. Michael S. A. Graziano (2013).

Solving the “easy” problem requires objective, empirical inquiry into how our brains are organized and wired, what brain areas and neural circuits process which kinds of experience, how they all share relevant information, etc. Armed with MRIs and other technologies, neuroscience has made great progress on all that. What it can’t seem to get its instruments around is the personal and  private subjection interpretation of the brain’s objective processing of experience.

“First coined in 1995 by the Australian philosopher David Chalmers, this ‘hard problem’ of consciousness highlights the distinction between registering and actually feeling a phenomenon. Such feelings are what philosophers refer to as qualia: roughly speaking, the properties by which we classify experiences according to ‘what they are like’. In 2008, the French thinker Michel Bitbol nicely parsed the distinction between feeling and registering by pointing to the difference between the subjective statement ‘I feel hot’, and the objective assertion that ‘The temperature of this room is higher than the boiling point of alcohol’ – a statement that is amenable to test by thermometer.”

I Feel Therefore I Am  Aeon Magazine Dec. 1, 2015

Neuroscience does objective just fine, but meets its match with subjective.

“The question of how the brain produces the feeling of subjective experience, the so-called ‘hard problem’, is a conundrum so intractable that one scientist I know refuses even to discuss it at the dinner table. Another, the British psychologist Stuart Sutherland, declared in 1989 that ‘nothing worth reading has been written on it’.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

Recently though, neuroscience has unleashed new urgency on the hard problem:

“For long periods, it is as if science gives up on the subject in disgust. But the hard problem is back in the news, and a growing number of scientists believe that they have consciousness, if not licked, then at least in their sights.

“A triple barrage of neuroscientific, computational and evolutionary artillery promises to reduce the hard problem to a pile of rubble. Today’s consciousness jockeys talk of p‑zombies and Global Workspace Theory, mirror neurons, ego tunnels, and attention schemata. They bow before that deus ex machina of brain science, the functional magnetic resonance imaging (fMRI) machine.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

Impressive, but are they making progress? Not so much.

“Their work is frequently very impressive and it explains a lot. All the same, it is reasonable to doubt whether it can ever hope to land a blow on the hard problem.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

The quest to map and measure the “personalized feeling level” of consciousness has taken researchers to some odd places indeed — as we saw in the video featured last time. Zombies also feature prominently:

“All those tests still face what you might call the zombie problem. How do you know your uncle, let alone your computer, isn’t a pod person – a zombie in the philosophical sense, going through the motions but lacking an internal life? He could look, act, and talk like your uncle, but have no experience of being your uncle. None of us can ever enter another mind, so we can never really know whether anyone’s home.”

Consciousness CreepAeon Magazine, February 25, 2016

More about Zombies and other consciousness conundrums coming up, along with a look at what made consciousness shoot to the top of the unsolved scientific mysteries pile.

[1] Encyclopedia Briitanica

[2] We’ll see later in this series what made illuminating the human mind so critical to science in general, not just neuroscience in particular.

Knowledge, Conviction, and Belief [9]:  Reckoning With Mystery

pontius pilate

“What is truth?”
Pontius Pilate
John 18:38 (NIV)

On the science side of Cartesian dualism, truth must be falsifiable — we have to be able to prove it’s untrue. On the religious side, to falsify is to doubt, doubt becomes heresy, and heresy meets the bad end it deserves.

Neither side likes mystery, because both are trying to satisfy a more primal need:  to know, explain, and be right. It’s a survival skill:  we need to be right about a lot of things to stay alive, and there’s nothing more primal to a mortal being than staying alive. Mystery is nice if you’ve got the time, but at some point it won’t help you eat and avoid being eaten.

Science tackles mysteries with experiments and theories, religion with doctrine and ritual. Both try to nail their truth down to every “jot and tittle,” while mystery bides its time, aloof and unimpressed.

I once heard a street preacher offer his rationale for the existence of God. “Think about how big the universe is,” he said, “It’s too big for me to understand. There has to be a God behind it.” That’s God explained on a street corner:  “I don’t get it, so there has be a higher up who does. His name is God.” The preacher’s God has the expansive consciousness we lack, and if we don’t always understand, that’s part of the deal:

“For my thoughts are not your thoughts,
neither are your ways my ways,”
declares the Lord.
“As the heavens are higher than the earth,
so are my ways higher than your ways
and my thoughts than your thoughts.”

Isaiah 55:8-9 (NIV)

Compare that to a cognitive neuroscientist’s take on our ability to perceive reality, as explained in this video.

“Many scientists believe that natural selection brought our perception of reality into clearer and deeper focus, reasoning that growing more attuned to the outside world gave our ancestors an evolutionary edge. Donald Hoffman, a cognitive scientist at the University of California, Irvine, thinks that just the opposite is true. Because evolution selects for survival, not accuracy, he proposes that our conscious experience masks reality behind millennia of adaptions for ‘fitness payoffs’ – an argument supported by his work running evolutionary game-theory simulations. In this interview recorded at the HowTheLightGetsIn Festival from the Institute of Arts and Ideas in 2019, Hoffman explains why he believes that perception must necessarily hide reality for conscious agents to survive and reproduce. With that view serving as a springboard, the wide-ranging discussion also touches on Hoffman’s consciousness-centric framework for reality, and its potential implications for our everyday lives.”

The video is 40 minutes long, but a few minutes will suffice to make today/s point. Prof. Hoffman admits his theory is counterintuitive and bizarre, but promises he’s still working on it (moving it toward falsifiability). I personally favor scientific materialism’s explanation of consciousness, and I actually get the theory behind Prof. Hoffman’s ideas, but when I watch this I can’t help but think its’s amazing how far science and religion will go to define their versions of how things work. That’s why I quit trying to read philosophy:  all that meticulous logic trying to block all exits and close all loopholes, but sooner or later some mystery leaks out a seam, and when it does the whole thing seems overwrought and silly.

The street preacher thinks reality is out there, and we’re given enough brain to both get by and know when to quit trying and trust a higher intelligence that has it all figured out. The scientist starts in here, with the brain (“the meat that thinks”), then tries to describe how it creates a useful enough version of reality to help us get by in the external world.

The preacher likes the eternal human soul; the scientist goes for the bio-neuro-cultural construction we call the self. Positions established, each side takes and receives metaphysical potshots from the other. For example, when science clamors after the non-falsifiable multiverse theory of quantum physics, the intelligent designers gleefully point out that the so-called scientists are leapers of faith just like them:

“Unsurprisingly, the folks at the Discovery Institute, the Seattle-based think-tank for creationism and intelligent design, have been following the unfolding developments in theoretical physics with great interest. The Catholic evangelist Denyse O’Leary, writing for the Institute’s Evolution News blog in 2017, suggests that: ‘Advocates [of the multiverse] do not merely propose that we accept faulty evidence. They want us to abandon evidence as a key criterion for acceptance of their theory.’ The creationists are saying, with some justification: look, you accuse us of pseudoscience, but how is what you’re doing in the name of science any different? They seek to undermine the authority of science as the last word on the rational search for truth.

“And, no matter how much we might want to believe that God designed all life on Earth, we must accept that intelligent design makes no testable predictions of its own. It is simply a conceptual alternative to evolution as the cause of life’s incredible complexity. Intelligent design cannot be falsified, just as nobody can prove the existence or non-existence of a philosopher’s metaphysical God, or a God of religion that ‘moves in mysterious ways’. Intelligent design is not science: as a theory, it is simply overwhelmed by its metaphysical content.”

But Is It Science? Aeon Magazine, Oct. 7, 2019.

And so it goes. But what would be so wrong with letting mystery stay… well, um… mysterious?

We’ll look at that next time.