It Takes a Different Person to be a Christian and Then an Atheist

Not a different kind of person, but a different person, period.

You look in the mirror and don’t recognize yourself.

Other people don’t either. There’s something different about you, hard to say what – a different energy maybe, like your wiring got scrambled.

That kind of different is why I’m not a Christian anymore. The old me didn’t change my mind about God; I became a new person, and God didn’t fit anymore. It wasn’t just a tweak here and there, but the whole ecosystem of me — self, life, world, inside and out – got shifted, zapped, scrambled, rearranged to the point that it’s not that I don’t believe in God anymore, it’s that I can’t. I’m repulsed by the idea. I’m stunned, shocked, and amazed by what I used to believe. I wonder how I could have. What was I thinking?

Now here I am — a nonbeliever, among the godless, the faithless, the backslidden. I never would have believed it. Atheist wasn’t possible – it was never on the life choices list. It still isn’t. I didn’t choose it, I became it. I became a new person in a new place, with no way to get back. It wasn’t change, it was transformation.

“Transformation” has grandiose overtones. It sounds spiritual. We talked about transformation when I was a Christian. It’s right there in the Bible:

“Be transformed by the renewal of your mind.” Romans 12:2 ESV

The context and the rest of verse is dressed up with pious isn’t-transformation-going-to-be-wonderful language. No it’s not. It’s going to tear you down, and everything else with it. Transformation is destructive and painful, a depressing grind. Try to make big changes and everything comes unglued. I don’t wish it on anybody.

Transformation is inside and outside – the entire ecosystem that is you and your life. Ecological change on the inside is biological, neurological, physiological, chemical, hormonal. On the outside it’s sociological, communal, societal, institutional. When transformation has made a mess of all that it’s just getting warmed up. Now you’ve got to figure out how to carry all that into a new life.

Good luck with that.

Personal ecosystem change is why we take vacations and patronize spas, go to a monastery for a week of silence. It’s why churches sponsor retreats, why corporations lay out five star spreads for off-site strategic planning. It works:  put yourself in a new setting, you think new thoughts, feel new things. What was unthinkable and impossible before became your new to do list.

Personal ecosystem change is why re-entry is so hard – go away and get inspired, then try to take what happened out there back to the shop and everybody wants to know what you’ve been smoking out there. Meanwhile you’re scheming to turn no-way-I-can-go-back into the new normal. All that inspiration and new thoughts while you’re away vs. all that dread and drudgery when you go back to the grind – it’s evidence of ecological change.

Self-help is fraudulent pseudo-religion for a lot of reasons, but it’s biggest fraud is that it doesn’t tell us about the need for ecosystem change if we want to make big changes in our lives. Self-help makes it sound like we can just paste some new things onto what we already are, have, and do. Nope. Won’t work. The reason we’re not already doing the new thing is because we’re not the kind of person who does the new thing. If we were, we’d already be doing it. Duh. If we want to do the new thing, we need to be transformed.

“Transformed” is change on an ecological/systemic scale. That means nothing left out. Nothing left out means this is going to hurt. A big part of the trouble is that transformation can’t mix old and new — get far enough into the process and the old is out for good. That’s in the Bible, too:

“No one puts a piece of unshrunk cloth on an old garment, for the patch tears away from the garment, and a worse tear is made. Neither is new wine put into old wineskins. If it is, the skins burst and the wine is spilled and the skins are destroyed. But new wine is put into fresh wineskins, and so both are preserved.”

Matthew 9: 16-17 ESV

There were a couple popular books about new wine and new wineskins making the rounds in my early Christian years. (The Taste of New Wine and Wineskins.) They were the kind of influential books you could use at ecological change retreats – lots of earnest conversations and strategizing ways to make the new wine flow, like getting people into home Bible study and prayer groups, plus lots of great sermon moments about how very Gospely everything was going to be.

Our understanding of the concept was silly shallow. Every now and then somebody would find out about St. John of the Cross and his “dark night of the soul,” and quote it in a sermon. Nobody actually read what the 16th Century mystic wrote — the poetic phrase stood on its own:  transformation could be a major downer — not something you preach about on Tithing Pledge Sunday. If it got mentioned at all, “dark night” transformation got a makeover into something like a bad case of the flu you could get over.

The real thing?

Not so much.

I once thought it would be cool to be one of those self-help speaker, writer, consultant dudes. I got as far as writing some blog posts and making a few trips to do workshops. I got great reviews – earnest, beautiful “you changed my life” reviews. But then I started to worry that I was actually ruining people’s lives, which is pretty much what had happened to mine when I decided it was time to believe my way into my dreams – just like you’re supposed to. So I started telling audiences that they would suffer if they tried to make big changes. I warned them not to use the material because I knew it would work, and when it did they would regret it. Every would changed, and they’d have to deal with it and it would be no fun. I think people thought I was doing some kind of reverse psychology number on them. When it was clear they weren’t believing me, I quit doing the workshops. It was unethical to give people a great retreat experience and send them home knowing they would get clobbered and give up.

Who would submit themselves to the kind of transformation that would turn a commando Christian (me) into an atheist?

In a word, nobody. Not even me.

But then I did.

I’m not bragging. You can’t brag about an accident.

We all know we don’t change unless and until we have to. Which means the usual transformation catalyst is…

Trauma.

Me too.

We’ve all seen the major stressors lists. Mine were career, money, health. For starters – when trauma gets rolling, it likes company.

Trauma brings grief. Grief rewires our brains – it puts the stress response (flight or fight) in charge, furloughs the part that makes us feel like at least we’re in control. Memory and strong emotions hog the stage, decision-making and planning move out, fear about how we’re going to live without what’s been lost goes on permanent reruns we can’t shut off. We get disoriented, lose track of time and place. We go wandering, literally and figuratively. Our whacked out symptoms take up residence. We enter what science and environment writer April Reese calls The Fog of Grief.[1]

“I was a churning maelstrom of emotions: sadness, confusion, anger, disbelief, fear, regret, guilt. At times in those first hours, days and weeks after his death, it was hard to breathe. I couldn’t concentrate. I forgot things. Fatigue was a constant, no matter how much I slept. I came to understand what Joan Didion meant in The Year of Magical Thinking (2005), a chronicle of her grief over the loss of her husband, when she wrote: ‘I realised for the time being I could not trust myself to present a coherent face to the world.’

“This fog of grief, it turns out, is as common as grief itself. When the neurologist Lisa Shulman lost her husband to cancer nine years ago, ‘there was some serious sadness, but that wasn’t the main problem,’ she recalls. ‘It was the disorientation. I felt like I was waking up in a completely alien world. Because the whole infrastructure of my daily life was fundamentally gone.’

“She found herself becoming lost in time, ending up in familiar places without knowing how she got there, she recalls. ‘It’s not simply a matter of discomfort or anxiety. It’s frightening,’ she says. ‘Because you feel like, as Didion said long ago, you feel like you’re going crazy.’

“Grief has such a powerful effect on us, I learned, that it rewires the brain: the limbic system, a primal part of the brain controlling emotions and behaviours that ensure our survival, takes centre stage, while the prefrontal cortex – the centre of reasoning and decision-making – retreats to the wings.

“‘From an evolutionary standpoint, we are strongly hardwired to respond to something that is a threat,’ Shulman says. ‘We oftentimes don’t think of a loss of a loved one as a threat in that way, but, from the perspective of the brain, that’s the way it is literally perceived.’

“That perception of threat means that our survival response – ‘fight or flight’ – kicks in, and stress hormones flood the body. The work of the psychologist Mary-Frances O’Connor at the University of Arizona and others has found heightened levels of the stress hormone cortisol in the bereaved.

“While the cortisol is flowing fast, the brain remakes itself – at least temporarily – to help us endure the trauma of grief. In the weeks after a loss, the brain, like a stern nurse imposing temporary bed rest for itself, suppresses the control centres of higher functions, such as decision-making and planning. At the same time, Shulman says, areas involved in emotion and memory work overtime, gatekeeping which emotions and memories get through. Brain scans of the bereaved show that grief activates parts of the limbic system – sometimes referred to as the ‘emotional brain’. Among the limbic regions impacted are the amygdala, which governs the intensity of emotions and threat perception; the cingulate cortex, involved in the interplay between emotions and memory; and the thalamus, a sort of relay station that conveys sensory signals to the cerebral cortex, the brain’s information-processing centre.

“So my inability to form coherent sentences or remember what I opened the refrigerator to get is nothing to be worried about, Skritskaya assures me; my brain has simply powered down my thinking to enable me to tolerate the loss. The tradeoff is fuzzy cognition – what I’ve come to describe to friends as ‘grief brain’.

“‘Grief takes up a lot of bandwidth in the brain,’ Shulman writes in her book. ‘Odd behaviour and incoherence are expected consequences of the brain’s protective responses following emotional trauma.’”

Trauma and grief stay until the dark night is over. Ecological change catalysts like religious retreats and self-help seminars have the same effect — they suspend our status quo ties to “normal,” heighten emotions, promote reality-bending experiences, warp our risk tolerance, enhance receptivity to new versions of reality. But then the weekend is over and we go back home, where the symptoms quickly fade. We resent it, but it’s better than the alternative, which is trauma and grief staying with it until the job is done.

Trauma and grief is a potent cocktail of transformation. Drink it, and there’s going to be trouble. You’re going to suffer.

You might even lose your faith.


[1] The Fog of Grief: The five stages of grief can’t begin to explain it: grief affects the body, brain and sense of self, and patience is the key Aeon Magazine (Aug. 10, 2021).

We Seriously Need to Get Over Our Addiction to Ancient Wisdom

Where did we get the idea that Ancient Wisdom is such hot stuff?

You shrug. You don’t know, you never thought about it. I hadn’t either.

An “ancient wisdom” Google search generated the usual 89 million results in 0.65 seconds. The first couple pages were mostly life coaches trying to out-reverence each other.

Lesson learned:  call what you’re peddling ancient wisdom, and you’ll sell more of it. (Remember the opening of The Secret promo movie?)

Not exactly the answer I had in mind.

Ancient wisdom is an assumption:  of course it’s better than anything we might think of on our own — everybody knows that! It’s better because it’s… well, because it’s… un, because it’s really old… it’s so old it’s… ancient.

Sigh.

We assume ancient wisdom will give us an edge – rocket us from clueless to competitive. I mean, those ancients, they had it going. They’re the Who’s Who of Law, Art, Philosophy, Religion, History, Literature… The ancient texts. The ancient ways. The ancient teachings. The ancient books. The ancient heroes. The ancient incarnations of gods walking the Earth. Miles and piles of traditions and holidays and customs. Wars, wars, and more wars. Greed and evil, corruption and cruelty, with a sprinkle of nobility now and then. On and on and on… Ancient this, ancient that.

Ancient is most potent when it’s sacred ancient, which is as close to God as you can get. God is old – really old, older than old, older even than ancient. That means sacred ancient-ness is next to godliness.

Sigh.

We’re so addicted to ancient wisdom that we’re blind to our addiction, which makes it hard to talk about. It seems obvious, like asking why we breathe.

  • We breathe to live.
  • We revere ancient wisdom because we breathe.

Or something like that.

When’s the last time ancient wisdom made your life better? I mean really better, not just “I believe this old stuff will improve my life” better?

Here’s the problem (one of many):  We think those guys (yes, guys – ancient pronouns are definitely male) were just like us, living the same kinds of lives, dealing with the same kinds of issues, so that what they thought about how life works can help us out.

Not so.

This is the time travel problem:  the idea that if we could zap ourselves forward or backward in time we’d still be us, the same as we are now, only with some adjusting to do — so if we time-travelled Socrates into today, the bedsheet clothes would have to go, and he’d need a shower and probably a trip to the dentist, but otherwise with the help of Google Translate he’d fit right in.

Not a chance.

Humans function in context. We feel, don’t feel, think, don’t think, act, don’t act… see, perceive, conclude, decide, and all their opposites… only in context. We happen in the moment because that’s all we’ve got. We have no experience except here and now, and everything about our experience comes from our brains’ processing what we’re experiencing. We take in all the external stimuli – through our senses, through spatial and subliminal biological connections –and our brains process it all internally. The amalgamation becomes “reality.” A little of that happens consciously; most of it doesn’t. To the extent we’re aware, we are conscious only in context.

Ancient context was different. Ancient people and their ancient reality were different. The ancient human consciousness that created ancient reality was different. We and our reality and consciousness are different from theirs. We are not like those guys. They weren’t like us. If we could ever meet – which we can never do, not even metaphorically or intellectually or otherwise – we would barely recognize them as human. They would return the favor. We’d both notice the naked ape resemblance, but common ground would be hard to find. Maybe after some who-knows-how-long acclimation process we might learn to experience a new, shared context together. Until then, things would definitely be awkward.

We give ancient religion special status in our ancient addiction. We re-energize ancient events and teachings, beliefs and practices, by the application of our fervent belief. By our belief, we invest ancient relics and rituals with living virtue — antiquity reconstituted. We think we brought the ancient back to life, but that’s delusional because our believe is also processed in context – our current context. We’re making up the experience in the here and now. We cannot do otherwise.

Which loops us around back to where we started:  if we didn’t believe ancient wisdom is something special, we wouldn’t believe its relevance to us. And no, calling something “sacred” and “holy” and “eternal” and “immortal” doesn’t help — it still has to be processed through our mortal, temporal biology. We’re not creating ancient meaning and experiencing it in its original form — we’re only creating this moment’s version of it.

The best our believing can do is to treat ancient wisdom as what philosophers call a “first cause.” If you trace everything back through some impossibly tangled mega-gigantic cause and effect chain, you eventually get to the place where you can’t trace back anymore, so you need a “first cause” that gets the whole thing started.( Once you find the first cause, you sound like a parent:  “Because I said so, that’s why.” )

God is the first cause of choice. You can’t go further back than God, can’t prove or disprove God, you either believe in Him (yes, God’s pronouns are also male) or you don’t. Full stop. Ancient is the same way:  you either believe it’s good and true and valuable and worth fighting wars and making converts at gunpoint or sword point or on the rack or in the Inquisition or whatever… or you don’t. Belief is what makes ancient relevant, but when it does, it only gets the current version. Even if sacred holy other ancient could get a pass, there’s no sacred holy other compartment in our brains to process it.

Suppose we could break our ancient addiction habit – what would have to gain?

Ironically, the answer might be what we were after in the first place:  wisdom – the ability to think useful thoughts about what’s going on around us. Consider the following passage from a Pulitzer price-willing journalist, prolific author, and general awesomely intelligent and articulate human being, taken from I Don’t Believe in Atheists:  The Dangerous Rise of the Secular Fundamentalist, by Chris Hedges (2008).

“Our collective and personal histories — the stories we tell about ourselves to ourselves and others — are used to avoid facing the incoherence and fragmentation of our lives. Chaos, chance and irrational urges, often locked in our unconscious, propel, inform and direct us. Our self is elusive. It is not fixed. It is subject to forces often beyond our control. To be human is to be captive to these forces, forces we cannot always name or understand. We mutate and change. We are not who we were. We are not who we will become. The familiarity of habit and ritual, as well as the narratives we invent to give structure and meaning to our life, helps hide this fragmentation. But human life is fluid and inconsistent. Those who place their faith in a purely rational existence begin from the premise that human beings can have fixed and determined selves governed by reason and knowledge. This is itself an act of faith.

“We can veto a response or check an impulse, reason can direct our actions, but we are just as often hostage to the pulls of the instinctual, the irrational, and the unconscious. We can rationalize our actions later, but this does not make them rational. The social and individual virtues we promote as universal values that must be attained by the rest of the human species are more often narrow, socially conditioned responses hardwired into us for our collective and personal survival and advancements. These values are rarely disinterested. They nearly always justify our right to dominance and power.

“We do not digest every sensation and piece of information we encounter. To do so would leave us paralyzed. The bandwidth of consciousness – our ability to transmit information measured in bits per second — is too narrow to register the enormous mass of external information we receive and act upon. We have conscious access to about a millionth of the information we use to function in life. Much of the information we receive and our subsequent responses do not take place on the level of consciousness. As the philosopher John Gray points out, irrational and subconscious forces, however unacknowledged, are as potent within us as in others.

“To accept the intractable and irrational forces that drive us, to admit that these forces are as entrenched in us as in all human beings, is to relinquish the fantasy that the human species can have total, rational control over human destiny. It is to accept our limitations, to live within the confines of human nature. Ethical, moral, religious, and political systems that do not concede these stark assumptions have nothing to say to us.”

Nicely said.

We’re not such hot stuff, and neither is ancient wisdom. We’re not so in touch and in control as we’d like to think we are — in fact we bounce around and mutate all over the place – and always in context. We do our best to push back the night, still the churning seas, halt the careening clouds, tame the void to make it less awful. It’s worth the try – the effort, however vain, gives us a sense of purpose, meaning, agency. But we’re not going to banish our limitations by latching onto ancient wisdom, because the latching process ultimately takes place only in us. We are what we are in the context of the moment, just like those old guys were.

A bunch of old guys tried to figure things out. So do we.

Chances are they were about as good at it as we are.

Which isn’t saying much.

Narratives of Self, Purpose, and Meaning [Part 1]: Fish Stories

A friend of mine is a Christian, business leader, author, and fisherman. He tells fish stories in each of those roles. At least it feels that way to me, so I take his stories “with a grain of salt.” A Roman luminary named Pliny the Elder[1] used that phrase in a poison antidote in 77 A.D., and he meant it literally. Today, it describes how we respond when it feels like someone’s story – like the fish –  just keeps getting bigger.

I don’t care about my friend’s fish, I care about him. When he tells a fish story, he’s sharing his personal narrative. “This is who I am,” he’s saying, “And this is how I believe life works.”

“Each of us constructs and lives a ‘narrative’, wrote the British neurologist Oliver Sacks, ‘this narrative is us’. Likewise the American cognitive psychologist Jerome Bruner: ‘Self is a perpetually rewritten story.’ And: ‘In the end, we become the autobiographical narratives by which we “tell about” our lives.’ Or a fellow American psychologist, Dan P McAdams: ‘We are all storytellers, and we are the stories we tell.’ And here’s the American moral philosopher J David Velleman: ‘We invent ourselves… but we really are the characters we invent.’ And, for good measure, another American philosopher, Daniel Dennett: ‘we are all virtuoso novelists, who find ourselves engaged in all sorts of behaviour… and we always put the best “faces” on it we can. We try to make all of our material cohere into a single good story. And that story is our autobiography. The chief fictional character at the centre of that autobiography is one’s self.’”[2]

“Each of us conducts our lives according to a set of assumptions about how things work: how our society functions, its relationship with the natural world, what’s valuable, and what’s possible. This is our worldview, which often remains unquestioned and unstated but is deeply felt and underlies many of the choices we make in our lives.”[3]

The Self

This kind of narrative assumes the self is an entity all its own, with a purpose also all its own, and that if you get both in hand, you’ll know the meaning of life – at least your own. Current neuro-psychology doesn’t see things that way.

“The idea of there being a single ‘self’, hidden in a place that only maturity and adulthood can illuminate and which, like archaeologists, we might dig and dust away the detritus to find, is to believe that there is some inner essence locked within us – and that unearthing it could be a key to working out how to live the rest of our lives. This comforting notion of coming of age, of unlocking a true ‘self’ endures, even though it is out of step with current thinking in psychology, which denies a singular identity.”[4]

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless.”[5]

For most people, that scientific outlook is too harsh:

“From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it. For most humans, the idea that life is inherently meaningless simply will not do.”[6]

Self-Actualization

Cultivating a sense of identity, purpose, and meaning sounds good, but who’s got time? Maslow’s iconic “Hierarchy of Needs” pyramid recognizes that adult life puts the basics first.

“Abraham Maslow was the 20th-century American psychologist best-known for explaining motivation through his hierarchy of needs, which he represented in a pyramid. At the base, our physiological needs include food, water, warmth and rest. Moving up the ladder, Maslow mentions safety, love, and self-esteem and accomplishment. But after all those have been satisfied, the motivating factor at the top of the pyramid involves striving to achieve our full potential and satisfy creative goals. As one of the founders of humanistic psychology, Maslow proposed that the path to self-transcendence and, ultimately, greater compassion for all of humanity requires the ‘self-actualisation’ at the top of his pyramid – fulfilling your true potential, and becoming your authentic self.”[7]

Columbia psychologist Scott Barry Kaufman thinks we ought to get self-actualization off the back burner, for the sake of ourselves and our world.

“‘We live in times of increasing divides, selfish concerns, and individualistic pursuits of power,’ Kaufman wrote recently in a blog in Scientific American introducing his new research. He hopes that rediscovering the principles of self-actualisation might be just the tonic that the modern world is crying out for.”[8]

Kaufman’s research suggests that making room for self-awareness and growth helps to develop character traits that the world could use more of:

“Participants’ total scores… correlated with their scores on the main five personality traits (that is, with higher extraversion, agreeableness, emotional stability, openness and conscientiousness) and with the metatrait of ‘stability’, indicative of an ability to avoid impulses in the pursuit of one’s goals.

“Next, Kaufman turned to modern theories of wellbeing, such as self-determination theory, to see if people’s scores on his self-actualisation scale correlated with these contemporary measures. Sure enough, he found that people with more characteristics of self-actualisation also tended to score higher on curiosity, life-satisfaction, self-acceptance, personal growth and autonomy, among other factors.

“A criticism often levelled at Maslow’s notion of self-actualisation is that its pursuit encourages an egocentric focus on one’s own goals and needs. However, Maslow always contended that it is only through becoming our true, authentic selves that we can transcend the self and look outward with compassion to the rest of humanity. Kaufman explored this too, and found that higher scorers on his self-actualisation scale tended also to score higher on feelings of oneness with the world, but not on decreased self-salience, a sense of independence and bias toward information relevant to oneself. (These are the two main factors in a modern measure of self-transcendence developed by the psychologist David Yaden at the University of Pennsylvania.)

“The new test is sure to reinvigorate Maslow’s ideas, but if this is to help heal our divided world, then the characteristics required for self-actualisation, rather than being a permanent feature of our personalities, must be something we can develop deliberately. I put this point to Kaufman and he is optimistic. ‘I think there is significant room to develop these characteristics [by changing your habits],’ he told me. ‘A good way to start with that,’ he added, ‘is by first identifying where you stand on those characteristics and assessing your weakest links. Capitalise on your highest characteristics but also don’t forget to intentionally be mindful about what might be blocking your self-actualisation … Identify your patterns and make a concerted effort to change. I do think it’s possible with conscientiousness and willpower.’”[9]

But What if There’s No Self to Actualize?

If there’s no unified self, then there’s no beneficiary for all that “concerted effort to change” and “conscientiousness and willpower.”

“The idea of there being a single ‘self’, hidden in a place that only maturity and adulthood can illuminate and which, like archaeologists, we might dig and dust away the detritus to find, is to believe that there is some inner essence locked within us – and that unearthing it could be a key to working out how to live the rest of our lives. This comforting notion of coming of age, of unlocking a true ‘self’ endures, even though it is out of step with current thinking in psychology, which denies a singular identity.[10]

Again, it’s hard for most of us to live with that much existential angst[11]. We prefer instead to think there’s a unique self (soul) packed inside each of us, and to invest it with significance.

“From a scientific point of view, we were not created or designed but instead are the product of evolution. The natural events that shaped our world and our own existence were not purposeful. In other words, life is objectively meaningless. From this perspective, the only way to find meaning is to create your own, because the universe has no meaning or purpose. The universe just is. Though there are certainly a small percentage of people who appear to accept this notion, much of the world’s population rejects it. For most humans, the idea that life is inherently meaningless simply will not do.

“Instead, people latch onto what I call teleological thinking. Teleological thinking is when people perceive phenomena in terms of purpose. When applied to natural phenomena, this type of thinking is generally considered to be flawed because it imposes design where there is no evidence for it. To impose purpose and design where there is none is what researchers refer to as a teleological error.”[12]

Teleological thinking finds design and purpose in the material world[13] to counter the feeling that we’re at the mercy of random pointlessness. We prefer our reality to be by design, so that we have a chance to align ourselves with it – a form of personal empowerment psychologists call “agency.”

“Each of us has a story we tell about our own life, a way of structuring the past and fitting events into a coherent narrative. Real life is chaotic; life narratives give it meaning and structure.”[14]

The Coming of Age Narrative

Further, we look to a specific cultural rite of passage – when we “come of age” in late adolescence — as the time when we first discover and take responsibility for our unique self and its identity and purpose. From there, we carry that sense of who we are and where we fit into responsible adult life.

“The protagonist has the double task of self-integration and integration into society… Take, for instance, the fact that the culminating fight scene in most superhero stories occurs only after the hero has learned his social lesson – what love is, how to work together, or who he’s ‘meant to be’. Romantic stories climax with the ultimate, run-to-the-airport revelation. The family-versus-work story has the protagonist making a final decision to be with his loved ones, but only after almost losing everything. Besides, for their dramatic benefit, the pointedness and singular rush of these scenes stems from the characters’ desire to finally gain control of their self: to ‘grow up’ with one action or ultimate understanding.[15]

The Redemption Narrative

The coming of age story is a variant of the “redemption” narrative, in which we learn that suffering is purposeful: it shapes and transforms us, so we can take our place in society.

“For the past 15 years, Daniel McAdams, professor of psychology at Northwestern University in Illinois, has explored this story and its five life stages: (1) an early life sense of being somehow different or special, along with (2) a strong feeling of moral steadfastness and determination, ultimately (3) tested by terrible ordeals that are (4) redeemed by a transformation into positive experiences and (5) zeal to improve society.

“This sequence doesn’t necessarily reflect the actual events of the storyteller’s life, of course. It’s about how people interpret what happened – their spin, what they emphasise in the telling and what they discard.” [16]

Redemption narratives make us good citizens, and never mind if there’s some ego involved:

“In his most recent study, the outcome of years of intensive interviews with 157 adults, McAdams has found that those who adopt [redemption narratives] tend to be generative – that is, to be a certain kind of big-hearted, responsible, constructive adult.

“Generative people are deeply concerned about the future; they’re serious mentors, teachers and parents; they might be involved in public service. They think about their legacy, and want to fix the world’s problems.

“But generative people aren’t necessarily mild-mannered do-gooders. Believing that you have a mandate to fix social problems – and that you have the moral authority and the ability to do so – also requires a sense of self-importance, even a touch of arrogance.”[17]

The American Way

Coming of age and redemption stories have been culturally and neurologically sustained in Western and Middle Eastern civilizations since the Abrahamic scriptures wrote about the Garden of Eden 5500 years ago. Americans, as heirs of this ideological legacy, have perfected it.

“For Americans, the redemption narrative is one of the most common and compelling life stories. In the arc of this life story, adversity is not meaningless suffering to be avoided or endured; it is transformative, a necessary step along the road to personal growth and fulfilment.[18]

“The coming-of-age tale has become an peculiarly American phenomenon, since self-understanding in the United States is largely predicated on a self-making mythos. Where, in Britain, one might be asked about one’s parents, one’s schooling or one’s background, Americans seem less interested in a person’s past and more interested in his or her future. More cynical observers have claimed, perhaps rightly, that this is because Americans don’t have a clear history and culture; but the coming-of-age tale has also become important in the US because of a constant – maybe optimistic, maybe pig-headed – insistence that one can always remake oneself. The past is nothing; the future is “everything.

“This idea of inherent, Adam-and-Eve innocence, and the particularly American interest in it, is perhaps tantamount to a renunciation of history. Such denialism infuses both American stories and narratives of national identity, said Ihab Hassan, the late Arab-American literary theorist. In any case, the American tale of growing up concerns itself with creating a singular, enterprising self out of supposed nothingness: an embrace of the future and its supposedly infinite possibilities.”[19]

American capitalism relies on the redemption narrative as its signature story genre.

“From a more sociological perspective, the American self-creation myth is, inherently, a capitalist one. The French philosopher Michel Foucault theorised that meditating and journaling could help to bring a person inside herself by allowing her, at least temporarily, to escape the world and her relationship to it. But the sociologist Paul du Gay, writing on this subject in 1996, argued that few people treat the self as Foucault proposed. Most people, he said, craft outward-looking ‘enterprising selves’ by which they set out to acquire cultural capital in order to move upwards in the world, gain access to certain social circles, certain jobs, and so on. We decorate ourselves and cultivate interests that reflect our social aspirations. In this way, the self becomes the ultimate capitalist machine, a Pierre Bourdieu-esque nightmare that willingly exploits itself.

“Even the idea that there is a discreet transition from youth into adulthood, either via a life-altering ‘feeling’ or via the culmination of skill acquisition, means that selfhood is a task to be accomplished in the service of social gain, and in which notions of productivity and work can be applied to one’s identity. Many students, for instance, are encouraged to take ‘gap years’ to figure out ‘who they are’ and ‘what they want to do’. (‘Do’, of course, being a not-so-subtle synonym for ‘work’.) Maturation is necessarily related to finances, and the expectation of most young people is that they will become ‘independent’ by entering the workforce. In this way, the emphasis on coming of age reifies the moral importance of work.” [20]

As usual, Silicon Valley is ahead of the game, having already harnessed the power of the redemption story as its own cultural norm:

“In Silicon Valley these days, you haven’t really succeeded until you’ve failed, or at least come very close. Failing – or nearly failing – has become a badge of pride. It’s also a story to be told, a yarn to be unspooled.

“The stories tend to unfold the same way, with the same turning points and the same language: first, a brilliant idea and a plan to conquer the world. Next, hardships that test the mettle of the entrepreneur. Finally, the downfall – usually, because the money runs out. But following that is a coda or epilogue that restores optimism. In this denouement, the founder says that great things have or will come of the tribulations: deeper understanding, new resolve, a better grip on what matters.

“Unconsciously, entrepreneurs have adopted one of the most powerful stories in our culture: the life narrative of adversity and redemption.”[21]

Writing Your Own Story

There’s nothing like a good story to make you rethink your life. A bookseller friend’s slogan for his shop is “Life is a story. Tell a good one.”

“The careers of many great novelists and filmmakers are built on the assumption, conscious or not, that stories can motivate us to re-evaluate the world and our place in it.

“New research is lending texture and credence to what generations of storytellers have known in their bones – that books, poems, movies, and real-life stories can affect the way we think and even, by extension, the way we act.

“Across time and across cultures, stories have proved their worth not just as works of art or entertaining asides, but as agents of personal transformation.”[22]

As a result, some people think we ought to take Michel Foucault’s advice and meditate (practice “mindfulness”) and journal our way to a better self-understanding. As for journaling:

“In truth, so much of what happens to us in life is random – we are pawns at the mercy of Lady Luck. To take ownership of our experiences and exert a feeling of control over our future, we tell stories about ourselves that weave meaning and continuity into our personal identity. Writing in the 1950s, the psychologist Erik Erikson put it this way:

“To be adult means among other things to see one’s own life in continuous perspective, both in retrospect and in prospect … to selectively reconstruct his past in such a way that, step for step, it seems to have planned him, or better, he seems to have planned it.

“Intriguingly, there’s some evidence that prompting people to reflect on and tell their life stories – a process called ‘life review therapy’ – could be psychologically beneficial.”[23]

Consistent with Scott Barry Kaufman’s comments from earlier, the more you can put a coming of age or redemption story spin on your own narrative, the more likely journaling will improve your outlook.

“A relevant factor in this regard is the tone, complexity and mood of the stories that people tell themselves. For instance, it’s been shown that people who tell more positive stories, including referring to more instances of personal redemption, tend to enjoy higher self-esteem and greater ‘self-concept clarity’ (the confidence and lucidity in how you see yourself). Perhaps engaging in writing or talking about one’s past will have immediate benefits only for people whose stories are more positive.

“It remains unclear exactly why the life-chapter task had the self-esteem benefits that it did. It’s possible that the task led participants to consider how they had changed in positive ways. They might also have benefited from expressing and confronting their emotional reactions to these periods of their lives – this would certainly be consistent with the well-documented benefits of expressive writing and ‘affect labelling’ (the calming effect of putting our emotions into words).

“The researchers said: ‘Our findings suggest that the experience of systematically reviewing one’s life and identifying, describing and conceptually linking life chapters may serve to enhance the self, even in the absence of increased self-concept clarity and meaning.’”[24]

An American Life

My friend the storyteller is an exemplar of all the above. He’s an American, a Christian, and a capitalist. And when he starts his day by journaling, he believes he’s writing what he’s hearing from God. I was most of that, too for the couple decades he and I shared narratives and teleological outlook. I’ve since moved on:  at this writing, we’ve had no contact for over three years. I wondered if I could still call him a friend — whether that term still applies  after your stories diverge as entirely as ours . Yes you can and yes it does, I decided, although I honestly can’t say why.

Religion: Teleological Thinking Perfected

Personal narratives – especially actually writing your own story – aren’t for everyone. They require quiet, solitude, and reflection, plus doing that feels egotistical if you’re not used to it. Religion offers a more common teleological alternative, with its beliefs, rituals, and practices designed to put you in touch with an external, transcendent source of your identity, purpose, and meaning. “Don’t look inward, look up,” is its message.

We’ll look at that next time.

[1] . Wikipedia. Pliny the Elder was a naturalist, military leader, friend of the Emperor, and a victim of the Vesuvius eruption.

[2] I Am Not a Story: Some find it comforting to think of life as a story. Others find that absurd. So are you a Narrative or a non-Narrative? Aeon (Sept. 3, 2015)

[3] Lent, Jeremy, The Patterning Instinct: A Cultural History of Humanity’s Search for Meaning (2017)

[4] The Coming-Of-Age Con: How can you go about finding ‘who you really are’ if the whole idea of the one true self is a big fabrication? Aeon (Sept. 8, 2017)

[5] Routledge, Clay, Supernatural: Death, Meaning, and the Power of the Invisible World  (2018)

[6] Ibid.

[7] Do You Have A Self-Actualised Personality? Maslow Revisited. Aeon (Mar. 5, 2019)

[8] Ibid.

[9] Ibid.

[10] The Coming-Of-Age Con op. cit.

[11] Urban Dictionary: existential angst..

[12] Routledge, Clay, Supernatural: Death, Meaning, and the Power of the Invisible World  (July 2, 2018)

[13] Wikipedia.

[14] Silicon Phoenix: A Gifted Child, An Adventure, A Dark Time, And Then … A Pivot? How Silicon Valley Rewrote America’s Redemption Narrative, Aeon Magazine (May 2, 2016)

[15] The Coming-Of-Age Con, op cit.

[16] Silicon Phoenix, op. cit.

[17] Silicon Phoenix, op. cit.

[18] Silicon Phoenix, op. cit.

[19] The Coming-Of-Age Con op. cit.

[20] Silicon Phoenix, op cit.

[21] Silicon Phoenix, op cit.

[22] The Power of Story, op. cit.

[23] To Boost Your Self-Esteem, Write About Chapters of Your Life. Aeon (Apr. 5, 2019)

[24] Ibid.

Belief in Belief

ya gotta believe

New York Mets fans at the 1973 World Series
(they lost)

The quest to resolve the consciousness hard problem needs a boost from quantum mechanics to get any further. Either that, or there needs to be a better way to state the issue. As things stand, neuroscience’s inability to locate subjectivity in our brain matter gives pro-subjectivity the right to cite quantum mechanics as its go-to scientific justification.

The $12 Billion self-help industry and its coaches, speakers, and authors love quantum mechanics:  if subjectivity works on a sub-atomic level, the argument goes, then why not apply it on a macro, conscious level? Meanwhile, quantum scientists seem to have resigned themselves to the notion that, if their theories don’t have to be grounded in traditional objective standards like empirical testing and falsifiability, then why not hypothesize about multiverses and call that science?

Thus scientific rationalism continues to be on the wane — in science and as a way of life — especially in the USA, where belief in belief has been an ever-expanding feature of the American Way since we got started. To get the full perspective on America’s belief in belief, you need to read Kurt Andersen’s book, Fantasyland:  How American Went Haywire, a 500-Year History (2017), which I quoted at length last time. (Or for the short version, see this Atlantic article.)  The book provides a lot of history we never learned, but also reveals that the roots of our belief in belief go back even further than our own founding, and beyond our own shores. Although we weren’t founded as a Christian nation[1] (in the same way, for example, that Pakistan was expressly founded as a Muslim nation), Andersen traces this aspect of our ideological foundations to the Protestant Reformation:

“[Luther] insisted that clergymen have no special access to God or Jesus or truth. Everything a Christian needed to know was in the Bible. So every individual Christian believer could and should read and interpret Scripture for himself or herself. Every believer, Protestants said, was now a priest.

“Apart from devolving religious power to ordinary people — that is, critically expanding individual liberty — Luther’s other big idea was that belief in the Bible’s supernatural stories, especially those concerning Jesus, was the only prerequisite for being a Christian. You couldn’t earn your way into Heaven by performing virtuous deeds. Having a particular set of beliefs was all that mattered.

“However, out of the new Protestant religion, a new proto-American attitude emerged during the 1500s. Millions of ordinary people decided that they, each of them, had the right to decide what was true or untrue, regardless of what fancy experts said. And furthermore, they believed, passionate fantastical belief was the key to everything. The footings for Fantasyland had been cast.”

But even the Protestant Reformation isn’t back far enough. Luther’s insistence that anybody can get all the truth they need from the Bible is the Christian doctrine of sola scirptura, which holds that the Bible is the ultimate source of truth. And the Bible is where we find the original endorsement of the primacy of belief, in the teachings of none other than Jesus himself:

“Truly, I say to you, whoever says to this mountain, ‘Be taken up and thrown into the sea,’ and does not doubt in his heart,  but believes that what he says will come to pass, it will be done for him.”

Mark 11:23 (ESV)

Thus, the Christian rationale for belief in belief goes something like this:

  • “We believe the Bible tells the truth;
  • “The Bible says Jesus was God incarnate;
  • “God knows what’s true;
  • “Jesus, as God, spoke truth;
  • “Therefore, what Jesus said about belief is true.”

The rationale begins and ends in belief. Belief is a closed loop — you either buy it by believing, or you don’t. And if you believe, you don’t doubt or question, because if you do, belief won’t work for you, and it will be your own fault — you’ll be guilty of doubting in your heart or some other kind of sabotage. For example,

“If any of you lacks wisdom, let him ask God, who gives generously to all without reproach, and it will be given him. 6 But let him ask in faith, with no doubting, for the one who doubts is like a wave of the sea that is driven and tossed by the wind. 7 For that person must not suppose that he will receive anything from the Lord; 8 he is a double-minded man, unstable in all his ways.”

James 1:5-8 (ESV)

Thus belief disposes of every criticism against it. You’re either in or out, either with us or against us. Or, as a friend of mine used to say, “The Bible says it, I believe it, and that settles it!” And if your doubts persist, there are consequences. When I expressed some of mine back in college, the same friend handed me a Bible and said, “Read Luke 6: 62.”

“Jesus said to him, ‘No one who puts his hand to the plow and looks back is fit for the kingdom of God.’

Luke 9: 62  (ESV)

End of discussion.

But not here, not in this blog. Here, our mission is to challenge cherished beliefs and institutions. Here, we’ll to look more into what it means to believe in belief, and consider other options. In the meantime, we’ll set aside the hard problem of consciousness while we wait for further developments,

For more on today’s topic, you might take a look at Should We Believe In Belief? (The Guardian, July 17, 2009), and be sure to click the links at the end and read those pieces, too. All the articles are short and instructive.

[1] For a detailed consideration (and ultimate refutation) of the claim that American was founded as a Christian nation , see The Founding Myth, by Andrew L. Seidel (2019).

How Impossible Becomes Possible (2)

While objective, scientific knowledge scrambles to explain consciousness in purely biological terms (“the meat thinks”), subjective belief enjoys cultural and scientific predominance. And no wonder — the allure of subjectivity is freedom and power:  if scientists can control the outcome of their quantum mechanics lab work by what they believe, then surely the rest of us can also believe the results we want into existence. In fact, isn’t it true that we create our own reality, either consciously or not? If so, then consciously is better, because that way we’ll get what we intend instead of something mashed together by our shady, suspect subconscious. And the good news is, we can learn and practice conscious creation. Put that to work, and we can do and have and be whatever we want! Nothing is impossible for us!

I.e, belief in belief is the apex of human consciousness and self-efficacy:  it’s what makes the impossible possible. At least, that’s the self-help gospel, which also has deep roots in the New Testament. We’ll be looking deeper into both.

The Music Man lampooned belief in belief as practiced by con man Harold Hill’s “think method.” The show came out in 1957. Five years before, the Reverend Norman Vincent Peale published The Power of Positive Thinking, and twenty years before, Napolean Hill published Think and Grow Rich, in which he penned its most-quoted aphorism, “Whatever your mind can conceive and believe, it can achieve.”

Americans in particular have had an enduring allegiance to belief in belief, ever since we got started 500 years ago. Since then, we’ve taken it to ever-increasing extremes:

“The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control. In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts. Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation.

“Why are we like this?

“The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned. Once people commit to that approach, the world turns inside out, and no cause-and-effect connection is fixed. The credible becomes incredible and the incredible credible.

“America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successful—but also by a people uniquely susceptible to fantasy, as epitomized by everything from Salem’s hunting witches to Joseph Smith’s creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump. In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes ’60s and the internet age. The result is the America we inhabit today.

Fantasyland

Belief in belief soared to new heights in mega-bestseller The Secret:

“The Secret takes the American fundamentals, individualism and supernaturalism and belief in belief, and strips away the middlemen and most the pious packaging…. What’s left is a “law of attraction,” and if you just crave anything hard enough, it will become yours. Belief is all. The Secret’s extreme version of magical thinking goes far beyond its predecessors’. It is staggering. A parody would be almost impossible. It was number one on the Times’s nonfiction list for three years and sold around twenty million copies.”

Fantasyland:  How American Went Haywire, a 500-Year History, Kurt Andersen (2017)

American culture’s embrace of belief in belief was supercharged in its earliest days by the Puritans, about whom Kurt Andersen concludes, “In other words, America was founded by a nutty religious cult.” Maybe that’s why The Secret distanced itself from those Christian moorings:

“The closest antecedent to The Secret was The Power of Positive Thinking in the 1950s, back when a mega-bestselling guide to supernatural success still needed an explicit tether to Christianity.

“In The Secret, on the other hand, Rhonda Byrn mentions Jesus only once, as the founder of the prosperity gospel. All the major biblical heroes, including Christ, she claims, ‘were not only prosperity teachers, but also millionaires themselves, with more affluent lifestyles than many present-day millionaires could conceive of.’”

Fantasyland

The Secret also stakes its claim on the side of subjective science:

“There isn’t a single thing you cannot do with this knowledge,’ the book promises. ‘It doesn’t matter who your are or where you are. The Secret can give you whatever you want. ‘Because it’s a scientific fact.’”

Fantasyland

But The Secret is just one example of the subjective good news. Believe it into existence —  that’s how the impossible is done American, self-help, Christian, subjective science style. Never mind the objective, empirically-verified, scientific “adjacent possibility” approach we looked at last time — that’s just too stuffy, too intellectual. Belief in belief is much more inspiring, more of a joyride.

And that’s a problem.

More next time.

How Impossible Becomes Possible

active nerve cell in human neural system

network

Scientific materialism explains a lot about how the brain creates consciousness, but hasn’t yet fully accounted for subjective awareness. As a result, the “hard problem” of consciousness remains unsolved, and we’re alternately urged to either concede that the human brain just isn’t ready to figure itself out, or conclude that reality is ultimately determined subjectively.

Princeton psychology and neuroscience professor Michael S. A. Graziano isn’t ready to do either. He thinks the “hard problem” label is itself the problem, because it cuts off further inquiry:

“Many thinkers are pessimistic about ever finding an explanation of consciousness. The philosopher Chalmers in 1995, put it in a way that has become particularly popular. He suggested that the challenge of explaining consciousness can be divided into two problems. One, the easy problem, is to explain how the brain computes and stores information. Calling this problem easy is, of course, a euphemism. What it meant is something more like the technically possible problem given a lot of scientific work.

“In contrast, the hard problem is to explain how we become aware of all that stuff going on in the brain. Awareness itself, the essence of awareness, because it is presumed to be nonphysical, because it is by definition private, seems to be scientifically unapproachable. Again, calling it the hard problem is a euphemism, it is the impossible problem.

“The hard-problem view has a pinch of defeatism in it. I suspect that for some people it also has a pinch of religiosity. It is a keep-your-scientific-hands-off-my-mystery perspective. In the hard problem view, rather than try to explain consciousness, we should marvel at its insolubility. We have no choice but to accept it as a mystery.

“One conceptual difficulty with the hard-problem view is that it argues against any explanation of consciousness without knowing what explanations might arise. It is difficult to make a cogent argument against the unknown. Perhaps an explanation exists such that, once we see what it is, once we understand it, we will find that it makes sense and accounts for consciousness.”

Consciousness and the Social Brain. by Michael S. A. Graziano (2013).

I.e., if science is going to explain consciousness, it needs to reframe its inquiry, so that what is now an “impossible,” “scientifically unapproachable” problem becomes a “technically possible problem” that can be solved “given a lot of scientific work.”

Technology and innovation writer Steven Johnson describes how he thinks the impossible becomes possible in Where Good Ideas Come From — available as a TED talk. book, and animated whiteboard drawing piece on YouTube. In his TED talk, he contrasted popular subjective notions with what neuroscience has discovered about how the brain actually works:

“[We] have to do away with a lot of the way in which our conventional metaphors and language steer us towards certain concepts of idea-creation. We have this very rich vocabulary to describe moments of inspiration. We have … the flash of insight, the stroke of insight, we have epiphanies, we have ‘eureka!’ moments, we have the lightbulb moments… All of these concepts, as kind of rhetorically florid as they are, share this basic assumption, which is that an idea is a single thing, it’s something that happens often in a wonderful illuminating moment.

“But in fact, what I would argue is … that an idea is a network on the most elemental level. I mean, this is what is happening inside your brain. An idea — a new idea — is a new network of neurons firing in sync with each other inside your brain. It’s a new configuration that has never formed before. And the question is, how do you get your brain into environments where these new networks are going to be more likely to form?”

Johnson expands on the work of biologist and complex systems researcher Stuart Kauffman, who dubbed this idea the “adjacent possibility.” Adjacent possibility is where the brain’s neural networks (top picture above) meet data networks (the bottom picture):  neither is a static, closed environment; both are dynamic, constantly shifting and re-organizing, with each node representing a new point from which the network can expand. Thus the shift from unknown to known is always a next step away:

“The adjacent possible is a kind of shadow future, hovering on the edges of the present state of things, a map of all the ways in which the present can reinvent itself.”

Vittorio Loreto and his colleagues at Sapienza University of Rome turned adjacent possibility into a mathematical model which they then submitted to objective, empirical, real world testing. As he said in his TED talk:

“Experiencing the new means exploring a very peculiar space, the space of what could be, the space of the possible, the space of possibilities.

“We conceived our mathematical formulation for the adjacent possible, 20 years after the original Kauffman proposals.

“We had to work out this theory, and we came up with a certain number of predictions to be tested in real life.”

Their test results suggest that adjacent possibility is good science — that impossible doesn’t step out of the ether, it waits at the edge of expanding neural networks, ready to become possible.[1] As Steven Johnson said above, that’s a far cry from our popular romantic notions of revelations, big ideas, and flashes of brilliance. We look more at those next time.

[1] For a nerdier version, see this Wired piece: The ‘Adjacent Possible’ of Big Data: What Evolution Teaches About Insights Generation.

So Consciousness Has a Hard Problem… Now What?

god helmet

We’ve been looking at the “hard problem” of consciousness:

  • Neuroscience can identify the brain circuits that create the elements of consciousness and otherwise parse out how “the meat thinks,” but it can’t quite get its discoveries all the way around the mysteries of subjective experience.
  • That’s a problem because we’re used to thinking along Descartes’ dualistic distinction between scientific knowledge, which is objective, empirical, and invites disproving, and belief-based conviction, which is subjective, can’t be tested and doesn’t want to be.
  • What’s worse, science’s recent work in quantum mechanics, artificial intelligence, and machine learning has blurred those dualistic lines by exposing the primacy of subjectivity even in scientific inquiry.
  • All of which frustrates our evolutionary survival need to know how the world really works.[1]

Some people are ready to declare that subjective belief wins, and science will just have to get over it. That’s what happened with the “God Helmet” (shown in the photo above, taken from this article), Dr. Michael Persinger[2] created the helmet for use in neuro-religious research:

“This is a device that is able to simulate religious experiences by stimulating an individual’s tempoparietal lobes using magnetic fields. ‘If the equipment and the experiment produced the presence that was God, then the extrapersonal, unreachable, and independent characteristics of the god definition might be challenged,’ [says Dr. Persinger].” [3]

The God Helmet creates subjective experiences shared among various religions, such as sensing a numinous presence, a feeling of being filled with the spirit or overwhelmed or possessed, of being outside of self, out of body, or having died and come back to life, feelings of being one with all things or of peace, awe, fear and dread, etc. Since all of these states have been either measured or induced in the laboratory, you’d think that might dampen allegiance to the belief that they are God-given, but not so. Instead, when the God Helmet was tested on a group of meditating nuns, their conclusion was, how wonderful that God equipped the brain in that way, so he could communicate with us. Similarly,

 “Some years ago, I discussed this issue with Father George Coyne, a Jesuit priest and astronomer who was then Director of the Vatican Observatory. I asked him what he thought of the notion that when the 12th‑century Hildegard of Bingen was having her visions of God, perhaps she was having epileptic fits. He had no problem with the fits. Indeed, he thought that when something so powerful was going on in a mind, there would necessarily be neurological correlates. Hildegard might well have been an epileptic, Father Coyne opined; that didn’t mean God wasn’t also talking to her.”

The Mental Block – Consciousness Is The Greatest Mystery In Science. Aeon Magazine (Oct. 9, 2013)

If we’re not willing to concede the primacy of subjectivity, then what? Well, we could give up on the idea that the human race is equipped to figure out everything it would really like to know.

 “It would be poetic – albeit deeply frustrating – were it ultimately to prove that the one thing the human mind is incapable of comprehending is itself. An answer must be out there somewhere. And finding it matters: indeed, one could argue that nothing else could ever matter more – since anything at all that matters, in life, only does so as a consequence of its impact on conscious brains. Yet there’s no reason to assume that our brains will be adequate vessels for the voyage towards that answer. Nor that, were we to stumble on a solution to the Hard Problem, on some distant shore where neuroscience meets philosophy, we would even recognise that we’d found it.”

Why Can’t The World’s Greatest Minds Solve The Mystery Of Consciousness? The Guardian (Jan. 21, 2015)

“Maybe philosophical problems are hard not because they are divine or irreducible or workaday science, but because the mind of Homo sapiens lacks the cognitive equipment to solve them. We are organisms, not angels, and our minds are organs, not pipelines to the truth. Our minds evolved by natural selection to solve problems that were life-and-death matters to our ancestors, not to commune with correctness or to answer any question we are capable of asking. We cannot hold ten thousand words in short-term memory. We cannot see in ultraviolet light. We cannot mentally rotate an object in the fourth dimension. And perhaps we cannot solve conundrums like free will and sentience.”

How the Mind Works, Steven Pinker (1997)

Evolutionary biologist David Barash attributes our inability to the vastly different pace of biological evolution (what the operative biology of our brains can process) vs. cultural evolution (what we keep learning and inventing and hypothesizing about). Trouble is, the latter moves way too fast for the former to keep up.

“On the one hand, there is our biological evolution, a relatively slow-moving organic process that can never proceed more rapidly than one generation at a time, and that nearly always requires an enormous number of generations for any appreciable effect to arise.

“On the other hand is cultural evolution, a process that is, by contrast, extraordinary in its speed.

“Whereas biological evolution is Darwinian, moving by the gradual substitution and accumulation of genes, cultural evolution is … powered by a nongenetic ‘inheritance” of acquired characteristics. During a single generation, people have selectively picked up, discarded, manipulated, and transmitted cultural, social, and technological innovations that have become almost entirely independent of any biological moorings.

“We are, via our cultural evolution, in over our biological heads.”

Through a Glass Brightly:  Using Science to See Our Species as We Really Are, David P. Barash (2018)

Give in to subjectivity, or just give up…. We’ll look at another option next time.

[1] The study of how we know things is Epistemology.

[2] Dr. Persinger was director of the Neuroscience Department at Laurentian University in Ontario, Canada prior to his death in 2018.

[3] “What God Does To Your Brain:  The controversial science of neurotheology aims to find the answer to an age-old question: why do we believe?” The Telegraph (June 20, 2014).

Subjective Science

quantum mechanics formula

What happened to spark all the recent scientific interest in looking for consciousness in the brains of humans and animals, in insects, and … well, everywhere? (Including not just the universe, but also the theoretical biocentric universe and quantum multiverses.)

“It has been said that, if the 20th century was the age of physics, the 21st will be the age of the brain. Among scientists today, consciousness is being hailed as one of the prime intellectual challenges. My interest in the subject is not in any particular solution to the origin of consciousness – I believe we’ll be arguing about that for millennia to come – but rather in the question: why is consciousness perceived as a ‘problem’? How exactly did it become a problem? And given that it was off the table of science for so long, why is it now becoming such a hot research subject?”

I Feel Therefore I Am — How Exactly Did Consciousness Become A Problem? And why, after years off the table, is it a hot research subject now?  Aeon Magazine (Dec. 1, 2015)

From what I can tell, two key sparks started the research fire:  (1) the full implications of quantum mechanics finally set in, and (2) machines learned how to learn.

(1)  Quantum Mechanics:  Science Goes Subjective. Ever since Descartes set up his dualistic reality a few hundred years ago, we’ve been able to trust that science could give us an objective, detached, rational, factual view of the observable universe, while philosophy and religion could explore the invisible universe where subjectivity reigns. But then the handy boundary between the two was torn in the early 20th Century when quantum mechanics found that subjectivity reigns on a sub-atomic level, where reality depends on what researchers decide ahead of time what they’re looking for. Scientists tried for the rest of the 20th Century to restore objectivity to their subatomic lab work, but eventually had to concede.

 “Physicists began to realise that consciousness might after all be critical to their own descriptions of the world. With the advent of quantum mechanics they found that, in order to make sense of what their theories were saying about the subatomic world, they had to posit that the scientist-observer was actively involved in constructing reality.

“At the subatomic level, reality appeared to be a subjective flow in which objects sometimes behave like particles and other times like waves. Which facet is manifest depends on how the human observer is looking at the situation.

“Such a view apalled many physicists, who fought desperately to find a way out, and for much of the 20th century it still seemed possible to imagine that, somehow, subjectivity could be squeezed out of the frame, leaving a purely objective description of the world.

“In other words, human subjectivity is drawing forth the world.”

I Feel Therefore I Am

(2)  Machines Learned to Learn. Remember “garbage in, garbage out”? It used to be that computers had to be supervised — they only did what we told them to do, and could only use the information we gave them. But not anymore. Now their “minds” are free to sort through the garbage on their own and make up their own rules about what to keep or throw out. Because of this kind of machine learning, we now have computers practicing law and medicine, handling customer service, writing the news, composing music, writing novels and screenplays, creating art…. all those things we used to think needed human judgment and feelings. Google wizard and overall overachiever Sebastian Thrun[1] explains the new machine learning in this conversation with TED Curator Chris Anderson:

 “Artificial intelligence and machine learning is about 60 years old and has not had a great day in its past until recently. And the reason is that today, we have reached a scale of computing and datasets that was necessary to make machines smart. The new thing now is that computers can find their own rules. So instead of an expert deciphering, step by step, a rule for every contingency, what you do now is you give the computer examples and have it infer its own rules.

 “20 years ago the computers were as big as a cockroach brain. Now they are powerful enough to really emulate specialized human thinking. And then the computers take advantage of the fact that they can look at much more data than people can.

No wonder science got rattled. Like the rest of us, it was comfortable with all the Cartesian dualisms that kept the world neatly sorted out:  science vs. religion,[2] objective vs. subjective, knowledge vs. belief, humanity vs. technology…. But now all these opposites are blurring together in a subjective vortex while non-human intelligence looks on and comments about it.

Brave New World, indeed. How shall we respond to it?

More next time.

[1] Sebastian Thrun’s TED bio describes him as “an educator, entrepreneur and troublemaker. After a long life as a professor at Stanford University, Thrun resigned from tenure to join Google. At Google, he founded Google X, home to self-driving cars and many other moonshot technologies. Thrun also founded Udacity, an online university with worldwide reach, and Kitty Hawk, a ‘flying car’ company. He has authored 11 books, 400 papers, holds 3 doctorates and has won numerous awards.”

[2] For an alternative to the science-religion dualism, see Science + Religion:  The science-versus-religion opposition is a barrier to thought. Each one is a gift, rather than a threat, to the other, Aeon Magazine (Nov. 21, 2019)

 

Zombies and the Consciousness Hard Problem

              night of the living dead                   Walking Dead

                      Poster from the 1968 movie     https://comicbook.com/thewalkingdead

Philosophers and psychologists call human traits like feelings, conscience, and self- awareness “qualia,” and believe that, if zombies can lack them but still look and act like us (on a really bad day), then locating consciousness entirely in human biology (“physicalism”) can’t be right.

“Physicalism allows us to imagine a world without consciousness, a ‘Zombie world’ that looks exactly like our own, peopled with beings who act exactly like us but aren’t conscious. Such Zombies have no feelings, emotions or subjective experience; they live lives without qualia. As [philosopher David Chalmers][1] has noted, there is literally nothing it is like to be Zombie. And if Zombies can exist in the physicalist account of the world, then, according to Chalmers, that account can’t be a complete description of our world, where feelings do  exist: something more is needed, beyond the laws of nature, to account for conscious subjective experience.”

I Feel Therefore I Am, Aeon Magazine Dec. 1, 2015

To physicalists, says the article, “those are fighting words, and some scientists are fighting back”:

“In the frontline are the neuroscientists who, with increasing frequency, are proposing theories for how subjective experience might emerge from a matrix of neurons and brain chemistry. A slew of books over the past two decades have proffered solutions to the ‘problem’ of consciousness. Among the best known are Christof Koch’s The Quest for Consciousness: A Neurobiological Approach (2004); Giulio Tononi and Gerald Edelman’s A Universe of Consciousness: How Matter Becomes Imagination (2000); Antonio Damasio’s The Feeling of What Happens: Body and Emotion in the Making of Consciousness (1999); and the philosopher Daniel Dennett’s bluntly titled Consciousness Explained (1991).”

Of particular interest in that battery of academic firepower is Daniel Dennett, who has a unique take on Zombies and the consciousness “hard problem”:

“Not everybody agrees there is a Hard Problem to begin with – making the whole debate kickstarted by Chalmers an exercise in pointlessness. Daniel Dennett, the high-profile atheist and professor at Tufts University outside Boston, argues that consciousness, as we think of it, is an illusion: there just isn’t anything in addition to the spongy stuff of the brain, and that spongy stuff doesn’t actually give rise to something called consciousness.

“Common sense may tell us there’s a subjective world of inner experience – but then common sense told us that the sun orbits the Earth, and that the world was flat. Consciousness, according to Dennett’s theory, is like a conjuring trick: the normal functioning of the brain just makes it look as if there is something non-physical going on.

“To look for a real, substantive thing called consciousness, Dennett argues, is as silly as insisting that characters in novels, such as Sherlock Holmes or Harry Potter, must be made up of a peculiar substance named “fictoplasm”; the idea is absurd and unnecessary, since the characters do not exist to begin with.

“This is the point at which the debate tends to collapse into incredulous laughter and head-shaking: neither camp can quite believe what the other is saying. To Dennett’s opponents, he is simply denying the existence of something everyone knows for certain: their inner experience of sights, smells, emotions and the rest. (Chalmers has speculated, largely in jest, that Dennett himself might be a Zombie.)

“More than one critic of Dennett’s most famous book, Consciousness Explained, has joked that its title ought to be Consciousness Explained Away  Dennett’s reply is characteristically breezy: explaining things away, he insists, is exactly what scientists do… However hard it feels to accept, we should concede that consciousness is just the physical brain, doing what brains do.”

Why Can’t The World’s Greatest Minds Solve The Mystery Of Consciousness? The Guardian (Jan. 21, 2015)

Zombies also appear in another current scientific inquiry:  whether artificially intelligent machines can be conscious. “Who’s to say machines don’t already have minds?” asks this article.[2] If they do, then “we need a better way to define and test for consciousness,” but formulating one means you “still face what you might call the Zombie problem.” (Oh great — so a machine could be a Zombie, too, as if there weren’t already enough of them already.)

Suppose you create a test to detect human qualia in machines, and weed out the Zombies, but who’s going to believe it if it comes back positive?

“Suppose a test finds that a thermostat is conscious. If you’re inclined to think a thermostat is conscious, you will feel vindicated. If sentient thermostats strike you as silly, you will reject the verdict. In that case, why bother conducting the test at all?”

Consciousness Creep

And if conscious thermostats aren’t enough to make you “collapse into incredulous laughter and head-shaking,” then how about finding consciousness in … insects? Turns out, they, too, have a Zombie problem, according to this article, co-written by a biologist and a philosopher.[3]

What happened to science that it’s tackling these issues, and with a straight face? I promised last time we’d look into that. We’ll do that next.

[1] As we saw last time, David Chalmers defined the “easy” and “hard” problems of consciousness.

[2] Consciousness Creep:  Our machines could become self-aware without our knowing it. Aeon Magazine, February 25, 2016

[3] Bee-Brained;  Are Insects ‘Philosophical Zombies’ With No Inner Life? Close attention to their behaviours and moods suggests otherwise, Aeon Magazine (Sept. 27, 2018)

The Greatest Unsolved Mystery

sherlock holmes

Academic disciplines take turns being more or less in the public eye — although, as we saw a couple posts back, metaphysicians think their discipline ought to be the perennial front runner. After all, it’s about figuring out the real nature of things”[1] and what could be more important than that?

Figuring out the human mind that’s doing the figuring, that’s what![2] Thus neuroscience’s quest to understand human consciousness finds itself at the front of the line as the greatest unsolved scientific mystery of our time.

“Nearly a quarter of a century ago, Daniel Dennett wrote that: ‘Human consciousness is just about the last surviving mystery.’ A few years later, [David] Chalmers added: ‘[It] may be the largest outstanding obstacle in our quest for a scientific understanding of the universe.’ They were right then and, despite the tremendous scientific advances since, they are still right today.

“I think it is possible that, compared with the hard problem [of consciousness], the rest of science is a sideshow. Until we get a grip on our own minds, our grip on anything else could be suspect. The hard problem is still the toughest kid on the block.”

The Mental Block – Consciousness Is The Greatest Mystery In Science, Aeon Magazine Oct. 9, 2013

“Hard problem” is a term of art in the consciousness quest:

“The philosopher [David] Chalmers … suggested that the challenge of explaining consciousness can be divided into two problems.

“One, the easy problem, is to explain how the brain computes and stores information. Calling this problem easy is, of course, a euphemism. What is meant is something more like the technically possible problem given a lot of scientific work.

“In contrast, the hard problem is to explain how we become aware of all that stuff going on in the brain. Awareness itself, the essence of awareness, because it is presumed to be nonphysical, because it is by definition private, seems to be scientifically unapproachable.”

Consciousness and the Social Brain. Michael S. A. Graziano (2013).

Solving the “easy” problem requires objective, empirical inquiry into how our brains are organized and wired, what brain areas and neural circuits process which kinds of experience, how they all share relevant information, etc. Armed with MRIs and other technologies, neuroscience has made great progress on all that. What it can’t seem to get its instruments around is the personal and  private subjection interpretation of the brain’s objective processing of experience.

“First coined in 1995 by the Australian philosopher David Chalmers, this ‘hard problem’ of consciousness highlights the distinction between registering and actually feeling a phenomenon. Such feelings are what philosophers refer to as qualia: roughly speaking, the properties by which we classify experiences according to ‘what they are like’. In 2008, the French thinker Michel Bitbol nicely parsed the distinction between feeling and registering by pointing to the difference between the subjective statement ‘I feel hot’, and the objective assertion that ‘The temperature of this room is higher than the boiling point of alcohol’ – a statement that is amenable to test by thermometer.”

I Feel Therefore I Am  Aeon Magazine Dec. 1, 2015

Neuroscience does objective just fine, but meets its match with subjective.

“The question of how the brain produces the feeling of subjective experience, the so-called ‘hard problem’, is a conundrum so intractable that one scientist I know refuses even to discuss it at the dinner table. Another, the British psychologist Stuart Sutherland, declared in 1989 that ‘nothing worth reading has been written on it’.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

Recently though, neuroscience has unleashed new urgency on the hard problem:

“For long periods, it is as if science gives up on the subject in disgust. But the hard problem is back in the news, and a growing number of scientists believe that they have consciousness, if not licked, then at least in their sights.

“A triple barrage of neuroscientific, computational and evolutionary artillery promises to reduce the hard problem to a pile of rubble. Today’s consciousness jockeys talk of p‑zombies and Global Workspace Theory, mirror neurons, ego tunnels, and attention schemata. They bow before that deus ex machina of brain science, the functional magnetic resonance imaging (fMRI) machine.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

Impressive, but are they making progress? Not so much.

“Their work is frequently very impressive and it explains a lot. All the same, it is reasonable to doubt whether it can ever hope to land a blow on the hard problem.”

The Mental Block – Consciousness Is The Greatest Mystery In Science.

The quest to map and measure the “personalized feeling level” of consciousness has taken researchers to some odd places indeed — as we saw in the video featured last time. Zombies also feature prominently:

“All those tests still face what you might call the zombie problem. How do you know your uncle, let alone your computer, isn’t a pod person – a zombie in the philosophical sense, going through the motions but lacking an internal life? He could look, act, and talk like your uncle, but have no experience of being your uncle. None of us can ever enter another mind, so we can never really know whether anyone’s home.”

Consciousness CreepAeon Magazine, February 25, 2016

More about Zombies and other consciousness conundrums coming up, along with a look at what made consciousness shoot to the top of the unsolved scientific mysteries pile.

[1] Encyclopedia Briitanica

[2] We’ll see later in this series what made illuminating the human mind so critical to science in general, not just neuroscience in particular.