Mirror, Mirror, on the Wall…

mirror mirror

“If you were to ask the average person in the street about their self, they would most likely describe the individual who inhabits their body. They believe they are more than just their bodies. Their bodies are something their selves control. When we look in the mirror, we regard the body as a vessel we occupy.

“The common notion [is] that our self is an essential entity at the core of our existence that holds steady throughout our life. The ego experiences life as a conscious, thinking person with a unique historical background that defines who he or she is. This is the ‘I’ that looks back in the bathroom mirror and reflects who is the ‘me.’”

The Self Illusion:  How the Social Brain Creates Identity, Bruce Hood[1] (2012)

The idea that we are a self riding through life in a body is deeply ingrained in western thinking. Descartes gets most of the credit for it, but its religious and philosophical roots are much more ancient. (The same is true of the eastern, Buddhist idea that there’s no such a thing as a self. We’ll talk origins another time.)

Descartes’ dualism has the curious effect of excusing us from thinking too closely about what we mean by it. It does this by locating the body in the physical, biological, natural world while placing the self in a transcendent realm that parallels the natural world but remains apart from it. The body, along with the rest of the natural world, is the proper subject of scientific inquiry, but the self and its ethereal realm remain inscrutable, the province of faith and metaphor, religion and the humanities. David P. Barash[2] captures the implications of this dichotomy in Through a Glass Brightly:  Using Science to See Our Species as We Really Are (2018):

“Science differs from theology and the humanities in that it is made to be improved on and corrected over time… By contrast, few theologians, or fundamentalist religious believers of any stripe, are likely to look kindly on ‘revelations’ that suggest corrections to their sacred texts. In 1768, Baron d’Holbach, a major figure in the French Enlightenment, had great fun with this. In his satire Portable Theology (written under the pen name Abbe Bernier, to hide from the censors), d’Holbach defined Religious Doctrine as ‘what every good Christian must believe or else be burned, be it in this world or the next. The dogmas of the Christian religion are immutable decrees of God, who cannot change His mind except when the Church does.’

“By contrast, science not only is open to improvement and modifications but also is to a large extent defined by this openness. Whereas religious practitioners who deviate from their traditions are liable to be derogated — and sometimes killed — for this apostasy …, science thrives on correction and adjustment, aiming not to enshrine received wisdom and tradition but to move its insights closer to correspondence with reality as found in the natural world.”

Attempts to bridge the realms of body and soul end up in pseudo-science, eventually discredited and often silly. Consider for example the ether (or sometimes “aether”) — a term that since Plato and Aristotle has been applied to both the rarefied air only the gods can breathe and the stuff light moves through in inter-stellar space.[3]

You don’t need to be a poet or or prophet to think the self is inviolate. It’s just so obvious to most of us that there’s a self inside who watches and knows all about us — who in fact is us. We experience it as that never-silent internal voice — observing and commenting, often critiquing, sometimes shaming — that always seems to be accurate. We’ve been hearing it for as long as we can remember:  it’s embedded in our brain’s memory banks, all the way back to when we first started remembering things and using language to describe and record them.

We have always been aware that we are aware:
we don’t just observe, we observe ourselves observing.

Hence the belief that we are body and soul seems not worth challenging. Which is why, in keeping with this blog’s purpose, we’re going to do precisely that.

Cartesian dualism is foundational to self-awareness and to our cultural beliefs and institutions. It guides everything from religious faith to criminal culpability, health and wellbeing to mental illness. And so much more. As a result, taking a closer look will not only challenge our perceptions of what it real, it will shake reality itself. This inquiry won’t be easy. Again from The Self Illusion:

“Understanding that the self could be an illusion is really difficult… Our self seems so convincing, so real to us. But then again, many aspects of our experience are not what they seem.

“Psychologist Susan Blackmore makes the point that the word ‘illusion’ does not mean that it does not exist — rather an illusion is not what it seems. We all certainly experience some form of self, but what we experience is a powerful deception generated by our brains for our own benefit.”

That’s where we going. Hang tight.

[1] Bruce Hood is an experimental psychologist at the University of Bristol. He specializes in developmental cognitive neuroscience.

[2] David P. Barash is an evolutionary biologist and professor of psychology emeritus at the University of Washington.

[3] For a useful primer, see The Eternal Quest for Aether, the Cosmic Stuff That Never Was, Popular Mechanics (Oct 19, 2018).

War – What is it Good For?

War what is it good for.PNG

War what is it good for - Springsteen

War, huh, yeah
What is it good for
Absolutely nothing

You might know the song — either the original Edwin Starr 1970 version or the Springsteen cover — so good it made the infamous Clear Channel post-9-11 no play list. You might click the images or the links and have a listen — put you in the mood.

Owlcation provides the textbook explanation that wars are fought for economic or territorial gain; to further religious or nationalist interests; for self-defense or revenge; because of civil strife; or to bring about revolution. Those are rationalizations — things politicians and academicians say after the fact — but whether war is good for any of that is another issue. And it takes rare honesty to say we need war because it’s good for medicine, science, technology innovation, the economy, and the advance of civilization generally — all of which has been said.

Medicine

“For some historians, the Great War and the Second World War together form an ‘age of catastrophe’ or even one single war with a long break. The First World War also inaugurated a profound change beneath politics, in a realm largely hidden from journalism or military and political history. The Great War remade the human body itself.

“The doctors who identified this new human body saw an organism that organises itself, regulates itself, integrates itself, yet was extremely brittle. It was marked by fragility buried under the skin. It shattered easily, even worked against itself. The great number of injured and maimed bodies enabled doctors to create new kinds of medicine, physiology and psychiatry.

“Hints of this new conception of the body were present before the war, but when tens of thousands of soldiers returned with visible and invisible injuries, disordered hearts and broken psyches, it forced medicine to change too. Triage efforts on the battlefield had been sped up and regularised, and the entire front had become something of a giant medical laboratory for testing ideas and therapies. Many soldiers who, just a few decades earlier, would have died of their wounds now survived them. All of this changed the nature of the relationship between surgeons, physicians and psychiatrists, and patients. With survival, previously unknown pathologies emerged. The way in which medical scientists talked about the patient changed: they now described the patient’s body as an integral whole….”

The Maimed And The Healing:  The Casualties Of The First World War Brought A New Understanding Of Human Fragility And Wholeness Aeon Magazine (Dec. 13, 2018)

Science, Innovation, And The Advancement Of Civilization

“Were he alive today, the seventeenth-century Dutch astronomer and mathematician Christiaan Huygens might tell us we’d be fools to think that ambitious undertakings in space can be achieved without massive military support. Back in the 1690s, as Huygens thought about life on Mars and the other planets then know to populate the night sky, he pondered how best to foster inventiveness. For him and his era, profit was a powerful incentive (capitalism was as yet unnamed) and conflict was a divinely endorsed stimulation of creativity:

It has so pleased God to order the Earth… that this mixture of bad Men with good, and Consequences of such a mixture as Misfortunes, Wars, Afflictions, Poverty, and the like, were given us for this very good end, viz. the exercising our Wits and Sharpening our Inventions, by forcing us to provide for our own necessary defenses against our Enemies.

“Yes, waging war requires clever thinking and promotes technical innovation. Not controversial. But Huygens can’t resist linking the absence of armed conflict with intellectual stagnation:

And if Men were to lead their whole Lives in an undisturbed continual Peace, in no fear of Poverty, no danger of War, I don’t doubt they would live little better than Brutes, without all knowledge and enjoyment of those Advantages that make our Lives pass on with pleasure and profit. We should want the wonderful Art of Writing if its great use and necessity in Commerce and war had not forc’d our the Invention. ‘Tis to these we owe our Art of Sailing, our Art of Sowing, and most of those Discoveries of which we are Masters; and almost all the secrets in experimental Knowledge.

“So it’s simple:  no war equals no intellectual ferment. Arm in arm with trade, says Huygens, war has served as the catalyst for literacy, exploration, agriculture, and science.”

Accessory to War:  The Unspoken Alliance Between Astrophysics and the Military, Neil deGrasse Tyson and Avis Lang (2018)

The Economy

“[In February 2009, just after the Great Recession of 2007-2008,] an international group of economists, officials, and academics met under the auspices of Columbia University’s Center on Capitalism and Society to discuss how the world might manage to emerge from its worse-than-usual financial crisis. The Center’s director, Nobel Laureate in economics Edmund Phelps, argued that some financial regulation was called for but stressed that it must “discourage[e] finding for investment in innovation in the non-financial business sector, which has been the main source of dynamism in the U.S. economy.” What’s the non-financial business sector? Military spending, medical equipment, aerospace, computers, Hollywood films, music, and more military spending. For Phelps, dynamism and innovation hand in hand with capitalism — and with war. Asked by a BBC interviewer for a “big thought” on the crisis and whether it constituted “a permanent indictment of capitalism,” he responded, “My big thought is, we desperately need capitalism in order to create interesting work to be done, for ordinary people — unless maybe we can go to war against Mars or something as an alternative.”

“A vibrant economy, in other words, depends on at least one of the following:  the profit motive, war on the ground, or war in space.”

Accessory to War, Tyson and Lang

Personally, I’m with the song’s last stanza —

Oh no, there’s got to be a better way
Say it again, there’s got to be a better way.

More coming up.

The Hostilities of Change:  Surprise, Death, and War

Storming of the Bastille

“Ideas that require people to reorganize their picture of the world provoke hostility.”

Science historian James Gleick,
in his bestseller Chaos:  The Making of a New Science,

We looked last time at neuro-cultural resistance to change, and asked what it takes to overcome it.

It takes a paradigm shift — which, according to Merriam-Webster, is “an important change that happens when the usual way of thinking about or doing something is replaced by a new and different way.” Physicist and philosopher Thomas Kuhn coined the term in a work that was itself a paradigm shift in how we view the dynamics of change.

“The Kuhn Cycle is a simple cycle of progress described by Thomas Kuhn in 1962 in his seminal work The Structure of Scientific Revolutions… Kuhn challenged the world’s current conception of science, which was that it was a steady progression of the accumulation of new ideas. In a brilliant series of reviews of past major scientific advances, Kuhn showed this viewpoint was wrong. Science advanced the most by occasional revolutionary explosions of new knowledge, each revolution triggered by introduction of new ways of thought so large they must be called new paradigms. From Kuhn’s work came the popular use of terms like ‘paradigm,’ ‘paradigm shift,’ and ‘paradigm change.’”

Thwink.org

Our cultural point of view determines what we see and don’t see, blinds us to new awareness and perspective. That’s why our visions of a “new normal” are often little more than uninspiring extrapolations of the past.[1] Paradigm shifts offer something more compelling:  they shock our consciousness so much that we never see things the same again; they stun us into abrupt about-faces. Without that, inertia keeps us moving in the direction we’re already going. If we even think of change, cognitive dissonance makes things uncomfortable, and if we go ahead with it anyway, things can get nasty in a hurry.

“People and systems resist change. They change only when forced to or when the change offers a strong advantage. If a person or system is biased toward its present paradigm, then a new paradigm is seen as inferior, even though it may be better. This bias can run so deep that two paradigms are incommensurate. They are incomparable because each side uses their own paradigm’s rules to judge the other paradigm. People talk past each other. Each side can ‘prove’ their paradigm is better.

“Writing in his chapter on The Resolution of Revolutions, Thomas Kuhn states that:

‘If there were but one set of scientific problems, one world within which to work on them, and one set of standards for their solution, paradigm competition might be settled more or less routinely by some process like counting the number of problems solved by each.

‘But in fact these conditions are never met. The proponents of competing paradigms are always at least slightly at cross-purposes. Neither side will grant all the non-empirical assumptions that the other needs in order to make its case.

‘Though each may hope to convert the other to his way of seeing his science and its problems, neither may hope to prove his case. The competition between paradigms is not the sort of battle that can be solved by proofs.’”

Thwink.org

What does it take to detonate a logjam-busting “revolutionary explosion of new knowledge”? Three possibilities:

The Element of Surprise. [2]  We’re not talking “Oh that’s nice!” surprise. We’re talking blinding flash of inspiration surprise — a eureka moment, moment of truth, defining moment — that changes everything forever, in a moment, in the twinkling of an eye. In religious terms, this is St. Paul’s conversion on the Damascus Road or St. Peter’s vision of extending the gospel to the gentiles. In those moments, both men became future makers, not future takers, embodying the view of another scientist and philosopher:

“The best way to predict the future is to create it.”[3]

A New Generation.  Without the element of surprise, paradigm shifts take a long time, if they happen at all.

“A new scientific truth does not triumph by convincing its opponents
and making them see the light, but rather because its opponents eventually die,
and a new generation grows up that is familiar with it.”[4]

In religious terms, that’s why the Exodus generation had to die off in 40 years in the wilderness, leaving a new generation for whom Moses’ new paradigm was the only one they’d ever known.

Violence.  Or, if the new paradigm’s champions can’t wait, they can resort to violence, brutality, persecution, war… the kinds of power-grabbing that have long polluted religion’s proselytizing legacy.

Surprise, death, violence… three ways to bring about a paradigm shift. That’s true in religion, science, or any other cultural institution.

More next time.

[1] Carl Richards, “There’s No Such Thing as the New Normal,” New York Times ( December 20, 2010).

[2] Carl Richards, op. cit.

[3] The quote has been ascribed to a lot of different people, including Peter Drucker and computer scientist Alan Kay. But according to the Quote Investigator, “The earliest evidence appeared in 1963 in the book ‘Inventing the Future’ written by Dennis Gabor who was later awarded a Nobel Prize in Physics for his work in holography.”

[4] Max Planck, founder of quantum theory, in his Scientific Autobiography and Other Papers.

Religion on Demand

god helmet

“Given that the neurological roots of religious experiences can be traced so accurately with the help of the latest neuroscientific technologies, does this mean that we could — in principle — ‘create’ these experiences on demand?”[1]

It’s a good question. And so is the obvious follow up: if technology can create religious experience on demand, how does that affect religion’s claims to authenticity and its status as a cultural institution?

Dr. Michael Persinger[2] created the “”God Helmet” (shown in the photo above, taken from this article) for use in neuro-religious research.

This is a device that is able to simulate religious experiences by stimulating an individual’s tempoparietal lobes using magnetic fields. “If the equipment and the experiment produced the presence that was God, then the extrapersonal, unreachable, and independent characteristics of the god definition might be challenged,” [says Dr. Persinger]. [3]

The experiences created are not doctrinally specific, but are of a kind widely shared among different religions — for example, sensing a numinous presence, a feeling of being filled with the spirit or overwhelmed or possessed, of being outside of self, out of body, or having died and come back to life, feelings of being one with all things or of peace, awe, fear and dread, etc. All of these states have been measured or induced in the laboratory[4]:

Some recent advances in neuroimaging techniques allow us to understand how our brains ‘create’ a spiritual or mystical experience. What causes the feeling that someone else is present in the room, or that we’ve stepped outside of our bodies and into another dimension?

“In the last few years,” says [Dr. Jeff Anderson of the University of Utah School of Medicine in Salt Lake City], “brain imaging technologies have matured in ways that are letting us approach questions that have been around for millennia.”

Prof. James Giordano, from the Georgetown University Medical Center in Washington, D.C., [says that] “We are able to even understand when a person gets into ‘ecstasy mode’ … and to identify specific brain areas that participate in this process.”

“If ‘beings’ join the mystical experience,” Prof. Giordano goes on, “we can say that the activity of the left and right temporal lobe network (found at the bottom middle part of the cortex) has changed.”

 “When activity in the networks of the superior parietal cortex [which is a region in the upper part of the parietal lobe] or our prefrontal cortex increases or decreases, our bodily boundaries change,” Prof. Giordano explains in an interview for Medium. “These parts of the brain control our sense of self in relation to other objects in the world, as well as our bodily integrity; hence the ‘out of body’ and ‘extended self’ sensations and perceptions many people who have had mystical experiences confess to.”

The parietal lobes are also the areas that [Neuroscientist Andrew Newberg, a pioneer of neurotheology, has] found to have lower brain activity during prayer.

And much more. In addition, research has also helped to explain such things as why people with chronic neurodegenerative diseases often lose their religion:

“We discovered a subgroup who were quite religious but, as the disease progressed, lost some aspects of their religiosity,” [says Patrick McNamara, professor of neurology at Boston University and author of The Neuroscience of Religious Experience (2009)]. Sufferers’ brains lack the neurotransmitter dopamine, making McNamara suspect that religiosity is linked to dopamine activity in the prefrontal lobes. “These areas of the brain handle complexity best, so it may be that people with Parkinson’s find it harder to access complex religious experiences.”

Does this research signal the end of religion any time soon? Probably not, says Dr. Newberg:

Until we gain such answers, however, religion is unlikely to go anywhere. The architecture of our brains won’t allow it, says Dr. Newberg, and religion fulfills needs that our brains are designed to have.[5]

Tim Crane, author of The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), agrees:  religion, he says, is simply “too ingrained as a human instinct.” See also this article[6]’s analysis of the current state of the science vs. religion contention, which concludes that the scale seems to be tipping more to the latter:

Religion is not going away any time soon, and science will not destroy it. If anything, it is science that is subject to increasing threats to its authority and social legitimacy.

There are plenty of contrary opinions, of course, and all the technology and research in the world is unlikely to change anybody’s mind. pro or con. We’ll look at why not next time.

[1] “What Religion Does To Your Brain,” Medical News Today (July 20, 2018).

[2] Dr. Persinger was director of the Neuroscience Department at Laurentian University in Ontario, Canada prior to his death in 2018.

[3] “What God Does To Your Brain:  The controversial science of neurotheology aims to find the answer to an age-old question: why do we believe?” The Telegraph (June 20, 2014).

[4] “What Religion Does To Your Brain,” Medical News Today (July 20, 2018).

[5] Why God Won’t Go Away: Brain Science and the Biology of Belief, Andrew Newberg, Eugene D’Aquili, Vince Rause (2001).

[6] “Why Religion Is Not Going Away And Science Will Not Destroy It,” Aeon Magazine (Sept. 7, 2017).

Why Belief Works

Our experience of the “real world” will conform to what we believe. It has to, because our brains insist upon it.

They do that in part through neuro-cultural conditioning — the process by which the neurological wiring of a culture’s individual members is patterned after the culture’s belief system, and vice versa. This is the case with any kind of cultural institution, whether national, religious, scientific, economic, corporate, professional, team, tribal, or otherwise.[1] This post looks at religion as an example.[2]

Tim Crane is a professor of philosophy at the Central European University in Budapest. “I work in the philosophy of mind,” his online CV says, “I have attempted to address questions about the most general nature, or essence, of the human mind, and about the place of the mind in the rest of nature.” In his book The Meaning of Belief: Religion From An Atheist’s Point Of View (2017), he cites William James’ 1902 classic The Varieties of Religious Experience for a definition of what he calls “the religious impulse”:

“Were one asked to characterize the life of religion in the broadest and most general terms, one might say that it consists in the belief that there is an unseen order, and that our supreme good lies in harmoniously adjusting ourselves thereto.”

Christian Smith is a sociology professor and director of the Center for the Study of Religion and Society at the University of Notre Dame. Here’s his definition of religion:

“Religion is a complex of culturally prescribed practices, based on promises about the existence and nature of supernatural powers, whether personal or impersonal, which seek to help practitioners gain access to and communicate or align themselves with these powers, in hopes of realizing human goods and avoiding things bad.”

Religion: What It Is, How It Works, And Why It Matters (Princeton University Press, 2017)

Both authors stress that religious principles and practices need to match in order for religion to be effective. In other words:

“Faith without works is dead.”
The Epistle of James 2: 17

As it turns out, “faith without works is dead” is not just scripture, but accurate neuroscience as well. When we practice what we preach, we set up a self-sustaining loop in which belief drives thoughts and behavior, which in turn reinforce belief. In that way, religion develops the brain while the brain develops religion:

“Jordan Grafman, head of the cognitive neuroscience laboratory at Rehabilitation Institute of Chicago and neurology professor at Northwestern University, says that neurotheology is important in part because early religious practices helped develop our brains to begin with. ‘Religion has played an incredibly important role in human evolution. It’s funny, people want to separate the two but in fact they’re intertwined.’”

The Neuroscience Argument That Religion Shaped The Very Structure Of Our Brains,” Quartz (December 3, 2016)

The more widespread and enduring the religious practice, the more the religion develops scriptures, rituals, icons, and institutions to sustain itself. Therefore a Bible passage such as this…

“I was young and now I am old,
yet I have never seen the righteous forsaken
 or their children begging bread.”
Psalm 37: 25 NIV

… becomes both community truth and the “testimony” of individual adherents. But what happens when belief and experience don’t align — e.g., when a member of the congregation and her children in fact go begging?

Some religious thinkers, like the writer of this Huffington Post article, reckon with the contradiction by distinguishing belief from faith. Beliefs are products of the mind, she says, and deal with what can be known, while faith is a product of the spirit, which traffics in what cannot be known. Since knowledge is always shifting, belief can and probably will let us down, while faith in what can’t be known remains inscrutable. Faith therefore invites belief to step aside in favor of “trusting beyond all reason and evidence.”

That outlook captures the essential center of the definitions of religion we saw above:  that there is a “divine order” populated with “supernatural powers” that exists alongside but separate from ours. (Of which we have only limited understanding, the belief/faith outlook would add.)  Whether this satisfies the brain’s need to align internal patterning with external experience is the kind of issue being taken up by the new discipline of neurotheology which looks at where religion happens in the brain.

Neurotheology’s inquiries have far-reaching implications for many of our common assumptions about how reality is structured. For example, if faith can be explained in neurological terms, then it could be located — in whole or in part — along with belief on this side of the theoretical divide between human and supernatural existence.  This shift would likely have a ripple effect on similar dichotomies, such as known vs. unknown, real vs. imaginary, objective vs. subjective, observed vs. inscrutable, temporal vs. transcendence, etc.

More on neurotheology coming up.

[1] For more on cultural patterning, see the other posts in this blog’s category The Basics of Belief. Culture, and Reality.

[2] I talk about Christianity because it is the only religion I have personal experience with. And I am aware, by the way, that I write this post under the influence of my own neuroscientific cultural bias.

Moral Compass:  How We Know Right From Wrong

compass

Our brains are amoral. They need cultural context to give them a moral compass.

Real and Imaginary

It’s a staple of self-help advice and sports and performance psychology that our brains don’t know the difference between real and imagined, therefore we can trick them into getting us what we want. There’s good science to back this up, although recent research suggests that the brain actually does know the difference — that it has specific neurons for that purpose. Science Daily. Plus, although both real and imaginary run over the same neural pathway, they move in opposite directions:  input from the outside world runs bottom up — from lower level sensory to higher level cognitive processing — while imagined input runs top down. Psychology Today, Knowledge Nuts.

Getting Into Our Bodies

Not that Harold Hill’s “think system” is enough — we still need to practice and rehearse effectively. We need to get our bodies involved. We’re out there in the “real world” taking in sensory input, interacting with people, things, and experiences, meanwhile we’re imagining things, throwing in doses of speculation and making things up. Our brains and bodies need to work together to ground this swirl of information. This article[1] explains how they do that:

“When considering the senses, we tend to think of sight and sound, taste, touch and smell. However, these are classified as exteroceptive senses, that is, they tell us something about the outside world. In contrast, interoception is a sense that informs us about our internal bodily sensations, such as the pounding of our heart, the flutter of butterflies in our stomach or feelings of hunger.

“The brain represents, integrates and prioritises interoceptive information from the internal body. These are communicated through a set of distinct neural and humoural (i.e., blood-borne) pathways. This sensing of internal states of the body is part of the interplay between body and brain: it maintains homeostasis, the physiological stability necessary for survival; it provides key motivational drivers such as hunger and thirst; it explicitly represents bodily sensations, such as bladder distension. But that is not all, and herein lies the beauty of interoception, as our feelings, thoughts and perceptions are also influenced by the dynamic interaction between body and brain.”

University of Sussex cognitive neuroscientist Anil Seth lays all this out in his TED2017 talk “Your Brain Hallucinates Your Conscious Reality.”

TED Hallucinating reality

“When we agree about our hallucinations,” he says, “we call that reality.” Those agreements blend external (outside world) and internal (imagined) input into shared belief about what the real world is, and how it works. They also add another key cultural component:  a sense of right and wrong.

Why We Need a Moral Compass, and Where We Get It

Humans need community to survive. Community, in turn, needs a shared behavioral code. Our brains are flexible and amoral on issues of right and wrong — they take their cues from cultural context. Cultural moral coding is therefore evolutionary — motivated by the survival instinct.[2] All of that goes a long way toward explaining why activities honored by one group are despicable to another, and why, when confronted with those differences, each group’s first instinct is to point fingers.

This article reviews three prominent books[3] supporting culturally based morality, and concludes as follows:

“…one must come to the conclusion that inside human beings, as Gazzaniga says, ‘there is a moral compass.’ But ‘we have to be smart enough to figure out how it works.’ Across the realm of human experience—personal, collective, historical, and now neuroscientific—it is abundantly clear that we have the capacity to consciously consider consequences and choose our actions… The mind is a physio-spiritual mechanism built for choice, but it must be given direction. We may be endowed with a moral compass, but it does not arrive with prewired direction. Moral calibration is required.”

The article’s source is “the Church of God, an international community,” which according to its website is a “a nondenominational organization based in Pasadena, California [which] traces its antecedents to Sabbatarian communities in 17th-century Europe, and before that to the first-century apostolic Church at Jerusalem.” Its tool of choice for the brain’s “moral calibration” is the Bible:

“The Bible, too, is unequivocal in the need for [moral calibration] (see, for example, Proverbs 3:31 and Job 34:2–4), adding that there is a spiritual factor responsible for imparting this ability to the human mind (Job 32:8–9)… The Bible serves as the lodestone that sets our compass’s orientation and helps us establish our moral bearings.”

But of course the Church of God didn’t write the article — an individual or collaboration of individuals wrote it, in furtherance of the Church’s culture and institutional belief system. It’s not surprising that the Bible was its cultural choice for moral calibration. Another culture might have chosen a different tool — Mein Kampf, for instance, or the ISIS Manifesto.

The article closes with reservations about the three authors’ neuro-cultural approach to morality:

“As secularists, of course, these authors cannot be expected to pursue [the Bible] in their search for the source of moral standards, especially when, as Gazzaniga notes, so much of what constitutes religious faith is founded on superstition rather than on truth. And so, as researchers improve drug cocktails to ultimately manipulate and control the brain (as Tancredi believes they will), and as society haltingly accepts science as arbiter of good and evil (as Gazzaniga believes it must), it is not too farfetched to imagine that the moral grammar Hauser describes can be refashioned as well. In fact, if history provides any clue, it seems a done deal. The only question that remains is whether our ongoing recalibrations will be for the better or for the worse.”

Yes — whether “for the better or for the worse” remains to be seen…. But according to whose cultural point of view?

[1]How The Body And Mind Talk To One Another To Understand The World,” Aeon Magazine (Feb. 15, 2019).

[2] Here’s a nice primer on this concept. And here’s another, maybe more grownup version.

[3] Hardwired Behavior:  What Neuroscience Reveals About Morality, by Laurence Tancredi, Clinical Professor of Psychiatry at New York University School of Medicine, a psychiatrist in private practice, and a lawyer who consults on criminal cases involving psychiatric issues. The Ethical Brain:  The Science of Our Moral Dilemmas, by Michael S. Gazzaniga, Professor of Psychology and Director of the SAGE Center for the Study of the Mind at the University of California, Santa Barbara. Moral Minds:  How Nature Designed Our Moral Minds, by Marc D. Hauser, Professor of Psychology, Organismic and Evolutionary Biology, and Biological Anthropology at Harvard University, where he is director of the Cognitive Evolution Laboratory and co-director of the Mind, Brain and Behavior Program.

“Be the Change You Want to See” — Why Change MUST Always Begin With Us

the-beginning-e1503252471356

In the beginning, somebody…

Told a story. Made something. Made something that made things. Drew a picture. Used their voice melodiously. Moved a certain way and did it again. Took something apart, put it back together, and built another thing like it. Watched how weather and sky and flora and fauna responded to the passage of time. Sprinkled dry leaves on meat and ate it. Drew a line in the sand and beat someone who crossed it. Traded this for that. Resolved a dispute. Helped a sick person feel better. Took something shiny from the earth or sea and wore it. Had an uncanny experience and explained it.

And then somebody else did, too — and then somebody else after that, and more somebodies after that, until the human race had organized itself into families, clans, tribes, city-states, and nations, each with its own take on life in this world. Millennia later a worldwide civilization had emerged, organized around trans-cultural institutions of law, economics, science, religion, industry, commerce, education, medicine, arts and entertainment….

And then you and I were born as new members of a highly-evolved human culture of innumerable, impossibly complex, interwoven layers.

From our first breaths we were integrated into site-specific cultural institutions that informed our beliefs about how the world works and our place in it. Those institutions weren’t external to us, they were embodied in us — microbes of meaning lodged in our neural pathways and physical biome. Our brains formed around the beliefs of our culture — our neurons drank them in, and our neural networks were wired up with the necessary assumptions, logic, and leaps of faith.

These cellular structure informed what it meant for us to be alive on the Earth, individually and in community. They shaped our observations and awareness, experiences and interpretations, tastes and sensibilities. They defined what is real and imaginary, set limits around what is true and false, acceptable and taboo. And then they reinforced the rightness of it all with feelings of place and belonging, usefulness and meaning. When that was done, our brains and bodies were overlaid with a foundation for status quo — the way things are, and are supposed to be.

All that happened in an astonishing surge of childhood development. Then came puberty, when our brain and body hormones blasted into overdrive, dredging up our genetic and environmental beginnings and parading them out for reexamination. We kept this and discarded that, activated these genes instead of those. (The process by which we do that is called epigenetics, and it explains why your kids aren’t like you.) We also tried on countercultural beliefs. welcoming some and rejecting others. From there, we entered adult life freshly realigned with a differentiated sense of self, us, and them.

From there, adult life mostly reinforces our cultural beginnings, although the nuisances and opportunities of change periodically require us to make and reaffirm shared agreements in our communities, professions, workplaces, teams, and other groups, each time reaffirming and refining our shared cultural foundations. In doing so, we sometimes flow with the changing times, and sometimes retrench with nostalgic fervor.

Where does all this biological, cognitive, and social development and maintenance happen? In the only place it possibly could:  in the hot wet darkness inside the human body’s largest organ —   our skin. Yes, there is a “real world” out there that we engage with, but the processing and storing of experience happen inside — encoded in our brains and bodies.

be the changeWhich is why individual and cultural change must always begin with us — literally inside of us, in our physical makeup — because that’s where our world and our experience of it are registered and maintained. Gandhi’s famous words are more than a catchy meme, they describe basic human reality:  if we want things to change, then we must be transformed. Think about it:  we have no belief, perception, experience, or concept of status quo that is not somehow registered in our brains and bodies, so where else could change happen? (Unless there’s something like a humanCloud where it can be uploaded and downloaded — but that’s another issue for another time.)

The implications of locating human experience in our physical selves are far-reaching and fascinating. We’ll be exploring them.

#icons #iconoclast #psychology “philosophy #sociology #neurology #biology #narrative #belief #society #socialstudies #religion #law #economics #work #jobs #science #industry #commerce #education #medicine #arts #entertainment #civilization #evolution #perception #reality #subjective #culture #culturalchange #change #paradigmshift #transformation #growth #personalgrowth #futurism #technology #identity #rational #consciousness #cognition #bias #cognitivebias #brain #development #childdevelopment #puberty #adolescence #hormones #genetics #epigenetics #gandhi #bethechange #bethechangeyouwant #neurons #neuralnetworks