Debunking the Debunkers

Then I’ll get on my knees and pray
We don’t get fooled again

Meet the new boss
Same as the old boss

The Who[1]

Debunkers believe we’ll be better off without all the bunk. If only it were that simple: the basic premise of debunking might not hold up, the “truth that lies on the other side of bunk is elusive, and there are strong social forces that oppose it. Plus, once free of it, we tend to replace old bunk with new.

UVA Professor Emily Ogden defines “bunk”:

“‘Bunk’ means baloney, hooey, bullshit. Bunk isn’t just a lie, it’s a manipulative lie, the sort of thing a con man might try to get you to believe in order to gain control of your mind and your bank account. Bunk, then, is the tool of social parasites, and the word ‘debunk’ carries with it the expectation of clearing out something that is foreign to the healthy organism. Just as you can deworm a puppy, you can debunk a religious practice, a pyramid scheme, a quack cure. Get rid of the nonsense, and the polity – just like the puppy – will fare better. Con men will be deprived of their innocent marks, and the world will take one more step in the direction of modernity.”[2]

Sounds great, but can debunking actually deliver?

“Debunk is a story of modernity in one word – but is it a true story? Here’s the way this fable goes. Modernity is when we finally muster the reason and the will to get rid of all the self-interested deceptions that aristocrats and priests had fobbed off on us in the past. Now, the true, healthy condition of human society manifests itself naturally, a state of affairs characterised by democracy, secular values, human rights, a capitalist economy and empowerment for everyone (eventually; soon). All human beings and all human societies are or ought to be headed toward this enviable situation.”

Once somebody calls something a “fable” you know it’s in trouble. Plus, there’s no indisputable “truth” waiting to be found once the bunk is cleared out.

“There is no previously existing or natural secular order that will assert itself when we get the bunk out… There is no neutral, universal goal of progress toward which all peoples are progressing; instead, the claim that such a goal ought to be universal has been a means of exploiting and dispossessing supposedly ‘backward’ peoples.”

The underlying problem with debunking seems to be the assumptions we make — about what’s true and false, what we’ll find when we sort one from the other, and most importantly, who’s qualified to do that. Debunking requires what cultural anthropologist Talal Asad has called “secular agents” – a species that may not actually exist.

“Secular agency is the picture of selfhood that Western secular cultures have often wanted to think is true. It’s more an aspiration than a reality. Secular agents know at any given moment what they do and don’t believe. When they think, their thoughts are their own. The only way that other people’s thoughts could become theirs would be through rational persuasion. Along similar lines, they are the owners of their actions and of their speech. When they speak, they are either telling the truth or lying. When they act, they are either sincere or they are faking it… Modernity, in this picture, is when we take responsibility for ourselves, freeing both society and individuals from comforting lies.”

I.e., secular agency is a high standard we mostly fall short of. Instead, we do our best to conform to social conventions even if we don’t personally buy into them. A sports star points to the sky after a home run, a touchdown, a goal, acknowledging the help of somebody or Somebody up there… a eulogy talks about a deceased loved one “looking down on us”… a friend asks us to “think good thoughts” for a family member going into surgery… We don’t buy the Somebody up there helping us, the “looking down,” or the “good thoughts,” but we don’t speak up. Instead, we figure there’s a time and place for honesty and confrontation, and this isn’t one of them.[3]

“Life includes a great many passages in which we place the demands of social bonds above strict truth…. [In] the context of some of the stories we tell collaboratively in our relationships with others, the question of lying or truth does not arise. We set it aside. We apply a different framework, something more like the framework we apply to fiction: we behave as if it were true.”

So what’s left of debunking? Well, it still has its place, especially when it’s used to call the Bunk Lords to account.

“What then is debunking? It can be a necessary way of setting the record straight. I’m by no means opposed to truth-telling. We need fact-checkers. The more highly placed the con artist, the more his or her deceptions matter. In such cases, it makes sense to insist on hewing to the truth.

“[On the other hand,] the social dynamics of debunking should not be overlooked …, especially when the stakes aren’t particularly high – when the alleged lie in question is not doing a whole lot of harm.”

To Play Along or Not to Play Along

When I was a late adolescent and lurching my way toward the Christian faith, a seminary student advised me that, “Sometimes you just need to act as if something is true. You do that long enough, and maybe it will actually become true” – which I took to mean that, even if you’re full of yourself right now, in the long haul you might be happier fitting in.

Maybe, maybe not. You might also feel that, since the things we believe are always in progress anyway, why not be real about what’s up for you right now.

“At these times, what is debunking? It’s a performed refusal to play along.… It’s the announcement that one rejects the as-if mode in which we do what social bonds require.”

Plus, there seems to be a countervailing urge that sometimes prevails over socially playing nice: when we feel like we finally got it figured out, the scales fell from our eyes and we can see clearly now, we can see life for what it really is,,, get to that beatific place, and you want to tell everybody, even it if steps on their toes – which it does, but being newly enlightened and detoxed, you can’t help yourself.

Thus the “as if” game becomes a choice: playing along preserves social currency, opting out drains it. Which do you want?

Why Bother?

There’s also the “Why bother?” issue. Debunking is often preaching to the choir while the unconverted stay that way – in fact, they never even hear what you have to say; it never shows up in their feed.

“The theory of cognitive dissonance—the extreme discomfort of simultaneously holding two thoughts that are in conflict—was developed by the social psychologist Leon Festinger in the 1950s. In a famous study, Festinger and his colleagues embedded themselves with a doomsday prophet named Dorothy Martin and her cult of followers who believed that spacemen called the Guardians were coming to collect them in flying saucers, to save them from a coming flood. Needless to say, no spacemen (and no flood) ever came, but Martin just kept revising her predictions. Sure, the spacemen didn’t show up today, but they were sure to come tomorrow, and so on. The researchers watched with fascination as the believers kept on believing, despite all the evidence that they were wrong.

“‘A man with a conviction is a hard man to change,’ Festinger, Henry Riecken, and Stanley Schacter wrote in When Prophecy Failstheir 1957 book about this study. ‘Tell him you disagree and he turns away. Show him facts or figures and he questions your sources. Appeal to logic and he fails to see your point … Suppose that he is presented with evidence, unequivocal and undeniable evidence, that his belief is wrong: what will happen? The individual will frequently emerge, not only unshaken, but even more convinced of the truth of his beliefs than ever before.’

“This doubling down in the face of conflicting evidence is a way of reducing the discomfort of dissonance, and is part of a set of behaviors known in the psychology literature as ‘motivated reasoning.’ Motivated reasoning is how people convince themselves or remain convinced of what they want to believe—they seek out agreeable information and learn it more easily; and they avoid, ignore, devalue, forget, or argue against information that contradicts their beliefs.

“Though false beliefs are held by individuals, they are in many ways a social phenomenon. Dorothy Martin’s followers held onto their belief that the spacemen were coming … because those beliefs were tethered to a group they belonged to, a group that was deeply important to their lives and their sense of self.

“[A disciple who ignored mounting evidence of sexual abuse by his guru] describes the motivated reasoning that happens in these groups: ‘You’re in a position of defending your choices no matter what information is presented,’ he says, ‘because if you don’t, it means that you lose your membership in this group that’s become so important to you.’ Though cults are an intense example, … people act the same way with regard to their families or other groups that are important to them.”[4]

In light of all this cognitive self-preservation, not rocking the boat can seem like the more reasonable choice:

“Humans’ biggest advantage over other species is our ability to cooperate. Cooperation is difficult to establish and almost as difficult to sustain.

“Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“‘Reason is an adaptation to the hypersocial niche humans have evolved for themselves,’ [the authors of an seminal study] write. Habits of mind that seem weird or goofy or just plain dumb from an ‘intellectualist’ point of view prove shrewd when seen from a social ‘interactionist’ perspective.”[5]

But even if acting as-if is socially acceptable, sometimes you just can’t help but go after it.

Take “magical thinking” for example — a socially acceptable practice and favorite debunking target.

Magical thinking is based on a claim of cause and effect, and therefore offers a sense of predictability and control, It sounds scientific and reasonable, which makes it socially acceptable, but it’s neither; it’s faux science because you can’t test or verify it, and its not reasonable because there’s no logic to it, you can only believe it or not. The masquerade makes it a prime target for debunking.

Magical thinking [is] the belief that one’s ideas, thoughts, actions, words, or use of symbols can influence the course of events in the material world. Magical thinking presumes a causal link between one’s inner, personal experience and the external physical world. Examples include beliefs that the movement of the Sun, Moon, and wind or the occurrence of rain can be influenced by one’s thoughts or by the manipulation of some type of symbolic representation of these physical phenomena.

“Magical thinking became an important topic with the rise of sociology and anthropology in the 19th century. It was argued that magical thinking is an integral feature of most religious beliefs, such that one’s inner experience, often in participation with a higher power, could influence the course of events in the physical world.

“Prominent early theorists suggested that magical thinking characterized traditional, non-Western cultures, which contrasted with the more developmentally advanced rational-scientific thought found in industrialized Western cultures. Magical thinking, then, was tied to religion and ‘primitive’ cultures and considered developmentally inferior to the scientific reasoning found in more ‘advanced’ Western cultures.” [6]

Recent converts are notorious for their intolerance of whatever they just left behind[7] and therefore the least likely to play along with social convention. So, suppose you’re a recent convert from magical thinking and someone drops one of those refrigerator magnet aphorisms. You’ll weigh a lot of factors in the next instant, but sometimes there are just some things people need to stop believing, so you’ll go ahead and launch, and social peace-keeping be damned. You do that in part because you’re aware of your own susceptibility to temptation. This is from Psychology Today[8]

“How many times a day do you either cross your fingers, knock on wood, or worry that your good luck will turn on you? When two bad things happen to you, do you cringe in fear of an inevitable third unfortunate event? Even those of us who ‘know better’ are readily prone to this type of superstitious thinking.

“Further defying logic, we also readily believe in our own psychic powers: You’re thinking of a friend when all of a sudden your phone beeps to deliver a new text from that very person. It’s proof positive that your thoughts caused your friend to contact you at that very moment! … These are just a few examples of the type of mind tricks to which we so readily fall prey.”

The article provides a list of seven “mind tricks” taken from psychology writer Matthew Hutson’s book The 7 Laws of Magical Thinking, and invites us to “See how long it takes you to recognize some of your own mental foibles.” Here’s the list, with abbreviated commentary from the article:

  1. “Objects carry essences. We attribute special properties to items that belong or once belonged to someone we love, is famous, or has a particular quality we admire… the objects are just objects, and despite their connection with special people in our lives, they have no inherent ability to transmit those people’s powers to us.
  2. “Symbols have power. Humans have a remarkable tendency to impute meaning not only to objects but to abstract entities. We imbue these symbols with the ability to affect actual events in our lives.
  3. “Actions have distant consequences. In our constant search to control the outcomes of events in our unpredictable lives, we build up a personal library of favorite superstitious rituals or thoughts.
  4. “The mind knows no bound We are often impressed by the apparent coincidence that occurs when a person we’re thinking about suddenly contacts us. For just that moment, we believe the event “proves” that we’re psychic.
  5. “The soul lives on. [Why] do adults hold on so stubbornly to the belief that the mind can continue even after its seat (the brain) is no longer alive? The answer, in part, comes from the terror that we feel about death.
  6. “The world is alive. We attribute human-like qualities to everything from our pets to our iPhones. We read into the faces of our pets all sorts of human emotions such as humor, disappointment, and guilt. If our latest technological toy misbehaves, we yell at it and assume it has some revenge motive it needs to satisfy.
  7. “Everything happens for a reason. The most insidious form of magical thinking is our tendency to believe that there is a purpose or destiny that guides what happens to us… For the same reason, we believe in luck, fate, and chance.”

Magical thinking is one of my personal bugaboos, therefore my personal list would be longer than seven.[9] Those things make me twitch. You?

And speaking of mortality…

Miracles: Magic Gets Personal

We can (and do) make up all kinds of things about what it’s like “up there,” but we can’t really imagine it any more than we can our own death. There’s a lot of research about why that’s so[10], but as a practical matter we have to imagine death while we’re still alive in the here and now, but to do it properly we’d have to be there and then — a problem that explains the popularity of books that some call “heavenly tourism,” about people who go there and come back to tell us about it.[11]

We want our heroes and loved ones looking down on us because we miss them. Losing them makes us feel small, helpless, and powerless — like children. So we draw pictures of clouds and robes and harps and locate them there. Childish? Sure. But preferable to the idea that “they” vanished when their body and brain stopped biologically functioning. Why we like one over the other isn’t clear if we can step back and think about it, but we don’t. Instead we’re so freaked about the trip down the River Styx that we follow convention.

For the same reasons, praying for a miracle that staves off death persists in the face of little to support it.[12]

“Writing Fingerprints of God, my 2009 book about the science of spirituality, gave me an excuse to ask a question that I never openly considered before leaving Christian Science, one that was unusually freighted: Is there any scientific evidence, anything beyond the realm of anecdote, that prayer heals?

“It turns out, the evidence is mixed. Beginning in the 1980s, we’ve seen a rash of prayer studies. Some seemed to show that patients who were prayed for recovered more quickly from heart attacks. Another study found that prayer physically helped people living with AIDS.

“But for every study suggesting that prayer heals a person’s body, there is another one showing that prayer has no effect — or even makes you worse. Does prayer help people with heart problems in a coronary care unit? Researchers at the Mayo Clinic found no effect. Does it benefit people who needed to clear their arteries using angioplasty? Not according to researchers at Duke. In another study, prayer did not ease the plight of those on kidney dialysis machines. And don’t even mention skin warts: Researchers found that people who received prayer saw the number of warts actually increase slightly, compared with those who received no prayer.

“The most famous study, and probably the most damaging for advocates of healing prayer, was conducted by Harvard researcher Herbert Benson in 2006. He looked at the recovery rates of patients undergoing cardiac bypass surgery. Those patients who knew they were receiving prayer actually did worse than those who did not know they were receiving prayer.

“In the end, there is no conclusive evidence from double-blind, randomized studies that suggests that intercessory prayer works.

“Prayer studies are a ‘wild goose chase that violate everything we know about the universe,’ Richard Sloan, professor of behavioral medicine at Columbia University Medical Center and author of Blind Faith, told me: ‘There are no plausible mechanisms that account for how somebody’s thoughts or prayers can influence the health of another person. None.’”

“And yet,” the author continues, “ science has embraced a sliver of my childhood faith, a century after Mary Baker Eddy ‘discovered’ Christian Science in the late 1800s. If scientists don’t buy intercessory prayer, most do agree that there is a mind-body connection.” She also finds some connections in “another new ‘science,’ called ‘neurotheology,’” citing how the stimulation of certain brain areas can deliver the same sensations as meditation, contemplative prayer, spiritual ecstasy, and even out-of-body experiences. As a result, she wonders if the brain might act as a kind of radio: “Is the brain wired to connect with a dimension of reality that our physical senses cannot perceive?”

“Researchers have tried to replicate such out-of-body experiences, which are always after-the-fact anecdotes that cannot be tested. These experiences, they say, suggest that consciousness can exist separate from the brain — in other words, that there may be a transcendent reality that we tap into when brain functioning ceases.

“I am not asking you to believe that consciousness can continue when the brain is not functioning, that there is a God who answers prayer, or that people who pray or meditate connect with another reality. I’m not asking you to believe that all mystical or inexplicable experiences are simply the interaction of chemicals in the brain or firings of the temporal lobe. That’s the point: You don’t have to choose. Because neither side possesses the slam-dunk argument, the dispositive evidence that proves that there is a God, or there isn’t.”

I.e., she’s saying that the impermeable curtain of death means we can’t prove or disprove either the brain-as-a-radio theory or the materialist belief that when your body stops so do you. Thus we’re free to choose, and one’s as viable as the other. Obviously, unlike the Psychology Today writer, this ex-Christian Scientist is not a committed debunker. On the other hand, her reference to the lack of “dispositive evidence that proves that there is a God, or there isn’t” takes us to straight to the ultimate debunking target.

Debunking God (or not)

God is the ultimate debunking target (patriotism is a close second), and the “New Atheists[13]” are the ultimate God debunkers. They’ve also been roundly criticized for being as fundamentalist and evangelical as the fundamentalists and evangelicals they castigate.[14] That’s certainly how I respond to them. I discovered them when I was fresh in my awareness that I’d become an atheist. I put their books on my reading list, read a couple, and deleted the rest. I’d left the fighting fundamentalists behind, and had no desire to rejoin the association. On the other hand, I am grateful to them for making it easier for the rest of us to come out as atheist – something that current social convention makes more difficult than coming out gay.[15]

From what I can tell, there are lots of people like me who didn’t become atheists by being clear-thinking and purposeful[16], it was just something that happened over time, until one day they checked the “none” box beside “religious affiliation.” Atheism wasn’t an intellectual trophy we tried to win, it was a neighborhood we wandered into one day and were surprised to find we had a home there. As one writer said,

“My belief in God didn’t spontaneously combust—it faded.

“I wasn’t the only kid who stopped believing. A record number of young Americans (35 percent) report no religious affiliation, even though 91 percent of us grew up in religiously affiliated households.

“Our disbelief was gradual. Only 1 percent of Americans raised with religion who no longer believe became unaffiliated through a onetime “crisis of faith.” Instead, 36 percent became disenchanted, and another 7 percent said their views evolved.

“It’s like believing in Santa Claus. Psychologists Thalia Goldstein and Jaqueline Woolley have found that children’s disbelief in Santa Claus is progressive, not instantaneous. First kids think that the Santa in the mall or library is real, then they think he’s not real but still magically communicates with the actual Santa, and so on, until they finally realize that Santa is composed of costumed actors. “Kids don’t just turn [belief] off,” Goldstein says.

“Likewise, losing faith happens in pieces.”[17]

It seems fitting we would exit religion that way, since it’s the way many of us got into it in the first place. Yes, some people seem to have those Damascus Road conversions[18], or maybe a less dramatic “come to Jesus meeting,” as a friend of mine says, but more often religion just kind of seeps into us from the surrounding culture.

“I used to love this illustrated children’s Bible my mom gave me. Long-faced Jonah inside a yawning blue whale felt warm and right. My brain made these feelings. When we enjoy religious or associated experiences, like snuggling up with Mom reading the Bible, our brain’s reward circuits activate. Over time, religious ideas become rewarding in and of themselves. This is a powerful, unconscious motivation to keep believing.

“When I began to see my colorful Bible as boring and childish, those same reward circuits likely became less active. Religious experiences produced less pleasure. This happens involuntarily in people with Parkinson’s disease, which compromises the brain’s reward centers. [That is why] people who develop Parkinson’s are much more likely to lose their faith.”[19]

The New Magic – Or, maybe I’m just skeptical about skepticism.

But then, it’s common that having been debunked of religion, we transfer that same commitment to something else – maybe magical thinking or some other unverifiable belief system. Turns out there’s a neurological reason for that: the neural pathways that ran our old belief system are still there, so we just load them with new content:

“For many years I believed in both creationism, with a God whose hand I could shake, and evolution, a cold, scientific world that cared nothing about me. Because when we lose faith, our brain’s preexisting belief networks don’t dissolve. They’re updated, like a wardrobe. ‘Even if someone abandons or converts [religions], it’s not like they’re throwing out all the clothes they own and now buying a whole new set,’ says Jordan Grafman, director of brain injury research at the Shirley Ryan AbilityLab and a professor at Northwestern University. ‘You pick and choose what you leave and what you keep.’

“New beliefs join the same neurological framework as old ones. It’s even possible that an existing belief network paves the way for additional beliefs. [Another researcher] has found that kids who believe in fantastical beings are more likely to believe in new ones invented by researchers. “I think it’s because they already have this network that [the new belief] kind of fits into,” she explains. Sometimes the new beliefs resemble the old ones; sometimes they don’t.

“Most non-religious people are ‘passionately committed to some ideology or other,’ explains Patrick McNamara, a neurology professor at Boston University School of Medicine. These passions function neurologically as ‘faux religions.’”[20]

And then, having been newly converted to our new faux religion, we’re set up for another eventual round of debunking.

Meet the new boss.

Same as the old boss.

[1] Here’s the original music video of We Won’t Get Fooled Again. Watching it draws you all the way back into the turbulent, polarizing 60’s — if you remember them, that is — and the tone feels eerily similar to what we’re living with today. By the way, who said, “If you remember the 60’s, you really weren’t there”? Find out here.

[2] Ogden, Emily, Debunking Debunked, Aeon (Aug. 12, 2019). Ms. Ogden’s Aeon bio says she is “an associate professor of English at the University of Virginia, and an author whose work has appeared in Critical Inquiry, The New York Times and American Literature, among others. Her latest book is Credulity: A Cultural History of US Mesmerism (2018).” All quotes in this section are from this article.

nk’ means baloney, hooey, bullshit. Bunk isn’t just a lie, it’s a manipulative lie, the sort of thing a con man might try to get you to believe in order to gain control of your mind and your bank account. Bunk, then, is the tool of social parasites, and the word ‘debunk’ carries with it the expectation of clearing out something that is foreign to the healthy organism. Just as you can deworm a puppy, you can debunk a religious practice, a pyramid scheme, a quack cure. Get rid of the nonsense, and the polity – just like the puppy – will fare better. Con men will be deprived of their innocent marks, and the world will take one more step in the direction of modernity.

[3] This social convention has been around a long time: like the Bible (something else we might like to debunk) says, “There is a time for everything under heaven … a time to keep silence, and a time to speak.”   Ecclesiastes 3: 7

[4]This Article Won’t Change Your Mind,” The Atlantic (March 2017):

[5]Why Facts Don’t Change Our Minds,The New Yorker (Feb. 27, 2017).

[6] Encyclopedia Britannica.

[7] See Volck, Brian, The Convert’s Zeal, Image Journal (Aug. 22, 2019). See also this Pew Center report.

[8] 7 Ideas We Really Need to Stop Believing. Psychology Today (May 08, 2012).

[9] Mr. Hutson’s list is based on “a wealth of psychological evidence,” while mine comes from my own anecdotal judgment that magical thinking has led to all kinds of delusional decisions and disasters in my life. The irony of using my own subjective perspective to debunk my own life doesn’t escape me. – it ranks right in there with The Who’s resorting to prayer in the hope they won’t be fooled again.

[10] Doubting death: how our brains shield us from mortal truth, The Guardian (Oct. 19, 2019).

[11] Like Heaven is For Real, by Alex Malarkey. Yes, that’s his real name.

[12] The Science of Miracles, Medium (Feb. 7, 2019).

[13] Wikipedia.

[14] Wikipedia.

[15] What Atheists Can Learn From The Gay Rights Movement, The Washington Post (Apr. 3, 2013). Coming out as atheist is even trickier if you’re in the public eye: ‘I Prefer Non-Religious’: Why So Few US Politicians Come Out As Atheists, The Guardian (Aug. 3, 2019); The Last Taboo: It’s harder in America to come out as an atheist politician than a gay one. Why? Politico Magazine (Dec. 9, 2013)

[16] Such as Andrew L. Seidel, an “out-of-the-closet atheist” and author of The Founding Myth: Why Christian Nationalism Is Un-American (2019).

[17] Beaton, Caroline, What Happens to Your Brain When You Stop Believing in God: It’s like going off a drug Vice (Mar. 28 2017).

[18] The Acts of the Apostles 9: 1-9.

[19] Beaton, op. cit..

[20] Ibid.

All War is Holy War

holy war

According to one anthropologist,[1] the Yanomami Amazonian tribe lives in a “chronic state of war”:  violence against outsiders and members alike is a normal way of life. Their culture is the exception — most require a shift from peacetime to wartime culture in order for maiming and murdering to be acceptable. The shift begins with a cause to rally around:

“It is hard, maybe impossible, to fight a war if the cause is viewed as bankrupt. The sanctity of the cause is crucial to the war effort.”

War is a Force That Gives Us Meaning, Chris Hedges (2002).[2]

Most cultures are governed by some version of “Thou shalt not kill,” but God and the gods are not so constrained — they can and do kill, and direct their followers to do so. Therefore, to justify the mayhem, the state must become religious, and its cause must be sacred.

“War celebrates only power — and we come to believe in wartime that it is the only real form of power. It preys on our most primal and savage impulses. It allows us  to do what peacetime society forbids or restrains us from doing:  It allows us to kill.”

In wartime, the state is anointed with the requisite elements of religious culture:  dogmas and orthodox language; rites of initiation and passage; songs, symbols, metaphors, and icons; customs and laws to honor heroes, demonize foes, discipline skeptics, and punish nonbelievers.

“Because we in modern society have walked away from institutions that stand outside the state to find moral guidance and spiritual direction, we turn to the state in times of war.

“We believe in the nobility and self-sacrifice demanded by war… We discover in the communal struggle, the shared sense of meaning and purpose, a cause. War fills our spiritual void.”

Religious anointing reverses the secular aversion to killing and death:

“War finds its meaning in death.

“The cause is built on the backs of victims, portrayed always as innocent. Indeed, most conflicts are ignited with martyrs, whether real or created. The death of an innocent, one who is perceived as emblematic of the nation or the group under attack, becomes the initial rallying point for war. These dead become the standard bearers of the cause and all causes feed off the steady supply of corpses.

“The cause, sanctified by the dead, cannot be questioned without dishonoring those who gave up their lives. We become enmeshed in the imposed language.

“There is a constant act of remembering and honoring the fallen during war. These ceremonies sanctify the cause.

The first death is the most essential:

“Elias Canetti [winner of the Nobel Prize in Literature in 1981] wrote, “it is the first death which infects everyone with the feeling of being threatened. It is impossible to overrate the part played  by the first dead man in the kindling of war. Rulers who want to unleash war know very well that they must procure or invent a first victim. It need not be anyone of particular importance, and can even be someone quite unknown. Nothing matters except his death, and it must be believed that the enemy is responsible for this. Every possible cause of his death is suppressed except one:  his membership of the group to which one belongs oneself.”

Dissent has no place in the culture of war. The nation’s institutions and citizens are expected to speak the language of war, which frames and limits public discourse.

“The adoption of the cause means adoption of the language of the cause.

“The state spends tremendous time protecting, explaining, and promoting the cause. And some of the most important cheerleaders of the cause are the reporters. This is true in nearly every war. During the Gulf War, as in the weeks after the September attacks, communities gathered for vigils and worship services. The enterprise of the state became imbued with a religious aura. We, even those in the press, spoke in the collective.

“The official jargon obscures the game of war — the hunters and the hunted. We accept terms imposed on us by the state — for example, the “war on terror” — and these terms set the narrow parameters by which we are able to think and discuss.”

Exaltation of the nation, faith in the cause, honoring of the dead, and conformity to the language of war make doubt and dissent damnable:

“When we speak within the confines of this language we give up our linguistic capacity to question and make moral choices.

“The cause is unassailable, wrapped in the mystery reserved for the divine. Those who attempt to expose the fabrications and to unwrap the contradictions of the cause are left isolated and reviled.

“The state and the institutions of state become, for many, the center of worship in wartime. To expose the holes in the myth is to court excommunication.

“When any contradiction is raised or there is a sense that the cause is not just in an absolute sense, the doubts are attacked as apostasy.”

In war, the state shares dominion with the gods. When war ends, the state’s leaders, intoxicated with power, may not release war’s grip on the culture:

“There is a danger of a growing fusion between those in the state who wage war — both for and against modern states — and those who believe they understand and can act as agents of God.

“The moral certitude of the state in wartime is a kind of fundamentalism… And this dangerous messianic brand of religion, one where self-doubt is minimal, has come increasingly to color the modern world of Christianity, Judaism, and Islam.”

For the state to revert to peacetime culture, the moral shift that supported war must be reversed by both civilians and soldiers. This requires a harrowing withdrawal from addiction to wartime culture. We’ll talk about that next time.

[1] Napoleon Alphonseau Chagnon,

[2] All quotes in this article are from Chris Hedges’ book.