Skip to content

Posts tagged ‘Religion’

What Believers and Atheists Can Learn From Each Other (co-written with Rabbi Geoff Mitelman)

Here’s a forthcoming article for the Huffington Post religious blog I’ve written with Rabbi Geoff Mitelman, a friend and fellow cognitive science enthusiast. We discuss atheism and the psychology of belief. Check out his blog Sinai and Synapses

Rabbi Geoffrey Mitelman: It’s inherently challenging for believers and atheists to have productive conversations. Discussing topics such as belief and nonbelief, the potential irrationality of religion, or the limits of scientific knowledge is difficult since each side often ends up more firmly entrenched in their own worldview.

But one bright person interested in broadening the conversation is Sam McNerney, a science writer who focuses on cognitive science and an atheist interested in religion from a psychological point of view.

I found Sam through his writing on ScientificAmerican.com, and started reading his blog Why We Reason and his posts on BigThink.com. We discovered that even though we approached religion from different perspectives, we had great respect for each other.

So as two people with different religious outlooks we wondered: what can we learn from each other?

Sam McNerney: There are many things we can learn. Let’s take one: the role of authority.

A recent New York Times article points out that secular liberal atheists tend to conflate authority, loyalty and sanctity with racism, sexism and homophobia. It’s not difficult to see why. Societies suffer when authority figures, being motivated by sacred values and religious beliefs, forbid their citizens from challenging the status quo. But a respect for authority and the principles they uphold to some degree is necessary if societies seek to maintain order and justice and function properly. The primatologist Frans de Waal explains it this way: “Without agreement on rank and a certain respect for authority there can be no great sensitivity to social rules, as anyone who has tried to teach simple house rules to a cat will agree.” (Haidt, 106)

Ironically, atheists’ steadfast allegiance to rationality, secular thinking and the importance of open-mindedness blinds them to important religious values including respect for authority. As a result, atheists tend to confuse authority with exploitation and evil and undervalue the vital role authority plays in a healthy society.

Geoff: You accurately bring up one aspect of why organized religion can be so complicated: it is intertwined with power. And I’m glad you note that authority and power are not inherently bad when it comes to religion. In fact, as you also say, a certain degree of authority is necessary.

To me, the real problem arises when religion adds another element into the mix: certainty. It’s a toxic combination to have religious authorities with the power to influence others claiming to “know” with 100% certainty that they’re right and everyone else is wrong.

One thing I learned from several atheists is the importance of skepticism and doubt. Indeed, while certainty leads to arrogance, uncertainty leads to humility. We open up the conversation and value diverse experiences when we approach the world with a perspective of “I’m not sure” or “I could be wrong.”

Recently, astrophysicist Adam Frank wrote a beautiful piece on NPR’s blog 13.7 about how valuable uncertainty can be:

Dig around in most of the world’s great religious traditions and you find people finding their sense of grace by embracing uncertainty rather than trying to bury it in codified dogmas…

Though I am an atheist, some of the wisest people I have met are those whose spiritual lives (some explicitly religious, some not) have forced them to continually confront uncertainty. This daily act has made them patient and forgiving, generous and inclusive. Likewise, the atheists I have met who most embody the ideals of free inquiry seem to best understand the limitations of every perspective, including their own. They encounter the ever shifting ground of their lives with humor, good will and compassion.

Certainty can be seductive, but it hurts our ability to engage with others in constructive ways. Thus when religious people talk about God, belief or faith, we have to approach the conversation with a little humility and recognize that we don’t have a monopoly on the truth. In the words of Rabbi Brad Hirschfield, we need to realize that another person doesn’t have to be wrong for us to be right.

This doesn’t mean believers and atheists will agree on the role of religion in society, the validity of a particular belief system, or even the very existence of God. In fact, believers and atheists will almost certainly continue to vehementlydisagree about these questions. But we have to remember that not all disagreements are bad. Some arguments are quite beneficial because they help us gain a deeper understanding of reality, encourage clearer thinking, and broaden people’s perspectives.

The Rabbis even draw a distinction between two different kinds of arguments. Arguments they call “for the sake of Heaven” will always be valuable, while arguments that are only for self-aggrandizement will never be productive (Avot5:20). So I’m not interested in arguments that devolve into mocking, ridicule, name-calling or one-upmanship. But I’d gladly participate in any discussion if we are arguing about how we make ourselves and this world better, and would actively strive to involve whoever wants to be part of that endeavor, regardless of what they may or may not believe.

Sam: You are right to point out that both atheists and believers under the illusion of certainty smother potentially productive dialogue with disrespectful rhetoric. What’s alarming is that atheism in the United States is now more than non-belief. It’s an intense and widely shared sentiment where a belief in God is not only false, but also ridiculous. Pointing out how irrational religion can be is entertaining for too many.

There’s no doubt that religious beliefs influence negative behavioral consequences, so atheists are right to criticize religion on many epistemological claims. But I’ve learned from believers and my background in cognitive psychology that faith-based beliefs are not necessarily irrational.

Consider a clever study recently conducted by Kevin Rounding of Queen’s University in Ontario that demonstrates how religion helps increase self-control. In two experiments participants (many of whom identified as atheists) were primed with a religious mindset – they unscrambled short sentences containing words such as “God,” “divine” and “Bible.” Compared to a control group, they were able to drink more sour juice and were more willing to accept $6 in a week instead of $5 immediately. Similar lines of research show that religious people are less likely to develop unhealthy habits like drinking, taking drugs, smoking and engaging in risky sex.

Studies also suggest that religious and spiritual people, especially those living in the developing world, are happier and live longer, on average, than non-believers. Religious people also tend to feel more connected to something beyond themselves; a sentiment that contributes to well-being significantly.

It’s unclear if these findings are correlative or causal – it’s likely that many of the benefits that come from believing in God arise not from beliefs per se but from strong social ties that religious communities do such a good job of fostering. Whatever the case, this research should make atheists pause before they dismiss all religious beliefs as irrational or ridiculous.

Geoff: It’s interesting — that actually leads to another area where atheists have pushed believers in important ways, namely, to focus less on the beliefs themselves, and more on how those beliefs manifest themselves in actions. And to paraphrase Steven Pinker, the actions that religious people need to focus on are less about “saving souls,” and more about “improving lives.”

For much of human history the goal of religion was to get people to believe a certain ideology or join a certain community. “Being religious” was a value in and of itself, and was often simply a given, but today, we live in a world where people are free to choose what they believe in. So now, the goal of religion should be to help people find more fulfillment in their own lives and to help people make a positive impact on others’ lives.

It’s important to note that people certainly do not need religion to act morally or find fulfillment. But as Jonathan Haidt writes in his new book The Righteous Mind, religion can certainly make it easier.

Haidt argues that our mind is like a rider who sits atop an elephant to suggest that our moral deliberations (the rider) are post-hoc rationalizations of our moral intuitions (the elephant). The key to his metaphor is that intuitions comes first (and are much more powerful) and strategic reason comes afterwards.

We need our rider because it allows us to think critically. But our elephant is also important because it motivates us to connect with others who share a moral vision. Ultimately, if we are striving to build communities and strengthen our morals, we cannot rely exclusively on either the rider or the elephant; we need both. As Haidt explains:

If you live in a religious community, you are enmeshed in a set of norms, institutions and relationships that work primarily on the elephant to influence your behavior. But if you are an atheist living in a looser community with a less binding moral matrix, you might have to rely somewhat more on an internal moral compass, read by the rider. That might sound appealing to rationalists, but it is also a recipe for…a society that no longer has a shared moral order. [And w]e evolved to live, trade and trust within shared moral matrices. (Haidt, 269)

Since religion is a human construct, with its “norms, institutions and relationships,” it can be used in a variety of different ways. It can obviously be used to shut down critical thinking and oppress others. But as you mention, religion has positive effects on well-being, and religious beliefs correlate with a sense of fulfillment. Perhaps the job of religion, then, should be giving us a common language, rituals, and communities that reinforce and strengthen our ability to become better human beings and find joy and meaning in our lives.

Ultimately, we don’t have to agree with someone in order to learn from them. As Ben Zoma, a 2nd century Jewish sage, reminds us: “Who is wise? The person who learns from all people.” (Avot 4:1) When we are willing to open ourselves up to others, we open ourselves up to new ideas and different perspectives.

Indeed, I have come to believe that our purpose as human beings – whether we identify as a believer, an atheist, or anything in between – is to better ourselves and our world. And any source of knowledge that leads us to that goal is worth pursuing.

Religion, Evolution & What The New Atheists Overlook

Lancet flukes (Dicrocelium dendriticum) are a clever little parasite. To reproduce, they find their way into the stomach of a sheep or cow by  commandeering an ant’s brain. Once this happens, ants exhibit strange behavior: they climb up the nearest blade of grass until it falls, then they climb it again, and again. If the flukes are lucky, a grazing farm animal eats the grass along with the ant; a sure win for the flukes, but a sad, and unfortunate loss for the six-legged insect.

Does anything like this happen with human beings? Daniel Dennett thinks so. In the beginning of his book Breaking the Spell, Dennett uses the fluke to suggest that religions survive because they influence their hosts (e.g., people) to do bad things for themselves (e.g., suicide bombing) but good things for the parasite (e.g., Islam). Implicit in Dennett’s example is that religions are like viruses, and people and societies are better of without them.

Dennett’s position is akin to the rest of the New Atheists: religion is a nasty and irrational byproduct of natural selection. This means that religious beliefs were not directly selected for by evolution any more than our noses evolved to help us keep our glasses from sliding off our faces. In the words of Pascal Boyer, “religious concepts and activities hijack our cognitive resources.” The question is: what cognitive resources influenced religion?

Most cognitive scientists agree that the Hypersensitve Agency Detection Device (abbreviated HADD) played an important role. In brief, the HADD explains why we see faces in the clouds, but never clouds in faces. Neuroscientist Dean Buonomano puts it this way: “We are inherently comfortable assigning a mind to other entities. Whether the other entity is your brother, a cat, or a malfunctioning computer, we are not averse to engaging it in conversation.” This ability endows will and intention to other people, animals and inanimate objects. The HADD produces a lot of false positive errors (e.g., seeing the virgin Mary in a piece of toast), and God might be one of them.

Another feature of the human mind that religion might have co-opted is a natural propensity towards a dualistic theory of mind. Dualism is our tendency to believe that people are made up off physical matter (e.g., lungs, DNA, and atoms) as well as an underlying and internal essence. Even the strictest materialist cannot escape this sentiment; we all feel that there is a “me” resting somewhere in our cortices. A belief in disembodied spirits could have given rise to beliefs in supernatural entities that existed independent of matter. Yale psychologist Paul Bloom is a proponent of this view and supports his conclusions with experimental evidence highlighted in his book Descartes’ Baby.

Although the by-productive hypothesis, as it is known, is incomplete, it all points to the same logic: “a bit of mental machinery evolved because it conferred a real benefit, but the machinery sometimes misfires, producing accidental cognitive effects that make people prone to believing in gods.”

This is an important piece of the puzzle for the New Atheists. If religion is the off shoot of a diverse set of cognitive modules that evolved for a variety of problems, then religious beliefs are nothing more than a series of neural misfires that are “correctable” with secular Enlightenment thinking.

Not everyone agrees. The evolutionary biologists David Sloan Wilson and Edward O. Wilson propose that religiosity is a biological adaptation that created communities by instilling a “one for all, all for one” mentality in its members. This is important because it allowed group members to function as a superorganism, which moreover gave them an advantage on the African savannah; “An unshakable sense of unity among… warriors,” Buonomano says, “along with certainty that the spirits are on their side, and assured eternity, were as likely to, as they are now, to improve the chances of victory in battle.” The binding power of religion would have also helped communities form objective moral codes – do unto others as you would have others do unto you – and protected against free riders.

Jonathan Haidt is making a name for himself by advocating this point. In addition to the group selection hypothesis, Haidt points to our species ability to experience moments of self-transcendence. The world’s religions, he believes, are successful because they found a way to facilitate such experiences. Here’s how he explained it in a recent TED:

If the human capacity for self-transcendence is an evolutionary adaptation, then the implications are profound. It suggests that religiosity may be a deep part of human nature. I don’t mean that we evolved to join gigantic organized religions — that kind of religion came along too recently. I mean that we evolved to see sacredness all around us and to join with others into teams that circle around sacred objects, people and ideas. This is why politics is so tribal. Politics is partly profane, it’s partly about self-interest. But politics is also about sacredness. It’s about joining with others to pursue moral ideals. It’s about the eternal struggle between good and evil, and we all believe we’re on the side of the good.

What’s interesting about Haidt’s angle is that it sheds a bad light on Enlightenment and secular ideals that western civilization was founded on. We exult liberty, individualism and the right to pursue our self-interest. But are we ignoring our innate desire to be part of something greater? Are we denying our groupish mentalities? The modern world gives us fixes – think big football games or raves – but I think some atheists are deprived.

And this brings me back to the fluke and the New Atheists. If Haidt is right, and our religiosity was an evolutionary adaptation, then religious beliefs are a feature of, not a poison to, our cognition. The fluke, therefore, is not a parasite but an evolutionary blessing the facilitated the creation of communities and societies. This is not to deny all the bloodshed on behalf of religion. But if religion is an adaptation and not a byproduct, then “we cannot expect people to abandon [it] so easily.”

The Future Of Religion

Religious people, that is, people who say that religion is important in their lives, have, on average, higher subjective well being. They find a greater sense of purpose or meaning, are connected to stronger social circles and live longer and healthier lives. Why, then, are so many dropping out of organized religion?

Last year a team of researchers led by Ed Diener tried to answer this question. They found that economically developed nations are much less likely to be religious. On the other hand, religion is widespread in countries with more difficult circumstances. “Thus,” the authors conclude, “it appears that the benefits of religion for social relationships and subjective well-being depend on the characteristics of the society.” People of developed nations are dropping out of organized religion, then, because they are finding meaning and wellness elsewhere.

The real paradox is America, where Nietzsche’s anti-theistic proclamation went unheard. 83 percent of Americans identify with a religious denomination, most say that religion is “very important” in their lives and according to Sam Harris 44 percent “of the American population is convinced that Jesus will return to judge the living and the dead sometime in the next fifty years.” In fact, a recent study even showed that atheists are largely seen as untrustworthy compared to Christian and Muslims.

Why does the United States, one the most economically developed countries in the world, deviate from the correlation between religion and wealth? One answer is that trends always contain outliers. As Nigel Barber explains in an article: “The connection between affluence and the decline of religious belief is as well-established as any such finding in the social sciences…. [and] no researcher ever expects every case to fit exactly on the line… If they did, something would be seriously wrong.”

Whatever the reasons, a recent article by David Campbell and Robert Putnam suggests that Americans are catching up to their non-believing European counterparts. According to Campbell and Putnam, the number of “nones” – those who report no religious affiliation – has dramatically increased in the last two decades. “Historically,” Campbell and Putnam explain, “this category made up a constant 5-7 percent of the American population… in the early 1990s, however, just as the God gap widened in politics, the percentage of nones began to shoot up. By the mid-1990s, nones made up 12 percent of the population. By 2011, they were 19 percent. In demographic terms, this shift was huge.”

A study by Daniel Mochon, Michael Norton and Dan Ariely bodes well with this observation. They discovered that, “while fervent believers benefit from their involvement, those with weaker beliefs are actually less happy than those who do not ascribe to any religion-atheists and agnostics.” It’s possible the “nones” Campbell and Putnam speak of are motivated to abandon their belief by a desire to be happier and less conflicted with their lives. This might be too speculative, but there are plenty of stories, especially in the wake of the New Atheist movement, of people who describe their change of faith as a dramatic improvement for their emotional life. In a recent interview with Sam Harris, for example, Tim Prowse, a United Methodist pastor for almost 20 years, described leaving his faith as a great relief. “The lie was over, I was free,” he said, “…I’m healthier now than I’ve been in years and tomorrow looks bright.”

What does this say about the future of atheism? Hitchens and others suggest that a standoff between believers and non-believers may be inevitable. “It’s going to be a choice between civilization and religion,” he says. However, grandiose predictions about the future of the human race are almost always off the mark, and it’s likely that the decline in religion will remain slow and steady. It’s important to keep in mind that this decline is a recent phenomena. It wasn’t until the 17th century, the so-called Age of Reason, when writers, thinkers and some politicians began to insist that societies are better off when they give their citizens the political right to communicate their ideas. This was a key intellectual development, and in context to the history of civilization, very recent.

To be sure, radical ideologies will always exist; religion, Marx suggested, is the opiate of the people. But the trend towards empiricism, logic and reason is undeniable and unavoidable. Titles including God Is Not Great and The God Delusion are bestsellers for a reason. And if Prowse’s testimony as well as Campbell and Putnam’s data are indicative, there is a clear shift in the zeitgeist.

What Motivates A Suicide Bomber?

Suicide terrorism is a peculiar business. As a means of killing civilians it is hugely efficient. Steven Pinker explains that, “it combines the ultimate in surgical weapon delivery – the precision manipulators and locomotors called hands and feet, controlled by the human eyes and brain – with the ultimate in stealth – a person who looks just like millions of other people.” The most sophisticated drone doesn’t come close.

Relative to the past few decades it is trending. During the 1980s the world saw an average of about five suicide attacks per year. Between 2000 and 2005 that number skyrocketed to 180. The targets have been diverse. Israel, Iraq and Afghanistan get all the media attention, but Somalia and Sri Lanka experienced their share of self-destruction over the past five years.

What’s peculiar about suicide terrorism is that it is especially difficult to understand from a psychological point of view. Most people find it impossible to empathize with someone who walks into a crowded Jerusalem market wearing an overcoat filled with nails, ball bearings and rat poison with the intention of detonating the bomb strapped to his (99 percent of suicide terrorists are male) waist. How do we make sense of this?

Secular westerners tend to understand suicide terrorists as unfortunate products of undeveloped, undereducated and economically devastated environments. This isn’t true. All the 9/11 hijackers were college educated and suffered “no discernible experience of political oppression.” As Sam Harris explains:

Economic advantages and education, in and of themselves, are insufficient remedies for the cause of religious violence. There is no doubt that many well-educated, middle-class fundamentalists are ready to kill and die for God…. Religious fundamentalism in the developing world is not, principally, a movement of the poor and uneducated.

What is a sufficient explanation? In the case of Islam, why are so many of its followers eager to turn themselves into bombs? Harris believes that it is “because the Koran makes this activity seem like a career opportunity… Subtract the Muslim belief in martyrdom and jihad, and the actions of suicide bombers become completely unintelligible.” However you interpret the Koran, Harris’ position is that faith motivates Muslim suicide terrorists and that beliefs are the key to understanding the psychology of suicide terrorism. When nineteen Muslim terrorists woke up on the morning of September 11th they believed that 72 virgins awaited them in Heaven; they believed they would be remembered as heroes; they believed that self-destruction in the name of their God was glorious. It does not take a stretch of the imagination to correctly guess what they were saying (I should say, praying) moments before their doom.

Epistemology isn’t the whole story. Action requires belief but belief is not created in a vacuum. Understanding the motives of suicide bombers demands knowledge of the community they grew up in. You need context.

This is precisely what anthropologist Scott Atran attempted to dissect. After interviewing failed and prospective suicide terrorists he published several articles outlining the psychological profile of suicide terrorists and concluded that a call to martyrdom is appealing because it offers an opportunity to join a cohesive and supportive community of like-minded persons. Here’s Atran’s testimony to a U.S. Senate subcommittee:

When you look at whom [suicide terrorists] idolize, how they organize, what bonds them and what drives them; then you see that what inspires the most lethal terrorists in the world today is not so much the Koran or religious teachings as a thrilling cause and call to action that promises glory and esteem in the eyes of friends, and through friends, eternal respect and remembrance in the wider world that they will never live to enjoy.

The work of anthropologist Richard Sosis suggests that Atran is correct. Sosis studied the history of communes in the United States in the nineteenth century. He found that twenty years after their founding 6 percent of the secular communes still existed compared to 39 percent of the religious communes. He also discovered that the more costly sacrifices the religious commune demanded the better it functioned. By requiring members to withstand from things like alcohol and conform to dress codes the religious communes quickly and effectively bound its members together. This is why if the West wants to minimize suicide terrorism, Atran recommends, it should “[learn] how to minimize the receptivity of mostly ordinary people to recruiting organizations.”

Thankfully, the number of suicide bombers has declined in the last few years. In Iraq Vehicle and suicide attacks dropped from 21 a day in 2007 to about 8 a day in 2010. Along with a surge of American soldiers, the decline can be attributed to an attitude shift within the Islamic community. In Pinker’s latest book he explains that, “in the North-West Frontier Province in Pakistan, support for Al Qaeda plummeted from 70 percent to 4 percent in just five months in late 2007… In a 2007 ABC/BBC poll in Afghanistan, support for jihadist militants nosedived to one percent.” If Atran is correct in suggesting that suicide terrorism is fueled by an appeal to community and an opportunity to gain esteem then this is good news.

Individual belief and the communities they arise from help us understand the psyche of suicide bombers. But even a sufficient explanation would leave me wondering. Our DNA has one goal: replication. That natural selection has given us the means to stop this process might be one of Nature’s great ironies.

Read more

Why Atheists Should Be Allowed To Cherry Pick From Religion

Ever since Darwin published Origins, Nietzsche declared the death of God and Hitchens argued that religion poisons everything, atheists have struggled with atheism. Some deny the supernatural but are “spiritual;” some deny the historical credibility of the scripture, Torah or Quran but value their principles; some don’t believe in anything that cannot be explained by science yet maintain that humans possess an intangible essence or that there is an afterlife. I’ve even met folks who call themselves “atheists who believe in God.”

It’s easy to understanding said beliefs as inconsistent or incompatible; how can someone both believe and not believe in God? Be scientific and religious? This attitude ignores a truth that doesn’t get said enough: atheism is diverse.

The repetitive and attention grabbing debates between fundamentalists and non-believers are one reason this is forgotten. It’s easy to assume that only two opinions exist when searching “atheism” on YouTube or Google returns talks and articles from only William Lane Craig or Christopher Hitchens.

But most atheists know that the worldview of the fundamentalist and staunch non-believer inaccurately portrays religious belief as black and white. These more mainstream atheists know that there is a fairly large middle ground where religion and atheism can exist simultaneously to promote human flourishing. Religious people can believe in natural selection and be pro-choice even though many texts suggest otherwise while atheists have no problem being moral and giving to charity even though they never went to Sunday school.

When it comes to scientific claims, Hitchens and Dawkins are right: the world wasn’t created in a few days; natural selection is an observable phenomenon; God probably doesn’t exist; one can be moral without religion. But when it comes to how we ought to behave and what we ought to value the great religious texts got a few things correct. The problem is that hardcore atheists don’t let the mainstream cherry pick the good parts of religion without criticizing them for being inconsistent or intellectually lazy. We have to allow atheism to incorporate those religious practices and principles that we know contribute to human flourishing.

My conviction is not only a reminder that atheism is more diverse than some make it out to be, but also that atheism can be improved if it considers the right religious themes.

In a recent TED lecture Alain de Botton assumes a similar position. He explains:

I am interested in a kind of constituency that thinks something along these lines… I can’t believe in any of this stuff. I can’t believe in the doctrines… but – and this is a very important but – I love Christmas carols! I really like the art of Mantegna, I really like looking at old churches and I really like learning the pages of the Old Testament. Whatever it may be you know the kind of thing I am talking about: people who are attracted to the ritualistic side, the moralistic communal side of religion but can’t bear the doctrine. Until now these people have faced an unpleasant choice: either accept the doctrine and have all the nice stuff or reject the doctrine and live in a spiritual wasteland…  I don’t think we have to make that choice… there’s nothing wrong with picking and mixing, with taking out the best sides of religion. To me atheism 2.0 is about a respectful and impious way going through religions and saying what could we use. The secular world is full of holes… a thorough study of religion can give us all sorts of insights into areas of life that are not going too well.

The good news is, I think, most people agree. The problem is that they don’t get the coverage.

At the risk of stating the obvious, let’s remember that knowing how to live the best possible life requires both humanistic ideals as well as ideals from many of the great religions. As Jonathan Haidt concludes his enjoyable book The Happiness Hypothesis, “by drawing on wisdom that is balanced – ancient and new, Eastern and Western, even liberal and conservative – we can choose directions in life that will lead to satisfaction, happiness, and a sense of meaning.”

Does Pinker’s “Better Angels” Undermine Religious Morality?

Pinker at Strand book store in Manhattan last week

It is often argued that religion makes individuals and the world more just and moral, that it builds character and provides a foundation from which we understand right from wrong, good from evil; if it wasn’t for religion, apologists say, then the world would fall into a Hobbesian state of nature where violence prevails and moral codes fail. To reinforce this contention, they point out that Stalin, Hitler and Mao were atheists to force an illogical causal connection between what they did and what they believed.

One way to answer the question of if religion makes people and the world more moral and better off is to look at the history books. For that, I draw upon Steven Pinker’s latest, The Better Angels of Our Nature, an 800 page giant that examines the decline of violence from prehistoric hunter-gatherer societies to the present. Pinker opens his book with the following: “Believe it or not – and I know that most people do not – violence has declined over long stretches of time, and today we may be living in the most peaceable era in our species’ existence. The decline, to be sure, has not been smooth; it has not brought violence down to zero; and it is not guaranteed to continue. But it is an unmistakable development, visible on scales from millennia to years, from the waging of wars to the spanking of children.” Whether you’re familiar with Better Angels or not, it’s worth reviewing its arguments to show why violence declined. Let’s run through three sections of Pinker’s book – The Pacification Process, The Civilizing Process, and The Humanitarian Revolution – to see how violence declined. Doing so will allow us to judge if history has anything to say about religion being a credible source of moral good at the individual and global level.

The Pacification Process describes the shift from hunter-gatherer societies to state-run societies. Comparing data from hunter-gatherer societies to modern states reveals two different worlds. For example, the percentage of deaths due to violent trauma (we know this from archaeological studies) in hunter-gatherer societies was on average about 15 percent, with the Crow Creek Native Americans of South Dakota (circa 1325 CE) topping off the list at just below 65 percent and the Nubia of Papua New Guinea (circa 12,000-10,000 BCE) at the bottom at just below 10 percent. By comparison, in 2005 the percentage was less than point one of one percentage; people just aren’t killing each other like they used too, in other words. Another metric to compare hunter-gatherer societies to state-run societies in terms of violence is war deaths per 100,000 people per year. In hunter-gatherer societies it was on average 524. In contrast, consider the two most violence state-run societies in the modern era: Germany in the 20th century, which was involved in two world wars, is at 135 and Russia, which was involved in two world wars and a major revolution, is at 130. The whole world in the 20th century was around 60 war deaths per 100,000 people per year. Taken together, then, Hobbes got it right when he said that the state of nature was “solitary, poor, nasty, brutish and short.”

The Civilizing Process describes the decline of violence in Europe throughout the middle ages beginning around 1200 and ending in the modern era. One way to compare these two societies is to look at homicides per 100,000 people per year in England over the course of the last 800 years. Between 1200 and 1400, roughly 20 to 30 of every 100,000 English people were murdered. Compare this to the year 2000 where the number is less than one. This means, as Pinker says, “a contemporary Englishmen has a 50 fold less chance of being murdered than his compatriot in the middle ages.” The same story holds across Europe where murder rates declined in a nearly identical fashion. In Italy, for example, the murder rate dropped from about 90 homicides per 100,000 per year in 1300 to between one and two percent in 2000, and in the Netherlands it dropped from about 80 to also between one and two percent across the same time period. Indeed, as Pinker remarks, “from the 14th century on, the European homicide rate sank steadily.” The United States saw similar trends, though obviously not over the same period of time. Here’s one example. Homicides per 100,000 in per year in California fell from a bit over a hundred in 1850 to less than ten in 1910; it truly was the wild west.

The Humanitarian Process describes the rise in human rights, individualism, and liberal ideals throughout the last few centuries. There are several ways to examine this, one is the abolition of judicial torture. From just before 1700 to just after 1850 every major European country officially abolished every form of judicial torture including “breaking at the wheel, burning at the stake, sawing in half, impalement, and clawing.” In addition, England saw the abolition of the death penalty for non lethal crimes including, “poaching, counterfeiting, robbing a rabbit warren and being in the company of Gypsies.” By the turn of the 20th century, the death penalty was abolished outright for nearly every European country (sans Russia and Belarus). The United States saw similar trends. In the 17th and 18th century, it abolished capital punishment for crimes including, “theft, sodomy, bestiality, adultery, witchcraft, concealing birth, burglary, slave revolt, and counterfeiting.” However, capital punishment is still legal, though only about 50 people per year are executed. Describing the humanitarian process would be incomplete without mentioning the abolition of slavery, which sharply increased throughout the 19th century in many countries around the world. Mauritania was the last country to abolish slavery when it did so in 1981. It is also worth considering that the number of countries with policies that discriminate against ethnic minorities fell from 44 in 1950 to under 20 in 2003; the number of peacekeepers rose from zero just after World War Two to somewhere in the tens of thousands; and over 90 countries in the world are now democratic, compared to less than 20 autocracies.

Pinker describes two more processes – The Long Peace and The New Peace – which describe similar trends but in the 20th century. In brief, pick your metric having to do with violence and it’s a safe bet it has gone down in the last century. However, there are a few details regarding social issues in the United States worth mentioning. First, we saw a reduction in hate crimes and domestic violence; lynching dropped from 150 per year in 1880 to zero in 1960 and assaults by intimate partners from 1,000 (female victims) and about 200 (male victims) to about 400 and about 50 respectively. We also saw changes in sentiments towards minorities and females. The percentage of white people who “would move if a black family moved in next door” fell over the past six decades from 50 percent to nearly zero; the percentage of white people who believed that “black and white students should go to separate schools” fell similarly; and the approval rating of husband slapping steadily dropped throughout the second half of the 20th century. In addition, gay rights have risen dramatically, animal rights have increased and hate crimes have declined.

By now, the decline of violence should be clear (if you’re not sold, read Pinker’s book). What’s uncertain are its causes. This brings me back to religion and its claim that it provides a necessary moral foundation for the individual and the society. It’s my contention that considering the data Pinker assimilated there is little evidence to support this assertion. That is, religion is not responsible for the moral progress of the last few centuries and for humanity pulling itself out of its former Hobbesian state. As Pinker himself asserts, “the theory that religion is a force for peace, often heard among the religious right and its allies today, does not fit the facts of history.”

If not religion, then what? The more accurate picture is that humans are inclined towards violence and peace. Douglas Kenrick’s study, which Pinker cites, shows that most people (male & female) occasionally fantasize about killing another person, and a trip to the movies or a hockey game will probably demonstrate this sentiment. Paul Bloom’s study, on the other hand, illustrates that babies as young as six month have a moral sense of good and bad. Therefore, it’s much more fruitful to ask what are the historical circumstances that bring out what Abraham Lincoln called our “better angels.”

Pinker identifies four “better angels” – self-control, empathy, a moral sense and reason – and four historical circumstances or “pacifying forces” that favor them over our “inner demons.” The first is the “Leviathan,” or the state. As the Pacification Process and Humanitarian Process illustrated, state-run societies are much more peaceful than hunter-gatherer societies. There are a number of reasons for this. Most obvious is the fact that it is impossible to impose legalities during anarchy. It is only under a state-run society that laws regarding physical abuse or murder can be enforced. In addition, whereas hunter-gatherers were often forced to fight over food and territory, citizens of states tended to be more secure.

The second is “gentle commerce.” This describes process in which individuals realized that engaging in trade can result in a win-win. It’s Adam Smith’s Wealth of Nations; a society benefits when its citizens are allowed to freely exchange in trade and form their own businesses. The McDonald’s theory, which explains that no two countries with McDonald’s have ever gone to war with each other, highlights how gentle commerce benefits society on a global scale.

The third is the idea of the “expanding circle,” and it describes our growing tendency to be kind and emphatic towards strangers. Whereas hunter-gatherers and citizens of early states only cared for their kin, citizens in today’s world are much more helpful, forgiving, and caring to strangers. This helps explain why we often give money to people we’ve never met even when there is no return as is the case with charities or tipping (In the famous Ultimatum experiment in which people are given $20 and the choice to either take all of it, split it $18/$2, or split it $10/$10, most split it evenly). Indeed, institutions like the Red Cross and Unicef are predicated on the idea that humans are willing to give to others more in need. What expanded the circle? Pinker points to increased cosmopolitanism, which research shows encourages people to adopt the perspective of others.

The fourth is the “escalator of reason.” Pinker says it best: “As literacy and education and the intensity of public discourse increase, people are encouraged to think more abstractly and more universally. That will inevitably push in the direction of a reduction of violence. People will be tempted to rise about their parochial vantage points – that makes it harder to privilege one’s own interest over others. It replaces a morality based on tribalism, authority and puritanism with a morality based on fairness and universal rules. It encourages people to recognize the utility of cycles of violence and to see it as a problem rather than a contest to be won.” It shouldn’t come as a surprise then, that the rise of published books and literacy rates preceded the Enlightenment, an era that was vital in the rise of human rights.

These are the four pacifying forces the favor our “better angels.” Reviewing them again puts into question the claim that religion is a necessary moral foundation and the world is better because of religion. If these two claims are true than it would be difficult to explain why the decline of violence and the rise of humanitarian rights occurred so many years after the inception of the Abrahamic religions. Religion was late to the game if it does bring out our better angels. While apologists were busy trying to prove the existence of God and justify scriptures that preach “genocide, rape, slavery and the execution of nonconformists,” the age of reason allowed Europeans to realize that understanding what was morally right and what contributed to human flourishing the most did not require religious texts.

This is not to ignore the fact that good things happened on behalf of religion. The Quakers, to their credit, supported the abolition of slavery in the United States long before most, figures like Desmond Tutu have been instrumental in reducing global and nation conflicts and positive psychology research tells us that religion is a significant source of personal happiness. But it is to deny the claim that religion is a necessary moral foundation and the claim that the world would fall into moral anarchy without religion. People assume that a moral sense or code, an understanding of right and wrong, requires religion. Is this true? In reviewing data outlined in The Better Nature of Our Nature it is apparent that religion played at best a minimal role. It seems more plausible to explain the decline of violence through other historical circumstances and events, which I’ve outlined here.

Taken together, then, it’s probably most accurate to say that religion has been along for the ride but it certainly hasn’t been in the drivers seat. Waves of violence have come and gone – thankfully most of them have gone – and humanitarian rights are at an all times high at the hand of other historical forces. People who believe that religion provides a necessary moral foundation are merely paying “lip service [to the bible] as a symbol of morality, while getting their actual morality from more modern principles.”

A Case Against Religious Moderation

Imagine that instead of, “In God We Trust,” dollar bills in the United States read, “In Zeus We Trust.” Or think what it would be like if Barack Obama ended his speeches with, “Apollo bless the United States of America.” And consider how strange it would sound if one of your friends told you that they recently found deep comfort in Poseidon. What’s absurd about comments like these is not the mentions of Zeus, Apollo and Poseidon, rather, it is that these Gods have the same ontologically status as the Judeo-Christian God that our money, presidents and friends take seriously. That is, there is zero scientific evidence to suggest any of these Gods are real, though most people overwhelming favor one.

Yet, for no real reason, we are quick to call someone who professes a deep faith for Poseidon crazy while we would never challenge someone who profess a deep faith in the Judeo-Christian God. This is a double standard. We should challenge both, not be so politically correct, and be able to scrutinize all beliefs against what we know about the natural world. But we don’t because we are too religiously moderate – our propensity to give people with a belief in God a free pass from legitimate criticisms is too strong. This is deeply problematic, and I’d like to outline three reasons for this, which I draw from religious critic and neuroscientist Sam Harris.

The first problem with religious moderation is that it is “intellectually bankrupt.” When it comes to any legitimate academic subject, we evaluate its findings rationally. That is, we carefully look and its reason, test its hypotheses and try to replicate its findings. This is how academic progress happens. Psychology, for example, shifted from Freudian psychoanalysis, to Skinnerian behaviorism, to cognitivism over the last hundred plus years by using the scientific method. Now we know more about brain and behavior. The same story holds in any other academic subject – knowledge increases when old beliefs are challenged. The same cannot be said of virtually all religions because they are set up such that challenging its tenets is sinful, usually paid for with eternal damnation. If religious beliefs are not subjected to the same analyses as scientific ones they will remain dogmatic, static and “intellectually bankrupt.”

The second problem with religious moderation is that causes people (liberal westerners mainly) to understand something like suicide bombing, the mistreatment of women or homosexuals, or honor killings incorrectly. As Harris explains, when religious moderates see a jihadist say “we love death more than the infidel loves life,” and blows himself up (or herself), they tend to think that religion didn’t have a lot to do with it by citing socioeconomic, educational and societal reasons. This is incorrect. As scary as it is, there are many well-educated people who live in well-established communities who believe that blowing themselves up in the name of God is a good idea. I’m not denying that difficult cultural circumstances could play a role in fundamentalism, but consider this paragraph from a New York Times article a few years ago:

We examined the educational backgrounds of 75 terrorists behind some of the most significant recent terrorist attacks against Westerners. We found that a majority of them are college-educated, often in technical subjects like engineering. In the four attacks for which the most complete information about the perpetrators’ educational levels is available – the World Trade Center bombing in 1993, the attacks on the American embassies in Kenya and Tanzania in 1998, the 9/11 attacks, and the Bali bombings in 2002 – 53 percent of the terrorists had either attended college or had received a college degree. As a point of reference, only 52 percent of Americans have been to college. The terrorists in our study thus appear, on average, to be as well-educated as many Americans.

Religious moderates must realize that well-educated and well situated people have strong, sometimes deadly beliefs.

The third problem with religious moderation is that it gives cover to the fundamentalists. If people want to say that everyone has the right to believe what they want and practice their own religion, they are allowing people to believe that anyone who isn’t a _____ (fill in your religion) will be damned to hell. In other words, they opening the door up for a whole number of harmful ungrounded beliefs. For example, religious dogmas are keeping stem-cell research, an incredibly promising field surely beneficial to human beings, from happening. Likewise, they are forbidding the use of condoms in Sub-Sahara Africa where there are extraordinary high rates of people dying from AIDS. Religious tolerance is a good idea, as is respecting other people’s beliefs, but there should be a limit.

To review, I believe that religious moderation is bad for three reasons: It is intellectually dishonest – we should scrutinize religious beliefs in the same way we scrutinize academic beliefs; it causes us to be blind to how powerful beliefs can be; and it allows cover for fundamentalists. Hopefully, we can be less politically correct and begin to criticize God in the same way we criticize any other idea.

Read more

%d bloggers like this: