Skip to content

Posts tagged ‘Christopher Hitchens’

The Future Of Religion

Religious people, that is, people who say that religion is important in their lives, have, on average, higher subjective well being. They find a greater sense of purpose or meaning, are connected to stronger social circles and live longer and healthier lives. Why, then, are so many dropping out of organized religion?

Last year a team of researchers led by Ed Diener tried to answer this question. They found that economically developed nations are much less likely to be religious. On the other hand, religion is widespread in countries with more difficult circumstances. “Thus,” the authors conclude, “it appears that the benefits of religion for social relationships and subjective well-being depend on the characteristics of the society.” People of developed nations are dropping out of organized religion, then, because they are finding meaning and wellness elsewhere.

The real paradox is America, where Nietzsche’s anti-theistic proclamation went unheard. 83 percent of Americans identify with a religious denomination, most say that religion is “very important” in their lives and according to Sam Harris 44 percent “of the American population is convinced that Jesus will return to judge the living and the dead sometime in the next fifty years.” In fact, a recent study even showed that atheists are largely seen as untrustworthy compared to Christian and Muslims.

Why does the United States, one the most economically developed countries in the world, deviate from the correlation between religion and wealth? One answer is that trends always contain outliers. As Nigel Barber explains in an article: “The connection between affluence and the decline of religious belief is as well-established as any such finding in the social sciences…. [and] no researcher ever expects every case to fit exactly on the line… If they did, something would be seriously wrong.”

Whatever the reasons, a recent article by David Campbell and Robert Putnam suggests that Americans are catching up to their non-believing European counterparts. According to Campbell and Putnam, the number of “nones” – those who report no religious affiliation – has dramatically increased in the last two decades. “Historically,” Campbell and Putnam explain, “this category made up a constant 5-7 percent of the American population… in the early 1990s, however, just as the God gap widened in politics, the percentage of nones began to shoot up. By the mid-1990s, nones made up 12 percent of the population. By 2011, they were 19 percent. In demographic terms, this shift was huge.”

A study by Daniel Mochon, Michael Norton and Dan Ariely bodes well with this observation. They discovered that, “while fervent believers benefit from their involvement, those with weaker beliefs are actually less happy than those who do not ascribe to any religion-atheists and agnostics.” It’s possible the “nones” Campbell and Putnam speak of are motivated to abandon their belief by a desire to be happier and less conflicted with their lives. This might be too speculative, but there are plenty of stories, especially in the wake of the New Atheist movement, of people who describe their change of faith as a dramatic improvement for their emotional life. In a recent interview with Sam Harris, for example, Tim Prowse, a United Methodist pastor for almost 20 years, described leaving his faith as a great relief. “The lie was over, I was free,” he said, “…I’m healthier now than I’ve been in years and tomorrow looks bright.”

What does this say about the future of atheism? Hitchens and others suggest that a standoff between believers and non-believers may be inevitable. “It’s going to be a choice between civilization and religion,” he says. However, grandiose predictions about the future of the human race are almost always off the mark, and it’s likely that the decline in religion will remain slow and steady. It’s important to keep in mind that this decline is a recent phenomena. It wasn’t until the 17th century, the so-called Age of Reason, when writers, thinkers and some politicians began to insist that societies are better off when they give their citizens the political right to communicate their ideas. This was a key intellectual development, and in context to the history of civilization, very recent.

To be sure, radical ideologies will always exist; religion, Marx suggested, is the opiate of the people. But the trend towards empiricism, logic and reason is undeniable and unavoidable. Titles including God Is Not Great and The God Delusion are bestsellers for a reason. And if Prowse’s testimony as well as Campbell and Putnam’s data are indicative, there is a clear shift in the zeitgeist.

What Conspiracy Theories Teach Us About Reason

Conspiracy theories are tempting. There is something especially charming about a forged moon landing or government-backed assassination. Christopher Hitchens called them “the exhaust fumes of democracy.” Maybe he’s right: cognitive biases, after all, feast on easy access to information and free speech.

Leon Festinger carried out the first empirical study of conspiracy theorists. In 1954 the Stanford social psychologist infiltrated a UFO cult that was convinced the world would end on December 20th. In his book When Prophecy Fails, Festinger recounts how after midnight came and went, the leader of the cult, Marian Keech, explained to her members that she received a message from automatic writing telling her that the God of Earth decided to spare the planet from destruction. Relieved, the cult members continued to spread their doomsday ideology.

Festinger coined the term cognitive dissonance to describe the psychological consequences of disconfirmed expectations. It is a “state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent,” as two authors describe it, “the more committed we are to a belief, the harder it is to relinquish, even in the face of overwhelming contradictory evidence.”

Smokers are another a good example; they smoke even though they know it kills. And after unsuccessfully quitting, they tend to say that, “smoking isn’t that bad,” or that, “it’s worth the risk.” In a related example doctors who preformed placebo surgeries on patients with osteoarthritis of the knee “found that patients who had ‘sham’ arthroscopic surgery reported as much relief… as patients who actually underwent the procedure.” Many patients continued to report dramatic improvement even after surgeons told them the truth.

A recent experiment by Michael J. Wood, Karen M. Douglas and Robbie M. Sutton reminds us that holding inconsistent beliefs is more the norm than the exception. The researchers found that “mutually incompatible conspiracy theories are positively correlated in endorsement.” Many subjects, for example, believed Princess Diana faked her own death and was killed by a rogue cell of British Intelligence, or that the death of Osama bin Laden was a cover-up and that he is still alive. The authors conclude that many participants showed “a willingness to consider and even endorse mutually contradictory accounts as long as they stand in opposition to the officially sanctioned narrative.”

The pervasiveness of cognitive dissonance helps us explain why it sometimes takes societies several generations to adopt new beliefs. People do not simply change their minds; especially when there is a lot on the line. It took several centuries for slavery to be universally banned (Mauritania was the last country to do so in 1981). In the United States civil rights movements for women and African-Americans lasted decades. Same-sex marriage probably won’t be legal in all 50 states for several more years. Our propensity to hold onto cherished beliefs also pervades science. As Max Plank said, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Are there ways to dilute the negative effects of cognitive dissonance? I’m afraid that the Internet is part of the problem. Google makes it so easy for us to find something that confirms a belief. But it is also part of the solution. History tells us that cooperation and empathy between individuals, institutions and governments increases as the exchange of information becomes easier. From the printing press to Uncle Tom’s cabin and through the present day (where social networks are the primary means of communication for so many) people tend to consider a point of view other than their own the more they are exposed to other perspectives.

Steven Pinker captures this point well in his latest book: “As literacy and education and the intensity of public discourse increase, people are encouraged to think more abstractly and more universally. That will inevitably push in the direction of a reduction of violence. People will be tempted to rise above their parochial vantage points – that makes it harder to privilege one’s own interest over others.” It shouldn’t come as a surprise then, that the rise of published books and literacy rates preceded the Enlightenment, an era that was vital in the rise of human rights.

This brings me back to Hitchen’s quote. Indeed, a byproduct of democracy is the tendency for some people to believe whatever they want, even in the face of overwhelming contradictory evidence. However, Pinker reminds us that democracy is helping to relieve our hardwired propensity to only look for what confirms our beliefs. That our confirmation biases are innate suggests that they will never disappear, but the capacity to reason facilitated by the exchange of information paints an optimistic future.

Why Atheists Should Be Allowed To Cherry Pick From Religion

Ever since Darwin published Origins, Nietzsche declared the death of God and Hitchens argued that religion poisons everything, atheists have struggled with atheism. Some deny the supernatural but are “spiritual;” some deny the historical credibility of the scripture, Torah or Quran but value their principles; some don’t believe in anything that cannot be explained by science yet maintain that humans possess an intangible essence or that there is an afterlife. I’ve even met folks who call themselves “atheists who believe in God.”

It’s easy to understanding said beliefs as inconsistent or incompatible; how can someone both believe and not believe in God? Be scientific and religious? This attitude ignores a truth that doesn’t get said enough: atheism is diverse.

The repetitive and attention grabbing debates between fundamentalists and non-believers are one reason this is forgotten. It’s easy to assume that only two opinions exist when searching “atheism” on YouTube or Google returns talks and articles from only William Lane Craig or Christopher Hitchens.

But most atheists know that the worldview of the fundamentalist and staunch non-believer inaccurately portrays religious belief as black and white. These more mainstream atheists know that there is a fairly large middle ground where religion and atheism can exist simultaneously to promote human flourishing. Religious people can believe in natural selection and be pro-choice even though many texts suggest otherwise while atheists have no problem being moral and giving to charity even though they never went to Sunday school.

When it comes to scientific claims, Hitchens and Dawkins are right: the world wasn’t created in a few days; natural selection is an observable phenomenon; God probably doesn’t exist; one can be moral without religion. But when it comes to how we ought to behave and what we ought to value the great religious texts got a few things correct. The problem is that hardcore atheists don’t let the mainstream cherry pick the good parts of religion without criticizing them for being inconsistent or intellectually lazy. We have to allow atheism to incorporate those religious practices and principles that we know contribute to human flourishing.

My conviction is not only a reminder that atheism is more diverse than some make it out to be, but also that atheism can be improved if it considers the right religious themes.

In a recent TED lecture Alain de Botton assumes a similar position. He explains:

I am interested in a kind of constituency that thinks something along these lines… I can’t believe in any of this stuff. I can’t believe in the doctrines… but – and this is a very important but – I love Christmas carols! I really like the art of Mantegna, I really like looking at old churches and I really like learning the pages of the Old Testament. Whatever it may be you know the kind of thing I am talking about: people who are attracted to the ritualistic side, the moralistic communal side of religion but can’t bear the doctrine. Until now these people have faced an unpleasant choice: either accept the doctrine and have all the nice stuff or reject the doctrine and live in a spiritual wasteland…  I don’t think we have to make that choice… there’s nothing wrong with picking and mixing, with taking out the best sides of religion. To me atheism 2.0 is about a respectful and impious way going through religions and saying what could we use. The secular world is full of holes… a thorough study of religion can give us all sorts of insights into areas of life that are not going too well.

The good news is, I think, most people agree. The problem is that they don’t get the coverage.

At the risk of stating the obvious, let’s remember that knowing how to live the best possible life requires both humanistic ideals as well as ideals from many of the great religions. As Jonathan Haidt concludes his enjoyable book The Happiness Hypothesis, “by drawing on wisdom that is balanced – ancient and new, Eastern and Western, even liberal and conservative – we can choose directions in life that will lead to satisfaction, happiness, and a sense of meaning.”

What Made Christopher Hitchens Great

Christopher Hitchens was a man who called Mother Teresa a, “lying, thieving Albanian dwarf,” when most of the world worshipped her as a saint. He said it was a, “shame that there is no hell for [Jerry] Falwell to go to” days after Falwell’s death. He labeled Henry Kissinger a “murder conspirator and war criminal,” Sarah Palin a “national disgrace,” and back in the 1980s said that, “Reagan is doing to the country what he can no longer do to his wife.” He might be most remembered for believing that organized religion is “the main source of hatred in the world.” Calling him provocative would be an understatement.

Yet, Hitchens was not provocative just for the heck of it, his deliberations maintained a steadfast allegiance to rational thought. He was not an ideological talking head either, he was motivated by the truth and strictly defended the values of the Enlightenment. This is why it is incorrect to label Hitchens as the atheist’s version of religious fundamentalists. Unlike the fundamentalists, the claims Hitchens made were researched, grounded in facts and science and remarkably informed by a deep understanding of history and literature. His intellectual palette far exceeded those of religious demagogues.

One reason the United States Congress and presidency have such low approval ratings is because the individuals who comprise them are either unwilling or unable to participate in discussions that are honest in their content, clear and logical in their presentation and true in regard to the facts. In other words, they are unequivocally non-Hitchen. A large part of the atheist movement, which Hitchens was the face of, is the advocacy of reason at the individual and institutional level. Along with rightfully dispelling irrational thinking motivated by ungrounded religious beliefs, God Is Not Great is a book that will continue to further this process. Hopefully lawmakers and laypeople will read it as a call for society to be more reasoned and anchored by what is true.

I hope that we will remember Hitchens as we remember great artists. Like Stravinsky, who induced a riot with his paradigm shifting Rite of Spring, Dylan, who was called “Judas” for going electric, Picasso, who portrayed women the way he did, or Warhol, who treated art like Ford built the Model-T, Hitchen’s opinions and rhetoric replaced the expected with the unexpected while maintaining a coherent structure and clear vision; he didn’t change the medium of journalism just like Stravinsky, Dylan, Picasso or Warhol didn’t change music or visual art per se, but he did offer ways of understanding the world that nobody else had done previously. Like the groundbreaking artists of the past, his initial rejections and ongoing diatribe with mainstream thought eventually gave rise to new definitions of what is good or normal and what ought to be valued.

This is what made Hitchens great: like the Parisians who were so taken back by the dissonance of Stravinsky’s ballet, Hitchens’ critics so desperately wanted his inharmonious rhetoric and writings to resolve but knew, at the same time, that the tension he created and the chords he struck were why he will be remembered as one of the best.

%d bloggers like this: