Skip to content

Posts tagged ‘when prophecy fails’

What Conspiracy Theories Teach Us About Reason

Conspiracy theories are tempting. There is something especially charming about a forged moon landing or government-backed assassination. Christopher Hitchens called them “the exhaust fumes of democracy.” Maybe he’s right: cognitive biases, after all, feast on easy access to information and free speech.

Leon Festinger carried out the first empirical study of conspiracy theorists. In 1954 the Stanford social psychologist infiltrated a UFO cult that was convinced the world would end on December 20th. In his book When Prophecy Fails, Festinger recounts how after midnight came and went, the leader of the cult, Marian Keech, explained to her members that she received a message from automatic writing telling her that the God of Earth decided to spare the planet from destruction. Relieved, the cult members continued to spread their doomsday ideology.

Festinger coined the term cognitive dissonance to describe the psychological consequences of disconfirmed expectations. It is a “state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent,” as two authors describe it, “the more committed we are to a belief, the harder it is to relinquish, even in the face of overwhelming contradictory evidence.”

Smokers are another a good example; they smoke even though they know it kills. And after unsuccessfully quitting, they tend to say that, “smoking isn’t that bad,” or that, “it’s worth the risk.” In a related example doctors who preformed placebo surgeries on patients with osteoarthritis of the knee “found that patients who had ‘sham’ arthroscopic surgery reported as much relief… as patients who actually underwent the procedure.” Many patients continued to report dramatic improvement even after surgeons told them the truth.

A recent experiment by Michael J. Wood, Karen M. Douglas and Robbie M. Sutton reminds us that holding inconsistent beliefs is more the norm than the exception. The researchers found that “mutually incompatible conspiracy theories are positively correlated in endorsement.” Many subjects, for example, believed Princess Diana faked her own death and was killed by a rogue cell of British Intelligence, or that the death of Osama bin Laden was a cover-up and that he is still alive. The authors conclude that many participants showed “a willingness to consider and even endorse mutually contradictory accounts as long as they stand in opposition to the officially sanctioned narrative.”

The pervasiveness of cognitive dissonance helps us explain why it sometimes takes societies several generations to adopt new beliefs. People do not simply change their minds; especially when there is a lot on the line. It took several centuries for slavery to be universally banned (Mauritania was the last country to do so in 1981). In the United States civil rights movements for women and African-Americans lasted decades. Same-sex marriage probably won’t be legal in all 50 states for several more years. Our propensity to hold onto cherished beliefs also pervades science. As Max Plank said, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Are there ways to dilute the negative effects of cognitive dissonance? I’m afraid that the Internet is part of the problem. Google makes it so easy for us to find something that confirms a belief. But it is also part of the solution. History tells us that cooperation and empathy between individuals, institutions and governments increases as the exchange of information becomes easier. From the printing press to Uncle Tom’s cabin and through the present day (where social networks are the primary means of communication for so many) people tend to consider a point of view other than their own the more they are exposed to other perspectives.

Steven Pinker captures this point well in his latest book: “As literacy and education and the intensity of public discourse increase, people are encouraged to think more abstractly and more universally. That will inevitably push in the direction of a reduction of violence. People will be tempted to rise above their parochial vantage points – that makes it harder to privilege one’s own interest over others.” It shouldn’t come as a surprise then, that the rise of published books and literacy rates preceded the Enlightenment, an era that was vital in the rise of human rights.

This brings me back to Hitchen’s quote. Indeed, a byproduct of democracy is the tendency for some people to believe whatever they want, even in the face of overwhelming contradictory evidence. However, Pinker reminds us that democracy is helping to relieve our hardwired propensity to only look for what confirms our beliefs. That our confirmation biases are innate suggests that they will never disappear, but the capacity to reason facilitated by the exchange of information paints an optimistic future.