Skip to content

Posts tagged ‘cognitive dissonance’

What Conspiracy Theories Teach Us About Reason

Conspiracy theories are tempting. There is something especially charming about a forged moon landing or government-backed assassination. Christopher Hitchens called them “the exhaust fumes of democracy.” Maybe he’s right: cognitive biases, after all, feast on easy access to information and free speech.

Leon Festinger carried out the first empirical study of conspiracy theorists. In 1954 the Stanford social psychologist infiltrated a UFO cult that was convinced the world would end on December 20th. In his book When Prophecy Fails, Festinger recounts how after midnight came and went, the leader of the cult, Marian Keech, explained to her members that she received a message from automatic writing telling her that the God of Earth decided to spare the planet from destruction. Relieved, the cult members continued to spread their doomsday ideology.

Festinger coined the term cognitive dissonance to describe the psychological consequences of disconfirmed expectations. It is a “state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent,” as two authors describe it, “the more committed we are to a belief, the harder it is to relinquish, even in the face of overwhelming contradictory evidence.”

Smokers are another a good example; they smoke even though they know it kills. And after unsuccessfully quitting, they tend to say that, “smoking isn’t that bad,” or that, “it’s worth the risk.” In a related example doctors who preformed placebo surgeries on patients with osteoarthritis of the knee “found that patients who had ‘sham’ arthroscopic surgery reported as much relief… as patients who actually underwent the procedure.” Many patients continued to report dramatic improvement even after surgeons told them the truth.

A recent experiment by Michael J. Wood, Karen M. Douglas and Robbie M. Sutton reminds us that holding inconsistent beliefs is more the norm than the exception. The researchers found that “mutually incompatible conspiracy theories are positively correlated in endorsement.” Many subjects, for example, believed Princess Diana faked her own death and was killed by a rogue cell of British Intelligence, or that the death of Osama bin Laden was a cover-up and that he is still alive. The authors conclude that many participants showed “a willingness to consider and even endorse mutually contradictory accounts as long as they stand in opposition to the officially sanctioned narrative.”

The pervasiveness of cognitive dissonance helps us explain why it sometimes takes societies several generations to adopt new beliefs. People do not simply change their minds; especially when there is a lot on the line. It took several centuries for slavery to be universally banned (Mauritania was the last country to do so in 1981). In the United States civil rights movements for women and African-Americans lasted decades. Same-sex marriage probably won’t be legal in all 50 states for several more years. Our propensity to hold onto cherished beliefs also pervades science. As Max Plank said, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

Are there ways to dilute the negative effects of cognitive dissonance? I’m afraid that the Internet is part of the problem. Google makes it so easy for us to find something that confirms a belief. But it is also part of the solution. History tells us that cooperation and empathy between individuals, institutions and governments increases as the exchange of information becomes easier. From the printing press to Uncle Tom’s cabin and through the present day (where social networks are the primary means of communication for so many) people tend to consider a point of view other than their own the more they are exposed to other perspectives.

Steven Pinker captures this point well in his latest book: “As literacy and education and the intensity of public discourse increase, people are encouraged to think more abstractly and more universally. That will inevitably push in the direction of a reduction of violence. People will be tempted to rise above their parochial vantage points – that makes it harder to privilege one’s own interest over others.” It shouldn’t come as a surprise then, that the rise of published books and literacy rates preceded the Enlightenment, an era that was vital in the rise of human rights.

This brings me back to Hitchen’s quote. Indeed, a byproduct of democracy is the tendency for some people to believe whatever they want, even in the face of overwhelming contradictory evidence. However, Pinker reminds us that democracy is helping to relieve our hardwired propensity to only look for what confirms our beliefs. That our confirmation biases are innate suggests that they will never disappear, but the capacity to reason facilitated by the exchange of information paints an optimistic future.

Psychology’s Treacherous Trio: Confirmation Bias, Cognitive Dissonance, and Motivated Reasoning

In 2009, a nine year-old Brazilian girl became pregnant with twins after being raped by her stepfather. With advice from doctors, her mother opted for her to have an abortion. After pleading with Brazil, which outlaws abortions except when the mother’s life is in danger or when she has been raped, her daughter was granted one. Then things got really ugly. When the Archbishop of the city of Recife heard the news he invoked Canon law and excommunicated the mother and daughter and the members of the medical team who performed the abortion; the stepfather, meanwhile, remained a loyal and accepted member of the church.

Was it right for the girl to have an abortion? Was the Archbishop correct to condemn her, the mother, and the medical team? And what of Brazil’s stance on the matter?

We’ve heard these debates fleshed out countless times, and almost always to no avail. Far more interesting (and quantifiable) are the psychological forces that fuel these conversations. While many like to believe that they have a special access to the truth, the reality is that we all see the world not as it is, but as we want it to be: Republicans watch Fox while Democrats watch MSNBC; creationists see fossils as evidence of God, evolutionary biologists see fossils as evidence of evolution; a mother sees abortion as the best thing for her daughter, and the church sees it as unholy and sinful. You get the point – our beliefs dictate what we see and how we see.

The question is: why do humans remain so steadfast to their beliefs, sometimes even in the face of overwhelming opposing evidence?

The answer rests in a few psychological tendencies that when mixed together form a potent recipe for ignorance. The first is confirmation bias, which I wrote about last month over at Scientificamerican.com. Confirmation bias is exactly what it sounds like – the propensity for people to look for what confirms their beliefs and ignore what contradicts their beliefs while not being concerned for the truth.

The classic confirmation bias study comes from Stanford back in the late 1970s. Researchers brought in two groups of participants, one that supported capital punishment and one that opposed capital punishment. Both groups read two studies, “one seemingly confirming and one seemingly disconfirming their existing beliefs about the deterrent efficacy of the death penalty.” After reading the studies and other commentary, all of which were fake, researchers found that the proponents and opponents of capital punishment rated the studies that confirmed their point of view as higher than the studies that disconfirmed their point of view. Sadly, as the authors conclude, “people of opposing views can each find support for those views in the same body of evidence.”

Then there’s cognitive dissonance, which describes a “state of tension that occurs whenever a person holds two cognitions that are psychologically inconsistent.” Leon Festinger introduced it in 1957 after he infiltrated and studied a UFO cult convinced the world would end at midnight on December 21st, 1954. In his book When Prophecy FailsFestinger recounts how after midnight came and went, cult members began to look for reasons for why the end of the world had not come. Eventually the leader of the cult, Marian Keech, explained to her members that she received a message from automatic writing, which told her that the God of Earth decided to spare the planet from destruction. Relieved, the cult members continued to spread their doomsday ideology to non-believers. Although Festiner’s example is extreme, all of us do this everyday. Take unhealthy food; we all know that pizza is bad for us, but we still eat it. And after finishing a few slices we say “it was worth it,” or “I’ll run it off tomorrow.” Or take smokers; they know that smoking kills but continue to smoke. And after unsuccessfully quitting, they justify their failures by claiming that, “smoking isn’t that bad” or that “it is worth the risk.” Whether it’s UFO’s, food, or smoking we all hold inconsistent beliefs and almost always side with what is most comfortable instead of what is true.

Finally, there’s motivated reasoning, which describes our tendency to accept what we want to believe with much more ease and much less analysis than what we don’t want to believe. In one study done by Ziva Kunda, participants were brought into a room and told that they’ll be playing a game. Before the game started, they were instructed to watch someone else play the game who will either compete with them or against them. However, Kunda rigged the study; the participants actually watched a confederate, who played the game perfectly answering every question correct. Kunda found that the participants who were lined up to play against the confederate were dismissive and tended to attribute his accuracy to luck whereas the participants who were lined up to play with the confederate were praiseworthy of his “skills.” Both groups saw the same performance yet came to exact opposite conclusions. Clearly, we scrutinize much less when things go our way.

So what’s the difference between confirmation bias, cognitive dissonance, and motivated reasoning? The short answer is that there really aren’t any differences. Generally speaking, they serve the same purpose, and that is to frame the world so it makes sense to us. But there are a few nuances worth mentioning. For one, motivated reasoning is like an evil twin to cognitive dissonance in that it tries to avoid it. And for another, and I quote NYU psychologist Gary Marcus who says it perfectly, “whereas confirmation bias is an automatic tendency to notice data that fit with our beliefs, motivated reasoning is the complementary tendency to scrutinize ideas more carefully if we don’t like them than if we do.”

Back to Brazil.

People don’t change their minds – just the opposite in fact. Brains are designed to filter the world so we don’t have to question it. While this helps us survive, it’s a subjective trap; by only seeing the world as we want to, our minds narrow and it becomes difficult to understand opposing opinions. This helps explain the conflict in Recife. When we only look for what confirms our beliefs (confirmation bias), only side with what is most comfortable (cognitive dissonance) and don’t scrutinize contrary ideas (motivated reasoning) we impede social, economic, and academic progress. I am not sympathetic to the Archbishop by any degree, but when we consider how effortless it is for people to latch onto ideas it is easier to understand why he took such a harsh and unchanging stance.

Read more

How to Explain the Disaster at Tenerife

On March 27, 1977 the deadliest disaster in aviation history took place on the Spanish island of Tenerife. In the midst of take off, going approximately 160 mph, KLM flight 4805 collided with Pan Am flight 1739 half way down the runway, killing 583 people. The KLM captain was Jacob Van Zanten, KLM’s chief flight instructor who had just returned from a six month safety course for commercial pilots. The subsequent investigation concluded that Van Zanten took off without clearance, thereby causing the crash. How could such a credited and experienced pilot make such a catastrophic mistake?

The events that preceded the accident were a recipe for disaster. A terrorist bomb had exploded at Gran Canaria International Airport, forcing several planes to divert to Tenerife, a small airport not used to handling large commercial jets. The control tower was understaffed, their English was weak, and a heavy fog had set it that prevented Van Zanten and his crew from seeing no more than 300 meters. All of these inputs contributed to Van Zanten making the fateful decision to takeoff without permission from the control tower.

The accident was also very preventable. Van Zanten could have doubled checked with the control tower or waited for the fog to lift. However, his emotions got the best of him and his lack of patience cost him his life, and the lives of others. An expert with years of experience makes a rookie mistake and turns out to be flat-out wrong. Why?

It turns out that there are a lot of answers to this question (mistakes and errors, especially those having to do with the aviation business, are hot topics in the popular psychologist literature), and I have seen the Tenerife disaster comes up in three books: The Invisible Gorilla (p. 20), Being Wrong (p. 303), and Sway (p. 10-24). While Invisible Gorilla and Being Wrong mention Tenerife anecdotally, Sway spends several pages explaining the anatomy of the disaster with three principles:

Loss aversion (our tendency to go to great lengths to avoid possible losses), value attribution (our inclination to imbue a person or thing with certain qualities based on initial perceived value), and the diagnosis bias (our blindness to all evidence that contradicts our initial assessment of a person or situation).

Sway’s explanations seem good enough, but it bothers me to see something like an airline disaster be explained by a few psychological principles. In isolation, each of the three principles make sense and have been empirically demonstrated a number of times. However, when it comes to something much more complex, like an airline disaster involving a huge number of inputs, I am skeptical of the explanatory power of a few psychological tendencies. In other words, aren’t there more forces at work than loss aversion, value attribution, and the diagnosis bias?

What about confirmation bias – the tendency to look for what confirms our beliefs and to ignore what contradicts our beliefs while disregarding the truth.You could say that in the minutes before van Zanten took off he only looked for indications of a safe take-off and ignored indications of a dangerous take-off.

Then there is cognitive dissonance – the tendency to hold on to an erroneous belief in the face of overwhelming contradictory evidence i.e., doomsdayers. You could also say that as van Zanten became more committed to taking off, it became increasingly difficult for him to change his mind.

Could there be more? Or are we missing something?

The point I am driving at here is similar to the one I made a few posts ago regarding Joshua Bell. That is, what does it mean for psychology to explain real-world phenomena? Put differently, what does it mean for something to be “explained” or “understood?” (and keep in mind van Zanten wouldn’t be able to help us nearly as much as you think, self-reports are almost never accurate.) I don’t know; but it is important that the popular psychology literature doesn’t get too gung-ho with their psychological explanations. 

Read more