Skip to content

Posts tagged ‘rationality’

A Brief History of Popular Psychology: An Essay

It is unclear when the popular psychology movement started, perhaps with Malcolm Gladwell’s The Tipping Point or Steven Levitt and Stephen Dubner’s Freakonomics, or how it is defined, but it could be generally described by the public’s growing interest in understanding people and events from a sociological, economical, psychological, or neurological point of view.

Over the last decade the New York Times bestseller list has seen a number of these books: Ariely’s Predictably Irrational (2008) and The Upside of Rationality (2010), Gilbert’s Stumbling on Happiness (2006), Haidt’s The Happiness Hypothesis (2006), Lehrer’s How we Decide (2009), and Thaler & Sunstein’s Nudge (2008). What unites them is their attempt to “explore the hidden side of everything,” by synthesizing numerous academic studies in a relatable way, drawing upon interesting real-world examples, and by providing appealing suggestions for how one can understand the world, and his or her decisions and behaviors within the world, better.

The popular psychology movement is the result of a massive paradigm shift, what many call the cognitive revolution, that took place in the second half of the 20th century. Although it’s starting point is unclear, George A. Miller’s 1956 “The Magical Number Seven, Plus or Minus Two,” and Noam Chomsky’s 1959 “Review B. F. Skinner’s Verbal Behavior,” were, among others, important publications that forced psychology to become increasingly cognitive. Whereas behaviorists – who represented the previous paradigm – only considered the external, those involved in the cognitive revolution sought to explain behavior by studying the internal; the cause of behavior was therefore thought of as being dictated by the brain and not the environment.

The cognitive revolution naturally gave rise to the cognitive sciences – neuroscience, linguistics, artificial intelligence, and anthropology – all of which began to study how human brains processed information. A big part of the revolution revolved around the work done by psychologists Daniel Kahneman and Amos Tversky. Kahneman and Tversky developed a cognitive bias and heuristic program in the early 1970s that changed the way human judgment was understood. The heuristics and biases program had two goals. First, it demonstrated that the mind has a series of mental shortcuts, or heuristics, that “provide subjectively compelling and often quite serviceable solutions to… judgmental problems.” And second, it suggested that underlying these heuristics were biases that “[departed from] normative rational theory.”

Kahneman and Tversky’s work was vital because it questioned the notion that judgment was an extensive exercise based off of algorithmic processes. Instead, it suggested that people’s decisions and behaviors are actually influenced by “simple and efficient… [and] highly sophisticated… computations that the mind had evolved to make.”

Their work was complimented by Richard Nisbett and Lee Ross’s 1980 book Human Inference: Strategies and Shortcomings of Social Judgment, which outlined how people’s “attempts to understand, predict, and control events in their social sphere are seriously compromised by specific inferential shortcomings.” From this, a list of cognitive biases began to accumulate. These included: attentional bias, confirmation bias, the endowment effect, status quo bias, gambler’s fallacy, the primacy effect, and more.

The cognitive biases and heuristic program was just one part of the cognitive revolution however. The other equally important aspects came a bit later when psychologists began to empirically study how unconscious processing influenced behavior and conscious thought. These studies stemmed from the 1977 paper Telling More Than We Can Know: Verbal Reports on Mental Processes, by Richard Nisbett and Timothy Wilson. Nisbett and Wilson argued that, “there may be little or no direct introspective access to higher order cognitive processes,” thereby introducing the idea that most cognition takes place automatically at the unconscious level.

Wilson continued his research in the 80s and 90s, eventually developing the concept of the “adaptive unconscious,” a term he uses to describe our ability to “size up our environments, disambiguate them, interpret them, and initiate behavior quickly and non-consciously.” He argued that the adaptive unconscious is an evolutionary adaptation used to navigate the world with a limited attention. This is why we are able to drive a car, type on a computer, or walk without having to think about it.

Complimenting Wilson was Yale psychologist Jon Bargh who significantly contributed to the study of how certain stimulus influenced people’s implicit memory and behavior. In numerous experiments, Bargh demonstrated that people’s decisions and behaviors are greatly influenced by how they are “primed”. In one case, Bargh showed the people primed with rude words, such as “aggressively, bold, and, intrude,” were on average about 4 minutes quicker to interrupt an experimenter than participants who were primed with the polite words such as “polite, yield, and sensitively.”

Also in the 80s and 90s, neuroscientists began to understand the role of emotion in our decisions. In the 1995 book Descartes Error, Antonio Damasio explicates the “Somatic Markers Hypothesis” to suggest that, contrary to traditional western thought, a “reduction in emotion may constitute an equally important source of irrational behavior.” NYU professor Joseph LeDoux was also instrumental in studying emotions. Like Wilson, Nisbett, and Bargh, LeDoux advocated that an understanding of conscious emotional states required an understanding of “underlying emotional mechanisms.”

Along with emotion and the unconscious, intuition was another topic that was heavily researched in the past few decades. It was identified and studied as a way of thinking and as a talent. As a way of thinking, intuition more or less corresponds to Wilson’s adaptive unconscious; it is an evolutionary ability that helps people effortlessly and unconsciously disambiguate the world; i.e., the ability for people to easily distinguish males from females, their language from another, or danger from safety.

Intuition as a talent was found to be responsible for a number of remarkable human capabilities, most notably those of experts. As Malcolm Gladwell says in his 2005 best seller Blink, intuitive judgments, “don’t logically and systemically compare all available options.” Instead, they act off of gut feelings and first impressions that cannot be explained rationality. And most of the time, he continues, acting on these initial feelings is just as valuable as acting on more “thought out” feelings.

By the 1990s, when the “revolution in the theory of rationality… [was] in full development,” the line between rational and irrational behavior became blurred as more and more studies made it difficult to determine what constituted rational behavior. One on hand, some (mainly economists) maintained rationality as the norm even though they knew that people deviated from it. On the other hand, individuals like Herbert Simon and Gerd Gigerenzer argued that the standards for rational behavior should be grounded by ecological and evolutionary considerations. In either case though, rational choice theory was what was being argued. Because of this, the 1990s saw books such as Stuart Sutherland’s Irrationality (1994), Massimo Piattelli-Palmarini’s Inevitable Illusions: How Mistakes of Reason Rule Our Mind (1996), and Thomas Gilovich’s How We Know What Isn’t: The Fallibility of Human Reason in Everyday Life (1991). Each perpetuated that idea that behavior or decision-making was to be judged by a certain standard or norm (in this case, rational choice theory) as the titles imply.

However, when all of the facets of the cognitive revolution – cognitive biases and heuristics, the unconscious, emotion, and intuition – are considered, the idea that we act rationally begins to look extremely weak; this observation has heavily influenced the popular psychology movement. Pick up any popular psychology book and you will find Kahneman, Tversky, Nisbett, Wilson, Bargh, Damasio, Ledoux, and others heavily cited in arguments that run contrary to rational actor theory.

What’s interesting, and my last post touched on this, is that each popular psychology author has something different to say: Dan Ariely pushes behavioral economics to argue that we are all predictably irrational; Damasio argues that reason requires emotion; Gladwell, David Myers, and Wilson suggest that mostly thought is unconscious and our intuitive abilities are just as valuable as our rational ones; Daniel Gilbert and Jonathan Haidt illustrate how our cognitive limitations affect our well-being; Barry Schwartz shows how too much choice can actually hurt us; and Jonah Lehrer draws upon neuroscience to show the relationship between emotion and reason in our decision-making.

As a result of all these assertions, the human condition has become seriously complicated!

If there is something to conclude from what I have outlined it is this. Implicit in any evaluation of behavior is the assumption that human beings have a nature or norm, and that their behavior is deviating from this nature or norm. However, the popular psychology movement shows that our brains are not big enough to understand human behavior and our tendency to summarize it so simplistically is a reflection of this. We aren’t rational, irrational, or intuitive, we are, in the words of K$sha, who we are. 

Why We Reason

Last year Huge Mercier and Dan Sperber published an paper in Behavioral and Brain Science that was recently featured in the New York Times and Newsweek. It has since spurred a lot of discussion in the cognition science blogosphere by psychologists and science writers alike. What’s all the hype about?

For thousands of years human rationality was seen as a means to truth, knowledge, and better decision making. However, Mercier and Sperber are saying something different: reason is meant to persuade others and win arguments.

Our hypothesis is that the function of reasoning is argumentative. It is to devise and evaluate arguments intended to persuade…. reasoning does exactly what can be expected of an argumentative device: Look for arguments that support a given conclusion, and, ceteris paribus, favor conclusions for which arguments can be found (2010).

Though Mercier and Sperber’s theory is novel, it is not entirely original. In the western tradition, similar theories of human rationality date back to at least ancient Greece with the Sophists.

Akin to modern day lawyers, the Sophists believed that reason was a tool used for convincing others of certain opinions regardless of them being true or not. They were paid to teach young Athenians rhetoric so they could have, as Plato says in Gorgias, “the ability to use the spoken word to persuade the jurors in the courts, the members of the Council, the citizens attending the Assembly – in short, to win over any and every form of public meeting.”

So why is Mercier and Sperber’s paper seen as groundbreaking if its central idea is thousands of years old? Unlike ancient Greece, Mercier and Sperber have a heap psychological data to support their claims. At the heart of this data is what psychologists call confirmation bias. As the name indicates, confirmation bias is the tendency for people to favor information that conforms to their ideologies regardless of if it is true or not. It explains why democrats would prefer to listen to Bill Clinton over Ronald Reagan, why proponents of gun control are not NRA members, and why individuals who are pro-choice only listen to or read sources that are also pro-choice. In addition, confirmation bias greatly distorts our self-perceptions, namely, it describes why “95% of professors report that they are above average teachers, 96% of college students say that they have above average social skills… [and why] 19% of Americans say that they are in the top 10% of earners.”

If we are to think of rationality as having to do with knowledge or truth, like Socrates, Plato, and Descartes did, confirmation bias is a huge problem. If rationality really was about discovering objective truths, then it seems like confirmations bias would be ripe for natural selection; imagine how smart we would be if we actually listened to opposing opinions and considered how they may be better than ours. Put differently, if the goal of reasoning was really to improve our decisions and beliefs, and find the truth, then there should be no reason for confirmation bias to exist.

Under the Sophist paradigm, however, confirmation bias makes much more sense, as do similar cognitive hiccups such as hindsight biasanchoring, representativeness, the narrative fallacy, and many more. These biases, which began to appear in the psychological literature of the 1960s, provide “evidence, [which] shows that reasoning often leads to epistemic distortions and poor decisions.” And it is from this point that Mercier and Sperber have built their ideas from. Instead of thinking of faulty cognition as irrational as many have, we can now see that these biases are tools that are actually helpful. In a word, with as many opinions as there are people, our argumentative-orientated reasoning does a fine job of  making us seem credible and legitimate. In this light, thank God for confirmation bias!

Of course, there is a down side to confirmation bias and our rationality being oriented towards winning arguments. It causes us to get so married to some ideas – WMD’s in Iraq and Doomsday events, for example – that we end up hurting ourselves in the long wrong. But at the end of the day, I think it is a good thing that our reasoning is so self conforming. Without confirmation bias, we wouldn’t have a sense of right or wrong, which seems to be a necessary component for good things like laws against murder.

Finally, if you were looking to Mercier and Sperber thesis’ to improve your reasoning, you would be missing the point. Inherent in their argument is the idea that our rationality will forever be self-confirming and unconcerned with truth and knowledge. And for better or for worse, this is something we all have to deal with.

%d bloggers like this: