Skip to content

Posts tagged ‘Noam Chomsky’

A Brief History of Popular Psychology: An Essay

It is unclear when the popular psychology movement started, perhaps with Malcolm Gladwell’s The Tipping Point or Steven Levitt and Stephen Dubner’s Freakonomics, or how it is defined, but it could be generally described by the public’s growing interest in understanding people and events from a sociological, economical, psychological, or neurological point of view.

Over the last decade the New York Times bestseller list has seen a number of these books: Ariely’s Predictably Irrational (2008) and The Upside of Rationality (2010), Gilbert’s Stumbling on Happiness (2006), Haidt’s The Happiness Hypothesis (2006), Lehrer’s How we Decide (2009), and Thaler & Sunstein’s Nudge (2008). What unites them is their attempt to “explore the hidden side of everything,” by synthesizing numerous academic studies in a relatable way, drawing upon interesting real-world examples, and by providing appealing suggestions for how one can understand the world, and his or her decisions and behaviors within the world, better.

The popular psychology movement is the result of a massive paradigm shift, what many call the cognitive revolution, that took place in the second half of the 20th century. Although it’s starting point is unclear, George A. Miller’s 1956 “The Magical Number Seven, Plus or Minus Two,” and Noam Chomsky’s 1959 “Review B. F. Skinner’s Verbal Behavior,” were, among others, important publications that forced psychology to become increasingly cognitive. Whereas behaviorists – who represented the previous paradigm – only considered the external, those involved in the cognitive revolution sought to explain behavior by studying the internal; the cause of behavior was therefore thought of as being dictated by the brain and not the environment.

The cognitive revolution naturally gave rise to the cognitive sciences – neuroscience, linguistics, artificial intelligence, and anthropology – all of which began to study how human brains processed information. A big part of the revolution revolved around the work done by psychologists Daniel Kahneman and Amos Tversky. Kahneman and Tversky developed a cognitive bias and heuristic program in the early 1970s that changed the way human judgment was understood. The heuristics and biases program had two goals. First, it demonstrated that the mind has a series of mental shortcuts, or heuristics, that “provide subjectively compelling and often quite serviceable solutions to… judgmental problems.” And second, it suggested that underlying these heuristics were biases that “[departed from] normative rational theory.”

Kahneman and Tversky’s work was vital because it questioned the notion that judgment was an extensive exercise based off of algorithmic processes. Instead, it suggested that people’s decisions and behaviors are actually influenced by “simple and efficient… [and] highly sophisticated… computations that the mind had evolved to make.”

Their work was complimented by Richard Nisbett and Lee Ross’s 1980 book Human Inference: Strategies and Shortcomings of Social Judgment, which outlined how people’s “attempts to understand, predict, and control events in their social sphere are seriously compromised by specific inferential shortcomings.” From this, a list of cognitive biases began to accumulate. These included: attentional bias, confirmation bias, the endowment effect, status quo bias, gambler’s fallacy, the primacy effect, and more.

The cognitive biases and heuristic program was just one part of the cognitive revolution however. The other equally important aspects came a bit later when psychologists began to empirically study how unconscious processing influenced behavior and conscious thought. These studies stemmed from the 1977 paper Telling More Than We Can Know: Verbal Reports on Mental Processes, by Richard Nisbett and Timothy Wilson. Nisbett and Wilson argued that, “there may be little or no direct introspective access to higher order cognitive processes,” thereby introducing the idea that most cognition takes place automatically at the unconscious level.

Wilson continued his research in the 80s and 90s, eventually developing the concept of the “adaptive unconscious,” a term he uses to describe our ability to “size up our environments, disambiguate them, interpret them, and initiate behavior quickly and non-consciously.” He argued that the adaptive unconscious is an evolutionary adaptation used to navigate the world with a limited attention. This is why we are able to drive a car, type on a computer, or walk without having to think about it.

Complimenting Wilson was Yale psychologist Jon Bargh who significantly contributed to the study of how certain stimulus influenced people’s implicit memory and behavior. In numerous experiments, Bargh demonstrated that people’s decisions and behaviors are greatly influenced by how they are “primed”. In one case, Bargh showed the people primed with rude words, such as “aggressively, bold, and, intrude,” were on average about 4 minutes quicker to interrupt an experimenter than participants who were primed with the polite words such as “polite, yield, and sensitively.”

Also in the 80s and 90s, neuroscientists began to understand the role of emotion in our decisions. In the 1995 book Descartes Error, Antonio Damasio explicates the “Somatic Markers Hypothesis” to suggest that, contrary to traditional western thought, a “reduction in emotion may constitute an equally important source of irrational behavior.” NYU professor Joseph LeDoux was also instrumental in studying emotions. Like Wilson, Nisbett, and Bargh, LeDoux advocated that an understanding of conscious emotional states required an understanding of “underlying emotional mechanisms.”

Along with emotion and the unconscious, intuition was another topic that was heavily researched in the past few decades. It was identified and studied as a way of thinking and as a talent. As a way of thinking, intuition more or less corresponds to Wilson’s adaptive unconscious; it is an evolutionary ability that helps people effortlessly and unconsciously disambiguate the world; i.e., the ability for people to easily distinguish males from females, their language from another, or danger from safety.

Intuition as a talent was found to be responsible for a number of remarkable human capabilities, most notably those of experts. As Malcolm Gladwell says in his 2005 best seller Blink, intuitive judgments, “don’t logically and systemically compare all available options.” Instead, they act off of gut feelings and first impressions that cannot be explained rationality. And most of the time, he continues, acting on these initial feelings is just as valuable as acting on more “thought out” feelings.

By the 1990s, when the “revolution in the theory of rationality… [was] in full development,” the line between rational and irrational behavior became blurred as more and more studies made it difficult to determine what constituted rational behavior. One on hand, some (mainly economists) maintained rationality as the norm even though they knew that people deviated from it. On the other hand, individuals like Herbert Simon and Gerd Gigerenzer argued that the standards for rational behavior should be grounded by ecological and evolutionary considerations. In either case though, rational choice theory was what was being argued. Because of this, the 1990s saw books such as Stuart Sutherland’s Irrationality (1994), Massimo Piattelli-Palmarini’s Inevitable Illusions: How Mistakes of Reason Rule Our Mind (1996), and Thomas Gilovich’s How We Know What Isn’t: The Fallibility of Human Reason in Everyday Life (1991). Each perpetuated that idea that behavior or decision-making was to be judged by a certain standard or norm (in this case, rational choice theory) as the titles imply.

However, when all of the facets of the cognitive revolution – cognitive biases and heuristics, the unconscious, emotion, and intuition – are considered, the idea that we act rationally begins to look extremely weak; this observation has heavily influenced the popular psychology movement. Pick up any popular psychology book and you will find Kahneman, Tversky, Nisbett, Wilson, Bargh, Damasio, Ledoux, and others heavily cited in arguments that run contrary to rational actor theory.

What’s interesting, and my last post touched on this, is that each popular psychology author has something different to say: Dan Ariely pushes behavioral economics to argue that we are all predictably irrational; Damasio argues that reason requires emotion; Gladwell, David Myers, and Wilson suggest that mostly thought is unconscious and our intuitive abilities are just as valuable as our rational ones; Daniel Gilbert and Jonathan Haidt illustrate how our cognitive limitations affect our well-being; Barry Schwartz shows how too much choice can actually hurt us; and Jonah Lehrer draws upon neuroscience to show the relationship between emotion and reason in our decision-making.

As a result of all these assertions, the human condition has become seriously complicated!

If there is something to conclude from what I have outlined it is this. Implicit in any evaluation of behavior is the assumption that human beings have a nature or norm, and that their behavior is deviating from this nature or norm. However, the popular psychology movement shows that our brains are not big enough to understand human behavior and our tendency to summarize it so simplistically is a reflection of this. We aren’t rational, irrational, or intuitive, we are, in the words of K$sha, who we are. 

The Evolution of Water

In a recent TedTalk, MIT cognitive scientist Deb Roy gave a presentation entitled “The Birth of a Word,” based on a remarkable study he conducted over the last three years. Roy’s study was motivated by his interest in language acquisition. Specifically, he wanted to know how infants learned individual words over the course of their development. To do so, Roy wired his house with audio/video equipment and  recorded 90,ooo hours of video and 140,000 hours of audio (200 terabytes) to track how his son acquired English. Here is a clip that compiles all the instances from when his son tried to say the word “water,” over a six month stretch (listen for its gradual yet sudden transition from “gaga” to “water”).

Roy’s work is part of a growing body of research that tries to understand language acquisition, which is a hotly debated topic still in its infancy. In my last post I briefly touched on it by explaining why language is so metaphoric and interconnected. However, I realize that if we want to understand how language is acquired it is fruitless to just study syntax. It is like trying to understand planetary motion without mathematical equations; it’s easy to say that planets travel around the sun, but it entirely different explain why they do so. 

Unfortunately, there isn’t a Kepler for language acquisition, so I can’t offer anything novel here. However, it is worth contextualizing two contemporary theories that the language acquisition debate has spawned from. The first comes to us from Steven Pinker, who, in his 1997 book The Language Instinct, suggests that the ability to acquire language is geneticHe doesn’t explicitly state that there is a gene for language, “for any grammar gene that exists in every human being, there is currently no way to verify its existence” (322), but he does say that it is, “a distinct piece of the biological makeup of our brains” (18). So he is advocating that language is genetic in the same way that puberty or yawning is; there certainly isn’t “a gene” for either, but they are part of our genetic code nonetheless.

Park of Pinker’s project aims to figure out if all human languages are unified by some sort of “universal grammar,” (as Chomsky calls it) which holds that all “children must innately be equipped with a plan common to the grammars of all languages… that tells them how to distill the syntactic patterns out of the speech of their parents” (Pinker, 22).  The universal grammar debate is key for the language instinct hypothesis because if language really is instinctual, then you would expect it to be manifested similarly regardless of culture. Chomsky and Pinker have gone to great lengths to prove just that, but I will not get into the details for the sake of space (read Pinker’s book if your really interested).

In contrast, Stephen Jay Gould believes that thinking was an exaptation (a trait evolved for one purpose that later serves another pursue) that led to language. In his words, “natural selection made the human brain big, but most of our mental properties and potentials may be spandrels – that is, nonadaptive side consequences of building a device with such structural complexity” (The Pleasure of Pluralism, p.11). Evolutionarily speaking, Gould’s theory fits a bit better than Pinker who is faced with the Wallace problem, which asks how the Neolithic Revolution could have happened if brains had achieved its modern size roughly a million years ago. Simply put, language was too recent a phenomenon to be explained by natural selection, which is a much more gradual process. Gould accounts for this by saying that language is a byproduct of cognitive abilities that already existed.

Who is correct? Like so many things, the Pinker-Gould debate falls victim to dichotomy-fever i.e., our tendency to categorize things as being this or that. But that is not to say it is not helpful. As V.S Ramachandran explains, “neither of them is [correct], although there is a grain of truth to each… the competence to acquire [language] rules is innate, but exposure is needed to pick up the actual rules” (171). Like others, Rama believes that brains are language-ready, but do not have language built-in. This has been referred to as the still unidentified “language acquisition device,” and Ramachandran uses it to resolve Pinker and Gould’s contrasting views.

The next challenge is to understand exactly how genetics and experience work together to acquire language. This is part of Deb Roy’s project, which I think will turn out to be vitally important in the years to come. It is also a goal of many neuroscience labs out there today. And because they are all showing the brain to be more and more dynamic, it appears that an understanding of language acquisition will involve an understanding of the brain from a botton-up perspective. Unfortunately, this is a long ways off.

%d bloggers like this: