Skip to content

Posts tagged ‘scientific American’

The Irrationality of Irrationality: The Paradox of Popular Psychology

Here’s my latest on ScientificAmerican.com 

In 1996, Lyle Brenner, Derek Koehler and Amos Tversky conducted a study involving students from San Jose State University and Stanford University. The researchers were interested in how people jump to conclusions based on limited information. Previous work by Tversky, Daniel Kahneman and other psychologists found that people are “radically insensitive to both the quantity and quality of information that gives rise to impressions and intuitions,” so the researchers knew, of course, that we humans don’t do a particularly good job of weighing the pros and cons. But to what degree? Just how bad are we at assessing all the facts?

To find out, Brenner and his team exposed the students to legal scenarios. In one, a plaintiff named Mr. Thompson visits a drug store for a routine union visit. The store manager informs him that according to the union contract with the drug store, plaintiffs cannot speak with the union employees on the floor. After a brief deliberation, the manager calls the police and Mr. Thompson is handcuffed for trespassing. Later the charges were dropped, but Mr. Thompson is suing the store for false arrest.

All participants got this background information. Then, they heard from one of the two sides’ lawyers; the lawyer for the union organizer framed the arrest as an attempt to intimidate, while the lawyer for the store argued that the conversation that took place in the store was disruptive. Another group of participants – essentially a mock jury – heard both sides.

The key part of the experiment was that the participants were fully aware of the setup; they knew that they were only hearing one side or the entire story. But this didn’t stop the subjects who heard one-sided evidence from being more confident and biased with their judgments than those who saw both sides. That is, even when people had all the underlying facts, they jumped to conclusions after hearing only one side of the story.

The good news is that Brenner, Koehler and Tversky found that simply prompting participants to consider the other side’s story reduced their bias – instructions to consider the missing information was a manipulation in a later study – but it certainly did not eliminate it. Their study shows us that people are not only willing to jump to conclusions after hearing only one side’s story, but that even when they have additional information at their disposal that would suggest a different conclusion, they are still surprisingly likely to do so. The scientists conclude on a somewhat pessimistic note: “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”

In Brenner’s study, participants were dealing with a limited universe of information – the facts of the case and of the two sides’ arguments. But in reality – especially in the Internet era – people have access to a limitless amount of information that they could consider. As a result, we rely on rules of thumb, or heuristics, to take in information and make decisions. These mental shortcuts are necessary because they lessen the cognitive load and help us organize the world – we would be overwhelmed if we were truly rational.

This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.

But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”

The last several years have seen many popular psychology books that touch on this line of research. There’s Ori and Rom Brafman’s Sway, Dan Ariely’s Predictably Irrational and, naturally, Daniel Kahneman’s Thinking, Fast and Slow. If you could sum up the popular literature on cognitive biases and our so-called irrationalities it would go something like this: we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.

The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.

Tyler Cowen made a similar point in a TED lecture a few months ago. He explained it this way:

There’s the Nudge book, the Sway book, the Blink book… [they are] all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories. And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.

The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful.

To be sure, there’s an important difference between the bias that comes from hearing one side of an argument and (most) narratives. A corrective like “consider the other side” is unlikely to work for narratives because it’s not always clear what the opposite would even be. So it’s useful to avoid jumping to conclusions not only by questioning narratives (after all, just about everything is plausibly a narrative, so avoiding them can be pretty overwhelming), but by exposing yourself to multiple narratives and trying to integrate them as well as you can.

In the beginning of the recently released book The Righteous Mind, social psychologist Jonathan Haidt explains how some books (his included) make a case for how one certain thing (in Haidt’s case, morality) is the key to understanding everything. Haidt’s point is that you shouldn’t read his book and jump to overarching conclusions about human nature. Instead, he encourages readers to always think about integrating other points of view (e.g., morality is the most important thing to consider) with other perspectives. I think this is a good strategy for overcoming a narrow-minded view of human cognition.

It’s natural for us to reduce the complexity of our rationality into convenient bite-sized ideas. As the trader turned epistemologist Nassim Taleb says: “We humans, facing limits of knowledge, and things we do not observe, the unseen and the unknown, resolve the tension by squeezing life and the world into crisp commoditized ideas.” But readers of popular psychology books on rationality must recognize that there’s a lot they don’t know, and they must be beware of how seductive stories are. The popular literature on cognitive biases is enlightening, but let’s be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.

Moving forward, my suggestion is to remember the lesson from Brenner, Koehler and Tversky: they reduced conclusion jumping by getting people to consider the other information at their disposal. So let’s remember that the next book on rationality isn’t a tell-all – it’s merely another piece to the puzzle. This same approach could also help correct the problem of being too swayed by narratives – there are anyways multiple sides of a story.

Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.

The Science of the New Musician: How N.Y.U. Professor Gary Marcus Became a Guitar Hero

My latest post at the ScientificAmerican.com blog network reviews NYU Professor Gary Marcus’ Book Guitar Zero, which is out today.

Gary Marcus is a professor of psychology at NYU, an MIT graduate and a juggler, unicyclist and photographer. A few years ago he set out to conquer one field that had eluded him his whole life: music. “I had no musical talent whatsoever,” he described to me from his office, which sets a few blocks east of Manhattan’s Washington Square Park, “and was at one point gently told to stop taking recorder lessons when I was younger.” With a sabbatical coming up, and a growing interest in whether people could pick up an instrument in their adult life, Marcus did what anyone else would do. He picked up a guitar. Not any guitar though, a Guitar Hero guitar.

As someone who has spent most of high school and college playing this beloved game, this was music to my ears.

His latest book, Guitar Zero, now available, is the culmination of his work as a student of guitar, music enthusiast and researcher of learning. It joins the ranks of some excellent psychology of music books including Oliver Sacks’ Musicophilia, Daniel Levitin’s This Is Your Brain on Music and John Ortiz’ The Tao of Music.

But Guitar Zero is different. Yes, Marcus delves into the academic side of things, but he is also personal. He devotes several chapters to explain his struggles with congenital arrhythmia, learning music theory, playing instruments and he shares wonderful stories from his adventures at Day Jams, “a summer camp where kids ages eight to fifteen learn to play and compose rock and roll,” with his band “Rush Hour.” What comes out is a lighthearted memoir filled with wonderful insights about music and the human mind. Compared to popular psychology books written from the expert’s point of view, Guitar Zero is a refreshing glimpse into the mind of the amateur.

Jonathan Haidt and the Moral Matrix: Breaking Out of Our Righteous Minds

My latest at the Scientific American guest blog:

Meet Jonathan Haidt, a professor of social psychology at the University of Virginia who studies morality and emotion. If social psychology was a sport, Haidt would be a Phil Mickelson or Rodger Federer – likable, fun to watch and one of the best. But what makes Haidt one-of-a-kind in academia is his sincere attempt to study and understand human morality from a point of view other than his own.

Morality is difficult. As Haidt writes on his website, “It binds people together into teams that seek victory, not truth. It closes hearts and minds to opponents even as it makes cooperation and decency possible within groups.” And while many of us understand this at a superficial level, Haidt takes it to heart. He strives to understand our inherent self-righteousness and morality as a collection of diverse mental modules to try to ultimately make society better off.

I had the pleasure of visiting him at his office, which is currently in Tisch Hall at NYU (Haidt is a visiting professor at Stern School of Business), to speak about his background and how he came to write his forthcoming book, The Righteous Mind: Why Good People Are Divided by Politics and Religion.

A Brief Guide to Embodied Cognition: Why You Aren’t Your Brain

My latest over at the Scientific American guest blog. I was fortunate enough to interview professors George Lakoff (Berkeley) and Joshua Davis (Barnard) who were very helpful and took me through what it means for the mind to be “embodied.”

Embodied cognition, the idea that the mind is not only connected to the body but that the body influences the mind, is one of the more counter-intuitive ideas in cognitive science. In sharp contrast is dualism, a theory of mind famously put forth by Rene Descartes in the 17th century when he claimed that “there is a great difference between mind and body, inasmuch as body is by nature always divisible, and the mind is entirely indivisible… the mind or soul of man is entirely different from the body.” In the proceeding centuries, the notion of the disembodied mind flourished. From it, western thought developed two basic ideas: reason is disembodied because the mind is disembodied and reason is transcendent and universal. However, as George Lakoff and Rafeal Núñez explain:

“Cognitive science calls this entire philosophical worldview into serious question on empirical grounds… [the mind] arises from the nature of our brains, bodies, and bodily experiences. This is not just the innocuous and obvious claim that we need a body to reason; rather, it is the striking claim that the very structure of reason itself comes from the details of our embodiment… Thus, to understand reason we must understand the details of our visual system, our motor system, and the general mechanism of neural binding.”

What exactly does this mean? It means that our cognition isn’t confined to our cortices. That is, our cognition is influenced, perhaps determined by, our experiences in the physical world. This is why we say that something is “over our heads” to express the idea that we do not understand; we are drawing upon the physical inability to not see something over our heads and the mental feeling of uncertainty. Or why we understand warmth with affection; as infants and children the subjective judgment of affection almost always corresponded with the sensation of warmth, thus giving way to metaphors such as “I’m warming up to her.”

Cognitive Biases Abound in Sports: Guest Post @ Sciam

Coaches, managers, commentators and fans are cognitive disasters. When it comes to sports, we are idiot decision-makers. Read more in my latest post over at the Scientific American blog.

The gist:

When it comes to sports, data > intuition. Yet, coaches, managers, fans and commentators alike continue to go with their guts, especially when it comes to a team or player that is close to their hearts. If they want to decide optimally or speak about their beloved team or player with slightly more intelligence they should turn off their cognitive biases and look at the data. Trying to persuade someone to change a strategy, root for another team or consider why the sports team from your area is superior to the sports team from their area is not unlike trying to persuade a Republican that Obama is a good President or an atheist that God exists. It’s just not going to happen.

Read more

%d bloggers like this: