Skip to content

Posts tagged ‘Narratives’

The Irrationality of Irrationality: The Paradox of Popular Psychology

Here’s my latest on ScientificAmerican.com 

In 1996, Lyle Brenner, Derek Koehler and Amos Tversky conducted a study involving students from San Jose State University and Stanford University. The researchers were interested in how people jump to conclusions based on limited information. Previous work by Tversky, Daniel Kahneman and other psychologists found that people are “radically insensitive to both the quantity and quality of information that gives rise to impressions and intuitions,” so the researchers knew, of course, that we humans don’t do a particularly good job of weighing the pros and cons. But to what degree? Just how bad are we at assessing all the facts?

To find out, Brenner and his team exposed the students to legal scenarios. In one, a plaintiff named Mr. Thompson visits a drug store for a routine union visit. The store manager informs him that according to the union contract with the drug store, plaintiffs cannot speak with the union employees on the floor. After a brief deliberation, the manager calls the police and Mr. Thompson is handcuffed for trespassing. Later the charges were dropped, but Mr. Thompson is suing the store for false arrest.

All participants got this background information. Then, they heard from one of the two sides’ lawyers; the lawyer for the union organizer framed the arrest as an attempt to intimidate, while the lawyer for the store argued that the conversation that took place in the store was disruptive. Another group of participants – essentially a mock jury – heard both sides.

The key part of the experiment was that the participants were fully aware of the setup; they knew that they were only hearing one side or the entire story. But this didn’t stop the subjects who heard one-sided evidence from being more confident and biased with their judgments than those who saw both sides. That is, even when people had all the underlying facts, they jumped to conclusions after hearing only one side of the story.

The good news is that Brenner, Koehler and Tversky found that simply prompting participants to consider the other side’s story reduced their bias – instructions to consider the missing information was a manipulation in a later study – but it certainly did not eliminate it. Their study shows us that people are not only willing to jump to conclusions after hearing only one side’s story, but that even when they have additional information at their disposal that would suggest a different conclusion, they are still surprisingly likely to do so. The scientists conclude on a somewhat pessimistic note: “People do not compensate sufficiently for missing information even when it is painfully obvious that the information available to them is incomplete.”

In Brenner’s study, participants were dealing with a limited universe of information – the facts of the case and of the two sides’ arguments. But in reality – especially in the Internet era – people have access to a limitless amount of information that they could consider. As a result, we rely on rules of thumb, or heuristics, to take in information and make decisions. These mental shortcuts are necessary because they lessen the cognitive load and help us organize the world – we would be overwhelmed if we were truly rational.

This is one of the reasons we humans love narratives; they summarize the important information in a form that’s familiar and easy to digest. It’s much easier to understand events in the world as instances of good versus evil, or any one of the seven story types. As Daniel Kahneman explains, “[we] build the best possible story form the information available… and if it is a good story, [we] believe it.” The implication here is that it’s how good the story is, not necessarily its accuracy, that’s important.

But narratives are also irrational because they sacrifice the whole story for one side of a story that conforms to one’s worldview. Relying on them often leads to inaccuracies and stereotypes. This is what the participants in Brenner’s study highlight; people who take in narratives are often blinded to the whole story – rarely do we ask: “What more would I need to know before I can have a more informed and complete opinion?”

The last several years have seen many popular psychology books that touch on this line of research. There’s Ori and Rom Brafman’s Sway, Dan Ariely’s Predictably Irrational and, naturally, Daniel Kahneman’s Thinking, Fast and Slow. If you could sum up the popular literature on cognitive biases and our so-called irrationalities it would go something like this: we only require a small amount of information, often times a single factoid, to confidently form conclusions and generate new narratives to take on new, seemingly objective, but almost entirely subjective and inaccurate, worldviews.

The shortcomings of our rationality have been thoroughly exposed to the lay audience. But there’s a peculiar inconsistency about this trend. People seem to absorb these books uncritically, ironically falling prey to some of the very biases they should be on the lookout for: incomplete information and seductive stories. That is, when people learn about how we irrationally jump to conclusions they form new opinions about how the brain works from the little information they recently acquired. They jump to conclusions about how the brain jumps to conclusions and fit their newfound knowledge into a larger story that romantically and naively describes personal enlightenment.

Tyler Cowen made a similar point in a TED lecture a few months ago. He explained it this way:

There’s the Nudge book, the Sway book, the Blink book… [they are] all about the ways in which we screw up. And there are so many ways, but what I find interesting is that none of these books identify what, to me, is the single, central, most important way we screw up, and that is, we tell ourselves too many stories, or we are too easily seduced by stories. And why don’t these books tell us that? It’s because the books themselves are all about stories. The more of these books you read, you’re learning about some of your biases, but you’re making some of your other biases essentially worse. So the books themselves are part of your cognitive bias.

The crux of the problem, as Cowen points out, is that it’s nearly impossible to understand irrationalities without taking advantage of them. And, paradoxically, we rely on stories to understand why they can be harmful.

To be sure, there’s an important difference between the bias that comes from hearing one side of an argument and (most) narratives. A corrective like “consider the other side” is unlikely to work for narratives because it’s not always clear what the opposite would even be. So it’s useful to avoid jumping to conclusions not only by questioning narratives (after all, just about everything is plausibly a narrative, so avoiding them can be pretty overwhelming), but by exposing yourself to multiple narratives and trying to integrate them as well as you can.

In the beginning of the recently released book The Righteous Mind, social psychologist Jonathan Haidt explains how some books (his included) make a case for how one certain thing (in Haidt’s case, morality) is the key to understanding everything. Haidt’s point is that you shouldn’t read his book and jump to overarching conclusions about human nature. Instead, he encourages readers to always think about integrating other points of view (e.g., morality is the most important thing to consider) with other perspectives. I think this is a good strategy for overcoming a narrow-minded view of human cognition.

It’s natural for us to reduce the complexity of our rationality into convenient bite-sized ideas. As the trader turned epistemologist Nassim Taleb says: “We humans, facing limits of knowledge, and things we do not observe, the unseen and the unknown, resolve the tension by squeezing life and the world into crisp commoditized ideas.” But readers of popular psychology books on rationality must recognize that there’s a lot they don’t know, and they must be beware of how seductive stories are. The popular literature on cognitive biases is enlightening, but let’s be irrational about irrationality; exposure to X is not knowledge and control of X. Reading about cognitive biases, after all, does not free anybody from their nasty epistemological pitfalls.

Moving forward, my suggestion is to remember the lesson from Brenner, Koehler and Tversky: they reduced conclusion jumping by getting people to consider the other information at their disposal. So let’s remember that the next book on rationality isn’t a tell-all – it’s merely another piece to the puzzle. This same approach could also help correct the problem of being too swayed by narratives – there are anyways multiple sides of a story.

Ultimately, we need to remember what philosophers get right. Listen and read carefully; logically analyze arguments; try to avoid jumping to conclusions; don’t rely on stories too much. The Greek playwright Euripides was right: Question everything, learn something, answer nothing.

The Illusion of Understanding Success

In December of 1993, J.K. Rowling was living in poverty, depressed, and at times, contemplating suicide. She resided in a small apartment in Edinburgh, Scotland with her only daughter. A recent divorce made her a single mom. Reflecting on the situation many years later, Rowling described herself as, “the biggest failure I knew.”

By 1995 she finished the first manuscript of Harry Potter and the Philosopher’s Stone, a story about a young wizard she began writing years before. The Christopher Little Literary Agency, a small firm of literary agents based in Fulham, agreed to represent Rowling. The manuscript found its way to the chairman of Bloomsbury, who handed it down to his eight-year-old daughter Alice Newton. She read it and immediately demanded more; like so many children and adults after her, she was hooked. Scholastic Inc., bought the rights to Harry Potter in the United States in the spring of 1997 for $105,000. The rest is history.

Rowling’s story, which includes financial and emotional shortcomings followed by success and popularity, is the rages to riches narrative in a nutshell. It’s the story of an ordinary person, dismissed by the world, who emerges out of adversity onto the center stage. It’s the sword in the stone, it’s the ugly duckling; it’s a story that gets played out time and time again throughout history. Kafka captures it nicely in The Castle: “Though for the moment K. was wretched and looked down on, yet in an almost unimaginable and distant future he would excel everybody.”

The reality of Rowling’s story, however, is just that: it’s a story. It’s a sequence of facts strung together by an artificial narrative. It didn’t necessarily have to have a happy ending and it certainly was not predictable back in 1993. Rowling did not follow a predetermined path. Her life before Harry Potter was complex and convoluted, and, most importantly, luck played a significant role in her eventual success. These variables are always forgotten in hindsight.

Yet, we humans, facing limits of knowledge, to paraphrase one author, resolve the myriad of unknown events that defined Rowling’s life before Harry Potter by squeezing them into crisp commoditized ideas and packaging them to fit a warming narrative. We have, in other words, a limited ability to look at sequences of facts without weaving an explanation into them.

The same problem occurs in science. It’s always the story of invention, the tale of discovery or the history of innovation. These narratives manifest themselves in the form of a quest: A scientist is stuck on a problem, he or she is surrounded by doubt, but after years of hard work an insight prevails that changes the world forever.

In The Seven Basic Plots, Christopher Booker summarizes The Quest, which sounds as much like Darwin on the Beagle, MaGellan aboard the Trinidad or Marco Polo traveling across Asia as it does Frodo traversing Middle Earth. As Booker explains:

Far away, we learn, there is some priceless goal, worth any effort to achieve… From the moment the hero learns of this prize, the need to set out on the long hazardous journey to reach it becomes the most important thing to him in the world. Whatever perils and diversion lie in wait on the way, the story is shaped by that one overriding imperative; and the story remains unresolved until the objective has been finally, triumphantly secured.

Unfortunately, Frodo’s triumph at Mount Doom is more real than natural selection to some. Kahneman is right: “It is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle.”

Our propensity to story tell is also fueled by the survivorship bias, which describes our tendency to believe that successful people possess a special property. For Steve Jobs it was his assertive leadership and vision, for Bob Dylan it was his poetry and willingness to challenge the norm and for Rowling it was her creativity and imagination. But these attributes are post-hoc explanations; there are plenty of people with Dylan’s musical and lyrical caliber who will never match his success. Likewise, many creative geniuses of Rowling’s stature will never sell tens of millions books. Luck, at the end of the day, might be the best explanation.

When trying to answer the question of what makes people successful the best response might be it’s impossible to know. Indeed, hardwork, intelligence and good genes certainly play a role. But the reality of Rowling’s story is that it is highly unlikely. Twelve out of twelve publishing houses rejected the book. In the years leading up to Harry Potter a number of things could have prevented Scholastic from purchasing the rights to her book. If it weren’t for little Alice Newton, the book may have never seen the light of day.

The true test of an explanation, as Kahneman also says, is whether it would have made the event predictable in advance. No story of Rowling’s unlikely success will meet that test, because no story can include all events that would have caused a different outcome. This being said, we will continue to explain Rowling’s story as if it was inevitable and predictable. We will always be obsessed with happy endings.

The takeaway is twofold: first, be suspicious of narratives, especially if they are charming; second, be humble about what you think it takes to be successful. There is good reason to believe that what you think is an illusion perpetuation by a narrative where everybody lives happily ever after.

The Irrationality Of Irrationality

Reason has fallen on hard times. After decades of research psychologists have spoken: we humans are led by our emotions, we rarely (if ever) decide optimally and we would be better off if we just went with our guts. Our moral deliberations and intuitions are mere post-hoc rationalizations; classical economic models are a joke; Hume was right, we are the slaves of our passions. We should give up and just let the emotional horse do all the work.

Maybe. But sometimes it seems like the other way around. For every book that explores the power of the unconscious another book explains how predictably irrational we are when we think without thinking; our intuitions deceive us and we are fooled by randomness but sometimes it is better to trust our instincts. Indeed, if a Martian briefly compared subtitles of the most popular psychology books in the last decade he would be confused quickly. Reading the introductions wouldn’t help him either; keeping track of the number of straw men would be difficult for our celestial friend. So, he might ask, over the course of history have humans always thought that intelligence was deliberate or automatic?

When it comes to thinking things through or going with your gut there is a straightforward answer: It depends on the situation and the person. I would also add a few caveats. Expert intuition cannot be trusted in the absence of stable regularities in the environment, as Kahneman argues in his latest book, and it seems like everyone is equally irrational when it comes to economic decisions. Metacognition, in addition, is a good idea but seems impossible to consistently execute.

However, unlike our Martian friend who tries hard to understand what our books say about our brains, the reason-intuition debate is largely irrelevant for us Earthlings. Yes, many have a sincere interest in understanding the brain better. But while the lay reader might improve his decision-making a tad and be able explain the difference between the prefrontal cortex and the amygdala the real reason millions have read these books is that they are very good.

The Gladwells, Haidts and Kahnemans of the world know how to captivate and entertain the reader because like any great author they pray on our propensity to be seduced by narratives. By using agents or systems to explain certain cognitive capacities the brain is much easier to understand. However, positioning the latest psychology or neuroscience findings in terms of a story with characters tends to influence a naïve understanding of the so-called most complex entity in the known universe. The authors know this of course. Kahneman repeatedly makes it clear that “system 1” and “system 2” are literary devices not real parts in the brain. But I can’t help but wonder, as Tyler Cowen did, if deploying these devices makes the books themselves part of our cognitive biases.

The brain is also easily persuaded by small amounts of information. If one could sum up judgment and decision-making research it would go something like this: we only require a tiny piece of information to confidently form a conclusion and take on a new worldview. Kahneman’s acronym WYSIATI – what you see is all there is – captures this well. This is precisely what happens the moment readers finish the latest book on intuition or irrationality; they just remember the sound bite and only understand brains through it. Whereas the hypothetical Martian remains confused, the rest of us humans happily walk out of our local Barnes and Noble, or even worse, finish watching the latest TED with the delusion feeling that now, we “got it.”

Many times, to be sure, this process is a great thing. Reading and watching highbrow lectures is hugely beneficial intellectually speaking. But let’s not forget that exposure to X is not knowledge of X. The brain is messy; let’s embrace that view, not a subtitle.

%d bloggers like this: