The Evil of Irrelevant Information: Anchoring & The Conjunction Fallacy
I want to think that people are rational consumers, but it’s hard to ignore the overwhelming evidence that says they’re not. You don’t even have to read the academic literature to realize this, just go to the grocery store! As you walk down the aisle and see a delicious bag of chips with “50 percent less calories,” ask yourself this: would you have bought it if it said “with 50 percent as many calories?” Or how about the medication over in the pharmaceutical section that works “99 percent of the time,” would you buy it if it was “ineffective 1 percent of the time”? In both cases the answer is probably not.
We succumb to these silly things because our brains are easily fooled by numerical manipulations. As psychologist Barry Schwartz explains, “when we see outdoor gas grills on the market for $8,000, it seems quite reasonable to buy one for $1,200. When a wristwatch that is no more accurate than the one you can buy for $50 sells for $20,000, it seems reasonable to buy one for $2,000.” Whether you like it or not, your decisions are easily swayed.
Let’s look at some more examples.
Imagine you’re at an auction bidding on a bottle of Côtes du Rhône, a bottle of Hermitage Jaboulet La Chapelle, a cordless keyboard and mouse, a design book and a one-pound box of Belgium chocolates. Before the auction starts the auctioneer asks you to jot down the last two digits of your social security number and indicate if you would be willing pay this amount for any of the products as well as the maximum amount you would be willing to bid for each product. When Dan Ariely, Drazen Prelec and George Loewenstein conducted this auction to a group of MIT undergrads, they found that the social security number greatly influenced the students’ bids. In Ariely’s words:
The top 20 percent (in terms of the value of their s.s), for instance, had an average of $56 for the cordless keyboard; the bottom 20 percent bid an average of $16. In the end, we could see that students with social security numbers ending in the upper 20 percent placed bids that were 216 to 346 percent higher than those of the students with social security numbers ending in the lowest 20 percent.
Ariely’s experiment illustrates a cognitive bias known as anchoring, which illustrates our inability to ignore irrelevant information and assess things at face value. The classic anchoring experiment comes from Daniel Kahneman and Amos Tversky. Two groups were asked whether or not the percentage of African countries in the United Nations was higher or lower than a given value: 10 percent for one group and 65 percent for the other group. They found that “the median estimates of the percentage of African countries in the United Nations were 25 and 45 for groups that received 10 and 65, respectively, as starting points.” Put differently, those who received 10 percent estimated the percentage of African countries in the UN to be 25, whereas those who received 65 percent estimated the percentage of African counties in the UN to be 65. As one author puts it, “the brain isn’t good at disregarding facts, even when it knows those facts are useless.”
Along the same lines is the “conjunction fallacy,” which highlights our propensity to misunderstand probability. Here is a simple example. Which description of my friend Brent is more likely: 1) he is the CEO of Bank of America, or 2) he is the CEO of Bank of America and his annual salary is at least $1,000? Though your intuition strongly favors option two, option one is more likely because there are less contingencies. In other words, though it is very likely that he makes more than $1,000 a year as the CEO of a Bank of America, the probability of option one is higher. As UCLA psychologists Dean Buonomano says, “the probability of any event A and any other event B occurring together has to be less likely than (or equal to) the probability of event A by itself.”
The point I am driving at is that we are easily manipulated by irrelevant information. Why? There is a fairly simple explanation.
For most of human history our species survived in a simple world where there wasn’t TV, the internet, fast food, birth control pills, or economic meltdowns. There was just one thing – survival. This was what our psychologies evolved for. Unfortunately, there is a significant mismatch between the world our psychologies were built for and the world as it is today. Food illustrates this disconnect. In the hunter-gatherer society where food was scarce, it would have been smart to load up on as many fatty and salty foods as possible. Now, it would be stupid, or at least bad for your health, to visit your local McDonalds every day, which relentlessly takes advantage of our primitive appetites.
Here’s the kicker: the same is true for anchoring and the conjunction fallacy. In hunter-gatherer societies humans didn’t have to decide between different priced gas grills or bags of chips, or figure out probabilities. They just had to understand how to get food, build shelter and exist long enough to pass on their genes. Because of this, our poor judgement is “not a reflection of the fact that [our brains were] poorly designed, but… that [they were] designed for a time and place very different from the world we now inhabit,” as Buonomano says.
Unfortunately, this means that unless natural selection speeds up, we won’t be getting better any time soon.