Skip to content

Posts tagged ‘Anchoring’

The Price of Framing & Anchoring

I love wikipedia, which is why I am more than willing to donate some money. But I was a little taken back when I saw that its initial asking price is $20. I was thinking a few bucks at most, certainly not $20… that’s four Bud Lights in NYC! What’s interesting is that after seeing the initial price of $20, giving five, six, or seven dollars as opposed to one or two didn’t seem that bad. But then my knowledge of cognitive biases reminded me that Jimmy Wales was playing me.

The green box on the right illustrates a cognitive bias known as anchoring, which “describes the common human tendency to rely too heavily… on one trait or piece of information when making decisions” (I took this quote, appropriately, from wikipedia). A good bargainer uses anchoring to set the initial price high and give the buyer the illusion that he or she is getting a good deal. Likewise, Wales set the smallest donation at $20 to make, say, ten dollars seem like not that much. I mentioned anchoring about a month ago; now, I want to turn to its evil twin, the framing effect, which also distracts us with irrelevant information.

To get a sense of the power of framing, consider Dan Ariely’s example, which appears in the first chapter of Predictably Irrational. Below are three subscription plans offered by Economist.com. Which would you choose?

  1. Economist.com subscription – US $59.00 One-year subscription to Economist.com. Includes online access to all articles from The Economist since 1997.
  2. Print subscription – US $125.00 One-year subscription to the print edition of The Economist.
  3. Print & web subscription – US $125.00 One-year subscription to the print edition of The Economist and online access to all articles from The Economist since 1997.

If you read closely, something strange should have jumped out at you. Who would, as Ariely says, “want to buy the print option alone… when both the Internet and the print subscriptions were offered for the same price?” At first it seems as if someone at The Economist may have made a mistake, after all, how could a one-year subscription have the same value as a one-year subscription and access to online articles since 1997? But after thinking for a second, you may realize that the people at The Economist are not all that stupid; they may in fact know a thing or two about human behavior.

To see just how influential the “framing” of the Economist’s subscription plans are, Ariely conducted the following experiment. First, he presented his MIT Sloan School of Management students with the options as seen on Economist.com and had them choose a subscription. Here were the results.

  • Internet-only subscription for $59 – 16 students
  • Print-only subscription for $125 – 0 students
  • Print-and-Internet subscription for $125 – 84 students

It makes sense – who would choose option two given option three? But the question is: how much did option two influence the student’s decision making? Ariely conducted a second experiment to find the answer. He gave the following subscription plan, this time without the second option, to a second group of students and had them pick one. Here were the results.

  • Internet-only subscription for $59 – 68 students
  • Print-and-Web subscription for $125 – 32 students

As you can see, by simply removing the second option the preference of the students shifted dramatically. Without the second option 68 students chose option one while only 32 students chose option two. How significant is this? Well let’s say that instead of running this experiment with 100 graduate students, you did it with 10,000 customers in the real world. And let’s say that all 10,000 customers chose to sign up for a subscription. In scenario one, where three options are presented, 8,400 people would have chosen option three, 1,600 would have chosen option one, none would have chosen option two, and The Economist would have made $1,144,400 in revenue. Let’s compare this to the second scenario; 6,800 chose option one, 3,200 chose option two, and The Economist would have made $801,200 in revenue. By simply placing a decoy option, The Economist has made $343,200 more.

So what’s the lesson? When you go out this weekend to restaurants or bars, remember that all those gimmicks are just waiting to feast on your cognitive biases. Maintain rationality!

The Evil of Irrelevant Information: Anchoring & The Conjunction Fallacy

I want to think that people are rational consumers, but it’s hard to ignore the overwhelming evidence that says they’re not. You don’t even have to read the academic literature to realize this, just go to the grocery store! As you walk down the aisle and see a delicious bag of chips with “50 percent less calories,” ask yourself this: would you have bought it if it said “with 50 percent as many calories?” Or how about the medication over in the pharmaceutical section that works “99 percent of the time,” would you buy it if it was “ineffective 1 percent of the time”? In both cases the answer is probably not.

We succumb to these silly things because our brains are easily fooled by numerical manipulations. As psychologist Barry Schwartz explains, “when we see outdoor gas grills on the market for $8,000, it seems quite reasonable to buy one for $1,200. When a wristwatch that is no more accurate than the one you can buy for $50 sells for $20,000, it seems reasonable to buy one for $2,000.” Whether you like it or not, your decisions are easily swayed.

Let’s look at some more examples.

Imagine you’re at an auction bidding on a bottle of Côtes du Rhône, a bottle of Hermitage Jaboulet La Chapelle, a cordless keyboard and mouse, a design book and a one-pound box of Belgium chocolates. Before the auction starts the auctioneer asks you to jot down the last two digits of your social security number and indicate if you would be willing pay this amount for any of the products as well as the maximum amount you would be willing to bid for each product. When Dan Ariely, Drazen Prelec and George Loewenstein conducted this auction to a group of MIT undergrads, they found that the social security number greatly influenced the students’ bids. In Ariely’s words:

The top 20 percent (in terms of the value of their s.s), for instance, had an average of $56 for the cordless keyboard; the bottom 20 percent bid an average of $16. In the end, we could see that students with social security numbers ending in the upper 20 percent placed bids that were 216 to 346 percent higher than those of the students with social security numbers ending in the lowest 20 percent.

Ariely’s experiment illustrates a cognitive bias known as anchoring, which illustrates our inability to ignore irrelevant information and assess things at face value. The classic anchoring experiment comes from Daniel Kahneman and Amos Tversky. Two groups were asked whether or not the percentage of African countries in the United Nations was higher or lower than a given value: 10 percent for one group and 65 percent for the other group. They found that “the median estimates of the percentage of African countries in the United Nations were 25 and 45 for groups that received 10 and 65, respectively, as starting points.” Put differently, those who received 10 percent estimated the percentage of African countries in the UN to be 25, whereas those who received 65 percent estimated the percentage of African counties in the UN to be 65. As one author puts it, “the brain isn’t good at disregarding facts, even when it knows those facts are useless.”

Along the same lines is the “conjunction fallacy,” which highlights our propensity to misunderstand probability. Here is a simple example. Which description of my friend Brent is more likely: 1) he is the CEO of Bank of America, or 2) he is the CEO of Bank of America and his annual salary is at least $1,000? Though your intuition strongly favors option two, option one is more likely because there are less contingencies. In other words, though it is very likely that he makes more than $1,000 a year as the CEO of a Bank of America, the probability of option one is higher. As UCLA psychologists Dean Buonomano says, “the probability of any event A and any other event B occurring together has to be less likely than (or equal to) the probability of event A by itself.”

The point I am driving at is that we are easily manipulated by irrelevant information. Why? There is a fairly simple explanation.

For most of human history our species survived in a simple world where there wasn’t TV, the internet, fast food, birth control pills, or economic meltdowns. There was just one thing – survival. This was what our psychologies evolved for. Unfortunately, there is a significant mismatch between the world our psychologies were built for and the world as it is today. Food illustrates this disconnect. In the hunter-gatherer society where food was scarce, it would have been smart to load up on as many fatty and salty foods as possible. Now, it would be stupid, or at least bad for your health, to visit your local McDonalds every day, which relentlessly takes advantage of our primitive appetites.

Here’s the kicker: the same is true for anchoring and the conjunction fallacy. In hunter-gatherer societies humans didn’t have to decide between different priced gas grills or bags of chips, or figure out probabilities. They just had to understand how to get food, build shelter and exist long enough to pass on their genes. Because of this, our poor judgement is “not a reflection of the fact that [our brains were] poorly designed, but… that [they were] designed for a time and place very different from the world we now inhabit,” as Buonomano says.

Unfortunately, this means that unless natural selection speeds up, we won’t be getting better any time soon.

Read more

%d bloggers like this: