Confirmation bias – The bias that supports all your other biases

[Note: I shared this mental model with my email subscribers on Nov 27, 2016. If you want to receive a new mental model every week, join the club.]

 

Faced with the choice between changing one’s mind and proving there is no need to do so, almost everyone gets busy on the proof. – J. K. Galbraith

What it is:

Confirmation bias is our tendency to seek information that confirms our prior beliefs, and to ignore evidence to the contrary. This happens in a few ways:

  • When we see evidence that confirms our beliefs, we accept it with ease. But when we see contrary evidence, the bar suddenly becomes much higher. We look for ways to dismiss the new facts. As this delightfully funny comic shows.
  • We interpret new information in a way that suits our beliefs.

As Sherlock Holmes said, we make “the capital mistake of twisting facts to suit theories”.

Or, in Warren Buffett’s words:

“the human being is best at … interpreting all new information so that their prior conclusions remain intact”.

confirmation bias

The Rorschach Test: What pattern do you see?

Examples in business:

  • We choose performance metrics that suit our conclusion. Call it entrepreneurial optimism. But we always choose the metric that shows most positive performance. “So what if the overall retention numbers are down? At least the <enter ridiculous random metric here> is going up.”
  • We assume our competitors are stupid and evil. In general, we assume the worst characteristics among people we dislike. So, any strategic choice my competitor makes is either (a) a bad idea that is definitely going to fail, or (b) copied from me.
  • We see patterns where they don’t exist. As an investor at OperatorVC, I have to be extra-careful of this. It’s very easy to find examples of failed or successful startups that are similar to the one I’m evaluating now (depending on what I want to find, of course). And analogies are the worst. It’s hard to avoid what Scott Adams calls bumper sticker thinking.
  • “Hammer looking for a nail”. When you have a cool product / concept you’ve come up with, you start looking for an application for it. You start with a solution, and then look for a problem. The issue is, you’ll find problems aplenty. Everything will look like it fits your concept.

At a meta-level, this goes for mental models too. After last week’s issue, I kept seeing situations where people were mistaking “the map for the territory“. They weren’t. That was confirmation bias at work.

Chris Anderson made the same mistake, applying his Long Tail mental model to everything. He also called the Al-Qaeda a “supercharged niche supplier” in “the long tail of national security” (!). Read Tim Wu’s hilarious account of this in The Wrong Tail.

 

Rules to protect yourself:

  1. When you have a hypothesis, look for disproving evidence first. Follow Charles Darwin’s Golden Rule.
  2. Don’t rationalize in hindsight. Make a prediction first, and then see how things match up. In the example of the performance metrics, choose one North Star metric and track that. Don’t choose other, more favorable metrics after the fact.
  3. Don’t use 1-2 mental models for everything. Seriously, the world is not so simple. You need a hammer, but you also need scalpels and spanners. [Moral: Keep checking out my mental models section every week]

 

TL:DR: Be careful what you look for. You’ll find it.

[fancy_box id=5][content_upgrade id=606]Want to get new mental models straight to your inbox? The next one arrives this Sunday – don’t miss it![/content_upgrade][/fancy_box]

Filed Under: Psychology and Human Behavior