Skip to content

Cognitive bias

Introduction#

We all exhibit cognitive biases when making judgements and decisions. They are impossible to avoid, but by understanding how they come about we can counteract them to some extent.

This page describes the two systems model for understanding the origin of biases and outlines some of the common biases to be aware of.

Key points
  • Cognitive biases are natural and unavoidable side effect of our evolved mental processes.
  • They can lead us to make decisions and judgements that are illogical and objectively poor.
  • A better awareness of cognitive biases can help us reduce their impact.

Two systems#

In his book Thinking, Fast and Slow , Daniel Kahneman gives us a mental model for two very different modes of thinking that give rise to biases. He calls these System 1 and System 2, characterises each and provides examples to illustrate the differences. Quotes are from that book unless otherwise stated.

System 1 operates automatically and quickly with little or no effort and no sense of voluntary control.

System 1 is fast, intuitive, automatic, unconscious, effortless and biased. Examples of System 1 thinking include:

  • Reading clear text in your native language.
  • Driving a car on an empty road.
  • Thinking of a good chess move (if you’re a chess master).

System 2 is in charge of doubting and unbelieving, but System 2 is sometimes busy, and often lazy.

System 2 is slow, logical, deliberate, conscious, effortful and lazy. Examples of System 2 thinking include:

  • Reading barely legible text, or text in a language you do not know well.
  • Parking in a tight parking space.
  • Looking for a woman wearing a red coat.

The automatic operations of System 1 generate surprisingly complex patterns of ideas, but only the slower System 2 can construct thoughts in an orderly series of steps.

We have evolved to use our quick, efficient System 1 as much as possible. It often guides us well, providing ready answers to many of life’s challenges with seemingly no effort. When faced with new challenges or ones it knows are outside its capability it will recruit System 2 to provide deep, logical analysis. But it can be over-confident and give us quick answers that are wrong.

We can be blind to the obvious, and we are also blind to our blindness.

These intuitive answers can be so convincing that even if we know them to be wrong, we cannot intuitively see them as wrong. A simple example is the Müller-Lyer illusion. Even when we know he lines are the same length, we cannot see them as such.

Müller-Lyer illusion

Müller-Lyer illusion. Even when we know the lines are the same length, we cannot see them as the same.

Common biases#

Confirmation Bias#

When people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound.

To do

Halo Effect#

To do

Availability Heuristic#

To do

Optimism Bias#

To do

Hindsight Bias#

To do

Anchoring Bias#

To do

Quotes#

A reliable way to make people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.

If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution.

https://www.coursehero.com/lit/Thinking-Fast-and-Slow/quotes/ Thinking, Fast and Slow is, in a sense, a tour of the blind spots of human cognition. Kahneman shows how the intuitions that guide everyday thinking can fail—sometimes in mundane ways, but sometimes quite spectacularly. A recurring theme in the book is the limits of human intuition when confronted with rare or unexpected phenomena. These blind spots are significant because modern life often requires judging probabilities and assessing statistical trends—tasks in which intuition can be more hindrance than help.