Reading time 8 minutes

The Shortcuts of Thinking: Heuristics and Biases

To be more efficient, human beings take mental shortcuts that allow us to make decisions while saving time, effort and resources. Shortcuts often lead us to our destination, but sometimes they cause us to get lost.

Throughout the day, we make many decisions. We decide what we eat for breakfast, where we invest our money, what commute we take to the office or how we respond to an email, among many other things.

Most of our decisions are simple and intuitive, but others are much more complicated and require our full attention. In each case, we think in different ways.

1. The two systems of thinking

Following in the footsteps of psychologists Keith Stanovich and Richard West, Nobel Prize-winning economist Daniel Kahneman distinguished two systems of thought.

System 1 operates quickly and automatically and does not require voluntary control or much effort. It is a spontaneous part of our thinking.
System 2, on the other hand, is much slower and more reflective. It works in a conscious way and requires a remarkable effort. It is our most rational facet.

Both systems of thinking govern our mental life and decision-making processes by dealing with different tasks.

In our day-to-day lives, we rely on system 1 to think about simple things and act in contexts we know well. This system detects hostility in a tone of voice, completes the operation 2+2=?, reads words on billboards, drives a car on a quiet road, and generally takes care of making easy decisions.

System 2, on the other hand, takes care of things that demand attention and effort. It is the system that answers questions on an exam, parks in a tight space, compares the price and performance of two computers, searches for a particular person in a crowd and, in principle, takes responsibility for making the most difficult decisions.

Ideally, our system 1 should focus only on the less demanding tasks and whenever we have to deal with a difficult issue, system 2 should be activated. In other words, it would be appropriate for us to think fast about simple issues and slow about complicated ones. However, this is not what happens in all cases, as system 2 needs time, effort and resources that it does not always have at its disposal. In such cases, heuristic strategies help us to deal with complex issues.

2. What are the heuristic strategies?

In circumstances where we have to answer a complex question, but our system 2 does not have enough time, will or resources to do so, we tend to take a mental shortcut: we substitute simpler questions for the complex question, answer the simple questions and act accordingly. This substitution strategy is called a “heuristic strategy” or simply “heuristics”.

There are different heuristic strategies that allow us to solve complex problems in simple ways, but they also cause numerous errors. Let’s see it with a couple of examples.

3. Anchoring and adjustment heuristics

In their book Nudge, Richard Thaler and Cass Sunstein explain anchoring and adjustment heuristics as follows.

“Suppose we are asked to guess the population of Milwaukee, a city two hours north of Chicago, where we live. None of us know much about Milwaukee, but we think it is the largest city in Wisconsin. How should we proceed? One thing we could do is start with something we know, which is the population of Chicago, about three million. So, we would think, Milwaukee is a big city, but certainly not as big as Chicago, so, mmm, maybe it has a third of their population, so a million.”

In this case, the complex question is “What is the population of Milwaukee?” and the simple questions we ask ourselves to answer it are “What is the population of Chicago? Is Milwaukee as big as Chicago? Is it much smaller?”. By asking ourselves the simple questions, the answers to which we know, we find it much easier to venture an answer to the complex question, “One million.” However, we are likely to make mistakes.

“Now imagine someone from Green Bay, Wisconsin, is asked the same question. He also doesn’t know the answer [to what Milwaukee’s population is], but he knows that Green Bay has about a hundred thousand people and Milwaukee is bigger, so he assumes, for example, that it has three times the population, three hundred thousand people.

This process is called anchoring and adjustment. You start with an anchor, the known figure, and adjust in the direction you think is appropriate. Bias occurs because the adjustments are often insufficient. Experiments have repeatedly shown that, in problems similar to our example, Chicagoans tend to make guesses that are too high (based on a high anchor), while those of Green Bayers are too low (based on their low anchor). In reality, Milwaukee has about 580,000 residents.”

In general, when we make estimates, initial anchors induce systematic errors: high anchors favor high estimates, and low anchors favor low estimates. This phenomenon causes us to make bad decisions, for example, when negotiating a salary in a job interview. If we come from a humble background, when we negotiate a salary, it is likely that our reference salaries will be low and that, because of this initial anchor, we will claim a lower salary than people whose reference salaries are high in the upper class.

4. Availability heuristics

A friend is thinking of moving to the city where you live and asks you if it is a safe place. To answer the question correctly, you should have information that you do not have. You do not know the number of crimes committed in your city and in other similar cities, so you cannot make appropriate comparisons. How do you answer the question?

What you can do is try to recall crimes that have occurred in your city. You can replace the complex question “Is my city safe?” with the simple question “Do I remember many crimes in my city?”.

If you can remember crimes easily, you will think that they happen often and that the city is not very safe. If you have trouble remembering them, you will think that crimes happen very infrequently and that the city is very safe.

In general, our brain transforms the availability of memories into probability of events. In other words: we consider that what we can easily remember is more probable.

The strategy of converting the availability of memories into probability of events is called “availability heuristics”. It is often very useful, because it saves us from complex calculations, but it also causes us to make mistakes. If, for example, the media constantly broadcast crimes in our city, but not in other similar cities, we will think that we live in a less safe place than average, even though it is like any other city of the same size and condition. This error constitutes a bias.

5. What is a bias?

In general, heuristics are advantageous to us, as they save us time, effort and resources and bring us closer to the truth and our goals. However, they also lead to systematic errors: biases.

A bias is a recurrent deviation of thought and action. We have seen that anchoring heuristics cause an anchoring bias and that availability heuristics cause an availability bias. In general, biases occur when system 2 does not operate with the time, effort and resources it needs. Precisely for that reason, if we want to avoid incurring biases, we must think much more slowly and carefully.

Other biases that affect our decision-making processes are:

– Confirmation bias: Information that reinforces our beliefs and attitudes seems more plausible to us. For that reason, we give more weight to it than to information that questions what we think. This tendency is reasonable, given that we cannot continually question our ideas. However, it makes it difficult for us to discard false beliefs and feeds dogmatism.
– Status quo bias: We prefer to keep things as they are, even in circumstances where we can improve our situation. Most of the time, we humans embody the adage that “A bird in the hand is better than two in the bush”. Thus, we avoid taking unnecessary risks, but we also discard opportunities for improvement.
– Sunk cost bias: We tend to continue projects in which we have invested time, effort, or resources. For example, after ten years with a partner, we are more likely to continue with that partner. This bias favors commitment and the consolidation of habits, but it also causes us to insist on keeping alive projects (personal, professional or otherwise) that are doomed to failure.

These biases (and many others!) affect all of us daily, causing harm in our personal and professional lives. Fortunately, their systematic nature allows us to anticipate and correct them by carrying out behavioral change interventions.

Conclusion

Humans take mental shortcuts to solve complex problems in simple ways. In this way, we save time, effort and resources.
– Mental shortcuts (heuristic strategies) consist of transforming complex issues into simple issues, solving the simple issues and acting accordingly.
– Heuristics produce the systematic deviations that we call biases. Biases are recurrent failures in the way we think and act.
– As they recur, we can anticipate and correct biases by making behavioral change interventions, both in our daily lives and in our companies and institutions.

If you are interested in how to conduct behavioral change interventions to prevent biases and other forms of irrationality from hurting your business, look at this article or contact us.

Are you interested?

Get in touch!


Ángel Longueira Monelos

angellongueira@beway.com

Leave a Comment

Your email address will not be published. Required fields are marked *