close Facebook Twitter Instagram LinkedIn
5 mins

How to Get Better at Decision-Making by Thinking, Fast and Slow

Want to get better at decision-making? Daniel Kahneman’s groundbreaking Thinking, Fast and Slow can help you learn to work with, not against, your brain.
by Carrie M. King | Sep 29 2020

It’s a hallway scene from a high school movie.

“Think fast!”

A football leaves the hands of a student who is most likely wearing a varsity jacket. It’s flying over the heads of teenagers who are moving along the corridor to class. What happens next?

If the receiver is sporty, they’ll jump or dive and somehow catch the ball. However, if the receiver is nerdy, the football will likely hit them square in the face.

They didn’t think fast.

These two caricatures represent more or less how thinking works in the human mind. The sporty kid is System 1 thinking which kicks into gear when you need split-second, reflexive action or judgement calls. System 1 is fast, impulsive, and responsive to in-the-moment situations.

The nerdy kid is System 2 thinking which may not react at the speed of light, but excels at figuring out problems that require deeper consideration. These two systems most often work very well in tandem, although systematic errors arise when System 1 is so on the ball, so to speak, that it often reacts before System 2 has a chance to kick in.

If you don’t have time to read on to learn more, check out the video above which explains System 1 and System 2 thinking in about a minute.

Daniel Kahneman and The Science of Decision-Making

These two modes of thinking were first described by Israeli-American psychologist and behavioral economist, Daniel Kahneman, who won a Nobel prize in 2002. In 2011, he published Thinking, Fast and Slow, a book based on his groundbreaking, Nobel prize-winning research. Selected by The New York Times Book Review as one of the ten best books of 2011, it has since become—and remained—an international bestseller.

The book summarizes decades of analysis Kahneman conducted with his research partner, Amos Tversky. Digging deep into cognitive psychology and economics, it explains the nature of our biases, how we make decisions based on risk, familiarity, and probability, and what lies at the core of our search for happiness.

To get a full sense of the depth and breadth of his work you’ll have to read Thinking, Fast and Slow but you can get a taste of the key insights on Blinkist. If you have even fewer than 15 minutes, here are a few of his main ideas.

Lazy Brain Shortcuts

In the video above, Page and Turner introduce the famous bat-and-ball problem. A bat and ball cost $1.10. If the bat costs one dollar more than the ball, how much does the ball cost?

If you guessed $0.10, I’m sorry to tell you you’re wrong, but don’t worry, that’s also the conclusion most people jump to. Feel free to take a minute and do the math, or watch the video to have the concept explained.

“Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”
Daniel Kahneman, Thinking, Fast and Slow

What happened is that your automatic, intuitive, “fast” System 1 thinking jumped in and answered the question based on what seemed to be the most obvious answer. When you stopped to think about it, System 1 stepped back to let your “slow” System 2 work out the arithmetic.

The bat-and-ball problem shows how our brain takes shortcuts when we perceive a problem to be easier than it actually is. Our brains are energy-efficient, which also makes them extremely lazy in certain cases. This is known as the law of least effort.

Pausing to check the right answer with System 2 requires more energy than trying to pawn off the problem on our speedy System 1. This had no major impact in the case of a math problem like this, but you can easily see how your speedy System 1 thinking could get you into scrapes that your more thoughtful System 2 could avoid.

Relying on System 1 thinking also means our mind is limiting the depth of our intelligence and capacity to understand complex situations.

So, next time the answer seems obvious, why not pause for a second and try to engage your savvy System 2 thinking to make sure your lazy brain doesn’t just try to take the easy way out.

Snapping Judgment and Cutting Bias

To explore why your System 1 thinking might cause you to make biased snap judgments, let’s consider a different situation. Imagine you’re in a bar and you end up having a really nice chat with someone called Susan. Because you found Susan really easy to talk to, you will most likely assume positive things about the rest of her character.

We often make decisions about whether we like or dislike someone based on very little information. This is because of our mind’s tendency to oversimplify situations due to what is known as exaggerated emotional coherence.

“A reliable way of making people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.”
Daniel Kahneman, Thinking, Fast and Slow

One you may have heard of before is confirmation bias. This is another example of System 1 taking shortcuts without engaging the analytical powers of System 2.

Confirmation bias leads people to tend to agree with information that supports knowledge they already have or something in which they already believe. It also means that we’re very open to suggestions and leading questions can influence our assumptions.

For example, if I ask you, “Is Hawaii amazing?”, you’re very likely to think Hawaii is amazing, whether or not you have any existing information about Hawaii and its levels of amazingness.

Try to develop an awareness of your own biases—we all have them!—because otherwise, your brain will just continue to identify with System 1 thinking and make snap judgments without you realizing. That can lead to questionable choices and poor decision-making.

It’s also a good idea to research topics and people that you need to make decisions about—like job applicants or political candidates, not people you meet in bars, please!—and to analyze the language used in questions, surveys, and studies, to make sure you’re not being encouraged to answer in a particular way.

Why ‘Availability Heuristics’ Make You Believe Fake Facts

We now know that the brain takes shortcuts and jumps to conclusions with limited information in order to help us immediately understand what’s happening around us. These shortcuts are called heuristics, and help us to avoid a huge amount of daily cognitive strain. If we had to re-process every single situation afresh, we would have very, very tired brains.

System 1 helps us to conserve our mental energy for the tasks that really require it. However, as mentioned above, this can lead us into trouble. There are many types of heuristics, but let’s focus on two to explain how they work: the substitution heuristic and the availability heuristic.

“This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.”
Daniel Kahneman, Thinking, Fast and Slow

The substitution heuristic is where we answer an easier question than the one we’ve been asked. For example, if you’re asked, “Will this woman running for mayor be successful in the role?”, you’re likely to substitute that question with one that’s easier to answer like, “Does this woman look like a successful mayor?”.

Instead of researching her record, the substitution heuristic will mean you will make an assessment based on your existing idea of what a successful mayor looks like. If the two images don’t measure up, you’re unlikely to support her, even if she’s an excellent candidate.

With the availability heuristic, you will overestimate the probability of something you hear often or find easy to remember. A good example of the availability heuristic is when you consider how you are more likely to die.

Statistically, strokes cause more deaths than accidents, but one study found that 80% of respondents consider it more likely they’ll die in an accident rather than from a stroke. Why? We simply don’t hear about stroke deaths on the news, whereas accidents are reported all the time in all their gory details. This makes us remember them more clearly, and believe that horrific accidents are far more common and more likely when this is not true at all.

System 1 thinking makes our lives easier and faster, but being aware of how it works might give us a richer understanding of how to get better at smart decision-making and enable us to question our go-to assumptions. To discover more about how your System 1 and System 2 thinking work, watch the video above and pick up a copy of the fascinating Thinking, Fast and Slow today. You can also read the award-winning psychologist’s key ideas on Blinkist.

To discover more in this series, check out:

Facebook Twitter Tumblr Instagram LinkedIn Flickr Email Print