Of Two Minds: How to Get Better at Decision-Making by Thinking Fast and Slow
“Think fast!” It’s a hallway scene from any classic cliché high school movie. A football has left the hands of a student most likely wearing a varsity jacket and is flying over the heads of teenagers who are swanning or skulking their way to class along a locker-flanked corridor. What happens next?
If the receiver is similarly sporty, they’ll jump or dive and somehow seamlessly catch the ball to much applause and backslapping. If, however, the person on the receiving end is your standard nerdy archetype, the football will almost certainly hit them square in the face, sending glasses askew and a stack of books to the ground. They simply didn’t think fast.
These two characters represent more or less how you think. The sporty kid is your System 1 thinking which kicks into gear when you require split-second, reflexive action or judgement calls. The nerdy one is System 2 which may not react at the speed of light, but excels at figuring out problems that require deeper consideration. These two systems most often work very well in tandem, but problems arise when System 1 is so on the ball, so to speak, that it often reacts before System 2 has a chance to kick in. If you don’t have time to read on to learn more, check out the video above which explains System 1 and System 2 thinking in just about a minute.
Daniel Kahneman and The Science of Decision-Making
These two modes of thinking were first described by Israeli-American psychologist and behavioral economist, Daniel Kahneman, who won a Nobel prize in 2002. In 2011, he published Thinking, Fast and Slow, a book based on his groundbreaking research which has since become an international bestseller and one of the most recognizable nonfiction books in the world.
Thinking, Fast and Slow
Thinking, Fast and Slow
- 19 min reading time
- 500k reads
- audio version available
The book summarizes decades of analysis Kahneman conducted with his research partner, Amos Tversky, and helps us understand the nature of our biases, how we make decisions based on risk, familiarity, and probability, and what lies at the core of our search for happiness. To get a full sense of the depth and breadth of his work, and a sense of how System 1 and System 2 thinking play off one another, you’ll have to read Thinking, Fast and Slow, but you can get a taste of the key insights on Blinkist. If you have even fewer than 15 minutes, here are a few of his main ideas.
Lazy Brain Shortcuts
In the video above, Page and Turner introduce the famous bat-and-ball problem which goes like this: A bat and ball cost $1.10. If the bat costs one dollar more than the ball, how much does the ball cost? If you guessed $0.10, I’m sorry to tell you you’re wrong, but don’t worry, that’s also the conclusion most people jump to. Feel free to take a minute and do the math, or watch the video to have the concept explained.
“Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”
What happened is that your automatic, intuitive, “fast” System 1 thinking jumped in and answered the question based on what seemed to be the most obvious answer. When you stopped to think about it, System 1 stepped back to let your “slow” System 2 work out the arithmetic. The bat-and-ball problem neatly shows how our brain takes shortcuts when we perceive a problem to be easier than it actually is. Our brains are profoundly energy-efficient, which also makes them extremely lazy in certain cases. This is known as the law of least effort, because pausing to check the right answer with System 2 requires more energy, so our minds try to pawn off the problem on our speedy System 1. This had no major impact in the case of a math problem like this, but you can easily see how your speedy System 1 thinking could get you into scrapes that your more thoughtful System 2 could avoid. Relying on System 1 thinking also means our mind is limiting the depth of our intelligence and capacity to understand complex situations.
So, next time the answer seems obvious, why not pause for a second and try to engage your savvy System 2 thinking to make sure your lazy brain doesn’t just try to take the easy way out.
Snapping Judgment and Cutting Bias
To explore why your System 1 thinking might cause you to make biased snap judgments, let’s consider a different situation. Imagine you’re in a bar and you end up having a really nice chat with someone called Susan. Because you found Susan really easy to talk to, you will most likely assume positive things about the rest of her character. We often make decisions about whether we like or dislike someone based on very little information. This is because of our mind’s tendency to oversimplify situations due to what is known as exaggerated emotional coherence, or the halo effect. The halo effect is a kind of cognitive bias, and it’s far from the only one.
“A reliable way of making people believe in falsehoods is frequent repetition, because familiarity is not easily distinguished from truth.”
One you may have heard of before is confirmation bias. This is another example of System 1 taking shortcuts without engaging the analytical powers of System 2. Confirmation bias leads people to tend to agree with information that supports knowledge they already have or something in which they already believe. It also means that we’re very open to suggestion and leading questions can influence our assumptions. For example, if I ask you, “Is Hawaii amazing?”, you’re very likely to think Hawaii is amazing, whether or not you have any existing information about Hawaii and its levels of amazingness.
The halo effect and confirmation bias lead us to quick judgments which are more energy-efficient for the brain, but are often inaccurate due to our lack of data. We automatically fill in gaps in the information to make a quick decision, but this can often result in wrong conclusions.
Try to develop an awareness of your own biases—we all have them!—because otherwise, your brain will just continue to make snap judgments without you realizing, and that can lead to questionable choices and poor decision-making. It’s also a good idea to research topics and people that you need to make decisions about—like job applicants or political candidates, not people you meet in bars, please!—and to analyze the language used in questions, surveys, and studies, to make sure you’re not being encouraged to answer in a particular way.
Why ‘Availability Heuristics’ Make You Believe Fake Facts
We now know that the brain takes shortcuts and jumps to conclusions with limited information in order to help us immediately understand what’s happening around us. These shortcuts are called heuristics, and help us to avoid a huge amount of daily cognitive strain. If we had to re-process every single situation afresh, we would have very, very tired brains. System 1 helps us to conserve our mental energy for the tasks that really require it. However, as mentioned above, this can lead us into trouble. There are many types of heuristics, but let’s focus on two to explain how they work: the substitution heuristic and the availability heuristic.
“This is the essence of intuitive heuristics: when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.”
The substitution heuristic is where we answer an easier question than the one we’ve been asked. For example, if you’re asked, “Will this woman running for mayor be successful in the role?”, you’re likely to substitute that question with one that’s easier to answer like, “Does this woman look like a successful mayor?”. Instead of researching her record, the substitution heuristic will mean you will make an assessment based on your existing idea of what a successful mayor looks like. If the two images don’t measure up, you’re unlikely to support her, even if she’s an excellent candidate.
With the availability heuristic, you will overestimate the probability of something you hear often or find easy to remember. A good example of the availability heuristic is when you consider how you are more likely to die. Statistically, strokes cause more deaths than accidents, but one study found that 80% of respondents consider it more likely they’ll die in an accident rather than from a stroke. Why? We simply don’t hear about stroke deaths on the news, whereas accidents are reported all the time in all their gory details. This makes us remember them more clearly, and believe that horrific accidents are far more common and more likely when this is not true at all.
System 1 thinking makes our lives easier and faster, but being aware of how it works might give us a richer understanding of how to get better at smart decision-making and enable us to question our go-to assumptions. To discover more about how your System 1 and System 2 thinking work, watch the video above and pick up a copy of the fascinating Thinking, Fast and Slow today. You can also read the key ideas from the book on Blinkist.