Thinking, Fast and Slow Book Summary

Thinking, Fast and Slow is quite a weighty text (well long listen). It is a 16-hour audio book that is so dense with insights I had to depart from my standard 1.5x/2x playback to listen at normal speed. As this is such a huge book my summary just skims over the surface.

The book is more impactful if you participate in the exercises as you read/listen along. These exercises highlight how your brain is as fallible as those Kahneman studied. The audio book has a pdf attachment so listeners don’t miss out.

Daniel Kahneman, a recipient of the Nobel Prize in Economic Sciences for his seminal work in psychology challenging the rational model of judgment and decision-making, is one of the world’s most important thinkers. His ideas have had a profound impact on many fields including business, medicine, and politics. This book brings that research together and makes it accessible to the layperson.

Part 1: Two Systems

Kahneman puts forward a view that our brains have two distinct ways of operating. He calls these System 1 and System 2.
System 1 thinking is intuitive thinking – fast, automatic and emotional. It is based on simple mental rules of thumb “heuristics” and thinking biases (cognitive biases) that result in impressions, feelings and inclinations.
System 2 thinking is rational thinking – slow, deliberate and systematic – and based on considered evaluation that results in logical conclusions.

This concept is covered in The Chimp Paradox which is a more accessible (but less detailed) explanation of two mental modes of operation.

System 1 generally runs the show. It is fast and efficient and probably conferred a biologic advantage back in cave man days. In modern times System 1 will be in control if an experienced driver swerves to avoid a hazard before System 2 has worked out what the hazard is. We think fast to accomplish routine tasks and we need to think slow in order to manage complicated tasks.

System 2 kicks in when System 1 can’t find a suitable answer. This might be filling in a complex form. System 2 requires more mental energy and is slower and more deliberate. System 2 also monitors System 1 however, it doesn’t always do a good job. This gives rise to a number of cognitive biases that the book describes.

We are not objective rational thinkers. Things influence our judgment, attitude, and behaviour that we are not even aware of. We may stop walking when trying to solve a difficult problem. We turn down the car stereo when nearing the end of a journey. We find jokes funnier if we are smiling.

Even when we think we are being rational and thoughtful we may be falling for a cognitive shortcut. When asked a difficult to answer the question “how happy are you with your life?” system 1 can jump in with an answer to a similar but easier question “what is your mood right now”. This is called substitution and it often evades System 2 checks.

Part 2: Heuristics and Biases

System 1 suppresses doubt by constructing coherent stories from the data it has. System 2, our inner sceptic, is capable of doubt. System 2 can maintain incompatible possibilities at the same time to enable evaluation. However, sustaining doubt is harder work than sliding into certainty. This creates a bias for believing.

The “Anchoring Effect” is another powerful bias. If I say 10 and ask you to guess how old Gandhi was when he died you would guess younger than if I had said 65. The number before the question is unrelated and should have no impact. Yet it does. This is a powerful tool used in sales and marketing.

People often make accurate assessments when presented with statistical data. When presented with a story and the statistics people favour the explanation to the data to make general inferences from particular cases rather than making particular inferences from general cases.

Intuitive conclusions (from System 1) feed overconfidence because it feels right. Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1. It is natural for the associative machinery to match the extremeness of predictions to the perceived extremeness on which it is based.

The idea of regression to the mean is alien and difficult to communicate and comprehend. Matching predictions to the evidence is not only something we do intuitively; it also seems a reasonable thing to do. We will not learn to understand regression from experience.

Part 3: Overconfidence

In order to understand the world around us, we create stories. This gives rise to the narrative fallacy. We think we understand the past, which implies the future should be knowable, but in fact, we understand the past less than we believe we do. Which in turn leads to the hindsight bias.

Hindsight Bias is helped by the curse of knowledge once an event takes place we forget what we believed prior to that event. Consequently, the tendency to revise the history of
one’s beliefs in light of what actually happened produces a robust cognitive illusion. This impacts decision makers who after the fact get blamed for decisions that did not turn out well with comments of “…should have known that…”

Part 4: Choices

Most people dislike risk. Given the choice between a lower value sure thing most people will opt of the guaranteed return. This risk aversion can be reduced by thinking more broadly. Looking at risk over many potential “bets” allows an assessment of aggregate wins while looking at once instance can paralyse with fear of loss.

In the 18th century, Daniel Bernoulli is the renowned economist devised utility theory upon which expected utility calculations are built. However, these calculations do not factor in subjectivity. A million dollars is worth more to a poor person than a rich person (presumably the same holds true with the great British pound).

Utility Theory can create theory-induced blindness. Once you have accepted a theory and used it as a mention model or thinking aid it becomes difficult to see its flaws. Observations that don’t fit the model as a suggestion that the model is incorrect or incomplete. Thus the model is not challenged.

Kahneman devised Prospect Theory – for which he won a Nobel Prize in Economics. This is based on three things:
1) The value of money is less important than the subjective experience of changes in one’s wealth
2) We experience a diminished sensitivity to changes in wealth
3) we are loath to lose money (loss aversion)

Loss aversion is a peculiar thing. People will worth harder to avoid losses than to secure gains. This is not rational economic decision making. Golfers put better if they are going for par (to avoid losing points) than they do for bogies (to gain points). People work harder to avoid pain than to seek pleasure. This extends to success pursuing goals. People will work hard to meet a goal but then not sustain that effort to sustain progress beyond the goal.

Part 5 – Two Selves

We have an experiencing self and a remembering self. The latter takes precedence. How an experience ends lingers in our memory and shapes out experience. An amazing holiday with a terrible flight home will be remembered as negative. Our tastes and decisions are shaped by out memories. And our memories can be wrong.

We have a strong preference about our experiences of pain and pleasure. However, in reality, we neglect duration in our assessment. We want pain to be brief and pleasure to linger but the mind plays tricks on us. With the peak-end rule and duration neglect, we are likely to make decisions counter to our expressed preferences.

It’s mad that people would willingly choose more pain over longer exposure that ends pleasurably than shorter episodes of lower discomfort that have a bad ending. That is the mad decision making tat peak-end rule and duration neglect induce.

Leave a Reply