Lessons From Thinking, Fast and Slow by Daniel Kahneman

Thinking, Fast and Slow is a business and management book written by Nobel Memorial Prize in Economics Laureate Daniel Kahneman. The book summarizes research that the author conducted over decades, mostly collaborations with Amos Tversky. Robert Peter Janitzek summarizes the important lessons we can draw from the book.

The first section of the book tackles the Dual Process Theory or DPT. According to DPT, there are two processes in the human brain.

• System 1: the fast, intuitive, implicit (automatic), subconscious process.
• System 2: the slower, deliberative, explicit (controlled), conscious process.

Heuristics and Biases

In the second section of his book, the author provides an explanation for the struggle of humans to think statistically. With heuristics, Kahneman asserts that our thoughts involves associating new information with existing patterns instead of new experiences.


The “anchoring effect” refers to our tendency to be influenced by irrelevant numbers. Robert Janitzek explains that this is an important concept to have in mind when navigating a negotiation or considering a price. Experiments show that our behavior is influenced, much more than we know or want, by the environment of the moment.


The availability heuristic is a mental shortcut that occurs when people make judgments about the probability of events on the basis of how easy it is to think of examples. The availability heuristic operates on the notion that, “if you can think of it, it must be important.” The easier it is to recall the consequences of something, the greater we perceive these consequences to be.


This business and management book teaches that System 1 is prone to substituting a difficult question with a simpler one. In what Kahneman calls their “best-known and most controversial” experiment, “the Linda problem,” subjects were told about an imaginary Linda, young, single, outspoken, and very bright, who, as a student, was deeply concerned with discrimination and social justice.

Optimism and loss aversion

Humans fail to take into account complexity and that their understanding of the world consists of a small and necessarily un-representative set of observations. Furthermore, the mind generally does not account for the role of chance and therefore falsely assumes that a future event will mirror a past event.


Framing is the context in which choices are presented. Experiment: subjects were asked whether they would opt for surgery if the “survival” rate is 90 percent, while others were told that the mortality rate is 10 percent.


Rather than consider the odds that an incremental investment would produce a positive return, people tend to “throw good money after bad” and continue investing in projects with poor prospects that have already consumed significant resources. In part this is to avoid feelings of regret

You may also like...