Thinking, Fast and Slow
by Daniel Kahneman
Farrar, Straus & Giroux, 2011
The setting Daniel Kahneman has in mind for readers, he explains at the start of Thinking, Fast and Slow, is the proverbial watercooler. This, he hopes, is where his book will be useful. “The expectation of intelligent gossip is a powerful motive for serious self-criticism,” writes the psychologist and winner of a Nobel Prize in economic sciences, “more powerful than New Year resolutions to improve one’s decision making at work and at home.”
How’s that for a disarming invitation to a demanding subject? But then Kahneman, of all people, understands the importance of framing: If you want to elicit the right response, you need to set things up properly.
In this case, only one response is possible: unqualified delight. It’s thrilling to find that the world’s leading authority on how people make decisions is such a skillful and charming writer. The narrative in Thinking, Fast and Slow is simple and lucid: no Malcolm Gladwell–esque feints and swerves, no labored “writing for story.” Kahneman merely tells you about his work, surveys decades of psychological research on decision making in uncertain conditions, conveys a true understanding of his discipline, and keeps you engaged and enchanted throughout.
Kahneman’s main line of business — in partnership with the late Amos Tversky, to whom he pays moving tribute — has been to discover, test, and document systematic errors of judgment, or biases. (See “Daniel Kahneman: The Thought Leader Interview,” by Michael Schrage, s+b, Winter 2003.) There are more kinds than you might suppose, including the anchoring effect: If you put a number into somebody’s mind and then ask him or her a question that calls for a numerical answer, the initial number, however irrelevant, will bias the response.
Kahneman and Tversky demonstrated the anchoring effect with an experiment that used a wheel of fortune, marked from 0 to 100, but rigged to stop at 10 or 65. They spun the wheel and asked the experiment’s subjects, students, to write down the number — 10 or 65 — and answer two questions. First, “Is the percentage of African nations among U.N. members larger or smaller than the number you just wrote?” This forced the students to think about the number. Then, “What is your best guess of the percentage of African nations in the U.N.?” The average given by students who saw 10 was 25 percent; the average given by students who saw 65 was 45 percent. (The correct figure is actually 28 percent.) This startling result has been confirmed in numerous subsequent psychological experiments.
The list of biases is long. To name just a few: confirmation, halo, resemblance, nonregression, availability, what-you-see-is-all-there-is, loss aversion, overconfidence, hindsight, duration neglect, base-rate neglect, and framing, which is the bias imparted by the way a statement is phrased. (“This ground beef is 90 percent fat-free”: It’s good for you. “No, it’s 10 percent fat”: You wouldn’t want to eat that.) Just to see the human propensity for error mapped out would be fascinating in itself, but the book does more. It provides an encompassing framework for understanding these errors — perhaps, even, for reducing them.
Kahneman urges us to think of the mind as run by two systems. System 1 is fast, intuitive, automatic, endlessly striving to build a coherent picture of reality, usually without our noticing. System 1 is miraculous — without it, you couldn’t walk and chew gum at the same time — but it plays fast and loose with the facts, and is the source of the heuristics, or rules of thumb, that lead us astray. System 2 is slow, effortful, deliberate, and can sometimes override System 1. But System 2, Kahneman explains, is lazier than we might wish, and System 1 can be very persuasive. The book’s larger theme is the interaction of the two systems.