To Err Is Human
A review of Thinking, Fast and Slow, by Daniel Kahneman.
Thinking, Fast and Slow
by Daniel Kahneman
Farrar, Straus & Giroux, 2011
The setting Daniel Kahneman has in mind for readers, he explains at the start of Thinking, Fast and Slow, is the proverbial watercooler. This, he hopes, is where his book will be useful. “The expectation of intelligent gossip is a powerful motive for serious self-criticism,” writes the psychologist and winner of a Nobel Prize in economic sciences, “more powerful than New Year resolutions to improve one’s decision making at work and at home.”
How’s that for a disarming invitation to a demanding subject? But then Kahneman, of all people, understands the importance of framing: If you want to elicit the right response, you need to set things up properly.
In this case, only one response is possible: unqualified delight. It’s thrilling to find that the world’s leading authority on how people make decisions is such a skillful and charming writer. The narrative in Thinking, Fast and Slow is simple and lucid: no Malcolm Gladwell–esque feints and swerves, no labored “writing for story.” Kahneman merely tells you about his work, surveys decades of psychological research on decision making in uncertain conditions, conveys a true understanding of his discipline, and keeps you engaged and enchanted throughout.
Kahneman’s main line of business — in partnership with the late Amos Tversky, to whom he pays moving tribute — has been to discover, test, and document systematic errors of judgment, or biases. (See “Daniel Kahneman: The Thought Leader Interview,” by Michael Schrage, s+b, Winter 2003.) There are more kinds than you might suppose, including the anchoring effect: If you put a number into somebody’s mind and then ask him or her a question that calls for a numerical answer, the initial number, however irrelevant, will bias the response.
Kahneman and Tversky demonstrated the anchoring effect with an experiment that used a wheel of fortune, marked from 0 to 100, but rigged to stop at 10 or 65. They spun the wheel and asked the experiment’s subjects, students, to write down the number — 10 or 65 — and answer two questions. First, “Is the percentage of African nations among U.N. members larger or smaller than the number you just wrote?” This forced the students to think about the number. Then, “What is your best guess of the percentage of African nations in the U.N.?” The average given by students who saw 10 was 25 percent; the average given by students who saw 65 was 45 percent. (The correct figure is actually 28 percent.) This startling result has been confirmed in numerous subsequent psychological experiments.
The list of biases is long. To name just a few: confirmation, halo, resemblance, nonregression, availability, what-you-see-is-all-there-is, loss aversion, overconfidence, hindsight, duration neglect, base-rate neglect, and framing, which is the bias imparted by the way a statement is phrased. (“This ground beef is 90 percent fat-free”: It’s good for you. “No, it’s 10 percent fat”: You wouldn’t want to eat that.) Just to see the human propensity for error mapped out would be fascinating in itself, but the book does more. It provides an encompassing framework for understanding these errors — perhaps, even, for reducing them.
Kahneman urges us to think of the mind as run by two systems. System 1 is fast, intuitive, automatic, endlessly striving to build a coherent picture of reality, usually without our noticing. System 1 is miraculous — without it, you couldn’t walk and chew gum at the same time — but it plays fast and loose with the facts, and is the source of the heuristics, or rules of thumb, that lead us astray. System 2 is slow, effortful, deliberate, and can sometimes override System 1. But System 2, Kahneman explains, is lazier than we might wish, and System 1 can be very persuasive. The book’s larger theme is the interaction of the two systems.
When I said that this book is an unqualified delight, that was coming from System 1. System 2 tells me to lodge a couple of reservations. Kahneman claims too much for the practical utility of this endlessly fascinating work. Yes, in theory there are opportunities for policy intervention to improve decision making — but remember that if citizens are subject to these biases, so are their rulers. He briefly asserts that organizations are less susceptible to biases than individuals, but doesn’t go into detail. I’m not so sure. In my own experience as a company (and civil service) man, for every bias corrected by thinking in groups, another gets invented or amplified by groupthink.
Always affable and open-minded, Kahneman is nonetheless unfair, I think, to orthodox economists and their “rational agent” models of behavior. He accuses economists of believing, despite all the evidence he and others have gathered, that people are unfailingly rational. Surely few economists actually believe that. Perfect rationality is not an empirical claim; it is a simplifying assumption whose justification is that it yields useful models. Theories with more sophisticated psychological underpinnings would be great, so long as they can meet that same test. Whether behavioral economics — the field Kahneman and Tversky opened up — can do this is still an open question.
But System 2 always wants to spoil things. Thinking, Fast and Slow is a simply marvelous book. Let’s leave it at that.
Author profile:
- Clive Crook is a senior editor of the Atlantic and a columnist and member of the editorial board at Bloomberg View. His essay on the best business books about the 2008 financial meltdown appeared in the Winter 2009 issue of s+b.