Skip to contentSkip to navigation

Daniel Kahneman: The Thought Leader interview

The Nobel Prize–winning economist parses the roles of emotion, cognition, and perception in the understanding of business risk.

(originally published by Booz & Company)

 

Photograph by Matthew Septimus
A bat and a ball cost $1.10 in total. The bat costs $1 more than the ball. How much does the ball cost?
Almost everyone feels the temptation to answer “10 cents” because the sum $1.10 so neatly separates into $1 and 10 cents, and 10 cents seems the right price for a ball (small and light) relative to a bat (big and heavy). In fact, more than half of a group of students at Princeton and at the University of Michigan gave precisely that answer — that wrong answer.

The right answer is: The ball costs a nickel.

“Clearly, these respondents offered their responses without first checking,” observes Daniel Kahneman, the Eugene Higgins Professor of Psychology and a professor of public affairs in the Woodrow Wilson School of Public and International Affairs at Princeton University, and the winner of the 2002 Nobel Memorial Prize in Economics. “People are not accustomed to thinking hard and are often content to trust a plausible judgment that comes quickly to mind.”

You might choose to dismiss the baseball query as a trick question. But the pathological mistakes and the persistent miscalculations smart people make when they’re making up their minds is at the
core of Professor Kahneman’s path-breaking research. With his late collaborator Amos Tversky of Stanford University, Professor Kahneman completely reframed how economics and finance define and measure rational behavior. Their provocative thinking about thinking and simple — yet remarkably powerful — experiments have revealed the quirks, logical inconsistencies, and flaws in human decision making that represent the rule rather than the exception in cognitive processing.

Prospect Theory — the researchers’ empirical exploration of risk assessment, loss aversion, and reference dependence — explains why individuals consistently behave in ways that traditional economic theory, predicated on the optimization of individual self-interest, would not predict. This work directly spawned the controversial and exciting field of behavioral finance. Championed by economists such as the University of Chicago’s Richard Thaler and Yale’s Robert Shiller, author of Irrational Exuberance (Princeton University Press, 2000), behavioral finance defies the rational investor/random walk algorithms of market analysis in favor of models of judgment under uncertainty.

Research undertaken decades ago by Professor Kahneman, Professor Tversky, and their intellectual allies now influences hundreds of billions of dollars put into corporate investments worldwide. Their insights into the nature of human judgment have prompted fundamental reevaluation of how individuals spend their time, their money, and their thought. It’s hard for an intellectually honest person to read a paper by Professor Kahneman without feeling a shock of recognition, self-consciousness, and concern for the dysfunctions in his or her own thought processes. Professor Kahneman’s work invites — and occasionally demands — serious introspection by executives who profess to care about the quality and consistency of their decisions.

While graciously crediting collaborators and colleagues, Professor Kahneman has strong personal perspectives about how the discipline he helped create has evolved. The challenge of how — and where — psychologists should draw the lines between intuition, cognition, and emotion is one that clearly haunts his thoughts about thinking.

Professor Kahneman talked with strategy+business over coffee in Cambridge, Mass.

S+B: In your classic work on inconsistencies in individual decision making, the focus seemed to be on the fact that people make irrational choices even when they have pretty good information.

KAHNEMAN: When you are interpreting old results or old thoughts, you have to think what was in the background of the scientific conversation at the time. And at that time, in the 1970s, irrationality was really identified with emotionality. It was also obvious that a lot of explicit reasoning goes on: It was absolutely clear to us that people can compute their way out of some things. But we were interested in what comes to mind spontaneously. That led to the two-system theory.

S+B: Can you describe the two-system theory?

KAHNEMAN: Many of us who study the subject think that there are two thinking systems, which actually have two very different characteristics. You can call them intuition and reasoning, although some of us label them System 1 and System 2. There are some thoughts that come to mind on their own; most thinking is really like that, most of the time. That’s System 1. It’s not like we’re on automatic pilot, but we respond to the world in ways that we’re not conscious of, that we don’t control. The operations of
System 1 are fast, effortless, associative, and often emotionally charged; they’re also governed by habit, so they’re difficult either to modify or to control.

There is another system, System 2, which is the reasoning system. It’s conscious, it’s deliberate; it’s slower, serial, effortful, and deliberately controlled, but it can follow rules. The difference in effort provides the most useful indicator of whether a given mental process should be assigned to System 1 or System 2.

S+B: How did you begin your research into the two systems?

KAHNEMAN: In our first paper, Tversky and I did a study of the statistical thinking of professional statisticians when they’re thinking informally. We found what we called the Law of Small Numbers, a term we coined in 1971 to describe how people exaggerate the degree to which the probability distribution in a small group will closely resemble the probability distribution in the overall population. And we also found that people, experienced statisticians, do not apply rules that they’re aware of in guessing the probability of statistical outcomes.

S+B: So even “good” statisticians can be “bad” statisticians.

KAHNEMAN: That’s right. When they’re not computing seriously in System 2 mode, they rely on their intuitions for the kind of simple problems we gave them. We were hoping that, where things really mattered, they would replace their intuitions with computations. Yet what was striking to us was that even people who should know better were making those mistakes.

What’s Risky

S+B: Has your perception of risk and the meaning of risk evolved or changed since you began doing this work?

KAHNEMAN: The perception of and reaction to risk previously had been seen as emotional.

S+B: Not just seen as emotional; dismissed as emotional.

KAHNEMAN: Yes, exactly right. Our innovation was that we identified some categories of risk that were the result of certain cognitive illusions. That was a novelty and that got people excited. But it’s only part of the picture. There is an alternative way of looking at this that is becoming much more fashionable. There’s a paper that I really like a lot. The title of it says the whole story: “Risk as Feeling.” The idea is that the first thing that happens to you is you’re afraid, and from your fear you feel risk. So the view of risk is becoming less cognitive.

S+B: So it’s not that generalized emotion influences decision making. It’s that one emotion — fear — distorts the perception of risk and introduces error into decision making.

KAHNEMAN: What actually happens with fear is that probability doesn’t matter very much. That is, once I have raised the possibility that something terrible can happen to your child, even though the possibility is remote, you may find it very difficult to think of anything else.

S+B: It’s like a Lorenzian imprinting of goslings: The phenomenon of fear imprints on a decision maker.

KAHNEMAN: Emotion becomes dominant. And emotion is dominated primarily by the possibility, by what might happen, and not so much by the probability. The more emotional the event is, the less sensible people are. So there is a big gap.

S+B: You’re saying that the shadow cast by a worst case overwhelms probabilistic assessment?

KAHNEMAN: We say that people have overweighted the low probability. But the prospect of the worst case has so much more emotional oomph behind it.

S+B: So even experts make cognitive mistakes. But experts and executives in organizations don’t make decisions in isolation. They make decisions in meetings and committees and groups. Do we have the counterpart of System 1 and System 2 thinking in groups as well as individuals?

KAHNEMAN: We know a lot about the conditions under which groups work well and work poorly. It’s really clear that groups are superior to individuals in recognizing an answer as correct when it comes up. But when everybody in a group is susceptible to similar biases, groups are inferior to individuals, because groups tend to be more extreme than individuals.

S+B: So it’s a positive feedback loop. In a group, you get an amplification of the extremes.

KAHNEMAN: You get polarization in groups. In many situations you have a risk-taking phenomenon called the risky shift. That is, groups tend to take on more risk than individuals.

We looked at similar phenomena in juries. You collect judgments from the individual members, then you have them delivered, and then you look at the result compared to the median judgment of the group. It’s straightforward what happens: The group becomes more extreme than the individuals.

S+B: Why does this occur?

KAHNEMAN: One of the major biases in risky decision making is optimism. Optimism is a source of high-risk thinking. Groups tend to be quite optimistic. Furthermore, doubts are suppressed by groups. You can imagine the White House deciding on Iraq. That’s a situation where it’s easy for somebody in the administration to think, “This is terrible.” It’s equally easy to understand how someone like that would suppress himself. There is a tendency and the incentive to support the group. That underlies the whole class of phenomena that go by the label of groupthink.

S+B: What about decisions relating to asset allocation and financial investments? That strikes me as a perfect playground for some of these ideas on the relative sobriety of the individual versus the extremism of the group — such as the corporate investment committee.

KAHNEMAN: No, no. They are not the same dynamics. When you’re looking at the market and investment committees, you’re really talking about the dynamics of competing individuals. We really should separate those cases.

S+B: But I’m looking at the management committee, I’m looking at the jury, and I’m looking at the investment committee, and they all seem to be weighing evidence and evaluating risk. Their similarities seem to outweigh their differences.

KAHNEMAN: I’m a psychologist, so I start at the individual level and I look at individual-level biases or errors. Then I look at the group and I say, What happens in the group? How is the group structured? What are the incentives? What do people do to each other in the group situation that would either mitigate or exacerbate risks? Then there are market things where people respond to each other.

S+B: What you’re describing is an internal marketplace where groups come to a consensus about — or at least some sort of agreement about — the risks and rewards associated with their decisions.

KAHNEMAN: That’s correct. But remember that the internal incentives that shape how the group perceives risks and rewards may be very different from the reality of the risks and rewards in the external marketplace. Those incentives can distort risk perception.

S+B: Do you think the dysfunctions of group decision making are worse than the cognitive dysfunctions of individual decision makers?

KAHNEMAN: That depends on the nature of the decision, the individuals, and the groups. But I strongly believe that both individuals and groups need mechanisms to review how their decisions are made. This is particularly important for organizations that have to make many significant decisions in a short amount of time.

Business Decisions

S+B: How much interaction have you had with business leaders about this? Do you get senior executives asking, Look, we make a lot of decisions, we have to assess risk. What insights can you give us into how to do better risk assessment as individuals and as groups? Are there ways for us to become aware of our biases, either by setting up checklists or learning how to frame things better?

KAHNEMAN: I’m very impressed, actually, by the combination of curiosity and resistance that I encounter. The thing that astonishes me when I talk to businesspeople in the context of decision analysis is that you have an organization that’s making lots of decisions and they’re not keeping track. They’re not trying to learn from their own mistakes; they’re not investing the smallest amount in trying to actually figure out what they’ve done wrong. And that’s not an accident: They don’t want to know.

So there is a lot of curiosity, and I get invited to give lots of talks. But the idea that you might want to appoint somebody to keep statistics on the decisions that you made and a few years later evaluate the biases, the errors, the forecasts that were wrong, the factors that were misjudged, in order to make the process more rational — they won’t want to do it.

S+B: Are people introspection-averse, or are they risk-averse? You’re a psychologist; you say your unit of analysis is the individual. Why don’t individuals want to know?
People look at mirrors.

KAHNEMAN: But when they have made a decision, people don’t even keep track of having made the decision or forecast. I mean, the thing that is absolutely the most striking is how seldom people change their minds. First, we’re not aware of changing our minds even when we do change our minds. And most people, after they change their minds, reconstruct their past opinion — they believe they always thought that. People underestimate the amount to which their minds have changed. Now in addition, people in general, when they have been persuaded of something, they think they always thought that. There’s very good research on that.

S+B: We’ve just lived through one of the biggest bubbles in history. We both know people who said, “I put money in dot-coms or telecoms at their peak. What was I thinking?”

KAHNEMAN: Oh, many people will admit that they made a mistake.
But that doesn’t mean that they’ve changed their mind about anything in particular. It doesn’t mean that they are now able to avoid that mistake.

S+B: So your bet, based on your study of how individuals and groups make decisions, is that the stock market bust is not going to fundamentally change how people think about risk.

KAHNEMAN: For a long time it’s going to have the effect of people getting burned by a stove. There’s going to be an effect at the emotional level, and it could last for a while.

S+B: But their mind hasn’t changed. So you think it’s an emotional phenomenon, it’s a System 1?

KAHNEMAN: I think that is entirely based on emotion.

S+B: Do we want to use Freudian, self-destructive explanations for why people rely on flawed intuitions in making decisions, rather than on their statistical expertise?

KAHNEMAN: Oh, no, God forbid!

S+B: Well, how about using evolutionary psychology? Maybe it makes sense that humans have evolved a cognitive bias toward drawing inferences from small numbers.

KAHNEMAN: You can always find an evolutionary quotation for anything. But the question is whether it’s functional, which is not the same as being evolutionary. There might be some environment in which
it’s dysfunctional, but mainly it’s inevitable.

But, you know, there’s also the issue of perception, which links to intuition. Perception evolved differently than either intuition or cognition evolved.

S+B: Now it seems like we’re dividing decision making into three systems: there’s the emotional stuff; there’s the rational-computational system; but there’s also a perceptual system.

KAHNEMAN: Yes, I think of three systems. In my current perspective, the question I ask is, What makes thoughts come to mind? And some thoughts come to mind much more easily than others; some really take hard work; some come to mind when you don’t want them.

Decision Analysis

S+B: When you began your research in the psychology of decision making, the business world was intent on making the managerial decision process as rational as humanly possible.

KAHNEMAN: The rational model is one in which the beliefs and the desires are supposed to be determined. We were real believers in decision analysis 30 years ago, and now we must admit that decision analysis hasn’t held up.

S+B: Didn’t your own research help kill it? The essence of your work seems to be the ongoing tensions and contradictions between System 1 and System 2 thinking. That makes it almost impossible for rational System 2 thinking to win out.

KAHNEMAN: That’s not quite true. Our research doesn’t say that decision makers can’t be rational or won’t be rational. It says that even people who are explicitly trained to bring System 2 thinking to problems don’t do so, even when they know they should.

S+B: Howard Raiffa, a father of formal decision analysis, basically recanted on his original work in the 50th anniversary issue of Operations Research. He argued that decision analysis didn’t have nearly the impact he felt it could have had on managerial thinking.

KAHNEMAN: And I think it’s very clear why that happened, but it was not clear then.

S+B: Does this obviate all the decision analysis courses — all the drawing of decision trees — that students take in graduate business programs?

KAHNEMAN: It doesn’t mean you shouldn’t take decision analysis. It just means that decision analysts are not going to control the world, because the decision makers, the people who are in charge, do not want to relinquish the intelligence function to somebody else. After all, in principle, under decision analysis, there would be somebody generating probabilities, and the decision makers would look at the trade-offs and decide about the assignment of utilities. In addition, the decision maker would have a managerial function, to ensure that the whole thing is done right. And that is absolutely not the way it is. Decision makers don’t like decision analysis because it is based on that idea that decision making is a choice between gambles.

S+B: That’s a wonderful phrase, “choice between gambles.” Is it more important to influence the choice between gambles, or to make a choice between gambles?

KAHNEMAN: I think decision makers, in business and elsewhere, just reject the metaphor altogether. Managers think of themselves as captains of a ship on a stormy sea. Risk for them is danger, but they are fighting it, very controlled. The idea that you are gambling is an admission that at a certain point you have lost control, and you have no control beyond a certain point. This is abhorrent to decision managers; they reject that. And that’s why they reject decision analysis.

S+B: So what should we do instead?

KAHNEMAN: That’s why you ought to think of systems. There are ways of thinking about a problem that are better than others. But I admit I’m less optimistic than I was before.

S+B: Because?

KAHNEMAN: Because I don’t think that System 1 is very educable. And System 2 is slow and laborious, and just basically less significant, less in control than it thinks it is.

S+B: What is it that you would most like senior managers who have influence over people’s lives and money to understand about your work?

KAHNEMAN: If I had one wish, it is to see organizations dedicating some effort to study their own decision processes and their own mistakes, and to keep track so as to learn from those mistakes. I think this isn’t happening. I can see a lot of factors acting against the possibility of that happening. But if I had to pick one thing, that would be it.

Reprint No. 03409

Author profiles:


Michael Schrage (schrage@media.mit.edu) is codirector of the MIT Media Lab’s e-Markets Initiative and a senior adviser to the MIT Security Studies program. Mr. Schrage is the author of Serious Play: How the World’s Best Companies Simulate to Innovate (Harvard Business School Press, 2000).

 

Get s+b's award-winning newsletter delivered to your inbox. Sign up No, thanks
Illustration of flying birds delivering information
Get the newsletter

Sign up now to get our top insights on business strategy and management trends, delivered straight to your inbox twice a week.