Group dynamics can produce a different sort of challenge in bringing together a team; people vary in their styles and assertiveness. The most vocal or most senior person — rather than the person with the keenest sense of possibilities — might dominate the discussion and overly influence the consensus. This has been the case in a host of classroom simulations based on wildfires, plane crashes, and boat wrecks. They all place teams into a simulated high-pressure situation where collective insight should help. Typically, a dominant personality steps forth and drives the process toward his or her predetermined view, making little or no use of the wisdom of the crowd. In The Drunkard’s Walk: How Randomness Rules Our Lives (Pantheon, 2009), physicist and writer Leonard Mlodinow describes a number of research studies that show how most people put too much confidence in the most senior or highest-paid person. Does that sound like your executive team?
Culture and Capability
To become proficient at forecasting, a company must develop capabilities for both achieving insight and converting that insight into effective decision making. The firm need not seek out the star forecaster, but instead should invest in cultivating an open atmosphere of dialogue about uncertainty and scrutiny — one that brings to the fore a more complete picture of the expert knowledge that already resides in many of its existing employees.
The resulting culture will be one in which managers recognize and deal with uncertainty more easily; they won’t feel they have to resort to the extreme of either throwing up their hands in despair or pretending that they have all the answers.
In the end, overcoming the problems and traps in forecasting probably requires the use of all of these approaches together, within a supportive culture. An example of how difficult this is can be found in the U.S. National Aeronautics and Space Administration (NASA), which probably contains as analytically rigorous a set of people as can be found in a single organization.
The disintegration of space shuttle Columbia in 2003 on reentry during its 28th mission demonstrates how culture can overrule capability. After problems during the shuttle’s launch, NASA engineers developed extensive models for a wide range of scenarios, including the possibility that foam pieces had struck the wing, the event ultimately deemed responsible for the accident. But rather than focus on contingency plans for dealing with the known issue but unknown impact, NASA officials placed too much faith in their mathematical models, which suggested that the wing had not sustained a dangerous degree of damage. The results were catastrophic.
Less than a month after the Columbia disaster, this pervasive cultural problem at NASA was described in an article in the New York Times that quoted Carnegie Mellon University professor Paul Fischbeck. (Fischbeck, an expert on decision making and public policy, had also been the coauthor of a 1990 NASA study on the 1986 Challenger explosion caused by an O-ring failure at cold temperatures.) “They had a model that predicted how much damage would be done,” he said, “but they discounted it, so they didn’t look beyond it. They didn’t seriously consider any of the outcomes beyond minor tile damage.” In other words, even NASA’s brilliant rocket scientists couldn’t outsmart their own inherent biases. They needed processes and practices to force them to do so.
And so, probably, does your company. Too many managers dismiss the inherent uncertainty in the world and therefore fail to consider improbable outcomes or invest sufficient effort in contingency plans. The world is full of unknowns, even rare and difficult-to-predict “black swan” events, to use the term coined by trader, professor, and best-selling writer Nassim Nicholas Taleb. Overreliant on either their intuition or their mathematical models, companies can become complacent about the future.