A new book argues that the dangers of risk aversion often outweigh the risk of making mistakes.
In 1934, Max Wertheimer, a pioneer of Gestalt psychology, decided to see if he could stump his pen pal Albert Einstein with a math problem. In a letter, he wrote:
“An old clattery auto is to drive a stretch of two miles, up and down a hill, /\. Because it is old, it cannot drive the first mile—the ascent—faster than with an average speed of 15 miles per hour. Question: How fast does it have to drive the second mile—on going down, it can, of course, go faster—in order to obtain an average speed (for the whole distance) of 30 miles an hour?”
Being a math wizard, who can cypher at lightning speed, the answer immediately sprang into my mind: the downhill run would need to be driven at 45 miles per hour (mph). That’s wrong, of course. It would actually require four minutes to travel the entire two-mile run at 30 mph, but it has already taken four minutes to travel the first mile at 15 mph. By the time the car gets to the top of the hill, it’s impossible to average 30 mph over the whole run, unless some of Einstein’s time-warping ideas have been embedded in the car’s design.
But I feel a little better knowing that the problem initially stumped Einstein, too. As Gerd Gigerenzer, director of the Max Planck Institute for Human Development, tells the story in his new book, Risk Savvy: How to Make Good Decisions (Viking, 2014), “[Einstein] confessed to having fallen for this problem to his friend: ‘Not until calculating did I notice that there was no time left for the way down!’” Gigerenzer uses the anecdote to illustrate an undeniable reality: We all make mistakes, even bona fide geniuses.
Indeed, when it comes to decision making and risk, the real problem isn’t so much making mistakes, but rather the fear of making mistakes. “Risk aversion is closely tied to the anxiety of making errors,” Gigerenzer writes.
When that anxiety is embedded in an organization’s culture, it promotes “defensive decision making”—decisions that seem to offer protection against negative consequences, but can result in suboptimal outcomes and greater risk exposure. A common example offered by Gigerenzer is hiring a large national vendor with a well-known name even though a smaller, local vendor would provide better prices and better service. Just because “nobody ever got fired for buying IBM” (as the old IT axiom went), doesn’t necessarily mean that buying IBM was the best decision for the buyer.
When anxiety is embedded in an organization’s culture, it promotes
“defensive decision making.”
I asked Gigerenzer how you can tell if your company has a “negative error culture” that’s spawning defensive decision making. “If the leadership in an organization pretends that errors will never occur; if it tries to hide mistakes when they do occur; or if it looks for someone to blame when they can’t hide mistakes, you can bet that you’ve found a negative error culture,” he replied.
Echoing what several other business book authors—including Tim Hartford, Megan McArdle, and Ralph Heath—have told us in the recent past, Gigerenzer recommends that companies, as well as individual professionals, reframe how they view errors. He points to the commercial aviation sector as a case in point. The large-scale tragedies that can result when mistakes are made in-flight have forced the industry, and its regulators, to thoroughly examine every error, using a rigorous and transparent process of analysis and response, often in full public view. Increasingly, the industry is also working proactively to identify potential errors and prevent them. This is a major reason why air travel is the safest form of transportation.
Not all industries require such an intense focus on mistakes. But every company can benefit from what Gigerenzer calls a “positive error culture.” Such a system doesn’t try to make mistakes or even welcome them. But when errors do occur, they aren’t swept under the rug. Instead, they’re treated as valuable learning opportunities that help companies avoid the repetition of similar mistakes in the future.