Just about every human decision about the future is tainted by a gap — the difference between what we think we know and what we actually know. The more expert we are, the wider the gap is likely to be. The story below, an excerpt from the book Dance with Chance, is a classic example of an expert-busting enterprise. The experts in this case are highly sophisticated statisticians and professors who are using up-to-date models. Most of us assume that sophistication helps us understand the future. But in fact, sophistication makes things worse; it invites misplaced focus on the complicated. In the end, if we expect to make better decisions, we would do well to internalize the lesson in this passage. That doesn’t mean relying on common sense. It might mean looking for experts who understand the limits of what they know and the relative value of simpler methods.
— Nassim Nicholas Taleb
Excerpted from chapter 9 of
Dance with Chance: Making Luck Work for You
As an expert in statistics, working in a business school during the 1970s, one of the authors…couldn’t fail to notice that executives were deeply preoccupied with forecasting. Their main interest lay in various types of business and economic data: the sales of their firm, its profits, exports, exchange rates, house prices, industrial output…and a host of other figures. It bugged the professor greatly that practitioners were making these predictions without recourse to the latest, most theoretically sophisticated methods developed by statisticians like himself. Instead, they preferred simpler techniques which — they said — allowed them to explain their forecasts more easily to senior management. The outraged author decided to teach them a lesson. He embarked on a research project that would demonstrate the superiority of the latest statistical techniques. Even if he couldn’t persuade business people to adopt his methods, at least he’d be able to prove the precise cost of their attempts to please the boss.
Every decent statistician knows the value of a good example, so the professor and his research assistant collected many sets of economic and business data over time from a wide range of economic and business sources. In fact they hunted down 111 different time series, which they analyzed and used to make forecasts — a pretty impressive achievement given the computational requirements of the task back in the days when computers were no faster than today’s calculators. They decided to use their trawl of data to mimic, as far as possible, the real process of forecasting. To do so, each series was split into two parts: earlier data and later data. The researchers pretended that the later part hadn’t happened yet and proceeded to fit various statistical techniques, both simple and statistically sophisticated, to the earlier data. Treating this earlier data as “the past,” they then used each of the techniques to predict “the future,” whereupon they sat back and started to compare their “predictions” with what had actually happened.
Horror of horrors, the practitioners’ simple, boss-pleasing techniques turned out to be more accurate than the statisticians’ clever, statistically sophisticated methods. To be honest, neither [were] particularly great, but there was no doubt that the statisticians had served themselves a large portion of humble pie.
One of the simplest methods, known as “single exponential smoothing,” in fact appeared to be one of the most accurate. Indeed, for 61.8% of the time it was more accurate than the so-called Box-Jenkins technique, which represented the pinnacle of theoretically based statistical forecasting technology back in the 1970s. The academic journals of the day had proven that the Box-Jenkins method was more accurate than large econometric models where predictions were based on hundreds of equations and impressive volumes of data. So, by extension, single exponential smoothing was also more accurate than the grand-scale econometric models that cost hundreds of thousands of dollars to develop and use!

