Bottom Line: When it comes to major projects, managers should consider taking the outside view, in which project forecasts are based on objective, historical data about similar initiatives, rather than the traditional insider mind-set of viewing complex ventures through their own unique, highly specific lens.
Ah, humanity—always looking on the bright side of things. In their pivotal 1979 Nobel Prize–winning work, Daniel Kahneman and Amos Tversky argued that people systematically underestimate the costs, completion time, and risks of their actions when making decisions and predictions. According to this so-called planning fallacy, people naturally adopt an inside view that stresses the specific challenges associated with a given project, rather than drawing on past experience of similar projects—whether it’s a term paper or a DIY assembly project—to come to a realistic conclusion of the time and resources needed to finish.
Although the theory was not developed with business leaders in mind, most people and companies—including project managers, cost engineers, and risk-assessment departments—utilize this traditional and intuitive approach. As a result, previous studies have observed that the cost-benefit analyses, business models, and environmental impact assessments that typically support big decisions are often highly skewed compared to the actual costs and benefits incurred.
Studies have observed that the cost-benefit analyses that typically support big decisions are often highly skewed.
But there is a solution. Kahneman and Tversky advised taking the outside view, which consists of using the average outcomes of previous similar ventures to inform forecasts and decisions. Now, a new study applies this practice to a recent multibillion-dollar rail project—dubbed the “A-Train” to keep things confidential—to demonstrate how gathering simple statistics about similar ventures provides a wildly different assessment from the official forecast. And in the case of the A-Train, the yawning discrepancy between the inside and outside perspectives was enough to turn off a major investor.
Forecasting is especially crucial, but rarely accurate, for complex projects. Consider Boston’s Big Dig, a project to reroute a highway and construct a tunnel under the city. It was finished nearly a decade late, with cost overruns of almost US$12 billion, not including the interest. Consider also the excessively enthusiastic prediction for Bangkok’s $2 billion Skytrain, which led to the construction of an oversized system, with less than half of the expected passengers, station platforms and terminals that were larger than needed, and unused trains that cluttered up garages. Accordingly, the author of this paper—responding to a request for an appraisal from a potential investor with similar concerns about the public–private A-Train project—applied the outside view to the demand forecast, although he notes that the underlying issues and methodology could apply to a wide range of projects.
The author looked at 475 transportation infrastructure ventures, 62 of which were rail projects comparable to the A-Train. In forecasts for 53 of these 62 rail projects, the prediction for demand was significantly overestimated. On average, rider demand forecasts were 69 percent higher than the number of riders who actually hopped on the trains.
Applying the average rail case as a benchmark against the A-Train’s plan, the author calculated that demand would be about 59 percent of the official forecast, meaning he expected 8.3 million passengers would ride the train in the first year instead of the 14.1 million expected by the project’s leaders. Having 5.8 million fewer passengers would result in excess capacity on the trains and huge losses on the balance sheet, the author writes.
Delving deeper, the author calculated that the risk of a ridership shortfall of 15 percent or more was 16 times more likely in his own analysis based on the benchmark cases than in the A-train’s official forecast. The author notes that although the consultants, analysts, and business executives overseeing the A-Train proposal pointed to a handful of forecasts from previous similar projects used in their projections, they failed to provide proof that those projects had been accurately forecast. As a result, the investor who requested the outside view analysis declined to fund the A-Train.
And although new projects generally have some unique characteristics, the author points to the high level of statistical significance across the many projects in his sample. Indeed, he notes that the comparative advantage of taking the outside view will be most valuable for non-routine ventures, such as building green infrastructure or plants, and catering to new sources of demand. The key is harnessing project managers’ imagination to analyze historically applicable, similar projects rather than to conjure up hypothetical scenarios.
“To be sure, choosing the right class of comparative past projects would become more difficult when planners are forecasting initiatives for which precedents are not easily found, for instance the introduction of new and unfamiliar technologies,” the author writes. “However, most projects, and especially major ones, are both non-routine locally and use well-known technologies that have been tried out elsewhere.”
There are signs that the outside view is catching on. In the U.K., the Treasury and the Department for Transport have, in recent years, mandated that “data from past projects and similar projects elsewhere” be used in infrastructure cost forecasts. The Danish and Swiss governments have enacted similar measures. And in an historic first, 700 investors in 2012 filed an AUD$150 million (US$142 million) class-action lawsuit against the forecaster who produced “woefully inaccurate” traffic and revenue estimates for the multibillion-dollar Clem Jones Tunnel toll road in Brisbane, Australia, which went bankrupt within a year.
“In the outside view project planners and forecasters are not required to make scenarios, imagine events, or gauge their own and others’ levels of ability and control, so they cannot get all these things wrong,” the author concludes. “Human bias is bypassed.” Ah, statistics.
Source: Quality Control and Due Diligence in Project Management: Getting Decisions Right by Taking the Outside View, by Bent Flyvbjerg (University of Oxford), International Journal of Project Management, July 2013, vol. 31, no. 5