But quantitative analyses are never neutral. To be useful, any data, including economic data, must be considered in the context of the decision that is being made. Also, no matter how clever the mathematics, certain key inputs in a cost-benefit analysis cannot be translated into economic value. Security and safety, the preservation of wildlife and open spaces, the reduction of fear in a community, and scientific uncertainty in fields that spawn technological innovation are all economic intangibles — and omitting them when they are clearly important factors should invalidate the analysis. But it never does.
Perhaps most important, analysis is an act performed by human beings. Complex systems and modeling experts have long asserted that human beings operating in isolation or within the mental models of their professional training simply aren’t able to be objective enough to come up with the right answers to the kinds of problems that cost-benefit analysis addresses. Certainly, within tightly defined boundaries — when they practice the scientific method, for instance — people can approximate objectivity. Yet even then, they cannot help making value judgments at every step. Those judgments include, for example, choosing what assumptions they use to create their models, what data they include in and leave out of their calculations, what rules they use to compute critical values such as the cost of a human life, what populations they attribute costs and benefits to, and how they adjust for imperfections in market prices.
Some of these concerns have been voiced in studies of cost-benefit analysis itself, including those conducted by some of the most highly respected anti-regulation scholars. Although it has come to be seen as the methodology of choice by people who balk at government intervention, studies by members of that group show its effectiveness to be questionable at best. A 2007 study by Robert W. Hahn of the AEI-Brookings Joint Center for Regulatory Studies and Paul C. Tetlock of the Yale School of Management and the Red McCombs School of Business at the University of Texas suggests that although economic analyses have probably influenced the outcome of particular regulations, “there is little evidence that such analysis has had a large overall impact” on the total cost or volume of regulation.
On the pro-regulation side, a February 2004 analysis by Ruth Ruttenberg & Associates for the Public Citizen Foundation concluded that in 30 years of federal regulatory activity, the U.S. government had consistently inflated cost estimates for health, safety, and environmental protections. Rarely, if ever, did actual compliance costs reach the estimates provided by the regulating agency — and costs never reached the levels estimated by the private sector.
“Cost benefit studies still may provide useful information to policymakers, but [their] practical application...involves a significant number of controversial value judgments...that have become embedded in the practice of economics as we know it,” wrote Tyler Cowen, an economist at George Mason University and at the Center for Study of Public Choice, in 1998.
As any social scientist can tell you, this tension between data and values is deep and wide, and perpetuating the divide is a mainstay of many disciplines. But I believe that the gap itself is at the heart of the problem. As long as regulators refuse to acknowledge the need for a methodological bridge between data and values, cost-benefit analysis is the perfect cover for a biased assessment.
Realistic Data Is Elusive
“Most cost-benefit analysis is hokum,” says Alan Roberts, vice president of the Dangerous Goods Advisory Council, a nonprofit organization that works with state and federal regulators. From 1975 until his retirement in 1999 as the manager of the U.S. Department of Transportation’s hazardous materials program, Roberts handled more than 100 rule-making projects and, he says, “For every one of them, I had to make a declaration of cost and impact” — even though the most relevant data for calculating those costs and impacts was often in short supply.