March 30, 2017

RALEIGH — Most economic impact studies fail to show whether a proposed taxpayer-funded project is worth the projected costs. A new John Locke Foundation Spotlight report explains why.

“Whether it’s to support a new highway project, special tax breaks for solar energy, the building of a civic center or sports complex, or to promote subsidies for Hollywood film producers, you usually can find an economic impact study, often touting how great the project will be for the state or local economy,” said report author Dr. Roy Cordato, JLF Senior Economist and Resident Scholar.

“Yet all of these studies ignore basic principles of economics and do not meaningfully measure what they claim to measure — the economic impact of the public policies and projects they are assessing,” Cordato added.

Policymakers would make better decisions if they scrapped economic impact studies and relied instead on the type of cost-benefit analysis that private businesses use when making their investments, Cordato said.

He points to a “simple, predictable, and effective” formula linked to most economic impact studies. “A special-interest group that stands to benefit from a proposed project funds a study,” Cordato explained. “It purports to provide hard numbers about jobs and wages. It makes grandiose claims about economic benefits. The special-interest group then highlights the study in press releases picked up by a largely uncritical media. Stories about the study then give political cover to decision makers who will determine the project’s fate.”

That formula faces significant problems. “These studies typically use proprietary, off-the-shelf models with names such as IMPLAN, CUM, or REMI, but the people who carry out the work rarely have expertise in economics,” said Cordato, a Ph.D. economist. “Instead their expertise tends to be limited to the models they use.”

Real economic analysis should account for a significant factor that every model misses, Cordato said. “While each project will produce directly observable economic activity, some activity never occurs because the project has moved forward,” he said. “Economists call the activity that doesn’t occur ‘opportunity costs.’”

“Any economic impact study that does not attempt to assess these opportunity costs cannot legitimately be called economic analysis,” Cordato said. “In fact, not attempting to take account of opportunity costs is considered to be the biggest mistake that noneconomists make when thinking about economic issues.”

The report cites the example of a proposed $20 million taxpayer-financed convention center. Both the economic impact study and real economic analysis would attempt to gauge the effect of the $20 million on the construction industry, material and equipment suppliers, the labor market, and other affected sectors.

“A real economic analysis also asks: What economic activities would have occurred if the money had remained in the hands of the taxpayer?” Cordato explained. “The money would have been spent on various goods and services or saved in local banks. It would have had an impact that must be subtracted from the visible effects of the convention center’s construction.”

Economic impact studies don’t even attempt to get that calculation right, Cordato said. “The possibility that a new project could cause a net reduction in income, output, or employment is ruled out of the study models by their methodology,” he said. “The opportunity costs go unexamined. The models do not account for them.”

A related problem involves a factor called the “multiplier effect.” The idea is that the initial spending’s impact is multiplied as money moves its way through industries affected by the proposed project.

“These models do not account for the fact that dollars saved or spent in ways other than the government project also would have a multiplier effect,” he said. “The failure to account for that opportunity cost can lead to absurd claims.”

For instance, one study using the IMPLAN model argued that $72 million in N.C. government subsidies for renewable energy generated $1.4 billion in new economic output. “That amounts to a fantastical claim that every dollar invested in renewable energy yields $19 in economic benefits,” Cordato said. “If that were true, government would be silly not to shift every available dollar away from any other spending that yields a multiplier of less than 19. No one would make that argument with a straight face.”

A true economic analysis must account for the opportunity costs associated with renewable energy subsidies, Cordato said. “The failure to make that calculation demonstrates the absence of a basic understanding of what constitutes economic analysis.”

Rather than real economic analysis, economic impact studies do little more than measure “spending flows” among various sectors in the economy, Cordato said. “They tell us nothing about the relative efficiency of those spending flows,” he said. “In fact, if a government subsidy is responsible for the spending flow, the spending actually moves resources from more efficient resource uses to less efficient uses.”

Only a legitimate cost-benefit analysis would give decision makers the information they need when deciding to move forward with projects like the $20 million convention center, Cordato said.

“This is a complicated task — there are no off-the-shelf models that allow you to input numbers and spit out results,” he said. “It would require hiring highly specialized economists. But that is no excuse for relying instead on flawed economic impact studies.”

“In the final analysis, economic impact studies tell us nothing about the social value of government spending projects,” Cordato added. “This is one case in which incomplete information is worse than no information at all, particularly if it is manipulated to mislead the public.”

Dr. Roy Cordato’s Spotlight report, “Economic Impact Studies: The Missing Ingredient Is Economics,” is available at the JLF website. For more information, please contact Cordato at (919) 828-3876 or [email protected]. To arrange an interview, contact Mitch Kokai at (919) 306-8736 or [email protected].