• Socialist systems distrust the spontaneous order in market distribution of goods and replace it with central planners — with notoriously worse results
  • A new paper tested the hypothesis that this problem could be solved with supercomputers
  • Even with impossible simplifications, researchers found it would take well over 100,000 years to make all the computations necessary

At the center of the socialist critiques of capitalism is that market systems fail to distribute goods fairly. This critique is based on perceived unequal outcomes. Implicit in socialist systems is the belief that a government system of planning would produce a more equitable distribution system.

Unfortunately, the socialist answer to inequality of outcomes tends to force everyone into the lowest common outcome (equality in poverty), which it then drives even lower over time (equality in privation and squalor) — everyone, that is, who is not in the ruling class enjoying significantly greater wealth from their political power.

Also implicit in socialism is a rejection of the spontaneous social order at work in the economy, as recognized by, among others, Adam Smith, Friedrich A. Hayek, and James Buchanan, as well as a presumption that a central planner or planning committee would have all the knowledge needed that goes into all economic decisions.

It is this second aspect — what Hayek called “the pretense of knowledge” — that concerns the new research paper discussed here.

In “Revisiting the Computation Problem,” published in the Fall 2023 issue of The Quarterly Journal of Austrian Economics, Paul F. Cwik, the BB&T Professor of Economics and Finance at the University of Mount Olive, and Lucas M. Engelhardt, associate professor of economics at Kent State University at Stark, addressed the sheer improbability of being able to compute the distribution of goods to consumers optimally. More on that in a moment.

The Importance of Spontaneous Social Order

First, let us discuss spontaneous social order, which socialists reject in favor of central planning. In 1992, on the passing of Hayek, economist David Rehr wrote about spontaneous social order for the Federal Reserve Bank of Minneapolis. Rehr explained that Hayek and Smith’s idea of spontaneous order in society “was the result of human action but not human design” (emphasis added). Rehr wrote:

Hayek, expanding on arguments advanced by the Scots, wrote that society developed through tradition and reason, concurrently. Both logical and practical, everyday experience influenced man’s advancement. The use of reason, however, was not limitless, being bounded by bias held by an individual or group. This meant that society was too complex to be created piece by piece in a strictly rational, logical manner.

Hayek argued that those who misunderstood or disregarded the notion of spontaneous order did so because they incorrectly divided the world into two categories: “planned” (which implicitly means order and purpose) and “unplanned” (which connotes disorder, randomness and chaos). Hayek argued that society, and its most advanced institution — the market economy — fit into neither category and, therefore, belonged in a third group. Each member of this third group would be bounded by rules, have its own order and increase in complexity in a way that would not be fully understood. (Emphasis added.)

Spontaneous social order is not merely in the economy. Hayek cited it also in the development of our language:

No single individual or group thought it up. It has its own rules of grammar, and language continues to evolve as mankind advances. Language could not be described in complete detail even if every computer was dispatched to this use.

Central planning tends to bad outcomes and dictatorship.

Contrast that spontaneous order with central planning. Because it disrupts, distorts, and sometimes eliminates the untold number of market signals that help producers and consumers guide their independent decisions — far too many to count! — central planning tends to bad outcomes and dictatorship. As Rehr wrote:

Tyranny results from government’s attempts to plan the workings of daily life.

The implications [of spontaneous order] for the economy are even more striking. First, the very role of government economic planning comes into question, whether the issue is federal funding for highways or the use of taxes to influence investment decisions.

It is only the unabashedly “free” market that can generate the signals for producers and consumers to trade. Any attempt by government to regulate prices, impose interstate or intrastate tariffs, or impose quality standards sends conflicting, inaccurate messages that have a discoordination effect in the market. This equally applies to a society like the former Soviet Union that desired complete planning or the local community housing authority that plans what type of homes will be built.

Could Central Planners Armed with Supercomputing and AI Outdo Spontaneous Order?

Could supercomputing produce optimal central planning? It’s a question that has entertained some thinkers going back — surprisingly — to 1908.

Answering some earlier studies, Cwik and Engelhardt addressed this “computational question” with fresh calculations and up-to-date computing speeds. Global population is about 8 billion; the number of discrete consumer goods is unknowable, but Amazon.com alone sells “373 million unique products”; and “[t]he Frontier supercomputer at the Oak Ridge National Laboratory is rated at 1.1 exaflops” (meaning it could do 1.1 quintillion floating point operations per second).

A key to the paper is the realization that “computation time increases linearly for consumers and by the square of the number of goods.” In other words, adding more people doesn’t affect computation time nearly as much as adding more goods.

Why it matters

The calculation would require over three quintillion equations and take approximately 108,529 years to compute.

Paul F. Cwik and Lucas M. Engelhardt, “Revisiting the Computation Problem”

Strictly from the mathematical side (laying aside the issue of individual rights), the computational problem constantly evolves. As Cwik and Engelhardt explained, “As populations grow and new products are introduced, the problem gets larger; however, as computer processing technology and algorithms improve, the problem gets smaller.”

It should be instructive how much simplification the computational problem needs even to begin to address it. Cwik and Engelhardt have to make impossible assumptions just to make estimation possible. They include:

  • Planners are “angels working in the best interest of society rather than their own”
  • “[A]ll humans are already completely selfless and perform assigned tasks and duties to the best of their abilities”
  • Every person’s “subjective tastes and preferences” can be known and “fed into a computer algorithm”
  • All goods are consumer goods and are listed by Amazon

Even then, “assuming 8.2 billion consumers, 373 million goods, and a one-exaflop machine, the calculation would require over three quintillion equations and take approximately 108,529 years to compute.”

From there, Cwik and Engelhardt show that trying to work around the limits of the computational problem also won’t work. Limit the total amount of goods? That’s tyranny and would defeat the purpose of optimization. Assign more supercomputers to the problem? It would require so much electricity you’d have to cover the oceans completely with wind turbines or an area larger than Russia with solar panels.

The solution they reach is one we already had: “to avoid computation’s central planning problem, one must abandon central planning.”