The self-regulating market is a utopian project in Polanyi's original sense: a place that does not exist and cannot exist. The attempt to create a market that governs production and distribution without interference from social institutions fails structurally because markets require social institutions to function — laws that enforce contracts, norms that constrain fraud, educational systems that produce skilled workers, political stability that protects property, cultural institutions that sustain trust. The market depends on these institutions but does not produce them; when market logic extends without constraint, it destroys the conditions on which market activity depends. The self-regulating AI market reproduces this structural impossibility with particular clarity: the deployment of AI tools is destroying the educational institutions, professional communities, and mentorship relationships that produce the skilled workers the market will continue to require.
The impossibility has empirical precedents. The factory system of nineteenth-century England destroyed the communities that had produced skilled, healthy, socially embedded workers, creating a pauperized urban working class that could not sustain the labor force the market required. The enclosure movement destroyed the subsistence economy that had sustained rural populations, producing the displacement that eventually provoked the protective counter-movement the market had been designed to prevent. In each case, the market's logic destroyed its own foundations.
The AI market is repeating this pattern with characteristic precision. The boardroom conversation recounted in The Orange Pill — if five people using AI tools can do the work of a hundred, why keep a hundred? — crystallizes the impossibility. The arithmetic is clean within the market's own logic. The structural consequence is the elimination of the developmental infrastructure through which the next generation of judgment, expertise, and wisdom would have been produced.
The junior developer who never debugs by hand never develops architectural intuition. The lawyer who never reads cases never develops legal judgment. The student who never struggles with an essay never develops analytical capacity. The market produces today's output more cheaply and destroys the process by which tomorrow's capability would have been developed. This destruction is invisible to market metrics because the market prices current output; developmental processes are externalities that do not enter any market calculation.
The recursive dimension makes the AI case uniquely severe. When labor was commodified, workers retained the capacity for political organization through which they eventually built protective institutions. The commodification of intelligence threatens the capacity required to conceive such institutions. The judgment that would design protective frameworks, the attention that would sustain democratic engagement, the questioning that would imagine institutional alternatives — each is being subjected to the same market logic that destroyed previous counter-movements' infrastructure.
The structural impossibility argument is developed across The Great Transformation, particularly in Polanyi's analysis of why the nineteenth-century attempt to construct a self-regulating labor market produced not equilibrium but catastrophe. Polanyi showed that the market's defenders presented their project as natural and self-correcting when it was in fact dependent on continuous state intervention to create and maintain the conditions commodification required.
Application to AI markets has been developed in recent scholarship drawing on Polanyi's framework. Jeremy Shapiro's work on the geopolitics of AI, analyses of the limits of market-driven AI governance in the tradition of Dani Rodrik and Mariana Mazzucato, and critical political economy of data and platforms all converge on the recognition that the self-regulating AI market reproduces the structural impossibility Polanyi identified in its original form.
Markets depend on institutions they do not produce. Contracts require law; trust requires cultural traditions; skilled labor requires educational investment; monetary exchange requires political stability.
Unconstrained market logic destroys its own foundations. Extending commodification to institutions on which the market depends eliminates the conditions for market functioning.
The boardroom arithmetic is structurally self-defeating. Converting productivity gains into headcount reduction eliminates the developmental infrastructure through which the next generation of capability would have been produced.
The impossibility is temporal. The market measures current output; developmental processes operate on generational timescales; the misalignment produces inevitable crisis.
Market optimists argue that the market will produce its own solutions — new educational institutions, new developmental pathways, new forms of professional community — as economic incentives align with the need for sustained capability. The Polanyian response is historical: the original market did not produce its own solutions, requiring a century of political struggle to construct the protective institutions that eventually made industrial capitalism compatible with human life. There is no basis for expecting the AI market to perform a function it is structurally incapable of performing and has never performed in its prior instances.