The eighteenth-century enclosure of the agricultural commons converted shared stewardship of productive land into private property subject to market logic. The immediate yield was impressive; the long-term consequence was the exhaustion of soil that had been sustained for centuries through communal practice. The commodification of intelligence follows the same structural logic. The accumulated cognitive production of human civilization — the text, code, analysis, and creative work produced through millennia of educational institutions, communities of practice, and mentorship relationships — is being enclosed through AI training and subjected to commodity pricing. The immediate output is impressive: AI tools produce competent cognitive work at a fraction of the cost of human expertise. But the market does not maintain the cognitive soil. It extracts the accumulated harvest without sustaining the developmental processes that produced it.
The parallel to agricultural enclosure is structural, not metaphorical. The commons sustained rural communities for centuries through stewardship practices that maintained soil fertility, biodiversity, and ecological balance. Enclosure converted the commons into private property, subjected it to market logic, and produced immediate gains in agricultural output. Within a generation, the most productive enclosed lands were depleted, and the communities that had sustained them were destroyed. The pattern — extraction without replenishment, short-term yield without long-term stewardship — recurs wherever market logic governs the use of resources that were previously sustained by communal practice.
AI training reproduces the pattern with structural precision. The training corpus is the accumulated output of human cognitive activity: writing, coding, scientific analysis, creative production. This output was generated by institutions and communities that had purposes other than providing training data — universities, professional communities, cultural traditions, apprenticeship systems. The accumulated corpus is the crystallized product of these institutional purposes, sustained through ongoing investment in the developmental processes that produce individual practitioners capable of contributing to it.
The AI market extracts this accumulated production and reprocesses it into commodity form — outputs priced by token and sold by subscription — without sustaining the institutional infrastructure that produced the corpus. Educational institutions that trained the writers, research traditions that shaped the analysts, professional communities that developed the coders are treated as externalities. The market harvests the crop without tending the field.
The stripping of output from process is the specific mechanism of this enclosure. AI systems reproduce the crystallized products of human intelligence without reproducing the developmental trajectories, mentorship relationships, or communities of practice that produced them. The long-term consequence — invisible in present market metrics but structurally predictable — is the exhaustion of the cognitive soil: declining depth of professional expertise, atrophying tolerance for cognitive friction, dissolving bonds of professional community.
The framework draws directly on Polanyi's analysis of the original enclosure movement in The Great Transformation, chapters 3–5. Polanyi documented that the enclosure of common land was presented to contemporaries as an improvement in agricultural efficiency and a triumph of market rationality, but in fact produced mass displacement, the destruction of rural communities, and the creation of a pauperized urban working class whose conditions constituted the central social crisis of the nineteenth century.
The extension to cognitive commons has been developed by contemporary scholars working at the intersection of Polanyian analysis and digital political economy. Yochai Benkler's work on the wealth of networks, James Boyle's on the cultural commons, and recent scholarship on data governance all converge on the recognition that digital capitalism is enclosing resources that had previously been sustained through non-market institutional arrangements. The AI transition makes this enclosure uniquely visible because it operates on the accumulated output of human intellectual life itself.
Extraction without replenishment. AI training harvests the accumulated cognitive production of civilization without sustaining the institutional infrastructure that produced it.
The soil metaphor is structural. The cognitive capacity of a civilization, like agricultural soil, is a long-accumulated resource sustained through specific institutional practices; its exhaustion through extractive use follows predictable patterns.
Invisibility to market metrics. The depletion is invisible in current output measures because it manifests in the slow degradation of developmental infrastructure rather than the immediate reduction of harvest volume.
Re-embedding requires institutional investment. Protecting the cognitive commons requires constructing institutions that sustain the developmental processes the market cannot price — educational reform, labor protections, communities of practice, mentorship structures.
Defenders of current AI training practices argue that the training corpus was already public, that derivative works have always drawn on accumulated human production, and that the volume of new cognitive production enabled by AI tools will more than compensate for any decline in traditional expertise. The Polanyian response is not that training is theft but that the institutional frameworks for sustaining the sources of the training corpus are being undermined — that the question is not what is extracted but whether what is extracted can be replenished. The current trajectory suggests not.