In England between the fifteenth and nineteenth centuries, common lands that had been used by entire communities for grazing, gathering, and subsistence were progressively enclosed — fenced off, privatized, converted from shared resource to individual property. Enclosure increased agricultural productivity. It also displaced millions of people who had depended on the commons for survival, concentrating wealth in the hands of landowners. The training of AI models on copyrighted works without compensation is a form of enclosure. The commons being enclosed is not physical land but the accumulated textual heritage of human civilization — the corpus of writing that represents millennia of intellectual labor, creative effort, and cultural investment. The models extract enormous value from this corpus. The value accrues primarily to the companies that build and deploy the models. The creators whose works constitute the corpus receive nothing.
The enclosure analogy is structural rather than metaphorical. The core features map closely: a shared resource sustaining a community, a privatization enabled by shifts in legal and technological capability, a redistribution of value toward those who control the privatization apparatus, and a displacement of those who previously depended on the resource.
The productivity gains from enclosure were real. Agricultural output rose substantially. The productivity gains from AI training are also real; the models enabled by the training produce substantial economic and intellectual value. But as with the original enclosure, the distributional consequences can be devastating, and the legal instruments that enable the enclosure are not neutral arbitrations of competing interests but interventions that serve those with political power to shape the law.
The existing copyright framework cannot capture the injustice because it was designed for a different kind of use. Copyright regulates reproduction, not pattern extraction. The training corpus's value is aggregate — a property of the collection, not of any individual element — and the individual-rights framework has no category for aggregate value.
Responses adequate to the enclosure require mechanisms that operate at the level of the commons rather than at the level of the individual work. Collective licensing schemes, training data royalties, data trusts, and statutory compulsory licenses all aim at this level. None is a complete solution; all of them presuppose institutional construction that the Romantic framework did not require and that current institutions are slow to build.
The analogy between AI training and historical enclosure emerged in critical legal scholarship and policy writing in 2022–2024, as the first major AI copyright lawsuits were filed and the distributional stakes of the training corpus question became widely legible. The framing draws on earlier scholarship on the enclosure of scientific and digital commons, notably James Boyle's work on the second enclosure movement affecting information more broadly.
Woodmansee's own work does not explicitly develop the enclosure analogy, but her framework makes the analogy legible. The 1994 Construction of Authorship warning that copyright may be inapposite to the realities of cultural production anticipates the structural inadequacy that the enclosure framing makes visible.
Structural, not metaphorical. The analogy is about shared patterns — privatization of a common resource, redistribution of value, displacement of prior users — not about literal similarity between land and text.
Aggregate value outside individual rights. The training corpus's value emerges from its aggregation across millions of works. Individual-rights frameworks cannot capture aggregate value, so cannot adequately respond to the enclosure.
Distributional dimension. The enclosure frame foregrounds who captures value and who absorbs costs — questions the Romantic framework's celebration of individual authorship routinely obscures.
Productivity gains and distributional consequences coexist. The enclosure analogy does not deny that AI produces value; it insists that productivity and justice are distinct questions and that productivity gains do not automatically justify the distributional pattern through which they are produced.
Institution-building required. Responses adequate to the enclosure require institutional construction — collective licensing, training royalties, data trusts — that presupposes the collapse of the Romantic framework and the construction of replacement mechanisms.
Defenders of the current AI training regime argue that the enclosure analogy overstates the injury — that individual works are not removed from their creators by being used as training data, and that the analogy to land (a rival good) fails for information (a non-rival good). Critics respond that the relevant injury is not removal but value extraction without compensation, and that the non-rivalry of information makes the extraction easier, not more legitimate. The debate will not be resolved by analogy alone; it will be resolved by the legal and institutional responses that emerge over the coming decade.