The enclosure operates at a structural level that existing intellectual property frameworks were not designed to address. Open-source licenses govern the use of specific software artifacts but say nothing about using those artifacts as training data. Creative Commons licenses govern reproduction and adaptation of specific works but do not govern the statistical patterns extracted from millions of such works and embedded in neural network weights. The legal vacuum has allowed AI companies to consume the commons without the reciprocity obligations that commons governance traditionally imposed — no requirement to contribute improvements back, no community voice in governance, no sharing of the value generated by the models.
The dynamic is circular and self-reinforcing. AI systems trained on commons data enable individual direct production. Individual producers, who can meet their needs through solitary AI conversation, contribute less to the commons. The commons receives less data and less community engagement. Future AI models train on a degraded commons, supplemented by proprietary data or AI-generated synthetic data. The commons becomes peripheral to the AI ecosystem, further reducing the incentive to maintain it. This is not the tragedy of overgrazing — the commons is not depleted by use. It is the tragedy of underfeeding — the commons degrades because contributions decline when the social context motivating contribution (community recognition, collaborative governance) is eliminated by a technology that makes collaboration unnecessary.
Benkler's institutional response to enclosure has always emphasized legal frameworks that protect the commons: open licenses, robust fair use, resistance to copyright expansion. The AI moment requires extending this institutional repertoire. Possible frameworks include copyleft-for-AI licenses (requiring that models trained on commons data be released openly), commons-governed AI (open-source models developed and maintained by communities), and compensation mechanisms (channeling revenue from commercial AI to commons maintenance). None exist at scale, and the institutional vacuum is the governance crisis of the transition.
The concept draws on the historical analysis of enclosure — the 18th-century privatization of English common lands that displaced rural communities — which Benkler invoked in The Wealth of Networks to describe the danger facing the digital commons. The AI-specific application emerged in 2023–2024 legal battles (authors' guilds suing OpenAI, artists suing Stability AI) and in the recognition that the training data fueling AI capabilities was overwhelmingly drawn from commons-produced knowledge that received no compensation, credit, or governance rights.
Commons as substrate. AI capability rests entirely on the accumulated output of commons-based peer production, making the commons the unacknowledged foundation of the AI economy.
Extraction without reciprocity. AI companies consumed commons data and produced proprietary tools that compete with the commons for contributors, disrupting the ecology of motivation that sustained collaborative production.
Model collapse risk. If AI-generated content floods the training data of future models, the diversity and human grounding that characterized the original commons will degrade, reducing the quality of subsequent AI generations.
Institutional gap. Existing intellectual property frameworks do not govern the use of works as training data, creating a legal vacuum that must be filled through new licensing frameworks, governance structures, or compensation mechanisms.