Gorz argued that cognitive capitalism perpetuates itself by converting a structurally abundant resource — human intelligence and accumulated knowledge — into scarcity through mechanisms of enclosure: intellectual property rights, proprietary platforms, data monopolies, and corporate control of the infrastructure through which knowledge circulates. Knowledge that could in principle be shared freely is enclosed, commodified, and sold back to the people who produced it, generating profits for those who control the enclosures rather than benefits for those who generated the knowledge. The AI economy reproduces this pattern with remarkable fidelity: models trained on the collective knowledge of humanity are owned by a handful of corporations whose pricing decisions and terms of service determine who has access.
There is a parallel reading that begins not with knowledge's inherent abundance but with the material substrate required to make that knowledge productive. The servers, data centers, undersea cables, rare earth minerals, and electrical grids that enable AI are definitively rival and excludable — and their concentration reflects not artificial enclosure but genuine physical scarcity. When we examine who controls these material foundations, we find the same corporations Gorz critiques, but their power derives from managing actual resource constraints, not manufactured ones.
This material reading suggests that the "cognitive capitalism" frame obscures more than it reveals. The workers maintaining data centers in Virginia, the miners extracting lithium in Chile, the factory workers assembling GPUs in Taiwan — these are the labor relations that determine AI's political economy, not the abstracted "cognitive work" of prompt engineering. The knowledge may be collectively produced, but its computational activation requires massive capital investment that only certain actors can marshal. OpenAI didn't capture common knowledge; it invested $100 million in compute to transform that knowledge into something useful. The scarcity isn't artificial — it's the thermodynamic reality of computation at scale. From this vantage, calls for "democratic governance of cognitive infrastructure" ring hollow without addressing who owns the power plants, who controls the chip fabs, and who can afford to run models that cost millions in electricity alone. The political project isn't reclaiming enclosed knowledge but building alternative material infrastructures — a far more daunting challenge than Gorz's framework suggests.
The large language models powering contemporary AI tools were trained on billions of texts, millions of code repositories, and the distilled output of centuries of scientific, literary, artistic, and technical production. This knowledge was produced collectively, by countless individuals working across generations, and belongs, in any morally coherent sense, to the commons. But the models trained on this collective knowledge are proprietary, owned by a small number of corporations whose strategic direction determines access conditions.
This creates the defining contradiction of the AI economy: the common origin of the knowledge versus the private capture of its value. The developer whose expanded capability depends on Claude is, in Gorz's framework, a cognitive worker dependent on corporate infrastructure for access to resources produced by her own civilization. The autonomy she experiences is mediated by a commercial relationship she did not negotiate and cannot influence.
Gorz proposed that cognitive capitalism was structurally unstable because the knowledge on which it depended was inherently non-rival and non-excludable. The attempt to enclose knowledge through intellectual property and proprietary platforms is both economically inefficient and socially destructive. The instability creates political opportunity: collective claims on the productive surplus generated by accumulated knowledge can be advanced as matters of right, not requests for generosity.
The resolution of this contradiction will determine whether the AI transition produces a civilization of freedom or a new feudalism of cognitive enclosure. The political stakes extend beyond economic distribution to the question of whether the collective inheritance of human knowledge remains collective or is permanently captured by corporate platforms.
Gorz developed the analysis of cognitive capitalism through engagement with the French and Italian autonomist traditions — particularly the work of Yann Moulier-Boutang, Antonio Negri, and Maurizio Lazzarato — in his final major work, L'Immatériel (2003), published four years before his death.
Abundance produces scarcity. Knowledge is naturally non-rival, but capitalism requires scarcity — hence enclosure through intellectual property.
The platform is the new factory. Corporate control of infrastructure replaces direct ownership of production as the locus of exploitation.
Collective origin, private capture. Knowledge produced collectively across generations is commodified and sold back to its producers.
Structural instability. The non-rival character of knowledge makes enclosure permanently contested and economically inefficient.
The commons is the alternative. Democratic governance of cognitive infrastructure is the political answer to corporate enclosure.
Neoliberal economists argue that intellectual property rights are necessary to incentivize the production of knowledge. Gorzian and post-autonomist thinkers respond that most foundational knowledge was produced without such incentives, and that enclosure captures existing knowledge rather than incentivizing its creation.
The tension between Gorz's cognitive capitalism and the material infrastructure thesis resolves differently depending on which layer of the AI stack we examine. At the knowledge layer, Gorz is essentially correct (90/10): the training data for large language models does represent enclosed common heritage, artificially scarified through legal and technical mechanisms. Wikipedia, academic papers, open-source code — these were indeed produced collectively and their privatization within proprietary models represents genuine enclosure.
But at the computational layer, the material critique dominates (20/80): the scarcity of compute, energy, and specialized hardware is fundamentally physical, not artificially imposed. The $100 million required to train GPT-4 reflects real resource consumption, not rent-seeking. Here, the question isn't whether to enclose or liberate knowledge, but who controls the means of computation — a classical question of industrial capitalism that Gorz's framework undersells. The political implications similarly bifurcate: democratizing access to trained models (Gorz's concern) is technically trivial compared to democratizing access to training capacity (the materialist concern).
The synthetic frame that emerges recognizes AI as operating through dual scarcity regimes: artificial scarcity at the knowledge layer sitting atop genuine scarcity at the computational layer. This suggests a two-pronged political strategy — fighting enclosure where it's artificial (through open models, public datasets, commons-based licensing) while building alternative infrastructures where scarcity is real (through public compute, energy democracy, cooperative platforms). The mistake is treating either layer as the whole story. Gorz illuminates the knowledge enclosure but underestimates material constraints; the materialist critique reveals infrastructure dependencies but misses how artificial scarcity multiplies natural scarcity's effects. The AI economy's contradictions operate at both levels simultaneously.