An externality is a cost or benefit that falls on parties not involved in the transaction that produced it. Environmental pollution is the canonical example: a factory produces widgets priced to the buyer and smoke imposed on the community. The smoke does not appear on the factory's balance sheet, does not factor into the price of the widgets, and persists until institutional intervention forces the producer to internalize it. Stiglitz's career-long engagement with externalities has demonstrated that the gap between private cost and social cost is where markets produce their most destructive outcomes. The AI economy produces externalities novel in kind but familiar in structure: cognitive erosion at civilizational scale, informational ecosystem degradation through recursive training, and work intensification whose health consequences workers absorb individually while employers capture productivity gains aggregately.
The first externality class is cognitive. When a developer uses AI to produce code she does not fully understand, she captures a private benefit (faster output, expanded capability) and imposes an invisible social cost: a marginal reduction in the collective stock of deep understanding that the development community possesses. The antibiotics analogy is precise. A patient taking an antibiotic captures a private benefit (recovery) and imposes a social cost (marginal contribution to antibiotic resistance). Each individual decision is rational; the aggregate is a public-health crisis that no individual decision caused and no individual decision can solve. Millions of developers producing code they do not deeply understand are consuming a shared cognitive resource — the collective capacity to diagnose, maintain, and evolve the systems society depends on — without replenishing it. The consumption is invisible until the systems fail in contexts where depth atrophy matters.
The second externality class is informational. Large language models are trained on the accumulated output of human knowledge production — journalism, research, literature, professional practice. The AI economy simultaneously consumes this knowledge base and undermines the institutions that produce it. News organizations lose revenue as AI summaries replace direct readership. Research faces competition from AI-generated analysis that mimics scholarship's form without its substance. Stiglitz's warning is structural: the feedback loop in which AI output enters training data for future models, producing outputs that enter subsequent training data, progressively dilutes the proportion of human-verified, institutionally produced knowledge in the data supply. Users in 2030 will have no way to assess what proportion of the system's training corpus was produced by humans with genuine expertise versus generated by prior AI systems. The confidence of the output will continue to rise. The grounding will continue to thin.
The third externality class is labor-market. The Berkeley research documented work intensification as the empirical consequence of AI adoption. Workers did not get more rest; they got more work. They expanded scope, colonized breaks, and experienced the specific burnout pattern that sustained cognitive overextension produces. The employer captured the productivity gain. The worker absorbed the cost — in health, in relationships, in the cognitive depletion that compounds over time. This is a classical labor-market externality with historical precedent: the early factory system produced the same dynamic at physical scale, and the institutional response (the eight-hour day, workplace safety regulations, workers' compensation) forced producers to internalize the costs they had been imposing. No equivalent regulatory framework yet exists for cognitive externalities.
The policy prescription follows the environmental-economics template: pricing mechanisms that force producers to internalize costs they impose on society. Mandatory contribution by AI companies to the knowledge institutions whose output they train on — a licensing framework that compensates training-data producers rather than appropriating their work. Regulatory standards for AI-assisted professional work that maintain the market value of depth against the lemons dynamic. Cognitive safety regulations analogous to physical safety regulations, addressing the intensification documented by the Berkeley researchers. And public investment in the goods the AI economy depends on but does not produce: education, research, journalism, and the infrastructure allowing these public goods to function.
Stiglitz's work on externalities extends across his career, from early papers on pollution pricing in the 1970s through his role as Chair of the Council of Economic Advisers, where he advocated for environmental regulation during the Clinton administration. His 2006 work on the Stern Review economics of climate change formalized the framework for addressing externalities at civilizational scale. The application to AI represents a direct extension: the same structural gap between private cost and social cost, the same requirement for institutional intervention to close it, the same political obstacles from the industries generating the externalities.
Cognitive externalities are real but invisible. Unlike physical pollution, they cannot be seen, measured with simple instruments, or traced to specific sources — which is why they compound longer before institutional response arrives.
The information ecosystem is a public good. Journalism, research, and professional practice produce knowledge that benefits everyone but is funded by specific institutions whose business models AI is undermining.
Recursive training degrades the corpus. Each generation of AI trained partly on prior AI output dilutes the human-verified foundation without reducing the confidence of the output.
Intensification is a health externality. Burnout patterns documented in AI-augmented workplaces are occupational hazards that, absent regulation, are borne privately while the productivity gains are captured publicly.
Pigouvian pricing as framework. Tax or regulate producers to bear the cost of the externalities they create — the same logic that justifies carbon pricing applies to cognitive and informational externalities.
Critics of externality-based AI regulation argue that the costs are speculative, difficult to measure, and likely to be corrected by market innovation. Stiglitz's response: the measurement difficulty is a feature of externalities, not an argument against pricing them. Environmental economists spent decades developing methodologies for pricing difficult-to-measure costs, and the resulting institutional frameworks, however imperfect, prevented the far worse outcomes that unpriced pollution would have produced. The AI case is structurally identical.