Henry Farrell is an American political scientist whose work sits at the intersection of political economy, political theory, and the social effects of information technologies. His collaboration with Alison Gopnik, Cosma Shalizi, and James Evans on 'Large AI Models Are Cultural and Social Technologies' (Science, 2025) represents one of the most influential pieces of interdisciplinary AI scholarship of the decade — reframing LLMs not as intelligent agents but as transmission technologies whose effects should be analyzed through the lenses of media history, political economy, and the governance of cultural infrastructure rather than through the frameworks of AI safety that assume agentic intelligence.
Farrell is the SNF Agora Institute Professor at the Johns Hopkins School of Advanced International Studies, where he works on the political economy of information, democratic theory, and the institutional consequences of digital technologies. His earlier books — including The Political Economy of Trust (2009) and (with Abraham Newman) Of Privacy and Power (2019) and Underground Empire (2023) — established him as one of the leading social scientists working on how information infrastructures structure political and economic power.
The collaboration with Gopnik brought together developmental psychology, statistics, and sociology with Farrell's political-economic perspective. The combination produced a framework that no single discipline could have constructed: a case that LLMs are best understood through the analytical apparatus societies have developed for previous cultural technologies like the printing press, the newspaper, and the internet — apparatus that includes attention to access, distribution, institutional disruption, and the reshaping of cognitive habits at scale.
Farrell's contribution to the book and to the broader AI discourse has been to keep the political-economy questions central. If LLMs are cultural technologies, then the questions that matter most are not about alignment in the AI-safety sense but about governance: who controls the infrastructure, who gets access, how are the benefits and costs distributed, what institutions emerge to manage the new medium, and how do existing institutions (schools, courts, publishers, democracies) adapt to a world in which the production of plausible text has become essentially free.
Farrell has also been a persistent critic of the agent-framing that dominates AI safety discourse, arguing in a series of essays and op-eds that the framing imports science-fiction assumptions into policy conversations that should be grounded in empirical analysis of how current systems actually operate and what their current effects actually are. The cultural-technology thesis is both a descriptive claim about what LLMs are and a prescriptive argument about what questions should dominate public reasoning about them.
Henry Farrell was born in Ireland and educated at Trinity College Dublin and Georgetown University, where he received his Ph.D. in political science. He taught at George Washington University before joining Johns Hopkins SAIS. His collaboration with Gopnik began in the early 2020s through the Santa Fe Institute and the broader network of scholars working on the social implications of AI.
Cultural technology as political economy. Farrell insists that LLMs must be analyzed as infrastructures with distributional consequences, not just as technical systems.
Governance, not alignment. The questions that matter most are about who controls access, how benefits and costs distribute, and what institutions emerge.
Historical analogy is the method. Previous cultural technologies — printing, telegraphy, broadcasting, the internet — provide the analytical apparatus for understanding LLMs.
Critique of the agent framing. Treating LLMs as minds-in-progress imports science-fiction assumptions that obscure current empirical realities.
Interdisciplinary synthesis. The cultural-technology thesis emerged from combining developmental psychology, statistics, sociology, and political economy — no single discipline could have produced it.