This page lists every Orange Pill Wiki entry hyperlinked from Albert Hirschman — On AI. 34 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The architecture of contemporary public conversation — engagement-optimized platforms that reward clarity and confidence while attenuating the nuanced voice the AI transition most needs.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
The individual's departure from a deteriorating system — information-poor, irreversible, and, for the AI transition, concentrated among the most knowledgeable practitioners whose departure the system can least afford.
Hirschman's 1970 triad — the three possible responses to institutional deterioration. Exit punishes, voice informs, loyalty delays. The framework that explains why the AI discourse is failing.
Exit without alternative — the retreat of senior practitioners to lower-cost regions and simpler lives when the technology industry no longer offers a path in which their expertise is rewarded.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The systemic price paid when departing members carry away the specific diagnostic knowledge that would have enabled correction — the information the exit tax that the exiter does not bear.
The quietest and most misunderstood of the three responses — the active force that holds members inside a deteriorating system. At best, the commitment that sustains voice; at worst, the mechanism by which decline becomes invisible.
Hirschman's methodological commitment to taking seriously outcomes that conventional analysis dismisses as improbable — the disciplined refusal to treat pessimistic structural forecasts as conclusive.
The AI-augmented pathology of compulsive engagement with tools that generate real value — the collapse of the passions-interests distinction that the Hirschmanian reading identifies as structural, not personal.
Hirschman's intellectual discipline of questioning his own previous conclusions — the habit of discovering that apparently settled analyses conceal surprises. Largely absent from the AI discourse on both sides.
The Orange Pill's metaphor for the institutional work of redirecting the river of AI capability — not to stop the current but to shape what grows around it.
A fourth response beyond the classical triad — structural action that embodies the argument. The beaver builds the dam; the founder keeps the team; the curriculum designer preserves formative struggle.
Byung-Chul Han's 2010 diagnosis of the achievement-driven self-exploitation that has replaced disciplinary control as the dominant mode of power — and, in cybernetic terms, a social system operating in positive feedback.
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
The cohort mourning what the AI transition is eliminating — senior practitioners whose diagnoses of lost depth are often precise but whose voice fails to produce institutional response because they can name the illness without prescribing t…
The compounding dynamic in which individually rational exit by the system's most knowledgeable members destroys the transmission mechanism through which their replacements would have been produced.
Voice at its most precarious — private, unamplified, spoken to a single listener with no institutional structure to carry it further. The canary in the coal mine of institutional deafness.
Hirschman's 1967 principle that ambitious projects conceal their true difficulty until the builder is committed — productive self-deception that AI's early transparency is eliminating, along with the resilience the hiding used to build.
Smith's metaphor for the unintended coordination of self-interested individual action toward collective benefit — now applied to the strange collaboration between builders and AI systems that have no interest at all.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The largest and most diagnostically valuable cohort in the AI transition — practitioners who hold gain and loss in simultaneous awareness, silenced by a discourse architecture that rewards clarity and punishes ambivalence.
AI's early enthusiasts — the builders posting productivity metrics, shipping solo products, experiencing genuine creative release. Partly right, structurally blind, and the largest obstacle to the voice the transition needs.
Hirschman's 1973 metaphor for the psychology of tolerating inequality — why patience with rising disparity holds as long as progress appears imminent, and why it inverts into fury compounded by betrayal when the signal fails.
The most demanding of the three responses — the exercise of complaint from inside an institution with the expectation of being heard. Requires an audience, an adequate language, and institutional capacity to convert feedback into change.
Xingqi Maggie Ye and Aruna Ranganathan's 2026 Harvard Business Review ethnography of an AI-augmented workplace — the most rigorous empirical documentation to date of positive feedback dynamics in human-machine loops.
Hirschman's 1977 history of how commercial society was morally justified by reframing dangerous passions as manageable interests — a distinction the AI transition has collapsed through productive addiction.
Hirschman's 1991 catalog of the three recurring argumentative structures used to oppose progressive reform — perversity, futility, jeopardy — and their mirror-image progressive fallacies, all of which run through the AI discourse.
German-born American economist and political theorist (1915–2012) whose work on exit, voice, and loyalty, the hiding hand, and the rhetoric of reform crossed disciplinary boundaries for half a century.
MIT economist (b. 1967), Nobel laureate (2024), and among the most influential contemporary analysts of AI's institutional effects — whose inaugural UNESCO Hirschman Lecture framed the automation-augmentation choice as institutional rather …