This page lists every Orange Pill Wiki entry hyperlinked from John Kenneth Galbraith — On AI. 31 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The problem of making a powerful AI system reliably pursue goals that its designers and users actually endorse — the central unsolved problem of contemporary AI.
Hunter Lewis's 2024 Galbraithian observation that AI is a product whose production is the justification for its use — the system generates the demand it exists to satisfy.
Galbraith's 1952 thesis that concentrated economic power generates its own counterweight through organized opposition on the other side of its market — the mechanism whose speed the AI transition has outrun.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
The structural calculation that converts productivity multipliers into staffing reductions — and the paradigmatic demonstration of how the technical imperative operates in contemporary institutions without any individual explicitly endorsin…
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The economic mechanism by which voluntary adoption becomes involuntary dependence through the accumulation of platform-specific investments — the subject of Shapiro's career-long investigation and the force now operating at unprecedented sp…
Galbraith's most famous image — the gleaming automobile on the crumbling road — describing the structural tendency of private economies to produce magnificent private consumption alongside systematically degraded public services.
The specific behavioral signature of AI-augmented work: compulsive engagement that the organism experiences as voluntary choice, with an output the culture cannot classify as problematic because it is productive.
The structural pathology by which affluent economies systematically underfund the shared institutions that would make prosperity broadly beneficial — operating at compressed timescale in the AI transition.
Galbraith's 1958 diagnosis of a society that had solved the problem of production and discovered, to its considerable discomfort, that solving production did not solve living.
Galbraith's term for the collective of specialists whose knowledge actually directs a large organization — adapted to the AI age as the priesthood of researchers, alignment scientists, and infrastructure engineers whose expertise is both i…
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The economic system in which human attention is harvested, packaged, and sold to advertisers — the infrastructure that drives the algorithmic pathologies Gore calls artificial insanity.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Segal's epilogue to the Galbraith volume — the quarterly conversation where if five people can do what a hundred did, why are we paying a hundred? — and the mechanism through which private capture of AI productivity gains becomes structura…
Galbraith's term for beliefs that persist not because they survive scrutiny but because they survive social acceptability — the residue of approval rather than the residue of experience.
Galbraith's 1958 argument that modern producers create the desires they then satisfy, undermining the orthodox assumption of consumer sovereignty — now operating at the level of cognitive tools rather than consumer goods.
The composite figure at the center of the AI democratization debate — a builder with intelligence, tools, and ambition whose capability has expanded dramatically while the institutional infrastructure that would convert capability into capi…
The population mourning what the AI transition eliminates — senior practitioners whose recognition demand is systematically truncated: their diagnosis acknowledged, their claim to institutional response denied.
The AI-age successor to consumer sovereignty — the flattering doctrine that the AI-empowered builder freely chooses what to build, structurally inaccurate in the same way for the same reasons.
Galbraith's term for the large organizations whose size and market power allow them to plan their own environment rather than respond to market signals — applied to AI companies whose scale is not incidental but structural.
The structural principle — drawn from microprocessor history — that a productivity multiplier of twenty is not an improvement but a phase transition: a qualitative change the organizational structures of the previous regime cannot accommoda…
Galbraith's 1967 framework for the industrial economy transposed onto the AI age: from manufacturing goods to producing inferences, with the structural logic preserved intact.
Galbraith's inversion of the orthodox flow from consumer to producer — the observation that in the modern economy, production drives consumer demand more than consumer demand drives production.
Maslow's reading of The Orange Pill's central question: worthiness is not a moral endowment but the developmental achievement of a person whose signal is shaped by B-values.
The European Union's 2024 regulatory framework for artificial intelligence — the most comprehensive formal institutional response to the AI transition, whose risk-based classification system and uncertain adaptive efficiency represent on…
Edo Segal's 2026 book on the Claude Code moment — the empirical and narrative ground on which this Whitehead volume builds its philosophical reading.
The early 2026 repricing event in which a trillion dollars of market value vanished from SaaS companies — the critical-stage moment when AI's displacement of software's code value became visible to markets.
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…