This page lists every Orange Pill Wiki entry hyperlinked from Alan Kay — On AI. 31 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis, engaged in both The Orange Pill and this book, of the cultural trajectory toward frictionlessness that conceals the labor, struggle, and developmental process that gave work its depth.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The quality of subjective experience — being aware, being something it is like to be — and the single deepest unanswered question in both philosophy of mind and AI.
The pattern by which AI tools lower the floor of who can build — enabling production by individuals whose stock consists of an idea, a subscription, and the capacity for sustained attention.
Alan Kay's term for the trajectory by which the personal computer — designed as a medium for active creative engagement — became a medium for passive consumption, and the pattern the AI moment threatens to complete.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The gap between what a person can conceive and what they can produce — a ratio that has been collapsing since the Neolithic and that the language model reduced to approximately the length of a conversation.
Kay's proposed companion metric to the imagination-to-artifact ratio: the distance between what a user can produce and what the user can comprehend. The ratio the AI moment has left unchanged even as the other collapsed.
Kay's most quoted dictum — "The best way to predict the future is to invent it" — reframed as a design obligation: the future we must invent is the future of maximum understanding, not maximum production.
The programming paradigm Alan Kay named and formalized in Smalltalk — code organized as communities of objects that communicate through messages, modeled on biological cells.
The peculiar pathology of AI-augmented work — compulsive engagement with a tool that is genuinely producing valuable output, indistinguishable from flow externally and catastrophically different internally.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The Orange Pill's metaphor for the institutional work of redirecting the river of AI capability — not to stop the current but to shape what grows around it.
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
Alan Kay's 1972 proposal for a portable personal computer for children — never about hardware, always about creating a medium for thought that would amplify understanding rather than merely deliver answers.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
Kay's insistence that the purpose of a computing medium is to transform the user's thinking, not to maximize production — and his charge that the AI industry has confused the two.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The question "what is a human being for?" — which Clarke predicted intelligent machines would force humanity to ask, and which arrived in 2022–2025 with more force and less philosophical preparation than he expected.
The tax every previous computer interface levied on every user — the cognitive overhead of converting human intention into machine-acceptable form. The tax natural language interfaces have abolished.
Maslow's reading of The Orange Pill's central question: worthiness is not a moral endowment but the developmental achievement of a person whose signal is shaped by B-values.
Neural networks trained on internet-scale text that have, since 2020, proven capable of producing human-like responses across nearly every written domain — the technology at the center of the Orange Pill Cycle's subject.
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind. The paradigm that abolished the translation cost.
Jeff Koons's mirror-polished stainless steel sculptures — five editions made between 1994 and 2000, one of which sold for $58.4 million in 2013 — invoked by Byung-Chul Han and The Orange Pill as the paradigmatic artifact of the aesthetics o…
Xingqi Maggie Ye and Aruna Ranganathan's 2026 Harvard Business Review ethnography of an AI-augmented workplace — the most rigorous empirical documentation to date of positive feedback dynamics in human-machine loops.