This page lists every Orange Pill Wiki entry hyperlinked from Christopher Alexander — On AI. 41 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis, engaged in both The Orange Pill and this book, of the cultural trajectory toward frictionlessness that conceals the labor, struggle, and developmental process that gave work its depth.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The principle — defended by Wiener at considerable personal cost — that the creators of powerful systems bear moral responsibility for what those systems do after deployment, and that the claim of value-neutral research is a fiction that tr…
The collapse of the skill-obsolescence cycle from decades to months — and the resulting breakdown of the sequential grief-learning-rebuilding process that the human psyche requires to adapt.
Ericsson's empirically established mechanism for building expertise — effortful, targeted engagement at the boundary of capability, guided by specific feedback and sustained over thousands of hours.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Leopold's term for the cultivated capacity to read a landscape — to perceive pattern, relationship, and symptom in the specific configuration of what is present and what is absent. The meta-skill that all other stewardship skills depend on.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Toffler's 1970 diagnosis of the psychophysiological stress produced when human beings encounter more change than they can process — not the content of any particular change, but the pace itself.
The layered, embodied form of knowledge that accumulates in a practitioner through years of focal engagement with her material — too slow to notice day-to-day, too deep to transmit by documentation, and invisible to every metric the device …
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The widening gap between the speed at which an institution can adapt and the speed at which its environment is changing — the mechanism through which individual future shock compounds into systemic disorientation.
Alexander's theoretical term for the class of structures — physical, social, computational — that exhibit the properties of life: coherence, adaptation, wholeness, and the capacity to support flourishing.
Alexander's lifelong thesis that the people who inhabit a space should be the ones who shape it — a claim that frames AI's collapse of the expert-layperson gap as the realization of a fifty-year argument.
Alexander's 1977 framework of 253 interconnected generative patterns that allow ordinary inhabitants — not credentialed professionals — to design living environments.
The Orange Pill's term for compulsive engagement with generative tools — re-specified by the Skinner volume not as metaphor but as the precise behavioral signature of a continuous reinforcement schedule without an extinction point.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
Alexander's diagnostic category for artifacts that are structurally sound, aesthetically competent, and yet lack the property of life — the condition AI-generated output exhibits at unprecedented scale.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The Orange Pill's metaphor for the institutional work of redirecting the river of AI capability — not to stop the current but to shape what grows around it.
Consciousness as a small flame in an infinite darkness — fragile, improbable, illuminating only a few inches beyond itself, and burning as the founding act of revolt.
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
Alexander's catalogue of the structural features — from levels of scale to not-separateness — whose joint presence distinguishes living structure from dead form.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The characteristic rhetorical move by which established professions defend their jurisdiction against new entrants: the insistence that legitimate practice requires the specific knowledge the profession has historically gated.
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The question "what is a human being for?" — which Clarke predicted intelligent machines would force humanity to ask, and which arrived in 2022–2025 with more force and less philosophical preparation than he expected.
Alexander's name for the ineffable property that makes spaces, objects, and systems feel alive rather than merely functional — the quality AI tools can approximate but not originate.
The vast majority experiencing the full emotional complexity of the AI transition without a clean narrative to organize it — most accurate in perception, least audible in discourse.
The tax every previous computer interface levied on every user — the cognitive overhead of converting human intention into machine-acceptable form. The tax natural language interfaces have abolished.
Alexander's name for the step-by-step generative procedure through which living structure emerges — each step preserving and enhancing the wholeness of the whole.
Alexander's technical term for the class of structural changes that maintain and enhance the existing life of a system rather than destroying it — the discipline the AI builder must import into fast-generation workflows.
Maslow's reading of The Orange Pill's central question: worthiness is not a moral endowment but the developmental achievement of a person whose signal is shaped by B-values.
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind. The paradigm that abolished the translation cost.
The 15th-century invention — Gutenberg's movable type — that Gopnik, Farrell, Shalizi, and Evans identify as the single most illuminating historical analog for understanding what large language models actually are.
The early 2026 repricing event in which a trillion dollars of market value vanished from SaaS companies — the critical-stage moment when AI's displacement of software's code value became visible to markets.
The February 2026 training session in which Edo Segal's twenty engineers in Trivandrum crossed the orange pill threshold and emerged as AI-augmented builders producing twenty-fold productivity gains — the founding empirical moment of The Orange…