This page lists every Orange Pill Wiki entry hyperlinked from Joel Spolsky — On AI. 20 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The class of software produced when a developer describes intent in natural language and a language model returns working implementation across the full technology stack — the most powerful abstraction ever built, and the one whose structur…
Spolsky's framework's name for the confidence that abstraction provides — real while the abstraction holds, owed with interest when it fails — and the specific form of professional fluency that AI-era developers are accumulating without the…
The deliberate introduction of friction-rich, AI-free work into an otherwise AI-augmented workflow — not as Luddite theater but as training, modeled on aviation's mandatory hand-flying hours, designed to maintain the diagnostic strata that …
The practice of building mechanisms to transfer tacit diagnostic knowledge from retiring senior engineers to subsequent generations before the retirement consumes the knowledge — modeled on the nuclear weapons complex's response to the same…
Testing regimes designed specifically to find the places where AI-generated code is most likely to fail — concurrency, integration boundaries, failure injection, current-threat security scanning — before production conditions force the disc…
Diane Vaughan's four-phase institutional process — observation, assessment, normalization, baseline shift — by which anomalies become routine and the standards that would have caught them erode incrementally, invisibly, and without any sing…
Michael Polanyi's name for knowledge the knower cannot fully articulate — distributed across cognitive, perceptual, and motor systems — and precisely what no external technology can carry.
The layered history of computing understood as a staircase of successive concealments — machine code hidden by assembly, assembly by high-level languages, storage mechanics by SQL, HTTP by frameworks, servers by the cloud — each step liftin…
The distance between what a practitioner understands about a system and what the system requires her to understand when it fails — a gap that abstraction widens invisibly, that AI-generated code has made the widest in computing history, and…
The load-bearing metaphor through which Spolsky's framework reads AI-generated code: the elevator is the natural-language interface that carries the developer from lobby to penthouse in a single ride, magnificent until it stops between floo…
The most consequential and hardest-to-diagnose class of failure in AI-generated systems — not a bug in any component but a mismatch between assumptions that components make about each other, embedded in generated code, implicit and unrecove…
Spolsky's 2000 twelve-question checklist for evaluating software team quality — blunt, binary, deliberately oversimplified — and the five-question AI-era successor this volume proposes, designed to make visible the diagnostic capability tha…
Spolsky's 2002 thesis that all non-trivial abstractions, to some degree, are leaky — the structural observation that every layer designed to hide complexity will eventually fail to hide it, forcing the user to understand the very thing the …
The governance regime change in which the accumulated textual, visual, and computational output of millions of individuals was appropriated for AI training under terms their original contribution did not contemplate — the paradigmatic case …
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
American software developer, writer, and entrepreneur (b. 1965) whose Joel on Software blog, the Law of Leaky Abstractions, and the co-founding of Stack Overflow produced a body of practitioner-driven software criticism whose frameworks a…
The paradigmatic integration leak case of this volume — a three-person payment processing startup whose fully AI-generated backend ran flawlessly for eight months before a race condition in webhook processing began producing duplicate charg…
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…
The $300 billion global effort to fix two-digit year fields in software whose original architects had retired, died, or forgotten — the canonical demonstration that diagnostic capability is an organizational resource that atrophies invisibl…