Joel Spolsky — On AI — Wiki Companion
WIKI COMPANION

Joel Spolsky — On AI

A reading-companion catalog of the 20 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Joel Spolsky — On AI uses as stepping stones for thinking through the AI revolution.

This page lists every Orange Pill Wiki entry hyperlinked from Joel Spolsky — On AI. 20 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.

Concept (14)
AI-Generated Code as Ultimate Abstraction
Concept

AI-Generated Code as Ultimate Abstraction

The class of software produced when a developer describes intent in natural language and a language model returns working implementation across the full technology stack — the most powerful abstraction ever built, and the one whose structur…

Borrowed Competence
Concept

Borrowed Competence

Spolsky's framework's name for the confidence that abstraction provides — real while the abstraction holds, owed with interest when it fails — and the specific form of professional fluency that AI-era developers are accumulating without the…

Controlled Friction as Engineering Practice
Concept

Controlled Friction as Engineering Practice

The deliberate introduction of friction-rich, AI-free work into an otherwise AI-augmented workflow — not as Luddite theater but as training, modeled on aviation's mandatory hand-flying hours, designed to maintain the diagnostic strata that …

Institutional Memory Preservation
Concept

Institutional Memory Preservation

The practice of building mechanisms to transfer tacit diagnostic knowledge from retiring senior engineers to subsequent generations before the retirement consumes the knowledge — modeled on the nuclear weapons complex's response to the same…

Leak Detection Testing
Concept

Leak Detection Testing

Testing regimes designed specifically to find the places where AI-generated code is most likely to fail — concurrency, integration boundaries, failure injection, current-threat security scanning — before production conditions force the disc…

Normalization of Deviance
Concept

Normalization of Deviance

Diane Vaughan's four-phase institutional process — observation, assessment, normalization, baseline shift — by which anomalies become routine and the standards that would have caught them erode incrementally, invisibly, and without any sing…

Tacit Knowledge
Concept

Tacit Knowledge

Michael Polanyi's name for knowledge the knower cannot fully articulate — distributed across cognitive, perceptual, and motor systems — and precisely what no external technology can carry.

The Abstraction Sequence of Computing
Concept

The Abstraction Sequence of Computing

The layered history of computing understood as a staircase of successive concealments — machine code hidden by assembly, assembly by high-level languages, storage mechanics by SQL, HTTP by frameworks, servers by the cloud — each step liftin…

The Diagnostic Gap
Concept

The Diagnostic Gap

The distance between what a practitioner understands about a system and what the system requires her to understand when it fails — a gap that abstraction widens invisibly, that AI-generated code has made the widest in computing history, and…

The Elevator and the Staircase
Concept

The Elevator and the Staircase

The load-bearing metaphor through which Spolsky's framework reads AI-generated code: the elevator is the natural-language interface that carries the developer from lobby to penthouse in a single ride, magnificent until it stops between floo…

The Integration Leak
Concept

The Integration Leak

The most consequential and hardest-to-diagnose class of failure in AI-generated systems — not a bug in any component but a mismatch between assumptions that components make about each other, embedded in generated code, implicit and unrecove…

The Joel Test and Its AI-Era Successor
Concept

The Joel Test and Its AI-Era Successor

Spolsky's 2000 twelve-question checklist for evaluating software team quality — blunt, binary, deliberately oversimplified — and the five-question AI-era successor this volume proposes, designed to make visible the diagnostic capability tha…

The Law of Leaky Abstractions
Concept

The Law of Leaky Abstractions

Spolsky's 2002 thesis that all non-trivial abstractions, to some degree, are leaky — the structural observation that every layer designed to hide complexity will eventually fail to hide it, forcing the user to understand the very thing the …

The Training Data Question
Concept

The Training Data Question

The governance regime change in which the accumulated textual, visual, and computational output of millions of individuals was appropriated for AI training under terms their original contribution did not contemplate — the paradigmatic case …

Person (2)
Edo Segal
Person

Edo Segal

Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.

Joel Spolsky
Person

Joel Spolsky

American software developer, writer, and entrepreneur (b. 1965) whose Joel on Software blog, the Law of Leaky Abstractions, and the co-founding of Stack Overflow produced a body of practitioner-driven software criticism whose frameworks a…

Event (3)
The Eight-Month Fintech Leak
Event

The Eight-Month Fintech Leak

The paradigmatic integration leak case of this volume — a three-person payment processing startup whose fully AI-generated backend ran flawlessly for eight months before a race condition in webhook processing began producing duplicate charg…

The Trivandrum Training
Event

The Trivandrum Training

The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…

Y2K Remediation as Diagnostic Precedent
Event

Y2K Remediation as Diagnostic Precedent

The $300 billion global effort to fix two-digit year fields in software whose original architects had retired, died, or forgotten — the canonical demonstration that diagnostic capability is an organizational resource that atrophies invisibl…

Organization (1)
Stack Overflow as Collective Diagnostic Memory
Organization

Stack Overflow as Collective Diagnostic Memory

The programming Q&A platform Spolsky co-founded with Jeff Atwood in 2008 — the largest repository of programming knowledge in history, which became the collective diagnostic memory of the software profession and whose absorption into AI t…

Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
20 entries