This page lists every Orange Pill Wiki entry hyperlinked from Fred Brooks — On AI. 42 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Brooks's term for the complexity that arises not from the problem being solved but from the tools used to solve it — syntax, configuration, dependency management, build systems. The complexity AI eliminates more thoroughly than any previou…
Byung-Chul Han's diagnosis — extended through Dissanayake's biological framework — of the cultural dominance of frictionless surfaces and the specific reason the smooth feels biologically wrong.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
Engelbart's foundational distinction: automation removes the human from the loop, augmentation redesigns the loop so the human's participation becomes more powerful. The most consequential design decision of the AI decade.
Fred Brooks's 1975 empirical observation that adding people to a late software project makes it later — because communication overhead scales as n(n-1)/2 while productive capacity grows linearly.
Brooks's claim that the most important quality of a software system is the coherence of a single design vision — what committees cannot produce, what teams compromise, and what AI-augmented solo building unexpectedly restores.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Brooks's term for the complexity inherent in the problem — deciding what to build, understanding user needs, balancing requirements, ensuring correctness — which no tool can eliminate because it is not an artifact of the tools.
Aristotle's word for human flourishing — activity of the soul in accordance with virtue — and the standard against which the achievement society's confusion of productivity with the good life must be measured.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The maximum learning instrument, not the minimum product — a discipline of epistemic restraint now stripped of its economic rationale by AI and revealed in its essential form.
The productive cognitive resistance that arises when agents with different training and different frameworks must negotiate a shared understanding — irritating, slow, socially costly, and the primary mechanism through which distributed sy…
Brooks's 1975 prescription — 'plan to throw one away; you will anyway' — recognizing that the first implementation of any system is exploratory, a way of discovering what the system should actually do. AI has made throwing one away triviall…
The specific behavioral signature of AI-augmented work: compulsive engagement that the organism experiences as voluntary choice, with an output the culture cannot classify as problematic because it is productive.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The Brooks volume's proposed corollary to the mythical man-month — the assumption that an AI tool substitutes for a human developer on a month-for-month basis, which is as mythical as the original man-month and for structurally analogous re…
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Consciousness as a small flame in an infinite darkness — fragile, improbable, illuminating only a few inches beyond itself, and burning as the founding act of revolt.
The figure at the intersection of Segal's democratization narrative and Prahalad's access analysis — the builder whose capability has expanded dramatically and whose value-capture remains bounded by the institutional geography surrounding …
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The cycle of describe, generate, evaluate, refine that characterizes AI-augmented building — read through Tufte's framework as a small-multiples workflow operating in the temporal dimension.
Brooks's closing meditation in The Mythical Man-Month — the pleasures of making things, the fascination of complex structures, weighed against the obligation to meet others' specifications and the discovery of obsolescence upon completion.…
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The claim — central to this book's reading of the Orange Pill — that the collapse of techne's cost reveals a deeper barrier that was always the harder problem: deciding what deserves to be built.
The question "what is a human being for?" — which Clarke predicted intelligent machines would force humanity to ask, and which arrived in 2022–2025 with more force and less philosophical preparation than he expected.
Brooks's name for the tendency of architects who have successfully designed one system to over-engineer the second, adding every feature they wished they had included the first time — producing a bloated design that collapses under its own …
The vast majority experiencing the full emotional complexity of the AI transition without a clean narrative to organize it — most accurate in perception, least audible in discourse.
The AI builder's experience of independence resting on structural dependence—the tenant-farmer of the knowledge economy, sovereign within conditions she does not own.
Brooks's 1975 organizational proposal for software development — a small team built around a chief programmer (the surgeon) supported by specialists — preserving the conceptual integrity of the single mind while providing the support that m…
Brooks's metaphor for large-system development — a project that seemed simple at the start becoming progressively more difficult as essential complexity reveals itself through implementation. AI has made the tar pit easier to step into with…
The tax every previous computer interface levied on every user — the cognitive overhead of converting human intention into machine-acceptable form. The tax natural language interfaces have abolished.
Anthropic's command-line coding agent — the specific product through which the coordination constraint shattered in the winter of 2025, reaching $2.5B run-rate revenue within months.
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind. The paradigm that abolished the translation cost.
Brooks's 1986 essay arguing that no single technology would deliver order-of-magnitude improvements in software productivity within a decade — a prediction partially refuted by AI in letter, but confirmed in spirit.
The methodological frame of the Fred Brooks — On AI volume — an Opus 4.6 attempt to simulate Brooks's pattern of thought after his 2022 death, applied to a transformation he did not live to analyze.
Fred Brooks's 1975 collection of essays distilling hard-won lessons from the IBM System/360 project into the foundational principles of software engineering — Brooks's Law, conceptual integrity, the tar pit, and the mythical man-month itse…
Edo Segal's 2026 book on the Claude Code moment — the empirical and narrative ground on which this Whitehead volume builds its philosophical reading.