This page lists every Orange Pill Wiki entry hyperlinked from Rob Nixon — On AI. 29 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The governing metaphor of The Orange Pill — AI as a signal-amplifier that carries whatever is fed into it further, with terrifying fidelity. Buber's framework extends the metaphor: the amplifier clarifies what was already there, which makes…
Disasters that unfold so slowly they are experienced as normal conditions rather than emergencies—lacking the temporal profile of crisis.
The condition in which the subject exploits herself and calls it freedom — the signature of the enterprise of the self, where the overseer's function is internalized as motivation.
The novel form of value capture operating at the heart of the AI economy — user interactions become training data that improve models owned by the center, replicating colonial extraction patterns in cognitive rather than material form.
The paradoxical condition in which sustained creative output is produced through mechanisms structurally identical to addiction—excellence that costs more than metrics measure.
The progressive decay of the capacity for sustained, unaided concentration that occurs when practitioners rely continuously on AI assistance — incremental, imperceptible, and grounded in the neuroscience of synaptic pruning.
The systematic reduction of worker skill requirements through technological design — not a side effect of automation but frequently its central purpose, documented by Noble across industrial automation and extended by this volume to knowle…
The form of understanding that lives in the body — deposited through habitual engagement with resistant materials, irreducible to propositional content, and constitutive of genuine expertise.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The resistance AI tools eliminate from knowledge work — a category whose composition (wolf or parasite?) determines whether its elimination is liberation or erosion.
The reproduction of colonial geography in the AI transformation: protective institutions in the center, unconstrained commodification at the periphery — with peripheral economies absorbing the withdrawal of offshored cognitive work withou…
The widening gap between the speed at which an institution can adapt and the speed at which its environment is changing — the mechanism through which individual future shock compounds into systemic disorientation.
Harm that unfolds gradually and out of sight, dispersed across time and space, whose attritional character renders it invisible to systems calibrated for spectacular events.
The Berkeley researchers' term for the colonization of previously protected temporal spaces by AI-accelerated work — the mechanism through which the recovery windows of pre-AI workflows disappear.
The forcing of biological, ecological, or cognitive processes into tempos structurally incompatible with their developmental requirements—speed itself as a form of harm.

The act of bearing witness to slow violence—creating a counter-record preserving what dominant narratives cannot represent and institutional systems cannot measure.

The shared conditions—deep expertise, sustained attention, embodied knowledge—under which human understanding develops, now degraded by extraction without maintenance.
The distance between what a practitioner understands about a system and what the system requires her to understand when it fails — a gap that abstraction widens invisibly, that AI-generated code has made the widest in computing history, and…
The research tradition in the AI discourse organized around depth preservation — measuring progress by the maintenance of craft, embodied knowledge, and the formative friction of struggle, and identifying AI as a threat to the conditions …
Environmental politics rooted not in wilderness preservation but in defense of communities against extraction—Nixon's reframing of whose nature matters.
Nakamura's empirical finding that the transmission of standards — not knowledge, not technique — is the single most important function the mentor provides, and the function AI most thoroughly fails to replicate.
Robert Solow's 1987 observation — you can see the computer age everywhere except in the productivity statistics — which Brynjolfsson spent his career resolving into three distinct problems: timing, measurement, and organization.
The dual role of those documenting AI's effects from within—testifying to harm while producing it, a position both uniquely valuable and structurally compromised.
American Marxist economist (1920–1976) whose Labor and Monopoly Capital gave deskilling its canonical formulation and provided the theoretical foundation Noble built on.

Nigerian writer and activist (1941–1995) whose testimony against oil industry devastation of Ogoniland—and whose execution—exemplifies Nixon's writer-activist paradigm.
The 1960s–1970s global agricultural transformation — the closest historical parallel to the AI transition, demonstrating that transferable technology without institutional development concentrates benefits among the already-advantaged.
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…