This page lists every Orange Pill Wiki entry hyperlinked from Deirdre Barrett — On AI. 15 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The engineering condition in which a measurement system produces inaccurate readings because its inputs fall outside its design range — applied to the builder's satisfaction system encountering AI-augmented work.
The sensitive-period process by which a developing nervous system establishes the baseline thresholds that will govern its reward evaluations for life — and the specific vulnerability of children growing up among supernormal stimuli.
Barrett's core prescription for supernormal-stimulus exploitation: restructure the stimulus landscape rather than demand individual willpower, because the regulatory mechanisms are outmatched by the stimulus they would need to override.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Amotz Zahavi's evolutionary-biology principle that a communication is reliable precisely because it is costly to produce — the handicap that guarantees the signal cannot be cheaply faked.
The specific behavioral pattern — predicted by Barrett's framework and documented in viral partner-testimony of early 2026 — in which the builder's motivational system orients toward AI-augmented work while domestic life cools on the sand.
Wolfram Schultz's discovery that dopamine neurons encode the difference between expected and actual reward, not reward itself — the architecture that explains why AI-augmented work produces continuous anticipatory surges.
Tinbergen's term for an artificial signal that exaggerates the features an evolved response system tracks, triggering a response more intense than any natural stimulus could produce.
The specific dopaminergic architecture — calibrated by hundreds of thousands of years of ancestral problem-solving — that AI-augmented work activates at a frequency the system was never designed to sustain.
Tinbergen's 1951 oystercatcher experiment — the canonical demonstration that an organism will abandon its viable offspring to incubate a supernormal object, and the founding image of Barrett's framework.
The specific neurological event — not a metaphor — by which the regulatory mechanisms that terminate appetitive behavior are outcompeted by a supernormal reward signal, producing the builder's inability to stop.
The dorsolateral prefrontal cortex's real-time suppression of emergent cognitive outputs that deviate from stored expectations — the mechanism that enables structured performance and that must temporarily relax for creative insight to surfa…
The specific behavioral configuration — compulsive AI-augmented engagement experienced as exhilaration from within and pathology from without — produced by a reinforcing loop without a balancing counterpart.
Kent Berridge and Terry Robinson's neurological distinction between motivational drive and hedonic enjoyment — the dissociation that defines compulsion and that AI-augmented work produces at predictable intervals.