This page lists every Orange Pill Wiki entry hyperlinked from Terrence Deacon — On AI. 31 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The defining features of life and mind—function, purpose, meaning—constituted by orientation toward what is not present, what is missing, what matters.
Byung-Chul Han's diagnosis of the cultural trajectory toward frictionlessness — a smoothness that conceals the labor and struggle that gave previous work its depth.
The governing metaphor of The Orange Pill — AI as a signal-amplifier that carries whatever is fed into it further, with terrifying fidelity. Buber's framework extends the metaphor: the amplifier clarifies what was already there, which makes…
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.

The evolutionary mechanism by which the enormous cognitive effort of early symbolic communication became automatic as the brain reorganized across generations to support it efficiently.
The reciprocal shaping process—language selected for neural reorganization, reorganized brains enabled complex language—that built the symbolic species across hundreds of thousands of years.
The quality of subjective experience — being aware, being something it is like to be — and the single deepest unanswered question in both philosophy of mind and AI.
The productive struggle—applying rules to resistant situations, feeling the weight of judgment—through which practitioners build the embodied, perceptual expertise that AI's efficiency eliminates.
The research tradition — converging from neuroscience, philosophy, and robotics — that mind is not separable from body, and whose empirical maturity over four decades has made the computational theory of mind increasingly hard to defend.
The phenomenon by which complex properties arise from the interaction of simpler components and cannot be predicted from or reduced to those components alone — Sawyer's core explanatory mechanism for collaborative creativity, and the con…
The second law of thermodynamics' universal tendency toward disorder — Wiener's fundamental antagonist, the force against which every act of intelligence is a local and temporary resistance.
Peirce's three-level semiotic hierarchy—resemblance, correlation, convention—that Deacon extended into a theory of cognitive phase transitions and the architecture of meaning.
Wittgenstein's foundational later thesis — for a large class of cases, the meaning of a word is its use in the language — and the philosophical axis on which the AI language moment turns.
The second level of Deacon's emergent hierarchy—pattern and regularity arising from thermodynamic dissipation—the whirlpool, the snowflake, the convection cell.
The physicist's concept for discontinuous system reorganization — water to ice, coordination to judgment — that the Goldratt simulation uses to describe the AI moment's character.
The brain's longest-running construction project — continuing into the mid-twenties — during which the regulatory architecture that governs impulse, judgment, and sustained effort is built.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The spontaneous emergence of order in systems operating at the edge of chaos — neither so ordered that nothing can change nor so random that nothing can persist, but in the narrow zone where complex patterns hold just long enough to build …
The erosion of meaning's layered architecture when AI-mediated workflows bypass the indexical stratum—producing symbols that float without grounding.
Harnad's 1990 challenge—how do symbols acquire meaning?—that Deacon extended: grounding requires the full semiotic hierarchy, not just sensorimotor association.
Deacon's term for self-maintaining, purposively oriented systems—the third level of emergent dynamics—that exhibit genuine function, meaning, and the orientation toward what is absent.
The structural inversion the AI transition produces — when building becomes easy, scarcity migrates from execution to the capacity to decide what deserves to be built.
The structural reframing that reads the large language model's training corpus through the lens of Spivak's analysis of the colonial archive — an apparently comprehensive record whose categories enact the exclusions they claim to overcome.
The first leg of Goldberg's executive tripod — the capacity to hold multiple elements in active consciousness simultaneously, whose four-to-five-item limit constitutes the fundamental bottleneck on creative coordination.
Anthropic's command-line coding agent — the specific product through which the coordination constraint shattered in the winter of 2025, reaching $2.5B run-rate revenue within months.
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…
Deacon's 2012 theory of emergence—how mind emerged from matter through hierarchical constraints, and why the most important properties are defined by absence.
Xingqi Maggie Ye and Aruna Ranganathan's 2026 Harvard Business Review ethnography of an AI-augmented workplace — the most rigorous empirical documentation to date of positive feedback dynamics in human-machine loops.
Korean-German philosopher (b. 1959) whose diagnoses of the smoothness society and the burnout society anticipated the pathologies of AI-augmented work with unsettling precision.
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
American biological anthropologist and neuroscientist (b. 1950) whose Symbolic Species inverted the standard story of language origins—and whose semiotic framework diagnoses what AI does to human cognition.