This page lists every Orange Pill Wiki entry hyperlinked from Ilya Prigogine — On AI. 39 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The discipline of predicting when specific AI capabilities will arrive — a domain where Clarke's First Law applies cleanly: the distinguished elderly scientist who says X is impossible is, on the historical pattern, very probably wrong.
The Berkeley researchers' prescription for the AI-augmented workplace — structured pauses, sequenced workflows, protected human-only time, behavioral training alongside technical training — the operational counterpart to Maslach's fix-the-…
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
Prigogine's philosophical distinction between the classical worldview of static states and the creative worldview of irreversible processes — the deepest reframing of what the AI-collaborative builder is actually doing.
The specific thresholds at which far-from-equilibrium systems must choose between qualitatively different futures — the moments where determinism fails and small fluctuations determine macroscopic outcomes.
The institutional structures required to direct the AI surplus toward broadly shared welfare — infrastructure, education, labor market policy, governance of AI development, international coordination — built at the speed the transition dema…
Prigogine's Nobel-winning concept for open systems that maintain complex order by continuously processing energy flows far from equilibrium — the flame, the cell, the hurricane, and the builder at the terminal.
The discovery — which nobody predicted and no one fully explains — that large language models acquire qualitatively new abilities at particular scale thresholds. Reasoning, translation, code generation, in-context learning: none were traine…
The thermodynamic law that every dissipative structure maintains internal order only by exporting disorder to its environment — and the physics beneath the cost of creative work.
The thermodynamic regime beyond the critical threshold where linear dynamics fail and genuine novelty becomes possible — the only regime in which the interesting parts of the universe happen.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The thermodynamic principle that near a bifurcation point, small perturbations produce disproportionate effects — the physics underneath why individual choices matter most at moments of maximum system instability.
The operational frame in which a human and an AI system share a workflow as partners with complementary capabilities — the alternative to both "AI as tool" and "AI as replacement."
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
Edo Segal's name for the simultaneous experience of exhilaration and fragility that accompanies the orange pill moment — grounded by Prigogine's physics in the structural properties of far-from-equilibrium systems.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The spontaneous emergence of complex global order from local interactions in systems driven far from equilibrium — the mechanism that produced cells, brains, civilizations, and now the internal representations of large language models.
The thermodynamic translation of Segal's beaver metaphor — the ongoing practice of building robust structures rather than optimal ones, maintained through continuous attention rather than one-time construction.
The Berkeley researchers' term for the colonization of previously protected temporal spaces by AI-accelerated work — the mechanism through which the recovery windows of pre-AI workflows disappear.
Prigogine's insistence that irreversibility is not a subjective illusion produced by macroscopic coarseness but a fundamental feature of physical reality — and the diagnostic that distinguishes human creativity from machine computation.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Byung-Chul Han's 2010 diagnosis of the achievement-driven self-exploitation that has replaced disciplinary control as the dominant mode of power — and, in cybernetic terms, a social system operating in positive feedback.
Prigogine's radical argument that the future of complex systems is not merely unknown but unknowable — and the thermodynamic challenge to every confident prediction about AI's trajectory.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
The ecological principle — foundational to Jones's framework and routinely ignored by organizational AI deployment — that the engineer's obligation is not discharged by construction; it persists as long as the community depends on the engi…
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
Edo Segal's twenty-fold multiplier from Trivandrum — received by the culture with the reverence a quantitative civilization reserves for quantitative claims, and the archetypal thin description of a transformation whose meaning lives elsew…
The critical rate above which a far-from-equilibrium system's energy throughput overwhelms its organizational capacity — the thermodynamic name for the transition from productive intensity to destructive chaos.
The physical impossibility of returning a far-from-equilibrium system to its pre-transition state — the principle that makes orange pill recognition permanent rather than reversible.
Anthropic's command-line coding agent — the specific product through which the coordination constraint shattered in the winter of 2025, reaching $2.5B run-rate revenue within months.
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…
The 15th-century invention — Gutenberg's movable type — that Gopnik, Farrell, Shalizi, and Evans identify as the single most illuminating historical analog for understanding what large language models actually are.
Ye and Ranganathan's 2026 Harvard Business Review ethnography of AI in an organization — the empirical documentation of task seepage and work intensification that prospect theory predicts.
Edo Segal's 2026 book on the Claude Code moment — the empirical and narrative ground on which this Whitehead volume builds its philosophical reading.
Korean-German philosopher (b. 1959) whose diagnoses of smoothness, transparency, and achievement society provide the critical idiom within which Groys's AI analysis operates — and against which Groys's emphasis on institutional frame offers…
Builder, entrepreneur, and author of The Orange Pill — whose human-AI collaboration with Claude, described in that book and extended in this volume, provides the empirical ground for the Whiteheadian reading.
The early 2026 repricing event in which a trillion dollars of market value vanished from SaaS companies — the critical-stage moment when AI's displacement of software's code value became visible to markets.
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…