This page lists every Orange Pill Wiki entry hyperlinked from Andreas Wagner — On AI. 32 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The problem Darwin could not solve: selection preserves the fit once it appears, but cannot explain how novel functional forms emerge from possibility spaces larger than the observable universe.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
The structural property of genotype networks by which different positions are adjacent to different alternative phenotypes — the mechanism that converts dispersal through possibility space into expanding access to innovation.
The discovery — which nobody predicted and no one fully explains — that large language models acquire qualitatively new abilities at particular scale thresholds. Reasoning, translation, code generation, in-context learning: none were traine…
A category of risk whose realization would either annihilate humanity or permanently and drastically curtail its potential. AI joined this category in mainstream academic usage in 2014.
Regions of a neural network's loss landscape where small perturbations to parameters do not significantly affect performance — the computational realization of Wagner's biological robustness, and the topological signature of exploratory po…
Wagner's discovery that biological possibility space is organized into vast, interconnected webs of functionally equivalent sequences — the architecture that makes innovation structurally accessible rather than improbable.
The directed optimization algorithm that navigates neural network parameter space by following the slope of the loss function — structurally unlike biological mutation, but whose trajectories nonetheless traverse the same kind of topologic…
Wagner's mathematical claim that exploration through a sufficiently structured possibility space guarantees the encounter with novel phenotypes — innovation is not the exception but the rule.
The application of Wagner's biological robustness framework to human institutions — the capacity of an organization, profession, or civilization to absorb the perturbation of transformative technology without losing the capacities that mak…
The high-dimensional surface defined by a neural network's training objective — the computational analog of biological fitness landscapes, whose topology determines which configurations are accessible through gradient descent.
The empirical discovery that distinct optima of a neural network are connected by continuous paths of low loss — the computational demonstration that parameter space has the same architecture Wagner mapped in biological sequence space.
The process by which organisms — or populations of explorers in any structured space — wander through functionally equivalent configurations, accumulating positional diversity that places them adjacent to innovations they could not reach f…
The historical pattern by which the same innovation emerges from multiple independent explorers in narrow time windows — the empirical signature of topology, demonstrating that possibility spaces channel exploration toward accessible innov…
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
Wagner's resolution of the engineering paradox that stability and flexibility seem to preclude each other — in structured possibility spaces, the architecture that enables one is the architecture that enables the other.
The empirical relationships that predict how a language model's loss decreases with training compute, parameters, and data — the most reliable quantitative instrument the AI field has, and the reason investors have been willing to fund ten-…
Wagner's 2023 framework for innovations that arrive before their environment is ready to receive them — functional capabilities that lie dormant, sometimes for decades or centuries, until the conditions for their activation converge.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The largest and most informed population in any AI-transitioning organization — the people who hold ambivalence accurately and are systematically silenced by environments that reward only the extremes.
The civilization-scale neutral network of citizens holding contradictory assessments of AI simultaneously — the largest, most diverse, and most consequential adaptive substrate in any technological transition.
The irreducibly human labor that the topology of possibility space cannot perform — the evaluation of which innovations to pursue, shelter, or constrain, and the construction of institutions capable of applying judgment at the pace that ac…
The structural geometry of any sufficiently large possibility space — the shape of the landscape through which exploration moves, which determines which innovations are accessible and in what order they will be encountered.
The specific, identifiable points at which Wagner's biological framework fails to map cleanly onto artificial intelligence — the disanalogies that constrain the framework's transfer and identify where biological insight must be supplement…
Maslow's reading of The Orange Pill's central question: worthiness is not a moral endowment but the developmental achievement of a person whose signal is shaped by B-values.
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…
The class of machine-learning architectures loosely modeled on biological neurons — the substrate of the current AI revolution and the opposite of Asimov's designed-then-programmed positronic brain.
The 2017 neural network architecture, built around self-attention, that replaced recurrent networks for sequence modeling and became the substrate of every large language model since.
The 2025–2026 phase transition in which AI-assisted software production costs crossed below the costs of maintaining legacy code, triggering a trillion-dollar repricing of the SaaS industry in months.
The February 2026 training session in which Edo Segal's twenty engineers in Trivandrum crossed the orange pill threshold and emerged as AI-augmented builders producing twenty-fold productivity gains — the founding empirical moment of The Orange…