This page lists every Orange Pill Wiki entry hyperlinked from W. Brian Arthur — On AI. 28 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The progressive shortening of the interval between a technology's introduction and its mass adoption — from 75 years for the telephone to two months for ChatGPT — and the step function AI represents in the pattern.
The regulatory, institutional, and normative arrangements governing AI development and deployment — reframed through Ostrom's framework as a polycentric governance challenge requiring coordination across multiple scales rather than the mark…
The gravitational well of increasing returns holding a paradigm in place—accumulated advantages creating switching costs so high that marginal improvements from challengers cannot escape the basin, requiring categorical disruption.
Arthur's thesis that technologies arise from combinations of existing technologies in a recursive process—more components enable more combinations, accelerating innovation through self-amplifying dynamics.
The competitive advantage that emerges when accumulated investments in data, integrations, talent, and process make switching prohibitively expensive — the durable moat that AI cannot replicate because it was built through time.
The narrow dynamical regime between rigid order and dissolving chaos where complex systems are most adaptive—ordered enough to maintain stable structures, fluid enough to reorganize when conditions demand, discovered through Santa Fe Instit…
The phenomenon by which complex properties arise from the interaction of simpler components and cannot be predicted from or reduced to those components alone — Sawyer's core explanatory mechanism for collaborative creativity, and the con…

Arthur's term for AI as cognitive capability residing outside human minds—'not housed internally in human workers but externally in the virtual economy's algorithms,' available on-demand, reshaping institutional dependence.
Arthur's foundational thesis that technology markets are governed not by diminishing returns but by positive feedback loops in which success breeds success—small early advantages compound into dominant, often irreversible, market positions.
The durability of physical and institutional substrate that outlasts the technologies it was designed to serve—conduits outlasting cables, pathways outlasting conduits, embedded assumptions outlasting everything.
The economic mechanism by which voluntary adoption becomes involuntary dependence through the accumulation of platform-specific investments — the subject of Shapiro's career-long investigation and the force now operating at unprecedented sp…
The architectural prescription — drawn from Perrow's later work and extended by AI safety researchers — that systems designed as loosely coupled modules with limited interaction pathways absorb failures that tightly integrated systems tran…
The economic phenomenon by which a good becomes more valuable as more people use it — formalized by Katz and Shapiro in 1985 and now the single most important concept for understanding AI platform market structure.
The principle that where you are constrains where you can go—the sequence of decisions already made narrows future options, producing outcomes rational actors would not choose if they could see the full trajectory.
The physicist's concept for discontinuous system reorganization — water to ice, coordination to judgment — that the Goldratt simulation uses to describe the AI moment's character.
The runaway dynamic in which a system's output feeds back as input and amplifies — the screech of the microphone, the cascade of hemorrhage, the grinding compulsion of the AI-augmented builder who cannot stop.
Arthur's framework for how technologies evolve from simple tools into civilizational substrates—acquiring layers, subsystems, institutional infrastructure, transforming from replacement to foundation for entire ways of living.
The total cost — financial, technical, cognitive, and relational — that a user must bear to move from one platform to another, and the specific economic quantity that converts competitive markets into platform-dependent ones.
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
Robert Solow's 1987 observation — you can see the computer age everywhere except in the productivity statistics — which Brynjolfsson spent his career resolving into three distinct problems: timing, measurement, and organization.
Arthur's 2011 diagnosis of a vast digital substrate forming beneath the physical economy—'remotely executing and global, always on, endlessly configurable'—providing external intelligence that would rival physical production in scale.
The irreversible threshold in a positive-feedback system where the balance between competing alternatives snaps—not gradual shift but phase transition, after which the outcome is locked in.
Small cross-functional groups whose job is deciding what to build, not building it — Segal's organizational response to the separation of judgment from execution.
Markets where small performance differences produce disproportionate reward differences—characteristic of increasing-returns systems where positive feedback concentrates gains among a few participants while the rest compete for scraps.