W. Brian Arthur — On AI — Wiki Companion
WIKI COMPANION

W. Brian Arthur — On AI

A reading-companion catalog of the 28 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that W. Brian Arthur — On AI uses as stepping stones for thinking through the AI revolution.

This page lists every Orange Pill Wiki entry hyperlinked from W. Brian Arthur — On AI. 28 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.

Concept (24)
Adoption Curve Compression
Concept

Adoption Curve Compression

The progressive shortening of the interval between a technology's introduction and its mass adoption — from 75 years for the telephone to two months for ChatGPT — and the step function AI represents in the pattern.

AI Governance (Ostromian Reading)
Concept

AI Governance (Ostromian Reading)

The regulatory, institutional, and normative arrangements governing AI development and deployment — reframed through Ostrom's framework as a polycentric governance challenge requiring coordination across multiple scales rather than the mark…

Basin of Attraction
Concept

Basin of Attraction

The gravitational well of increasing returns holding a paradigm in place—accumulated advantages creating switching costs so high that marginal improvements from challengers cannot escape the basin, requiring categorical disruption.

Combinatorial Innovation
Concept

Combinatorial Innovation

Arthur's thesis that technologies arise from combinations of existing technologies in a recursive process—more components enable more combinations, accelerating innovation through self-amplifying dynamics.

Ecosystem Lock-In
Concept

Ecosystem Lock-In

The competitive advantage that emerges when accumulated investments in data, integrations, talent, and process make switching prohibitively expensive — the durable moat that AI cannot replicate because it was built through time.

Edge of Chaos
Concept

Edge of Chaos

The narrow dynamical regime between rigid order and dissolving chaos where complex systems are most adaptive—ordered enough to maintain stable structures, fluid enough to reorganize when conditions demand, discovered through Santa Fe Instit…

Emergence
Concept

Emergence

The phenomenon by which complex properties arise from the interaction of simpler components and cannot be predicted from or reduced to those components alone — Sawyer's core explanatory mechanism for collaborative creativity, and the con…

External Intelligence
Concept

External Intelligence

Arthur's term for AI as cognitive capability residing outside human minds—'not housed internally in human workers but externally in the virtual economy's algorithms,' available on-demand, reshaping institutional dependence.

Increasing Returns
Concept

Increasing Returns

Arthur's foundational thesis that technology markets are governed not by diminishing returns but by positive feedback loops in which success breeds success—small early advantages compound into dominant, often irreversible, market positions.

Infrastructure Persistence
Concept

Infrastructure Persistence

The durability of physical and institutional substrate that outlasts the technologies it was designed to serve—conduits outlasting cables, pathways outlasting conduits, embedded assumptions outlasting everything.

Lock-In (Shapiro Framework)
Concept

Lock-In (Shapiro Framework)

The economic mechanism by which voluntary adoption becomes involuntary dependence through the accumulation of platform-specific investments — the subject of Shapiro's career-long investigation and the force now operating at unprecedented sp…

Modularity Principle
Concept

Modularity Principle

The architectural prescription — drawn from Perrow's later work and extended by AI safety researchers — that systems designed as loosely coupled modules with limited interaction pathways absorb failures that tightly integrated systems tran…

Network Effects
Concept

Network Effects

The economic phenomenon by which a good becomes more valuable as more people use it — formalized by Katz and Shapiro in 1985 and now the single most important concept for understanding AI platform market structure.

Path Dependence
Concept

Path Dependence

The principle that where you are constrains where you can go—the sequence of decisions already made narrows future options, producing outcomes rational actors would not choose if they could see the full trajectory.

Phase Transition
Concept

Phase Transition

The physicist's concept for discontinuous system reorganization — water to ice, coordination to judgment — that the Goldratt simulation uses to describe the AI moment's character.

Positive Feedback
Concept

Positive Feedback

The runaway dynamic in which a system's output feeds back as input and amplifies — the screech of the microphone, the cascade of hemorrhage, the grinding compulsion of the AI-augmented builder who cannot stop.

Structural Deepening
Concept

Structural Deepening

Arthur's framework for how technologies evolve from simple tools into civilizational substrates—acquiring layers, subsystems, institutional infrastructure, transforming from replacement to foundation for entire ways of living.

Switching Costs
Concept

Switching Costs

The total cost — financial, technical, cognitive, and relational — that a user must bear to move from one platform to another, and the specific economic quantity that converts competitive markets into platform-dependent ones.

The Distribution Problem
Concept

The Distribution Problem

The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.

The Productivity Paradox
Concept

The Productivity Paradox

Robert Solow's 1987 observation — you can see the computer age everywhere except in the productivity statistics — which Brynjolfsson spent his career resolving into three distinct problems: timing, measurement, and organization.

The Second Economy
Concept

The Second Economy

Arthur's 2011 diagnosis of a vast digital substrate forming beneath the physical economy—'remotely executing and global, always on, endlessly configurable'—providing external intelligence that would rival physical production in scale.

The Tipping Point (Arthur's Framework)
Concept

The Tipping Point (Arthur's Framework)

The irreversible threshold in a positive-feedback system where the balance between competing alternatives snaps—not gradual shift but phase transition, after which the outcome is locked in.

Vector Pods
Concept

Vector Pods

Small cross-functional groups whose job is deciding what to build, not building it — Segal's organizational response to the separation of judgment from execution.

Winner-Take-All Markets
Concept

Winner-Take-All Markets

Markets where small performance differences produce disproportionate reward differences—characteristic of increasing-returns systems where positive feedback concentrates gains among a few participants while the rest compete for scraps.

Technology (1)
Claude Code
Technology

Claude Code

Anthropic's command-line coding agent — the specific product through which the coordination constraint shattered in the winter of 2025, reaching $2.5B run-rate revenue within months.

Person (1)
Stuart Kauffman
Person

Stuart Kauffman

American theoretical biologist (b. 1939) whose order for free, adjacent possible, and edge of chaos frameworks reshaped understanding of how complexity emerges spontaneously in nature.

Event (1)
The Death Cross
Event

The Death Cross

The 2025–2026 trillion-dollar repricing of the software industry — when AI market capitalization overtook SaaS capitalization — read through Nye's framework as a geopolitical repricing of what constitutes strategic advantage, not merely a …

Organization (1)
Santa Fe Institute
Organization

Santa Fe Institute

The New Mexico research center founded in 1984 as the disciplinary home of complexity science—where Arthur, Kauffman, Holland, and Gell-Mann developed the frameworks for understanding systems operating at the edge of chaos.

Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
28 entries