This page lists every Orange Pill Wiki entry hyperlinked from C. S. Holling — On AI. 33 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Governance designed for learning rather than compliance — polycentric, cross-scale, diversity-maintaining, adequate to the dynamics of the AI transition.
Holling's learning-oriented management discipline — treat every intervention as an experiment, monitor outcomes, adjust course based on evidence.
The regulatory and institutional frameworks adequate to govern a technology that evolves faster than legislative processes and operates across every national boundary simultaneously.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The configurations toward which systems tend to evolve once they enter a basin's domain — multiple possible futures available to the AI reorganization.

Holling's final public warning — rising global connectivity increases the risk of deep collapse that cascades across adaptive cycles.
The phase of accumulation, specialization, and tight coupling — peak efficiency under stable conditions, and the configuration that makes release catastrophic.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
The magnitude of disturbance a system can absorb before shifting to a qualitatively different regime — Holling's foundational alternative to engineering resilience.
The speed of return to equilibrium after perturbation — the dominant conception of resilience in mechanical and computational systems, and the wrong conception for the AI transition.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The nested set of adaptive cycles operating at different scales, connected by upward revolt and downward remember dynamics.
The traps — rigidity and poverty — in which systems stabilize in stable but impoverished or rigid but brittle states, resistant to normal cycle dynamics.
The irreversible subtractions that transformation entails — specific forms of knowledge, satisfaction, and connection that do not return.
The first pattern of reorganization — fast-growing colonizers arrive quickly and establish initial structure but do not determine long-term character.
A pathological stable but impoverished configuration — the system cycles between exploitation and release without accumulating enough capital to develop complex structure.
The short, violent, liberating collapse of accumulated structure — the crown fire, the financial crash, the SaaSpocalypse.
The decisive phase when resources liberated by collapse are assembled into the configurations that define the next cycle. The window is brief. The choices persist.
The structural purchase: every efficiency gain is paid for with a resilience loss, and the invoice arrives during the disturbance you did not plan for.
The two cross-scale dynamics of panarchy — upward cascade of disturbance and downward provision of stability and accumulated wisdom.
A pathological conservation-phase configuration so tightly coupled it cannot release when release is necessary — accumulating suppressed disturbance until collapse is catastrophic.
The reservoir of dormant capacities from which reorganization draws — richness determines the richness of the recovery.
The second candidate basin — a two-tier system where a small population develops judgment and a large population cycles through tool adoptions without upward mobility.
Holling's four-phase model — exploitation, conservation, release, reorganization — describing how complex systems grow, rigidify, collapse, and renew.
The third candidate basin — a diverse, modular configuration that maintains portfolio of approaches and invests in cross-scale interactions. The resilient future.
The path from release through reorganization — the short, violent, creative phase during which the next cycle's architecture is determined.
The paradigmatic release event — a crown fire that consumes decades of accumulation in days and clears the ground for reorganization.
The path from exploitation through conservation — the phase of growth, accumulation, and optimization that feels like progress and prepares its own destruction.
The first of three candidate basins for the AI reorganization — maximum output, minimum input, structurally shallow, catastrophically fragile.