This page lists every Orange Pill Wiki entry hyperlinked from C.S. Holling — On AI. 32 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The governance and decision-making posture that treats interventions as experiments, monitors outcomes, and adjusts course — designed for systems whose dynamics cannot be predicted in advance.
The emerging institutional arrangements governing AI development and deployment — a public good whose production faces all the structural challenges Olson's framework identifies.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The configurations toward which a system tends to evolve once inside a given domain — stable states separated by thresholds that, once crossed, may prove difficult or impossible to reverse.
The structural feature of the AI economy — paralleling semiconductor industry consolidation but compressed onto a faster timeline — by which training data, model weights, compute infrastructure, and foundation model development are controll…
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Holling's 1973 distinction — the magnitude of disturbance a system can absorb before flipping to a qualitatively different regime, as opposed to the speed of return to a single equilibrium.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The nested architecture of adaptive cycles operating at different scales, connected through the dynamics of revolt and remember that govern how disturbance cascades.
The honest ecological accounting that some things destroyed during release do not return — specific forms of embodied knowledge, satisfaction, and collegial bond that the new configuration cannot replicate.
The fast-growing, resource-capturing structures that first occupy a post-disturbance landscape — productive in the short term, structurally simple, and dangerous if they prevent slower configurations from establishing.
The multi-scale, multi-node institutional architecture that adaptive governance requires — multiple overlapping bodies operating at different scales, connected through feedback and learning channels.
The three mechanisms — loss of connectedness, liberation of capital, and emergence of radical uncertainty — that characterize the omega phase of the adaptive cycle.
The compromised state of the downward remember dynamic in the AI panarchy — cultural values, educational institutions, and regulatory frameworks failing to provide stability to faster-moving scales.
The two-tier AI-era basin — a small population develops judgment and commands a premium while a much larger population operates at tool competence and cycles through retraining without upward mobility.
The mechanism — documented in the Berkeley study of AI workplace adoption — by which AI-accelerated work colonizes previously protected temporal spaces, converting every pause into an opportunity for productive engagement.
Holling's four-phase model — exploitation, conservation, release, reorganization — describing the structural dynamics of every complex adaptive system under stress.
The resilient basin of attraction — a system characterized by diversity of approaches, modular structure, and cross-scale interactions that preserve adaptive capacity across multiple possible futures.
The release-to-reorganization path — the short, violent, creative phase during which the next cycle's character is determined.
The exploitation-to-conservation path — the phase of growth, accumulation, and optimization that feels like progress while preparing the conditions for release.
The AI-era basin of attraction in which the knowledge economy converges on a single model of maximum output with minimum input — productive, competitive, structurally simple, and systemically fragile.
A pathological configuration in which a system cycles between exploitation and release without ever accumulating enough structure to support complexity — stable in its impoverishment.
The alpha phase — the brief window of maximum fluidity after release, during which liberated resources recombine into configurations that will define the next cycle.
The structural tension at the core of Holling's framework — the qualities making a system productive under stable conditions are the qualities making it brittle when conditions change.
A pathological conservation-phase configuration so optimized and interconnected that it cannot release — accumulating vulnerability until the eventual collapse is catastrophic.
The reservoir of capacities, institutions, and cultural values from which post-disturbance reorganization draws its raw material — determining what configurations can grow.
Canadian ecologist (1930–2019) whose adaptive cycle, panarchy, and resilience framework reshaped how complex adaptive systems are understood across ecology, economics, and governance.
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
The canonical case of conservation-phase optimization that succeeded by engineering metrics while slowly destroying the system — and the subsequent shift to adaptive management as a response.
The eight-week repricing of early 2026 in which a trillion dollars of software-company valuation evaporated — not because revenues collapsed but because the narrative protecting them did.