The view of technological transition under the aspect of eternity reveals a five-stage pattern: threshold, exhilaration, resistance, adaptation, expansion. The adaptation stage is the window during which culture constructs institutions, norms, and practices that redirect the new technology's force toward conditions that support human flourishing. Adaptation is not automatic. It requires deliberate construction of structures — labor protections, educational reforms, governance frameworks, cultural norms — that channel the new capability without attempting to suppress it. The Luddites were destroyed because the adaptation structures were not built in time. The monks who copied manuscripts were displaced because structures adapting their skills to the new technology did not exist. Every failed transition is one in which adaptation produced structures too weak or poorly designed to redirect the flow. The AI transition is in its adaptation stage now. The outcome depends on what is built now.
There is a parallel reading that begins from the physical infrastructure AI requires: vast server farms consuming the energy of small nations, rare earth minerals extracted under conditions of ecological devastation, cooling systems draining aquifers in drought-stricken regions. The adaptation stage, viewed from this substrate dependency, is not a window of opportunity but a narrowing corridor of ecological viability. Every LLM training run burns carbon budgets that should sustain communities for years. Every new datacenter requires materials whose extraction destroys the landscapes where actual humans must continue living. The adaptation structures being built — governance frameworks, educational reforms, cultural norms — are superstructural arrangements atop a material base that is literally cooking the planet.
The people most affected by this transition are not those debating governance gaps in conference rooms but those whose water tables drop as datacenters proliferate, whose children mine cobalt for the chips that power the models, whose islands disappear beneath rising seas while we optimize for artificial general intelligence. The adaptation they require is not better frameworks for channeling AI's force but organized resistance to its material demands. Read from their position, the urgency is not that we might fail to build adequate channeling structures but that we are building them at all — that we are normalizing and institutionalizing a technology whose substrate requirements are fundamentally incompatible with the biological conditions for human flourishing. The real governance gap is not between capability and institutional response but between the technology's material needs and the planet's carrying capacity. The window that is closing is not for adaptation but for prevention.
The stage's specific character is urgency without permanence. The window closes. A technology that has crossed the threshold and produced exhilaration and provoked resistance is already reshaping the conditions within which adaptation can occur. The longer adaptation is delayed, the more the prior conditions are eroded — the institutions that could have channeled the technology, the skills that could have been reorganized, the cultural frameworks that could have mediated the transition. Late adaptation is harder than early adaptation not only because there is less time but because there is less material to work with.
The adaptation required for the AI transition operates at multiple levels simultaneously. At the individual level, it means the cultivation of the disciplines described in the freedom entry: cause-tracing, structural awareness, self-knowledge. At the organizational level, it means work structures designed to produce adequate ideas rather than maximize volume of output. At the educational level, it means restructuring curricula around the cultivation of the second and third kinds of knowledge. At the governance level, it means attention to the demand side of AI — what citizens, workers, and families need to navigate the transition — and not only the supply side of what AI companies may build.
Current adaptation is inadequate. The governance gap — the widening distance between the speed of capability and the speed of institutional response — is the defining failure mode of democratic governance in the exponential era. Corporate AI governance frameworks arrive eighteen months after the tools they are meant to govern have reshaped the workforce. Educational reforms are discussed at a pace that makes irrelevance their primary product. The people in the gap are building their own improvised dams from whatever they can find.
The Spinozist point about adaptation is that it is determined by prior causes but not predetermined toward any particular outcome. The presence or absence of specific causes — education that cultivates adequate understanding, institutions that protect time for reflection, cultural norms that value depth over speed — produces specific effects. These causes can be cultivated. Their absence can be repaired. But repair takes time the adaptation window may not provide, which is why the stage's urgency is its defining feature.
The five-stage pattern is developed in Chapter 17 of Edo Segal's The Orange Pill and extended in this volume's reading through Spinoza's sub specie aeternitatis. Antecedents include Joseph Schumpeter's creative destruction cycles, Thomas Kuhn's paradigm shifts, and Carlota Perez's techno-economic paradigm framework.
The Spinozist contribution is the metaphysical grounding: the stages are not contingent historical patterns but necessary consequences of the way organized modes of substance respond to the introduction of new capabilities. This grounding gives the pattern predictive force and reveals the adaptation stage's specific causal leverage.
Fourth of five stages. After threshold, exhilaration, and resistance, adaptation is the window during which channeling structures are built.
Not automatic. Adaptation requires deliberate construction; absence of construction produces the catastrophic outcomes of failed transitions.
Multi-level requirement. Adaptation must occur simultaneously at individual, organizational, educational, and governance levels.
Current inadequacy. The governance gap indicates that current adaptation is failing to keep pace with capability; the adaptation window is closing faster than structures are being built.
Causal leverage. The stage's specific feature under the Spinozist reading is its high causal leverage; adequate action here determines outcomes that become unreachable later.
The question of weighting depends entirely on which temporal and spatial scale we examine. At the immediate institutional level — how organizations and governments respond to AI capabilities already deployed — Edo's framing captures 90% of the relevant dynamics. The governance gap is real, the need for channeling structures is urgent, and the window for building them is closing. His multi-level analysis of required adaptations (individual, organizational, educational, governmental) accurately maps the intervention points available to us now.
But zoom out to the substrate level — the material and energetic requirements of AI systems — and the contrarian view commands 80% of the analytical weight. The ecological costs are not externalities but central constraints that will determine whether any adaptation structures can function long-term. The communities bearing these costs have no voice in the adaptation frameworks being constructed. This is not a governance gap but a governance exclusion, and it suggests that our adaptation structures are being built on foundations that cannot hold.
The synthetic frame that holds both views is one of nested urgencies. We face an immediate urgency (building structures to channel AI's current impacts on work and meaning) embedded within a deeper urgency (the ecological and political sustainability of AI's material substrate). Successful adaptation requires what we might call "provisional construction" — building the channeling structures Edo identifies while simultaneously working to transform AI's substrate dependencies. This means designing governance frameworks that include ecological constraints as first-order considerations, not afterthoughts. It means educational reforms that teach both how to work with AI and how to question its costs. The adaptation stage, properly understood, is not about choosing between these urgencies but about building structures sophisticated enough to address both simultaneously.