The Kindleberger Trap names the specific risk that a transition in global leadership leaves no one providing the public goods that international stability requires. Joseph Nye developed the concept from Kindleberger's analysis of the 1930s — when Britain could no longer play the hegemonic role and the United States was not yet willing to — and applied it to the twenty-first-century transition. Unlike the Thucydides Trap, which focuses on the risk of war between rising and established powers, the Kindleberger Trap focuses on the risk of collective action failure: not direct conflict but the absence of the coordination the international system requires.
The AI case exhibits Kindleberger Trap dynamics with unusual clarity. The United States, as the established technology hegemon, is simultaneously driving AI development and declining to exercise the governance leadership that historical hegemonic stability required. China, as the rising technology power, is building parallel AI infrastructure and declining to accept governance frameworks designed in Washington. The European Union is attempting to regulate without the technology base that would make its regulation binding. The resulting governance vacuum affects everyone but benefits no one in the long run.
The trap is particularly acute for AI because the technology's cross-border effects require cross-border governance. Data flows across jurisdictions. Models trained in one country are deployed in another. Safety failures in one jurisdiction produce consequences in all jurisdictions. Surveillance capabilities developed in one state can be deployed against populations in another. The scope of the technology exceeds the scope of any national regulatory framework, and the absence of an effective international framework means the gap is filled by bilateral negotiation, corporate self-regulation, and ad hoc arrangements that the displacement's speed is rendering inadequate.
Segal's concluding chapter in The Orange Pill references Asimov's Foundation series and Hari Seldon's attempt to compress a civilizational dark age from thirty thousand years to one thousand. The parallel is not accidental. The Kindleberger Trap identifies the specific mechanism through which civilizational transitions produce catastrophe in the absence of institutional foresight, and the AI transition is producing exactly this mechanism at compressed timescale.
Nye introduced the term in 2017 in Project Syndicate, explicitly building on Kindleberger's analysis of the interwar period. The concept has since been applied to technology governance, climate policy, and financial regulation as instances of the same structural problem.
Collective action failure, not direct conflict. The trap produces instability through coordination failure rather than war.
Rising and declining hegemons. The trap is most acute during transitions when neither power provides adequate public goods.
Cross-border externalities. Problems whose scope exceeds any national jurisdiction require international coordination.
AI as exemplary case. The technology's structure makes national regulation insufficient.