Every major expansion of information-processing capacity in human history has been followed by a period in which institutions could not keep up. The printing press produced a century of religious warfare before it produced the Enlightenment. The telegraph produced a succession of financial panics before markets developed the circuit breakers to absorb them. The pattern is not accidental; it follows from the information-theoretic structure of institutional processing. Institutions receive, evaluate, integrate, and act on information at rates determined by their structure, staffing, and accumulated tacit knowledge. When information growth exceeds processing capacity, decisions are made on partial information and institutional output degrades. AI is compressing this pattern into the shortest timescale in human history, and the gap between capability and institutional response is the defining contest of the current moment.
The printing press did not produce the Enlightenment directly. It produced a century of religious warfare first. Information flooded into populations that had no institutional infrastructure for evaluating it. Doctrinal chaos, political upheaval, wars that killed millions — the lag between the technology's arrival and the institutions' adaptation was measured in generations, and the cost was borne by the people who lived inside it. The Enlightenment came later, after universities developed the capacity to evaluate printed claims, libraries organized accumulated knowledge, scientific societies established standards of evidence.
The telegraph compressed the same pattern. Information previously traveling at horse-speed suddenly traveled at wire-speed. Financial markets, evolved to process information arriving at horse-speed, were flooded with information arriving at wire-speed. Markets crashed faster because bad news traveled faster. Institutional mechanisms for absorbing shock — circuit breakers, regulatory pauses, coordinated central bank responses — did not yet exist. Each financial panic (1857, 1873, 1893) arrived with unprecedented speed because the processing infrastructure had not caught up with the information infrastructure.
The AI case shows the same structure at compressed timescale. Segal's Orange Pill documents the specific institutional failures: educational systems optimizing for the transmission of knowledge becoming abundant while failing to develop capabilities becoming scarce; retraining programs teaching prompt engineering rather than judgment; regulatory frameworks addressing the supply side while leaving citizens, workers, students, and parents largely exposed on the demand side. The EU AI Act, American executive orders, emerging frameworks in Singapore and Brazil and Japan — real structures, but addressing what AI companies may build rather than what people need to navigate the moment wisely.
The lesson is not that AI should be restricted. The lesson is that the adaptation stage is where the outcome is determined. The threshold has been crossed. The exhilaration has been felt. The resistance is underway. The expansion — whether the transition produces broad-based development or concentrated benefit with distributed cost — depends entirely on whether the institutional adaptation is adequate. Adequate does not mean perfect. It means fast enough and good enough to prevent the information gap from widening to the point of systemic disruption.
The pattern was identified piecemeal by different disciplines — communications scholars studying print culture, economic historians studying financial panics, development economists studying technology transfer — before Hidalgo's information-theoretic framework allowed the observations to be unified. The unifying claim is that institutions are information-processing structures with finite processing capacity, and mismatches between environmental information growth and institutional capacity produce predictable pathologies regardless of domain.
Institutions are information-processing structures. They have finite capacity, determined by structure, staffing, and accumulated tacit knowledge.
Mismatch produces pathology. When information grows faster than institutional processing, decisions degrade and institutional output suffers.
The lag has predictable costs. Financial panics, political upheaval, social disruption — each mapped to the gap between expanded information flow and institutional capacity.
Adaptation determines outcome. The technology is not the problem; the absence of institutional structures to manage the expanded flow is.
AI compresses the timescale. The adaptation that took centuries for print and decades for telegraph must now occur in years, under pressure no previous institution has faced.
Some argue that the historical pattern does not extrapolate cleanly to the AI case because AI itself can be used to accelerate institutional adaptation — that institutions augmented by AI may be able to process information at rates earlier institutions could not. The counter-argument is that the institutions most in need of adaptation (educational systems, regulatory bodies, cultural norms) are precisely those least able to deploy AI for their own augmentation, making the asymmetry between technical capability and institutional response more severe, not less.