The law-lag narrative is the conventional story that technology moves fast, institutions move slowly, and the gap between them is where harm occurs. This narrative treats the speed differential as a fact of nature — technology develops according to its own logic, governance scrambles to catch up, always arriving after damage is done. Jasanoff has spent decades arguing that this framing is misleading in ways that serve specific interests. It makes regulation feel futile, deference to technologists feel rational, and the governance gap itself appear natural and ungovernable. The historical record contradicts the narrative: law does not merely react to technology but co-produces it. Patent systems shaped industrial innovation. Environmental regulation restructured chemistry. Securities law constituted financial markets. In each case, law was present at creation, shaping the technology's trajectory from inside. The AI moment appears to confirm the law-lag story only if analysis begins in 2022 and treats everything afterward as catch-up. But AI developed within regulatory environments that shaped every dimension of its emergence — data law, intellectual property, market structure, export controls.
Jasanoff's critique of the law-lag narrative appears most explicitly in The Ethics of Invention (2016), where she argues that the framing serves a delegitimizing function. When a society accepts that governance inevitably lags technology, it has already conceded that the period between deployment and regulation — the gap itself — will be governed by whoever built the technology, according to whatever principles they happen to hold. The law-lag narrative is not a description of governance failure; it is a mechanism producing governance failure by making the failure appear natural.
The historical evidence Jasanoff marshals is compelling. The pharmaceutical industry did not develop first and then get regulated afterward. The regulatory framework co-evolved with the industry: the 1906 Pure Food and Drug Act shaped what could be sold, the 1938 Food, Drug, and Cosmetic Act required safety demonstration before marketing, the 1962 Kefauver-Harris amendments required efficacy proof. Each regulatory intervention restructured the industry — changing what companies researched, how they tested products, what they could claim. The industry and its regulation were co-produced across a century.
AI's development followed the same pattern, though the governance was less visible. The computational infrastructure underlying large language models was built within telecommunications regulation that determined data transmission costs and privacy expectations. The training data was shaped by copyright law, which defined what text could be freely used. The venture capital that funded AI companies operated within securities regulation and tax policy. The talent pool was shaped by immigration law and educational policy. AI did not develop in a state of nature and then arrive for governance. It was produced within a governance framework that made specific choices — about data, about intellectual property, about market structure — whose consequences are manifesting now.
The law-lag narrative's political function is to naturalize the governance gap. If the gap is inevitable, then demanding that it close is unrealistic, and the realistic position is to trust builders to govern themselves during the lag. This trust is precisely what Jasanoff's framework interrogates. The builders possess genuine expertise about what they have built. They do not possess, and structurally cannot possess, the knowledge of what their building costs the people downstream — because that knowledge is experiential, distributed, and emerges slowly in registers the builder's instruments do not detect. Trusting builders to govern themselves is not realism but abdication disguised as pragmatism.
The law-lag narrative has been a persistent theme in technology discourse since at least the 1960s, when technological acceleration became a subject of popular and scholarly concern. Alvin Toffler's Future Shock (1970) popularized the image of institutions struggling to keep pace with change. Jasanoff's critique synthesizes science and technology studies' rejection of technological determinism with legal scholarship on law's constitutive functions, showing that the narrative is not merely wrong but politically consequential.
Law co-produces technology, not reacts to it. Legal frameworks shape what gets built by defining property rights, liability standards, permissible uses, and market structure — making law constitutive of technology's development, not merely responsive to it.
The gap is designed, not natural. The governance gap is not an inevitable consequence of differential speeds but a product of choices about what to regulate early (rarely) and what to leave unregulated (commonly).
The narrative serves power. Treating the gap as natural makes democratic governance feel futile and delegated governance (to builders, to markets, to experts) feel pragmatic — a political outcome produced through an apparently apolitical description.
Closing the gap requires redesign. The gap will not narrow by speeding up regulation or slowing down innovation but by rebuilding governance institutions to operate inside co-production rather than chasing after it.