The car extends mobility and amputates walking — not metaphorically but literally, as muscles atrophy and cities rebuild around driving. The law operates with the indifference of gravity. Applied to AI, the extension is generative thought at unprecedented scale. The amputations are four: the slow work of absorption through which embodied expertise is built; the discipline of solitary wrestling with resistant material; the capacity for sustained attention across incubation periods; and the tolerance for not-knowing itself. Each amputation makes the next more likely. The cascade is invisible from inside the extension because the amputation eliminates the very capacity that would have detected it.
The law is not evaluative. Extensions are not bad. Amputations are not always catastrophic. What matters is that they are structural — produced by extension itself, as its necessary companion. The energy that builds the extension is drawn from the capacity being amputated. You cannot extend one capacity without diminishing another. The question is never whether to accept the trade but how to manage it — which requires first naming it.
The AI amputations form a self-reinforcing cascade. Loss of embodied understanding weakens the foundation for sustained attention. Diminished sustained attention reduces tolerance for productive confusion. Reduced confusion-tolerance accelerates the atrophy of the capacity for not-knowing. Each amputation compromises the cognitive infrastructure required to detect subsequent amputations.
The social scale matters equally. When a medium restructures capacities collectively, it restructures the institutions built around those capacities. The software death cross is not merely a market event; it is the amputation of a social structure — the specialist profession, the credentialing pipeline, the organizational hierarchy — built when generative thought was scarce. The structures cannot survive abundance.
The law's indifference is what makes it politically difficult. The automobile amputated walking whether the walking was joyful exercise or grinding commute. AI amputates structures of scarcity whether they were unjust barriers to entry or legitimate repositories of tacit knowledge. The preservation of what matters must be deliberate and partial — dams built against a flow that cannot be stopped.
Articulated throughout Understanding Media (1964), drawing on McLuhan's reading of Harold Innis and the phenomenological tradition. The specific phrase captures McLuhan's insistence that technologies are extensions of the human body and nervous system, not external accessories — and that every extension operates according to a physiological law that connects amplification in one register to suppression in another.
Structural, not incidental. Amputation is not a side effect of extension but its necessary companion — the trade cannot be avoided.
Four AI amputations. Embodied absorption, solitary wrestling, sustained attention across gaps, tolerance for not-knowing.
Cascade dynamics. Each amputation compromises the capacity to detect the next, producing self-reinforcing invisibility.
Social amputation. When media restructure collective capacities, they amputate institutions built around the previous arrangement.
Preservation is partial. Dams maintain amputated capacities as supplements — never as replacements for what the medium has swept away.
Critics argue that amputation framing romanticizes prior capacities — that the walking lost to automobiles was often miserable labor, not noble exercise. The McLuhan defense accepts the critique at the level of evaluation while insisting on it at the level of description: whether or not the lost capacity was valuable, its loss is real and has consequences that should be named rather than dismissed.