Compression operates differently from distance and diffusion. Distance and diffusion are spatial — they concern the relationships among agents and affected persons. Compression is temporal — it concerns the sequence within which moral cognition unfolds. Moral cognition, Glover insisted, is not instantaneous. It requires time. The flicker of discomfort that becomes a question that becomes a revision that becomes a refusal — this sequence unfolds over days, sometimes weeks, sometimes years. When the production sequence compresses to hours, the moral sequence cannot keep up.
The point is not that moral cognition is slow. The point is that moral cognition has a characteristic tempo, shaped by the biology of human attention and the structure of the default mode network. Moral reflection occurs during unstructured cognitive time — in the shower, on the walk home, in the third hour of implementation when the mind has relaxed from the initial focus and begun to wander toward the implications of what it is building. AI-compressed workflows eliminate this unstructured time not by suppressing it directly but by colonizing the gaps where it used to occur.
The Berkeley study documented the colonization empirically: workers using AI tools filled previously protected pauses with additional prompts, treating every gap as production time to be optimized. The researchers measured output. What they did not measure — and could not, within their methodology — is what happened to the moral reflection that those gaps had hosted. Segal's observation in You On AI is exactly to the point: the gaps were not waste. They were where conscience lived. Their elimination by the efficiency premise of AI adoption eliminates the substrate on which moral formation depends.
Glover's framework suggests a specific remedy: the deliberate construction of temporal friction — mandatory delays between conception and deployment, structured pauses for review, institutional requirements that compress cannot override. These are not productivity drags. They are the temporal equivalent of the proximity that must be deliberately constructed when distance is structural. The gaps that no longer occur incidentally must occur intentionally.
The concept is On AI's extension of Glover's framework. Glover himself did not emphasize temporal compression — the technologies of his era worked primarily through spatial distance and organizational diffusion. The specific form of erosion that AI produces through speed was not available to him to diagnose. The extension is made by applying his method — identify the specific mechanism, trace its operation, propose structural remedies — to a mechanism he did not live to see.
The temporal dimension has been noted elsewhere. Hartmut Rosa's work on social acceleration describes the general phenomenon. Shoshana Zuboff's analysis of surveillance capitalism notes the speed-of-iteration advantage that AI grants to platforms. The specifically moral dimension — what compression does to the formation of conscience — is the contribution of On AI, rooted in Glover's framework for how moral cognition unfolds in time.
Moral cognition has a tempo. It is not instantaneous; it requires characteristic intervals. Compression below those intervals does not speed it up — it prevents it.
Production time is moral time. The hours of implementation were not merely production overhead. They were the substrate of moral reflection.
Gaps are not waste. The unstructured cognitive time between focused tasks is where conscience lives. Its colonization by AI-enabled production is the colonization of conscience itself.
Remediable by structural delay. Mandatory pauses, scheduled review, institutional requirements that cannot be compressed — these are the temporal dams that preserve moral time against the efficiency gradient.
Third mechanism, not replacement. Compression joins distance and diffusion as a distinct erosion mechanism. It operates alongside them and amplifies their effects.