Glover's career turned on a refusal: the refusal to explain twentieth-century atrocity through a theory of evil. The perpetrators, he found, were rarely wicked. They were eroded. The moral restraints that had governed their earlier lives — the reluctance to harm, the recognition of others as persons, the sense of themselves as people who would not do certain things — had worn down through identifiable mechanisms operating across identifiable periods. The erosion was not mysterious. It was structural. It proceeded through distance, diffusion of responsibility, the reclassification of persons into categories, the incremental escalation of demands, and the institutional suppression of the human response. The taxonomy matters for AI not because AI produces atrocity but because the mechanisms are substrate-independent. They operate wherever the conditions allow — in a Rwandan village, in a Nazi bureaucracy, in a Silicon Valley sprint.
The taxonomy's power lies in its specificity. Glover did not write about evil in general. He wrote about the specific practice of dehumanizing language, the specific effect of physical distance on moral response, the specific dynamics of incremental escalation. Each mechanism could be observed, measured in its effects, and — crucially — prevented by institutional design.
The erosion framework reframes moral failure as an engineering problem. If the mechanisms are structural, then so are the remedies. The dams Segal borrows from his beaver metaphor in The Orange Pill are, in Glover's language, the institutional structures that maintain the conditions under which moral resources can operate. Without them, the river of production carries everything away.
Applied to AI, the framework identifies specific features of AI-assisted work that function as erosion mechanisms. The compression of moral time eliminates the spaces where restraint could form. The distance between builder and user, already mediated by organizational structure and digital interface, gains a new layer through the generative tool. The distribution of agency between human and machine creates a novel form of the ancient problem: when everyone and no one built the thing, when the draft was generated and the human merely approved, where does responsibility live?
Glover was characteristically clear that the erosion is not moral weakness. Moral weakness is the failure to act on convictions one holds. Erosion is different — it is the gradual disappearance of the convictions themselves, the loss of the inner voice that knows what it thinks, independent of what the environment provides. The morally weak person knows and fails. The eroded person no longer knows, because the machinery of self-construction has been bypassed.
The framework emerged from Glover's patient reading of Holocaust testimony, Cultural Revolution memoir, Rwandan genocide documentation, and Stalinist confession. He was looking for the moment of decision — the point at which the perpetrator became one. He did not find it. What he found was a gradient: a series of small adjustments, each one manageable, each one slightly more extreme than the last, each one made possible by the sediment of the previous one.
This finding was not new to Glover — Hannah Arendt's banality of evil reached toward something similar — but Glover's contribution was to make the process mechanistic: to identify the specific pressures, the specific institutional structures, the specific psychological dynamics that produced the gradient, and therefore the specific interventions that could interrupt it.
Not evil but erosion. The frame that dissolves the comforting distance between ordinary people and perpetrators, and replaces it with a diagnostic map of how ordinary people become perpetrators.
Mechanisms, not motivations. Erosion is produced by structures, not by wickedness. This is what makes it both more frightening (it can happen to anyone) and more tractable (structures can be redesigned).
Substrate-independent. The mechanisms operate wherever the conditions allow — in bureaucracies, in markets, in research labs, in the architecture of a feed algorithm optimizing for engagement.
Invisible from inside. The gradient cannot be seen by those on it. The perception of moral stability persists while the ground shifts. This is what makes deliberate interruption necessary — and rare.
Remediable by design. The mechanisms that produce erosion can be countered by the institutional equivalents of friction: mandatory pauses, structural proximity, diffused authority for stopping rather than for proceeding.
The erosion framework has been criticized — notably by virtue ethicists in the MacIntyrean tradition — for understating the role of positive character formation. Glover's taxonomy is largely negative: it explains what breaks moral restraint, not what builds it. The On AI volume argues that this asymmetry is itself diagnostic: in the age of amplification, the mechanisms of erosion operate at machine speed while the mechanisms of formation operate at human speed, and Glover's framework is useful precisely because it identifies the asymmetry and refuses to pretend it isn't real.