Diffusion of Responsibility — Orange Pill Wiki
CONCEPT

Diffusion of Responsibility

Glover's second erosion mechanism: the distribution of action across enough agents — or, newly, between humans and tools — that each participant can reasonably claim his contribution was insufficient to produce the harm, allowing the harm to occur while responsibility disappears into the architecture of the system.

The most reliable finding from Glover's study of institutional atrocity was that perpetrators rarely experienced themselves as perpetrators. The camp administrator who processed transport manifests experienced himself as a bureaucrat. The engineer who built the gas chambers experienced himself as an engineer. The guard who escorted prisoners to the trains experienced himself as a guard. Each could point to some other agent in the system — the one who gave the order, the one who wrote the policy, the one who actually killed — as the locus of responsibility. The diffusion was the mechanism. No one did it. The system did it. And the system belongs to no one. Glover traced this mechanism across contexts: the firing squad where one rifle holds blanks and no one knows whose, the committee decision where every member voted yes but none feels accountable, the bureaucratic chain where every link can honestly say I only forwarded the paperwork. AI introduces a new form of this ancient problem by distributing agency between humans and tools in ways that create novel diffusions.

In the AI Story

Hedcut illustration for Diffusion of Responsibility
Diffusion of Responsibility

Classical diffusion operates among humans: the more people involved, the less any single person feels responsible. The mechanism is well-documented in social psychology — the bystander effect, the Milgram continuum of diffused authority, the Stanford prison experiment's production of cruelty through role structure. Glover's analysis integrated these findings with institutional analysis: certain structures are designed (again, not always intentionally) to produce maximum diffusion, and these structures are the ones that produce harm at scale.

The AI version is novel. It does not require multiple humans. It requires distributed agency between a single human and a tool whose contributions are substantial enough to shift the relationship from creator-to-creation into editor-to-draft. The developer who uses a compiler is still the author of the code; the compiler translates, but the developer made every logical decision. The developer who uses Claude Code is in a different relationship: she made high-level decisions, the tool made thousands of lower-level decisions, and the gap between them is the space where responsibility diffuses.

This diffusion is not a metaphor. It has operational consequences. When the AI-generated system produces harm — the engagement loop that captures a teenager's attention beyond any healthy threshold, the recommendation that radicalizes, the code that contains the bug that costs lives — the question who did this? has no clean answer. The developer can honestly say she did not write those specific lines. The tool vendor can honestly say the tool did what it was designed to do. The organization can honestly say no single decision produced the outcome. The harm exists; the responsibility has evaporated into the architecture.

Glover's framework insists that this evaporation is not metaphysical. Someone is responsible. The inability to locate the responsibility is a feature of the institutional design, not a fact about the moral universe. The design can be changed. Accountability can be restored by structural choices: review requirements that reassign authorship, audit trails that reconstruct the distribution of decisions, norms that hold the human accountable for what the tool produced in her name.

Origin

The concept traces through social psychology — Darley and Latané's 1968 bystander effect experiments, Milgram's demonstration that distance between decision and action increases compliance — into Glover's integration with institutional ethics. Glover's version is distinctive because it treats diffusion not as a psychological bias to be corrected but as an institutional design feature that can be either suppressed or exploited.

The AI-mediated version of the problem has been developing in technology ethics since the 1980s — the problem of many hands that Helen Nissenbaum and others identified in computing contexts. The arrival of generative AI sharpens it: the tool's contributions are now substantial enough that the ancient question of how to assign responsibility for collaborative harm acquires a new difficulty, and the institutional structures that would answer it have not been built.

Key Ideas

Responsibility does not disappear. It is dispersed. The architecture that disperses it can be redesigned.

Scales with complexity. The more agents involved — or the more capable the tool — the greater the diffusion. Mass-scale systems require structural work to maintain accountability against the gradient.

Novel with AI. The human-machine distribution of agency creates a new form: not diffusion among humans but diffusion between human and non-human contributions, where the non-human contribution cannot be held accountable in any conventional sense.

Editor, not author. The builder who reviews AI output is in an editorial relationship to the work. Editorial responsibility is real but different from authorial responsibility, and the difference matters.

Remediable by design. Audit trails, authorship conventions, review structures that force re-concentration of decision at identifiable points — all are institutional counter-mechanisms that can prevent diffusion from producing the unaccountable harm the diffusion would otherwise enable.

Appears in the Orange Pill Cycle

Further reading

  1. Jonathan Glover, Humanity: A Moral History of the Twentieth Century (1999)
  2. John Darley and Bibb Latané, "Bystander Intervention in Emergencies" (1968)
  3. Helen Nissenbaum, "Accountability in a Computerized Society" (1996)
  4. Hannah Arendt, Eichmann in Jerusalem (1963)
  5. Dennis Thompson, "Moral Responsibility of Public Officials" (1980)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT