You On AI Encyclopedia · The Human Response The You On AI Encyclopedia Home
Txt Low Med High
CONCEPT

The Human Response

Glover's name for the involuntary, pre-deliberative recognition of another person's humanity that makes cruelty psychologically difficult — the catch in the throat, the flinch, the sudden awareness that the person at the other end of one's action is as real as oneself.
The human response is Glover's most original contribution to moral psychology: the observation that moral restraint, when it operates, almost never operates through philosophical deliberation. It operates through involuntary recognition. A guard looked into a prisoner's eyes and could not continue. A soldier saw a specific face and his finger froze on the trigger. A bureaucrat processing deportation orders encountered a familiar name and felt something shift in his chest. The response is not summoned by willpower or produced by moral education. It arises when the psychological distance between agent and affected person is small enough for recognition to operate. Every technology of mediation — the longbow, the telegraph, the factory, the bureaucracy — has altered this distance. AI introduces a new species of it: functional distance, the separation between intention and consequence produced by a tool that handles implementation, compressing the hours of hands-on building during which the response could surface into a conversation.
The Human Response
The Human Response

In The You On AI Encyclopedia

Glover traced the human response through dozens of cases where atrocity was resisted. The pattern was always the same: resistance was triggered not by the application of principle but by an encounter with the particular. A specific face. A specific name. A specific person whose individuality broke through the abstractions that institutional machinery had imposed. When the particular was never encountered — when the agent operated entirely within categories, numbers, reports, dashboards — the response failed.

The critical word is involuntary. This is not moral virtue, cultivated and deployed. It is a psychological capacity that operates when conditions allow and goes dormant when they don't. Distance kills it. Abstraction kills it. The categorization of persons into groups kills it. Any mechanism that interposes a conceptual or physical barrier between agent and affected person suppresses the response that would otherwise make harm difficult.

Moral Identity (Glover)
Moral Identity (Glover)

AI compresses the imagination-to-artifact ratio in ways that matter for the human response. The developer who once spent hours coding a notification timing algorithm inhabited the logic of interruption at a granular level — each decision a small encounter with the system's eventual effect on a person. In the third hour of hand-coding, a flicker of discomfort might surface. Claude Code does not provide the third hour. The cycle from intention to deployment compresses to minutes, and minutes do not contain the same density of moral encounter.

The Berkeley study documented in You On AI records the empirical shape of this compression: workers using AI tools filled previously protected pauses with additional prompts, expanded scope, ran parallel tasks. The researchers measured behavior. What they did not measure — because it falls outside their methodology — is what happened to workers' relationships with the persons affected by their expanded output. Glover's framework permits the inference: when work intensifies without a corresponding deepening of connection, the result is production without sympathy.

Origin

Glover developed the concept from his study of the moments when atrocity was resisted — and noticed that resistance was almost never the product of ethical training. It was the product of proximity. The cases he cited included a German officer who refused to execute partisans because he recognized a boy as the same age as his son; a Rwandan woman who hid Tutsi neighbors because she had shared meals with them; the American helicopter pilot at My Lai who placed his aircraft between the killers and the villagers because he saw, from the air, specific human beings running.

In each case, the institutional machinery had succeeded in producing the category — partisan, Tutsi, enemy — but failed to prevent the encounter with the particular. The particular broke the category. The response fired. What Glover called the human response was a name for whatever it is in human beings that makes the particular harder to harm than the categorical — and the name for what institutions must suppress to produce harm at scale.

Key Ideas

Glover traced the human response through dozens of cases where atrocity was resisted

Pre-deliberative. The response operates before reasoning, not through it. It is the catch, the flinch, the involuntary registration of shared humanity.

Proximity-dependent. Distance suppresses it. Not only physical distance — conceptual, linguistic, bureaucratic distance all function the same way.

Triggered by the particular. Categories do not trigger it. Specific persons do. Institutional machinery that converts persons into categories is designed, whether intentionally or not, to prevent the triggering.

A new species of distance. AI introduces functional distance — the elimination of the implementation journey where moral encounters might have occurred.

Must be deliberately cultivated. In the AI era, the proximity that once arose incidentally from friction must be constructed through intentional practice — user observation, direct encounter, the refusal to let the tool mediate every relationship with the downstream.

Further Reading

  1. Jonathan Glover, Humanity: A Moral History of the Twentieth Century (1999), Part IV
  2. Emmanuel Levinas, Totality and Infinity (1961)
  3. Simone Weil, "Human Personality" (1943)
  4. Iris Murdoch, The Sovereignty of Good (1970)
  5. Stanley Milgram, Obedience to Authority (1974)
Explore more
Browse the full You On AI Encyclopedia — over 8,500 entries
← Home 0%
CONCEPT Book →