The three resources operate in concert but independently. A person can have strong sympathy for immediate others and fail entirely at respect for persons at a distance. A person can have robust moral identity about certain practices and complete indifference to others that are structurally equivalent. The resources are not unified; they are a toolkit, and moral life requires all three.
Sympathy depends on proximity. It is the human response in its activated form — not the involuntary flinch but the sustained capacity to attend to what another is experiencing. It is suppressed by the same mechanisms that suppress the involuntary response: distance, abstraction, the conversion of persons into categories.
Respect for persons is what Kant called treating persons as ends rather than means. Glover treated it less as a principle to apply than as a perceptual capacity to cultivate — the capacity to see another person as having her own life, her own projects, her own perspective that is not reducible to her function in your plans. This capacity is what the reduction of users to engagement metrics most directly damages.
Moral identity is the most architectural of the three. It is the self one is constructing through choice, and it determines what choices become available. A person with strong moral identity will resist certain actions because doing them would violate who she is. A person whose moral identity has been eroded will find the same actions thinkable, not because her beliefs have changed but because the self that would have resisted has been worn away.
The AI-augmented workplace operates through what Segal calls the fishbowl — a set of assumptions so familiar they have become invisible. Glover's framework reveals the fishbowl as a moral atmosphere: a set of premises that determine which resources are exercised and which are starved. The premise of the moral neutrality of tools suppresses the builder's obligation to examine what she builds. The premise of the inherent goodness of efficiency suppresses the recognition that some friction is morally constitutive. The premise of the sovereignty of user choice distributes responsibility so widely it effectively disappears.
The three-resource framework developed across Glover's career but received its fullest articulation in Humanity, where he used it to explain the uneven geography of resistance in twentieth-century atrocities. In the same institutions, some individuals resisted and others complied. The difference was not information — all had access to the same facts. The difference was which resources were available to each individual at the moment of choice, and that availability depended on the individual's prior history of exercising the resources and on the institutional context's support for that exercise.
Glover's resources have antecedents in Hume's theory of sympathy, in Kant's respect for persons, in Aristotle's account of character. His contribution was integrative: to treat these as distinct, interacting capacities rather than as philosophical abstractions, and to insist they function as muscles rather than as principles.
Three capacities, not one. Sympathy, respect, and identity are distinct. They fail independently. An institution can exercise some while starving others.
Muscles, not principles. The resources are trained by use and atrophied by neglect. They are not held; they are exercised.
Atmosphere-dependent. The institutional context determines which resources are called upon. A well-designed institution is one whose moral atmosphere exercises the full set.
Suppressible by design. Every technology of distance, abstraction, or categorization suppresses at least one of the three. AI suppresses all three at once — sympathy through compression of time, respect through metric-based abstraction, identity through the bypass of self-construction in favor of tool-generated output.
Cultivable by deliberate practice. The suppression is not destiny. The resources can be strengthened — but only by practices that introduce the friction the tool has removed: direct encounter with users, refusal of the first plausible output, the slow work of asking whether the thing being built deserves to exist.