Glover rejected the idea that moral behavior is produced by correct beliefs. The people who committed atrocities often held correct beliefs about right and wrong. What they lacked were the operational capacities that convert belief into action. Glover identified three: sympathy, the capacity to feel something of what another person feels; respect for persons, the recognition of others as beings with their own perspectives and dignity; and moral identity, the sense of oneself as a particular kind of person who would not do certain things. These are not abstract principles. They are psychological muscles — real, measurable, trainable, depletable. Exercised, they strengthen; neglected, they atrophy; subjected to institutional pressure that rewards their suppression, they wither. The moral atmosphere of an institution determines which resources are exercised and which are suppressed. The AI-assisted workplace has a specific atmosphere, and On AI uses Glover's framework to map what it exercises and what it starves.
There is a parallel reading that begins not with the resources themselves but with the material conditions required to maintain them. Glover's framework treats sympathy, respect, and moral identity as psychological muscles that atrophy under institutional pressure. But this locates the problem in individual practice and institutional design—a voluntarist picture in which the right exercises and atmospheres can restore what has been lost. It overlooks the question of whether the substrate those resources require still exists.
Sympathy depends on proximity, Glover says. But proximity is not a design choice; it is a material fact of how labor is organized. The AI-assisted workplace does not suppress sympathy through poor atmosphere—it eliminates the conditions under which sympathy could operate. When work becomes the coordination of remote contractors reviewed through dashboards, when users become data exhaust, when colleagues become Slack handles, there is no proximity left to exercise sympathy within. The resource has not atrophied from neglect; the ground it grew in has been paved over. Respect for persons requires seeing others as having their own projects. But when every interaction is intermediated by systems designed to extract behavioral surplus, when the economic logic of the platform requires converting persons into addressable segments, the capacity to see otherwise is not merely starved—it is structurally opposed by the material basis of the work itself. Glover's framework names what is lost. It does not address whether restoration is possible once the substrate is gone.
The three resources operate in concert but independently. A person can have strong sympathy for immediate others and fail entirely at respect for persons at a distance. A person can have robust moral identity about certain practices and complete indifference to others that are structurally equivalent. The resources are not unified; they are a toolkit, and moral life requires all three.
Sympathy depends on proximity. It is the human response in its activated form — not the involuntary flinch but the sustained capacity to attend to what another is experiencing. It is suppressed by the same mechanisms that suppress the involuntary response: distance, abstraction, the conversion of persons into categories.
Respect for persons is what Kant called treating persons as ends rather than means. Glover treated it less as a principle to apply than as a perceptual capacity to cultivate — the capacity to see another person as having her own life, her own projects, her own perspective that is not reducible to her function in your plans. This capacity is what the reduction of users to engagement metrics most directly damages.
Moral identity is the most architectural of the three. It is the self one is constructing through choice, and it determines what choices become available. A person with strong moral identity will resist certain actions because doing them would violate who she is. A person whose moral identity has been eroded will find the same actions thinkable, not because her beliefs have changed but because the self that would have resisted has been worn away.
The AI-augmented workplace operates through what Segal calls the fishbowl — a set of assumptions so familiar they have become invisible. Glover's framework reveals the fishbowl as a moral atmosphere: a set of premises that determine which resources are exercised and which are starved. The premise of the moral neutrality of tools suppresses the builder's obligation to examine what she builds. The premise of the inherent goodness of efficiency suppresses the recognition that some friction is morally constitutive. The premise of the sovereignty of user choice distributes responsibility so widely it effectively disappears.
The three-resource framework developed across Glover's career but received its fullest articulation in Humanity, where he used it to explain the uneven geography of resistance in twentieth-century atrocities. In the same institutions, some individuals resisted and others complied. The difference was not information — all had access to the same facts. The difference was which resources were available to each individual at the moment of choice, and that availability depended on the individual's prior history of exercising the resources and on the institutional context's support for that exercise.
Glover's resources have antecedents in Hume's theory of sympathy, in Kant's respect for persons, in Aristotle's account of character. His contribution was integrative: to treat these as distinct, interacting capacities rather than as philosophical abstractions, and to insist they function as muscles rather than as principles.
Three capacities, not one. Sympathy, respect, and identity are distinct. They fail independently. An institution can exercise some while starving others.
Muscles, not principles. The resources are trained by use and atrophied by neglect. They are not held; they are exercised.
Atmosphere-dependent. The institutional context determines which resources are called upon. A well-designed institution is one whose moral atmosphere exercises the full set.
Suppressible by design. Every technology of distance, abstraction, or categorization suppresses at least one of the three. AI suppresses all three at once — sympathy through compression of time, respect through metric-based abstraction, identity through the bypass of self-construction in favor of tool-generated output.
Cultivable by deliberate practice. The suppression is not destiny. The resources can be strengthened — but only by practices that introduce the friction the tool has removed: direct encounter with users, refusal of the first plausible output, the slow work of asking whether the thing being built deserves to exist.
The question is not whether Glover's framework is right—it clearly names real psychological capacities—but which parts of the moral landscape it illuminates and which it leaves dark. On the mechanisms of individual moral action, Glover is fully right (100%): sympathy, respect, and identity do function as muscles, they do atrophy with neglect, and they do determine what choices become thinkable. The parallel reading is right (80%) on substrate: these capacities require certain material conditions, and when platform economics eliminates face-to-face labor, proximity-dependent sympathy has nowhere to operate. Both views are right; they answer different questions.
The productive synthesis requires distinguishing between full restoration and partial exercise. Glover's framework is 100% correct that deliberate practices can strengthen the resources even under adverse conditions—the builder who insists on direct user contact, the team that refuses metric-only evaluation, the individual who slow-reads output before publishing. The contrarian view is 70% correct that these exercises cannot fully compensate for substrate loss—you cannot restore pre-platform proximity through willpower—but overstates (30%) by suggesting the resources become entirely inoperable. They become constrained, not eliminated. The right framing is exercises within constraints.
What this means practically: Glover's prescriptions work (100%) for individuals and teams operating within platform structures—they can resist moral deskilling through deliberate friction. But system-level restoration (the contrarian's concern) requires changing the structures themselves, not just individual practice within them. The resources are real, trainable, and partially restorable through exercise. The substrate matters, limits what exercise can achieve, but does not render exercise futile. Both cultivation and structural change are necessary; neither alone is sufficient.