This page lists every Orange Pill Wiki entry hyperlinked from Jonathan Glover — On AI. 46 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis — extended through Dissanayake's biological framework — of the cultural dominance of frictionless surfaces and the specific reason the smooth feels biologically wrong.
The developmental experience of having nothing externally provided to attend to, which forces the developing mind to generate its own objects of attention from internal resources — the foundational soil of adult creative capacity.
The specific effect of AI-accelerated production on the interval between conception and deployment — the shrinkage of the window in which moral friction could once surface, the human response could once fire, and the question should this exist…
The specific developmental and environmental conditions — boredom, difficulty, and trusting relationships — under which the capacity for moral questioning forms in children, and whose systematic erosion by AI-saturated environments thr…
The brain system that activates when focused task demand subsides — the substrate of mind-wandering, self-referential processing, and the associative integration from which spontaneous creativity arises.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Glover's second erosion mechanism: the distribution of action across enough agents — or, newly, between humans and tools — that each participant can reasonably claim his contribution was insufficient to produce the harm, allowing the harm t…
The first of Glover's three erosion mechanisms: the separation — physical, psychological, or conceptual — between the agent and the person harmed, which suppresses the human response and makes cruelty psychologically tractable.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Phillips's Winnicottian argument that frustration is not an obstacle to creativity but its necessary ground — the not-knowing from which genuine surprise emerges, and which frictionless interfaces systematically eliminate.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
Glover's term for the institutional climate of premises, norms, and background assumptions — the fishbowl water — that determines which moral resources are exercised and which are suppressed within any working environment.
Shannon Vallor's term — deployed through Glover's framework — for the progressive atrophy of the moral capacities that institutional and technological environments no longer exercise, with AI tools producing a specifically comprehensive for…
The specific psychological discomfort — the resistance of a value against a proposed action — that signals a conflict between what the agent is doing and what the agent believes, and whose preservation is the operational condition of moral …
Glover's foundational reframing of the moral self not as a fixed possession but as an ongoing construction — the cumulative product of choices that express and reinforce the kind of person one is becoming, now placed under unprecedented p…
Glover's term for the three psychological capacities — sympathy, respect for persons, and moral identity — that prevent cruelty when they operate and that atrophy when institutional conditions suppress them.
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind; the paradigm that, read through Gibson's framework,…
The specific behavioral signature of AI-augmented work: compulsive engagement that the organism experiences as voluntary choice, with an output the culture cannot classify as problematic because it is productive.
Honneth's framework holding that human identity is a social achievement constituted through three forms of mutual acknowledgment — love, rights, and social esteem — each producing a distinct dimension of selfhood.
The Berkeley researchers' term for the colonization of previously protected temporal spaces by AI-accelerated work — the mechanism through which the recovery windows of pre-AI workflows disappear.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Glover's framework for the moral duty that falls on those who capture the gains of a systemic transition to address the concentrated costs borne by those whom the transition displaces — the obligation that does not depend on intention or di…
The twelve-year-old's 'Mom, what am I for?' read not as a request for information but as an opening of the intermediate area — a question that asks to be held, not answered, because holding is what develops the capacity to inhabit unresolv…
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
The population mourning what the AI transition eliminates — senior practitioners whose recognition demand is systematically truncated: their diagnosis acknowledged, their claim to institutional response denied.
Glover's taxonomy of the specific, replicable mechanisms through which the psychological restraints that normally prevent cruelty loosen, contract, and finally break — a map of moral decay whose applicability extends from genocide to the da…
Levinas's name for the vulnerable, irreducible presence of another human being that issues an ethical commandment before any act of knowledge—the phenomenon that AI's interface, by structural design, cannot present.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
Glover's name for the involuntary, pre-deliberative recognition of another person's humanity that makes cruelty psychologically difficult — the catch in the throat, the flinch, the sudden awareness that the person at the other end of one's…
Glover's name for the gradient mechanism of moral decay: not a cliff but a slope, each step small enough to seem continuous with the last, the cumulative trajectory visible only from a distance no single participant occupies.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
On AI's synthesis of Glover's framework into a set of five daily practices — self-interrogation, constructed proximity, preserved friction, expression-conviction alignment, and accepted obligation — through which the moral identity of th…
The question "what is a human being for?" — which Clarke predicted intelligent machines would force humanity to ask, and which arrived in 2022–2025 with more force and less philosophical preparation than he expected.
Edo Segal's name for the vast majority experiencing the full emotional complexity of the AI transition without a clean narrative to organize it — most accurate in perception, least audible in discourse.
AI's early enthusiasts — the builders posting productivity metrics, shipping solo products, experiencing genuine creative release. Partly right, structurally blind, and the largest obstacle to the voice the transition needs.
Segal's scene of the child who asks 'What am I for?' — received by Jonas's framework as the paradigmatic moral claim of the technological age, the voice of the generation that will bear consequences it cannot consent to.
Glover's diagnosis of the mechanism by which belief-adoption becomes identity-adoption — once a position has become a marker of tribal belonging, abandoning it carries the psychological cost of identity disruption, and evidence is processed…
Maslow's reading of The Orange Pill's central question: worthiness is not a moral endowment but the developmental achievement of a person whose signal is shaped by B-values.
Jonathan Glover's 1999 masterwork — not a history of atrocity but a diagnostic manual, mapping the specific psychological and institutional mechanisms through which ordinary people came to participate in the worst things human beings have d…
Ye and Ranganathan's 2026 Harvard Business Review ethnography of AI in an organization — the empirical documentation of task seepage and work intensification that prospect theory predicts.
Korean-German philosopher (b. 1959) whose diagnoses of smoothness, transparency, and achievement society provide the critical idiom within which Groys's AI analysis operates — and against which Groys's emphasis on institutional frame offers…
Builder, entrepreneur, and author of The Orange Pill — whose human-AI collaboration with Claude, described in that book and extended in this volume, provides the empirical ground for the Whiteheadian reading.
The skilled textile workers whose 1811–1816 destruction of wide stocking frames became the founding Luddite event — and whose ontological error, Ellul's framework suggests, was believing they faced a technology when they faced a logic.