This page lists every Orange Pill Wiki entry hyperlinked from Martha Nussbaum — On AI. 31 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis of the cultural trajectory toward frictionlessness — a smoothness that conceals the labor and struggle that gave previous work its depth.
The Berkeley researchers' prescription for the AI-augmented workplace — structured pauses, sequenced workflows, protected human-only time, behavioral training alongside technical training — the operational counterpart to Maslach's fix-the-…
The constructive program this volume derives from White's historical research: the institutions that will govern AI must be built deliberately, during the lag period, by people who understand both the technology's implications and the lon…
Nussbaum's analytical distinction between what a person actually does or is (functioning) and the real freedom to do or be it (capability) — the instrument that cuts through the central confusion of the AI discourse.
Nussbaum's Aristotelian definition of compassion — the painful emotion occasioned by awareness of another's undeserved misfortune — as a cognitive achievement with three specific judgmental conditions.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Nussbaum's thesis that emotions are not irrational disturbances but cognitive evaluations — judgments about the significance of events for a person's flourishing, and therefore legitimate forms of moral perception.
Segal's term for the exit without alternative exercised by senior technology practitioners in 2025–2026 — the voluntary disconnection Castells's framework identifies as the losing strategy in a network society.
Sen's foundational distinction between what a person does or is (functioning) and what she is substantively free to do or to be (capability) — the analytical engine of capability theory.
Nussbaum's philosophical insistence — against the moralistic fiction that outcomes reflect effort and merit — that human lives are profoundly shaped by factors agents neither chose nor could control.
Aristotle's name for the intellectual virtue that governs action in particular circumstances — the form of knowledge that cannot be computed, because it requires experience, character, and having stakes in the world.
The Aristotelian capacity to perceive the right thing to do in particular, unrepeatable circumstances — the cognitive resource the AI transition most urgently demands and whose developmental conditions it most thoroughly threatens.
Edo Segal's phenomenological term for falling and flying at the same time—the subjective signature of the ontological event Heidegger's framework helps name.
Sen's framework that redefines human welfare as the substantive freedom to achieve functionings one has reason to value — the evaluative instrument this book applies to AI.
Edo Segal's phrase for the simultaneous experience of awe and loss during the AI transition — what Nussbaum's framework identifies as moral sophistication rather than confusion.
The figure in whom the thymotic crisis of the AI transition concentrates — the credentialed professional whose decades of expertise are being repriced by a technology she did not design and cannot control.
The research tradition in the AI discourse organized around depth preservation — measuring progress by the maintenance of craft, embodied knowledge, and the formative friction of struggle, and identifying AI as a threat to the conditions …
Nussbaum's precise philosophical claim — against the culture of instrumental optimism — that grief is a cognitive evaluation, an accurate perception that something of genuine value has been damaged.
Nussbaum's Aristotelian insistence that the value of human activity does not reduce to the value of its product — the capacities exercised and the character expressed constitute value independent of market recognition.
Nussbaum's argument that the attempt to protect valued goods by making them invulnerable succeeds only by eliminating the goods it was designed to protect — a philosophical error with direct implications for AI-era retreat.
The research tradition in the AI discourse organized around capability expansion and democratization — measuring progress by productivity gains, adoption speed, and the compression of the imagination-to-artifact ratio.
The scene at the center of the book — a child at the threshold of formal operations asking 'What am I for?' with a cognitive tool powerful enough to pose the question but not yet equipped to manage it.
Nussbaum's term for the cognitive and emotional capacity to hold contradictory truths about genuine goods in conflict — without the premature resolution that simplifies the moral situation.
Aeschylus's trilogy of blood-vengeance and civic transformation — the philosophical archetype, on Nussbaum's reading, for how genuine conflicts between genuine goods are resolved through institutional transformation rather than the victory…
Nussbaum's 2013 argument that just institutions require not merely right structures but the emotions — compassion, solidarity, outrage at injustice — that motivate citizens to build and sustain them.
Sophocles's paradigm of irreducible obligation — the title character's choice between burying her brother according to religious law and obeying Creon's civic decree — neither obligation reducible to the other.
Edo Segal's 2026 book on the Claude Code moment and the AI transition — the empirical ground and narrative framework on which the Festinger volume builds its diagnostic reading.
Nussbaum's 2001 750-page treatise defending the cognitive theory of emotions — the book that established emotions as forms of intelligent judgment essential to ethical perception.
The paradigmatic case of tragic conflict in Nussbaum's framework — the king forced to choose between sacrificing his daughter and breaking his oath to the fleet, where every available action destroys a genuine good.
Segal's canonical account of Claude producing an elegant philosophical connection that was eloquent, structurally satisfying, and wrong — the moment that revealed how the feeling of being met can override the evaluative function.
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…