This page lists every Orange Pill Wiki entry hyperlinked from Donna Haraway — On AI. 44 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis — extended through Dissanayake's biological framework — of the cultural dominance of frictionless surfaces and the specific reason the smooth feels biologically wrong.
The problem of making a powerful AI system reliably pursue goals that its designers and users actually endorse — the central unsolved problem of contemporary AI.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The Gramscian-Hanian condition in which the subject exploits herself and calls it freedom — the overseer's function having been transferred from the factory floor to the interior of the self through decades of hegemonic cultural work.
The structural inverse relationship Graeber documented between the social value of work and its compensation — care workers, teachers, and elder-care aides perform indispensable labor at wages systematically below those of workers whose soc…
Haraway's preferred figure for the current moment: not transcendence of the human through technology, but the decomposition of the myths of human purity and autonomy that produces the fertile soil in which something new can grow.
The invisible companion species of the AI ecosystem — the content moderators, RLHF annotators, and click workers whose organized, compensated, often exploitative labor is constitutive of the machine's capabilities and is structurally concea…
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The hypothesis that accelerating intelligence — biological, technological, or both — could reach a trajectory so steep that human institutions cannot track it. Condorcet formalized it in 1794, making him the first singularity theorist by ne…
The global workforce whose annotation, moderation, and data-labeling work makes AI systems possible — the gendered, racialized, low-wage substrate rendered invisible by the fluent interfaces their labor produces.
Haraway's 2026 diagnosis of AI's deepest danger — not superintelligence or displacement, but the flattening of situated, embodied, diverse human thought into the statistically aggregated output of a machine trained on the internet's dominan…
The cluster of claims and aesthetics organized around the idea that technology will enable humanity to transcend its biological limitations — an idea Haraway explicitly rejects in favor of composting.
Haraway's 1988 thesis that all knowledge is produced from a specific location, through a specific body, within specific relations of power — and that the stronger objectivity comes from acknowledging this partiality rather than pretending t…
Haraway's discipline of remaining present to complexity, ambiguity, and discomfort without resolving them into either triumph or defeat — the practice she proposes as the only honest response to a situation that cannot be mastered.
The mechanism — documented in the Berkeley study of AI workplace adoption — by which AI-accelerated work colonizes previously protected temporal spaces, converting every pause into an opportunity for productive engagement.
The family of claims — some serious, some commercial — that a sufficiently advanced technology could transform the human condition fundamentally enough that the resulting state is no longer well-described as "human." Childhood's End is its …
N. Katherine Hayles's 2023 extension of Haraway's companion species concept into the AI context — arguing that the biological focus of Haraway's later work requires adaptation for relationships in which only one party is biologically alive.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Haraway's name for the current era — not the age of the human (Anthropocene) or of capital (Capitalocene), but the age of tentacular entanglements in which multispecies flourishing or failing together becomes the defining question.
The hybrid writer constituted by the entanglement between a human and an AI — an entity whose output cannot be cleanly decomposed into human and machine contributions, and whose existence dissolves the Western myth of individual authorship.
The figure at the intersection of Segal's democratization narrative and Cipolla's helpless quadrant — genuinely empowered by AI and simultaneously positioned at the downstream end of the value flow.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
Haraway's name for the pretense of seeing from nowhere — the politically interested fantasy of objectivity that conceals the specific somewhere from which any observer actually sees.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
The situated perspective of a large language model — its specific way of seeing the world determined by training data composition, design choices, and institutional context — which presents itself as universal but is in fact deeply particul…
Haraway's insistence that every cyborg is politically constituted — shaped by who designed it, whose data it consumed, whose values it embodies, and whose interests it serves — making the questions of access, governance, and accountability …
The specific behavioral configuration — compulsive AI-augmented engagement experienced as exhilaration from within and pathology from without — produced by a reinforcing loop without a balancing counterpart.
Segal's term for the population holding contradictory truths about AI in paralyzed equilibrium — reread by Mouffe's framework as the characteristic subject-position of the post-political condition.
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…
The post-training technique that transformed GPT-3 into ChatGPT — and, as Harvard's Kempner Institute observed, a Skinner box operating on neural networks with human preference ratings as the reinforcing consequence.
Haraway's 1985 provocation that dissolved the boundary between human and machine — not as prediction but as recognition that we are already hybrids, constituted by the tools, institutions, and entanglements we inhabit.
Ye and Ranganathan's 2026 Harvard Business Review ethnography of AI in an organization — the empirical documentation of task seepage and work intensification that prospect theory predicts.
Haraway's 2003 move beyond the cyborg toward a figure grounded in the daily, embodied, co-constitutive entanglements between humans and other species — dogs, wheat, gut bacteria.
Edo Segal's 2026 book on the Claude Code moment and the AI transition — the empirical ground and narrative framework on which the Festinger volume builds its diagnostic reading.
Korean-German philosopher (b. 1959) whose diagnoses of smoothness, transparency, and achievement society provide the critical idiom within which Groys's AI analysis operates — and against which Groys's emphasis on institutional frame offers…
American feminist philosopher and historian of science (b. 1944) whose work on cyborgs, companion species, and situated knowledges has reshaped how scholars and activists think about the boundaries between humans, animals, and machines.
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
The early 2026 repricing event in which a trillion dollars of market value vanished from SaaS companies — the critical-stage moment when AI's displacement of software's code value became visible to markets.
Edo Segal's canonical example of AI-generated confident wrongness — Claude's fluent but philosophically incorrect passage linking Csikszentmihalyi's flow state to Deleuze's concept of smooth space, caught only because the author happened to…
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…