This page lists every Orange Pill Wiki entry hyperlinked from Achille Mbembe — On AI. 31 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The thesis that African populations are not merely subjects of the AI revolution but agents in it, with specific traditions of creativity, improvisation, and collective intelligence that constitute resources for engaging AI on terms the met…
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The argument that the digital infrastructure of the AI age is not a break from colonial history but a continuation of it, reproducing in new forms the extractive and hierarchical structures of earlier empires.
The pattern by which AI tools lower the floor of who can build — enabling production by individuals whose stock consists of an idea, a subscription, and the capacity for sustained attention.
The project of displacing the colonial hierarchy of knowledge that the training corpora of large language models have inherited and made operational.
A category of risk whose realization would either annihilate humanity or permanently and drastically curtail its potential. AI joined this category in mainstream academic usage in 2014.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The operational frame in which a human and an AI system share a workflow as partners with complementary capabilities — the alternative to both "AI as tool" and "AI as replacement."
The gap between what a person can conceive and what they can produce — a ratio that has been collapsing since the Neolithic and that the language model reduced to approximately the length of a conversation.
Mbembe's 2003 concept of the sovereign power to determine who lives and who dies — extended in the AI age to the power to determine which knowledge lives in the training corpus and which is erased.
Mbembe's framework for an ethics adequate to the planetary scale of AI's effects — one that takes seriously radical interconnection while refusing to dissolve radical inequality.
The organization of production around the extraction of maximum value from labor that is simultaneously essential and invisible — a logic that structures the content-moderation and data-labeling infrastructure of contemporary AI.
The peculiar pathology of AI-augmented work — compulsive engagement with a tool that is genuinely producing valuable output, indistinguishable from flow externally and catastrophically different internally.
The discipline of formulating a question such that a capable answering system produces a useful answer. Asimov's Multivac stories prefigured it; prompt engineering operationalizes it.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
Mbembe's framing of the platform user agreement as the digital era's successor to the colonial commandement — the unilateral contract through which the platform exercises sovereign power over the user it purports to serve.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The Orange Pill's metaphor for the institutional work of redirecting the river of AI capability — not to stop the current but to shape what grows around it.
Mbembe's thesis that conditions once reserved for the colonized — precarity, disposability, surveillance, the extraction of value from a body treated as disposable — are extending to broader populations through the mechanisms of platform ca…
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
Mbembe's figure for everything the smooth AI interface conceals — the content moderators, data labelers, cobalt miners, and extracted artists whose labor the daylight discourse refuses to illuminate.
Mbembe's name for the political formations that inherit colonial structures without the label — and the framework for understanding how power persists through and beyond formal independence.
When evaluation benchmarks leak into the training corpus of a language model, the model can memorize the answers instead of reasoning to them — which makes benchmark scores a measure of leakage rather than capability.
Maslow's reading of The Orange Pill's central question: worthiness is not a moral endowment but the developmental achievement of a person whose signal is shaped by B-values.
The global workforce that reviews traumatic, violent, and prohibited content to train and maintain AI safety systems — a paradigmatic instance of Mbembe's nocturnal body.
Xingqi Maggie Ye and Aruna Ranganathan's 2026 Harvard Business Review ethnography of an AI-augmented workplace — the most rigorous empirical documentation to date of positive feedback dynamics in human-machine loops.
Cameroonian political philosopher (b. 1957) whose concepts of necropolitics, the postcolony, and the becoming-Black of the world provide the sharpest available framework for reading AI's colonial genealogy.
Korean-German philosopher (b. 1959) whose diagnoses of the smoothness society and the burnout society anticipated the pathologies of AI-augmented work with unsettling precision.