This page lists every Orange Pill Wiki entry hyperlinked from Ramesh Srinivasan — On AI. 20 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The governing metaphor of The Orange Pill — AI as a signal-amplifier that carries whatever is fed into it further, with terrifying fidelity. Buber's framework extends the metaphor: the amplifier clarifies what was already there, which makes…
The regulatory, institutional, and normative arrangements governing AI development and deployment — reframed through Ostrom's framework as a polycentric governance challenge requiring coordination across multiple scales rather than the mark…
Collective benefit, Authority to control, Responsibility, Ethics—framework asserting indigenous communities' right to govern their own data.
The novel form of value capture operating at the heart of the AI economy — user interactions become training data that improve models owned by the center, replicating colonial extraction patterns in cognitive rather than material form.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
The full material footprint of AI operations — energy, water, minerals, land, and carbon — that productivity metrics systematically exclude but that the embedded economy and ecological ceiling make inescapable.
The fair treatment of communities as knowers—requiring AI systems to recognize diverse knowledge forms without testimonial or hermeneutical violence.
The range of signals an amplifier can receive and reproduce faithfully—Srinivasan's technical extension of Segal's metaphor revealing AI's cultural tuning.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
Relational, oral, practice-embedded knowledge systems that resist extraction into propositional formats—what the amplifier structurally cannot process.
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind; the paradigm that, read through Gibson's framework,…
Co-design methodology positioning affected communities as architects rather than end-users—developed through Scandinavian labor movements, applied to AI by Srinivasan.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The paradigmatic figure of the peripheral isolate in the AI transition — a capable builder at the geographic and institutional margins whose different constraints predict different innovations than the center will produce.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The governance regime change in which the accumulated textual, visual, and computational output of millions of individuals was appropriated for AI training under terms their original contribution did not contemplate — the paradigmatic case …