This page lists every Orange Pill Wiki entry hyperlinked from Aza Raskin — On AI. 20 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Center for Humane Technology's term for the systematic weakening of human capacities that engagement-optimized technology produces in the cognitive domains the technology assists.
Raskin's distinction between technology that accelerates what humans already do (super human) and technology that expands what it means to be human (extra human).
Raskin's name for the design philosophy that treats user attention, energy, and cognitive capacity as resources to be consumed — contrasted with flourishing-oriented design.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The resistance AI tools eliminate from knowledge work — a category whose composition (wolf or parasite?) determines whether its elimination is liberation or erosion.
The implementable design specifications — reflection prompts, natural stopping points, calibrated challenge, cognitive health metrics — that would redirect AI tools from extraction to flourishing.
Designed moments of conscious choice within a tool's engagement flow — the specific design feature eliminated by infinite scroll and reintroduced by humane AI design.
Raskin's term for the design philosophy that optimizes for engagement through immediate feedback, variable reward, and the elimination of natural stopping points.
The economic system in which human attention is harvested, packaged, and sold to advertisers — the infrastructure that drives the algorithmic pathologies Gore calls artificial insanity.
The structural duty — analogous to medical and structural-engineering obligations — that knowledge of mechanism imposes on those who design tools affecting millions who cannot see the mechanism themselves.
The compulsive engagement pattern unique to AI collaboration — indistinguishable from flow from inside, producing real work and real dependency simultaneously.
The reinforcement schedule that produces the most persistent behavior and the most intense dopaminergic activation: rewards delivered at unpredictable intervals. The slot machine's architecture, the social media feed's architecture, and …
The 2006 interface pattern Aza Raskin designed in an afternoon to eliminate the bottom of the webpage — now estimated to consume 200,000 human lifetimes per day.
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…
American designer (b. 1984), inventor of infinite scroll, co-founder of the Center for Humane Technology and the Earth Species Project — the builder who indicts his own creation.
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
American technology ethicist, former Google design ethicist, and co-founder of the Center for Humane Technology — Raskin's closest intellectual partner in the humane technology movement.