This page lists every Orange Pill Wiki entry hyperlinked from Douglas Engelbart — On AI. 39 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis — extended through Dissanayake's biological framework — of the cultural dominance of frictionless surfaces and the specific reason the smooth feels biologically wrong.
The problem of making a powerful AI system reliably pursue goals that its designers and users actually endorse — the central unsolved problem of contemporary AI.
The Renaissance art of critical reading — the active evaluation of texts against evidence, logic, and source reliability — and the direct ancestor of the evaluative discipline AI-generated content requires.
AGI: a hypothetical system with human-level cognitive ability across essentially every domain. The transition-point that AI-safety thinking orients around, even when no one agrees on what it is.
The study of how AI-saturated environments shape the minds that live inside them — the framework for asking what becomes of judgment, curiosity, and the capacity for sustained attention when answers become abundant and friction is engineer…
Engelbart's foundational distinction: automation removes the human from the loop, augmentation redesigns the loop so the human's participation becomes more powerful. The most consequential design decision of the AI decade.
The five burdens augmentation imposes on the human it amplifies: exposed judgment, intellectual honesty, emotional resilience, self-directed development, and sustained attention. Augmentation does not make work easier; it makes work differe…
Brown's seven-component operationalization of trust — Boundaries, Reliability, Accountability, Vault, Integrity, Non-judgment, Generosity — that converts an abstraction into observable practice.
The two archetypal organizational responses to AI-driven productivity gains — reducing staff to maintain output or maintaining staff to expand output — each producing fundamentally different professional outcomes.
Engelbart's assumption that the human and the tool would evolve together at approximately balanced rates — and the structural diagnosis of what happens when the tool accelerates beyond the human's capacity to adapt.
Engelbart's neglected insight: augmentation's highest-value application is not the amplification of individuals but the enhancement of teams — and the current AI deployment is reproducing the industry's historical failure to invest in the c…
Ericsson's empirically established mechanism for building expertise — effortful, targeted engagement at the boundary of capability, guided by specific feedback and sustained over thousands of hours.
Toffler's 1970 diagnosis of the psychophysiological stress produced when human beings encounter more change than they can process — not the content of any particular change, but the pace itself.
Engelbart's formalization of the augmented system: Humans using Language, Artifacts, Methodology, and Training. Every component shapes every other, and improving one in isolation as likely degrades the system as enhances it.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind; the paradigm that, read through Gibson's framework,…
The discipline of formulating a question such that a capable answering system produces a useful answer. Asimov's Multivac stories prefigured it; prompt engineering operationalizes it.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
Michael Polanyi's term for the knowledge that lives in the hands and nervous system rather than in explicit propositions — acquired through practice, failure, and embodied pattern recognition, and dissolved by AI workflows that produce ou…
The mechanism — documented in the Berkeley study of AI workplace adoption — by which AI-accelerated work colonizes previously protected temporal spaces, converting every pause into an opportunity for productive engagement.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Engelbart's organizing strategy: use the tools you are building to improve the process of building them. Each cycle makes the next faster, producing a compounding spiral of capability rather than linear improvement.
The transition from training students in specific cognitive tasks (which AI commoditizes) to developing judgment, questioning, and integrative thinking — the educational restructuring the AI deployment phase demands.
The widening structural distance between the speed of technological capability and the speed of institutional response — the defining failure mode of democratic governance in an exponential era.
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The question "what is a human being for?" — which Clarke predicted intelligent machines would force humanity to ask, and which arrived in 2022–2025 with more force and less philosophical preparation than he expected.
What Engelbart established and what he left incomplete: the foundation of augmentation theory is sound, but the culture, pedagogy, governance, measurement systems, and philosophy required to make augmentation operational at civilizational s…
The structural diagnosis of why the computing industry has consistently preferred automation over augmentation: six reinforcing forces — measurement, sales, implementation, organizational compatibility, designer comfort, and psychological e…
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…
Engelbart's working implementation of the augmentation framework — the system the SRI team used to build itself. The first platform for genuine collective cognition, and the most sophisticated demonstration of bootstrapping ever attempted.
The AI-powered conversational concierge kiosk that Edo Segal's team at Napster built in thirty days for CES 2026 — the Orange Pill's central case of AI-accelerated specific-purpose design, read through Rams's framework as a case of useful to wh…
Ye and Ranganathan's 2026 Harvard Business Review ethnography of AI in an organization — the empirical documentation of task seepage and work intensification that prospect theory predicts.
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
American writer and educator (b. 1947) who taught Engelbart's 1962 paper for years at Stanford and became the principal interpreter of augmentation for a generation of readers who would not otherwise have encountered it.
The canonical moment in Segal's work with Claude when the model produced a passage of philosophical elegance that was rhetorically compelling and substantively wrong — the paradigm case of fluent fabrication, and the founding episode of th…
Engelbart's ninety-minute 1968 demonstration that showed collaborative editing, hypertext, video conferencing, and the mouse — as integrated components of an augmentation system. The audience took home the peripherals and left behind the vi…
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…