This page lists every Orange Pill Wiki entry hyperlinked from Merlin Donald — On AI. 25 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The potential fourth cognitive transition—the externalization not merely of storage but of processing itself—in which AI systems generate new patterns from accumulated knowledge, reorganizing the conditions of creative work.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The modern mind as a hybrid operating simultaneously across episodic, mimetic, mythic, and theoretic layers—now extended by AI into a five-layer architecture whose richness depends on maintaining all dimensions.
The quality of subjective experience — being aware, being something it is like to be — and the single deepest unanswered question in both philosophy of mind and AI.
Ericsson's empirically established mechanism for building expertise — effortful, targeted engagement at the boundary of capability, guided by specific feedback and sustained over thousands of hours.
The form of understanding that lives in the body — deposited through habitual engagement with resistant materials, irreducible to propositional content, and constitutive of genuine expertise.
The primate mode of consciousness in which each moment is experienced as it occurs, without the capacity to represent, rehearse, or symbolically communicate experience—the foundational layer underlying all later cognitive revolutions.
The culturally constructed environment of texts, tools, and symbolic systems that extends individual cognitive reach beyond biological capacity—the substrate of theoretic culture and now the training ground of AI.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The neglect of foundational cognitive layers when a powerful upper layer becomes available—producing short-term efficiency and long-term fragility, the defining danger of the AI transition.
The first cognitive revolution beyond primate baseline—the deliberate, self-initiated, representational use of the body through gesture, imitation, ritual, and manual skill—foundational to all craft and kinesthetic intelligence.
The second cognitive revolution—language and oral narrative—allowing humans to construct shared imaginative worlds, transmit knowledge across generations, and organize experience into meaningful stories.
Bruner's 1986 distinction between two irreducible modes of cognition — the logical-scientific mode that seeks general truths and the narrative mode that constructs particular meanings. AI excels at the first; Bruner's framework asks what ha…
The compulsive engagement pattern produced when the enterprise of the self encounters unlimited productive capability — behavior indistinguishable from addiction, output indistinguishable from achievement.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
The transition from training students in specific cognitive tasks (which AI commoditizes) to developing judgment, questioning, and integrative thinking — the educational restructuring the AI deployment phase demands.
Andy Clark and David Chalmers's 1998 thesis that cognition routinely extends beyond the skull into tools, notebooks, devices, and other people — the philosophical foundation for thinking about AI as a cognitive partner rather than a separat…
The recurring pattern — substitution, atrophy, preemption, redistribution — by which each new cognitive technology empties a mental palace that took generations to build.
Pye's term for work in which the outcome is not predetermined — quality depends on the maker's continuous judgment, care, and skill, and every moment of production admits the possibility of failure.
The third cognitive revolution—external symbolic storage through writing, mathematics, and formal notation—enabling systematic thought, cumulative knowledge, and the entire edifice of science, law, and philosophy.
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
Canadian cognitive neuroscientist (b. 1939) whose three-stage theory of cognitive evolution — mimetic, mythic, theoretic — provides the architectural framework for understanding AI as a fourth transition.