Don Norman — On AI — Wiki Companion
WIKI COMPANION

Don Norman — On AI

A reading-companion catalog of the 23 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Don Norman — On AI uses as stepping stones for thinking through the AI revolution.

This page lists every Orange Pill Wiki entry hyperlinked from Don Norman — On AI. 23 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.

Concept (21)
Affordances
Concept

Affordances

The actionable properties of an object as perceived by a user — what the thing permits you to do — borrowed from Gibson, refined by Norman, and rendered newly problematic by an AI interface whose action space is unbounded and invisible.

Automation Dependence
Concept

Automation Dependence

The quiet risk of comprehensive automation: not that machines dominate us, but that we lose the capabilities they replace. Asimov's Solarians are the founding fiction; contemporary work on cognitive offloading is the empirical counterpart.

Cascading Error
Concept

Cascading Error

The propagation of initial interpretation or specification errors through complex bodies of AI-generated work, compounding silently because the speed of production outpaces the speed of evaluation.

Conceptual Model
Concept

Conceptual Model

The user's mental representation of how a system works — accurate enough to predict, diagnose, and recover from the system's behavior — and the specific cognitive architecture that AI's probabilistic, context-dependent outputs systematical…

Flow State
Concept

Flow State

Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.

Forcing Function (Norman)
Concept

Forcing Function (Norman)

Norman's design pattern for preventing error by making the wrong action impossible rather than merely unlikely — and the conceptual template for AI-era design interventions that pause production before errors cascade.

Interpretation Error
Concept

Interpretation Error

A new category of AI-era error — the person's prompt is clear and the system's execution is correct, but the system's interpretation of the prompt diverges from the person's meaning, producing an output that is technically right and practi…

Knowledge in the World vs. in the Head
Concept

Knowledge in the World vs. in the Head

Norman's foundational distinction — between knowledge embedded in the environment (available without memorization) and knowledge carried in memory (requiring learning and recall) — and the observation that AI systems place nearly all rele…

Natural Language Interface
Concept

Natural Language Interface

The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind; the paradigm that, read through Gibson's framework,…

Progressive Affordance Disclosure
Concept

Progressive Affordance Disclosure

The Norman volume's proposed design pattern for conversational AI — revealing capabilities dynamically through dialogue rather than statically through fixed interface layouts, extending Norman's progressive disclosure into the conversation…

Representational Mismatch
Concept

Representational Mismatch

Tversky's diagnostic term for the gap between the spatial structure of a thinker's understanding and the spatial structure a tool demands — the hidden tax on every pre-AI interface.

Resilience Design
Concept

Resilience Design

Norman's framework for designing systems that maintain human capability even as technology changes the conditions under which capability is exercised — and the design orientation Chapter 6 of the Norman volume proposes as the antidote to …

Signifiers
Concept

Signifiers

The perceivable cues that tell a person what an object affords — separate from the affordance itself, and in the AI era almost entirely absent, misleading, or replaced by accidental signals the system never intended to send.

Silent Redesign of Human Capability
Concept

Silent Redesign of Human Capability

The gradual, invisible atrophy of cognitive skills that occurs when capabilities distributed across a human-AI coupling cease to be exercised by the human component — a design consequence Norman's framework predicts but current AI systems …

Specification Error
Concept

Specification Error

A second new AI-era error category — the user's specification is incomplete in ways she did not know were possible, and the system produces a technically correct output that omits a requirement she never articulated because she did not kno…

The Blank Prompt (Norman Reading)
Concept

The Blank Prompt (Norman Reading)

The empty text field of the conversational AI interface — read through Norman's framework as the worst-designed primary interface element in the history of computing, communicating less about its capabilities than the average door handle.

The Coupled System
Concept

The Coupled System

The framework for analyzing human-AI interaction as a single integrated system rather than a human using a tool — where the behavior of the whole cannot be understood by analyzing components in isolation, and design must address the coupli…

The Gulf of Evaluation
Concept

The Gulf of Evaluation

The distance between what a system has done and what the person can perceive, interpret, and judge about what it did — the gulf that has blown open in the AI era precisely because the Gulf of Execution collapsed.

The Gulf of Execution
Concept

The Gulf of Execution

Norman's name for the distance between what a person wants to do and what a system allows her to do — the chasm the AI interface has, for the first time in tool history, crossed from the machine's side.

The Translation Cost
Concept

The Translation Cost

The tax every previous computer interface levied on every user — the cognitive overhead of converting human intention into machine-acceptable form. The tax natural language interfaces have abolished.

Three Levels of Emotional Design
Concept

Three Levels of Emotional Design

Norman's framework for the visceral, behavioral, and reflective levels of emotional processing that every designed artifact engages — and the lens through which Chapter 5 of the Norman volume diagnoses the emotional architecture of AI-as…

Work (1)
The Orange Pill
Work

The Orange Pill

Edo Segal's 2026 book on the Claude Code moment and the AI transition — the empirical ground and narrative framework on which the Festinger volume builds its diagnostic reading.

Person (1)
Edo Segal
Person

Edo Segal

Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.

Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
23 entries