This page lists every Orange Pill Wiki entry hyperlinked from James J. Gibson — On AI. 21 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Gibson's load-bearing concept: the possibilities for action an environment offers a particular organism — real, relational, value-laden, and present whether or not anyone perceives them.
The full field of offerings an environment provides — what it makes perceivable, easy, and available versus what it hides, makes difficult, or eliminates entirely.
The study of how AI-saturated environments shape the minds that live inside them — the framework for asking what becomes of judgment, curiosity, and the capacity for sustained attention when answers become abundant and friction is engineer…
The developmental experience of having nothing externally provided to attend to, which forces the developing mind to generate its own objects of attention from internal resources — the foundational soil of adult creative capacity.
The unpredictable behavioral patterns that arise from the interaction between an introduced technology's affordance structure and the specific organisms and ecology into which it is introduced — irreducible to either technology or user in …
The active, friction-rich movement through a structured environment through which perceptual attunement develops — distinct from the passive reception of delivered information.
The socially shared norms that govern not merely the expression of emotion but the experience of emotion itself — and the mechanism through which the AI discourse enforces enthusiasm while pathologizing grief.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Gibson's technical alternative to data processing — the direct detection of structured information in the ambient array by an organism whose perceptual system has been educated through active engagement.
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind; the paradigm that, read through Gibson's framework,…
The educated capacity of a perceptual system to detect invariants in its environment — the ecological account of expertise, developed through active exploration rather than stored rules.
A systematic inventory of what a designed environment offers for doing — which actions it makes perceivable and easy, which it hides, and which it forecloses entirely.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
The perverse outcomes that arise when interventions designed to solve a problem create incentive structures producing more of the problem — the policy analog of Gibson's emergent affordance behavior.
Gibson's reframing of perception as an active, exploratory relationship between organism and environment — not a computational process inside the head but a direct pickup of structure in the ambient array.
Gibson's technical term for the structured field of ambient light available at any point of observation — not photons in flight but light already organized by surfaces, textures, and occlusions into patterns that specify the layout of the …
The specific behavioral configuration — compulsive AI-augmented engagement experienced as exhilaration from within and pathology from without — produced by a reinforcing loop without a balancing counterpart.
The phenomenon by which a mastered tool becomes invisible — incorporated into the perceptual apparatus as a medium through which the world is perceived, rather than an object in the world that is perceived.
Ye and Ranganathan's 2026 Harvard Business Review ethnography of AI in an organization — the empirical documentation of task seepage and work intensification that prospect theory predicts.
Norman's 1988 landmark — originally The Psychology of Everyday Things — that established the foundational vocabulary of human-computer interaction and whose principles, as the Norman volume argues, apply with renewed urgency to the AI era.