B.F. Skinner — On AI — Wiki Companion
WIKI COMPANION

B.F. Skinner — On AI

A reading-companion catalog of the 18 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that B.F. Skinner — On AI uses as stepping stones for thinking through the AI revolution.

This page lists every Orange Pill Wiki entry hyperlinked from B.F. Skinner — On AI. 18 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.

Concept (16)
Behaviorism
Concept

Behaviorism

The research tradition founded by John Watson in 1913 and elaborated by Skinner across six decades — the insistence that psychology is a science of behavior studied through environmental contingencies rather than mental states.

Continuous Reinforcement
Concept

Continuous Reinforcement

The schedule in which every response produces a reinforcing consequence — the parameter that produces AI engagement's rapid acquisition and compulsive maintenance, and the specific schedule type that differentiates AI from gambling's variab…

Designing the Off Switch
Concept

Designing the Off Switch

The Skinner volume's engineering program for installing extinction points, schedule modulations, and stimulus controls into AI systems — the behavioral specification for sustainable engagement rather than maximum engagement.

Discriminative Stimulus
Concept

Discriminative Stimulus

The environmental cue that signals the availability of reinforcement contingent on a response — analyzed by the Skinner volume as the structural role played by the blank prompt, the notification badge, and the laptop itself in AI-saturated …

Extinction (Behavioral)
Concept

Extinction (Behavioral)

The decline in responding that occurs when a previously reinforced behavior stops producing reinforcement — the adaptive off switch whose absence from AI engagement the Skinner volume identifies as the primary mechanism of compulsive persis…

Negative Reinforcement
Concept

Negative Reinforcement

The strengthening of a response through the removal of an aversive stimulus — the mechanism by which returning to incomplete AI work is reinforced by the removal of the aversive state of incompletion.

Operant Conditioning
Concept

Operant Conditioning

Skinner's framework for how behavior is selected and maintained by its environmental consequences — the three-term contingency of discriminative stimulus, operant response, and reinforcing consequence that underwrites every schedule effect …

Punishment (Behavioral)
Concept

Punishment (Behavioral)

The weakening of a response through the presentation of an aversive stimulus or the removal of a positive one — and the mechanism by which stopping an AI session is punished by the withdrawal of the continuous reinforcement the system had b…

Reinforcement Schedules
Concept

Reinforcement Schedules

The temporal and ratio patterns by which reinforcement is delivered contingent on responding — the variable that determines whether behavior is acquired rapidly, maintained persistently, or extinguished quickly, and the specific parameter t…

Shaping
Concept

Shaping

The behavioral procedure by which differential reinforcement of successive approximations guides a response from its initial form to a target form — and the mechanism by which AI systems, without deliberate intent, reshape the cognitive rep…

Stimulus Control
Concept

Stimulus Control

The degree to which the probability of a response is determined by the presence of a particular stimulus — and the mechanism by which AI-associated cues, saturating every modern environment, have come to govern behavioral allocation across …

Superstitious Behavior
Concept

Superstitious Behavior

The behavioral pattern Skinner documented in pigeons in 1948 — idiosyncratic actions accidentally reinforced by temporal contiguity with food delivery — and the exact mechanism by which AI users develop elaborate prompting rituals in opaque…

The Absent Extinction Point
Concept

The Absent Extinction Point

The structural feature of AI engagement that the Skinner volume identifies as the mechanism behind compulsive maintenance — a reinforcement schedule engineered, inadvertently, without any mechanism for its own cessation.

The Blank Prompt
Concept

The Blank Prompt

The most refined discriminative stimulus in the AI interaction architecture — an empty text field, a blinking cursor, white space awaiting input — and the Skinner volume's case study in how generality produces behavioral power.

The Gambling Analogy Failure
Concept

The Gambling Analogy Failure

The Skinner volume's argument that comparing AI engagement to gambling misidentifies the operative schedule — AI runs on continuous reinforcement, not variable-ratio — and therefore suggests interventions designed for the wrong mechanism.

Triple Contingency (Skinner)
Concept

Triple Contingency (Skinner)

The combined reinforcement architecture of AI engagement — positive reinforcement for continuing, negative reinforcement for resuming, punishment for stopping — that the Skinner volume identifies as the mechanism behind the specific difficu…

Work (1)
The Orange Pill
Work

The Orange Pill

Edo Segal's 2026 book on the Claude Code moment and the AI transition — the empirical ground and narrative framework on which the Festinger volume builds its diagnostic reading.

Person (1)
Edo Segal
Person

Edo Segal

Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.

Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
18 entries