This page lists every Orange Pill Wiki entry hyperlinked from Daniel Dennett — On AI. 27 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis — extended through Dissanayake's biological framework — of the cultural dominance of frictionless surfaces and the specific reason the smooth feels biologically wrong.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
Daniel Dennett's phrase for the common-sense but probably incoherent picture in which consciousness is a central inner stage where experiences "arrive" to be watched by an internal self. Directly relevant to how we think about whether AI sy…
The quality of subjective experience — being aware, being something it is like to be — and the single deepest unanswered question in both philosophy of mind and AI.
Dennett's provocation that consciousness is not a unified mystery but a collection of specific cognitive mechanisms — memory tricks, attention tricks, self-modeling tricks — whose combination produces what feels like something magical and i…
Dennett's slogan for the methodological commitment to explaining intelligence by earned mechanisms — cranes that build from the ground up — rather than by unexplained miracles imported from above.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Dennett's term for the reasons for which a design exists without any reasoner having formulated them — the kind of rationality that evolved systems and trained networks alike embody without understanding.
Dennett's third-person method for studying consciousness that takes subjects' reports seriously as data without granting them unchecked authority about what is actually happening in their minds.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
Daniel Dennett's theory that consciousness is not a single coherent stream but a parallel process of competing neural drafts — one of the most influential post-Cartesian theories of mind.
The intrinsic qualitative character of conscious experience — the redness of red, the specific felt quality of pain — and the feature of mind whose relation to physical process is the substance of the hard problem.
Dennett's 1991 argument that patterns are real if they compress — if treating them as existing yields genuine predictive leverage — a criterion that legitimizes the intentional stance without requiring metaphysical substance.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Segal's image of consciousness as a fragile flame in cosmic darkness — the philosophical foundation of consciousness-based identity, and the scaffolding whose developmental adequacy this book interrogates.
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
Daniel Dennett's strategy of treating a system as if it had beliefs, desires, and rationality — a pragmatic alternative to metaphysical debates about what "really" has a mind.
The specific behavioral configuration — compulsive AI-augmented engagement experienced as exhilaration from within and pathology from without — produced by a reinforcing loop without a balancing counterpart.
Segal's term for the population holding contradictory truths about AI in paralyzed equilibrium — reread by Mouffe's framework as the characteristic subject-position of the post-political condition.
Dennett's account of the experienced self as a simplified interface the brain presents to itself — useful, predictively tractable, and no more the reality beneath than the desktop is the computer's hardware.
Dennett's term for the external cognitive instruments — language, notation, institutions, and now AI — that humans download into biological hardware never designed for the feats it performs.