This page lists every Orange Pill Wiki entry hyperlinked from Manfred Max-Neef — On AI. 27 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The need for love, care, solidarity, and mutual presence — satisfied only through sustained, unproductive, vulnerable attention between persons.
The specific developmental concern Twenge testified to the U.S. Senate in January 2026 was greater than her concerns about social media — the substitution of simulated relationships for real ones, in a generation whose face-to-face social c…
The regulatory, institutional, and normative arrangements governing AI development and deployment — reframed through Ostrom's framework as a polycentric governance challenge requiring coordination across multiple scales rather than the mark…
Bruce McEwen's 1993 extension of Selye's framework — the cumulative biological wear that accumulates from repeated stress responses, measurable through specific inflammatory, metabolic, and endocrine markers.
Max-Neef's framing of creation as a fundamental, universal, non-substitutable need — one of nine, served spectacularly by AI and at the cost of the other eight.
The chronic, low-grade frustration of having creative impulses without the means to realize them — the structural condition AI tools address and whose scale they retrospectively reveal.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Max-Neef's eighth need — identity as the synthesis of all others, constituted by the specific configuration of satisfiers through which needs are met, and destabilized when AI disrupts the satisfier ecology.
The dissolution of the self-structure when the competency around which professional identity was organized is economically disposed of — the psychological dimension of expertise displacement.
Max-Neef's seventh need — the quality of attention characterized by openness, curiosity, and freedom from productive purpose — colonized by AI's constant availability.
Max-Neef's load-bearing distinction between the finite, universal needs and the culturally specific practices through which they are met.
Max-Neef's distinction between access to a system and voice in its governance — the distinction the AI democratization narrative systematically conflates.
The catastrophic gap between the speed of AI-driven capability displacement and the speed of institutional response — measured in months versus years.
The category of AI interactions that create the appearance of need-satisfaction without the substance — dangerous precisely because convincing.
Max-Neef's criterion for genuine development — the capacity of communities to meet their needs through means they control and can sustain — directly threatened by AI dependency patterns.
The biological infrastructure of human life — sleep, nutrition, movement, recovery — that AI-augmented workflows consume to fund creative output.
Michael Polanyi's 1966 insight that we know more than we can tell — refined by Collins into a taxonomy of three species that has become the decisive framework for understanding what AI systems can and cannot absorb from human practice.
Max-Neef's diagnostic taxonomy — synergic, singular, inhibiting, pseudo-, and violator/destroyer — that cuts against the AI discourse with surgical precision.
AI expands negative freedom (freedom from constraint) spectacularly while contracting positive freedom (autonomy, self-reliance) invisibly — two dimensions of freedom moving in opposite directions simultaneously.
Max-Neef's nine-by-four analytical grid — the operational instrument that makes multi-dimensional human welfare visible where single-axis metrics cannot.
Max-Neef's taxonomy of subsistence, protection, affection, understanding, participation, leisure, creation, identity, and freedom — finite, universal, non-substitutable.
The pathological dynamic in which intense satisfaction of one need masks the progressive neglect of the others — the central diagnostic of the AI moment.
Max-Neef's fourth need — the felt comprehension that arrives only through struggle — systematically displaced by AI's capacity to produce output without the friction that generates comprehension.
Ye and Ranganathan's 2026 Harvard Business Review ethnography of AI in an organization — the empirical documentation of task seepage and work intensification that prospect theory predicts.
Edo Segal's 2026 book on the Claude Code moment and the AI transition — the empirical ground and narrative framework on which the Festinger volume builds its diagnostic reading.
Max-Neef's paradigmatic fieldwork case — a development intervention that succeeded on every conventional metric while destroying the satisfier ecology of the community.
Hilary Gridley's January 2026 viral Substack essay 'Help! My Husband is Addicted to Claude Code' — a household production crisis expressed as relationship complaint, and the empirical touchstone Coyle's framework makes analytically legible.