This page lists every Orange Pill Wiki entry hyperlinked from Ronald Heifetz — On AI. 27 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
A problem requiring changes in people's values, beliefs, habits, or identities rather than the application of existing expertise—Heifetz's foundational distinction from technical problems.
The practice of exercising institutional authority to raise questions and name uncertainties rather than to provide solutions—Heifetz's inversion of the leadership contract.
The deepest level of Schein's cultural model — beliefs so taken for granted that articulating them would seem absurd, and the level at which the AI transition is forcing painful revision.
The specific depletion produced by sustained emotional labor under conditions of inadequate replenishment — Hochschild's framework reveals AI's new division of feeling as a burnout machine.
The virtue Erikson assigned to the successful resolution of the Industry stage — the quiet confidence that arises from knowing one can do things well — now requiring redefinition in the age of AI.
The mode of engagement in which the person produces the expected response, meets the requirement, and fits into the predetermined framework — the structural opposite of creative apperception.
The leader's refusal to solve the adaptive challenge on behalf of the organization—instead creating conditions for the people who hold the problem to do the transformative work themselves.
The relational context that simultaneously supports and challenges developmental growth — holding on, letting go, and staying in place — the infrastructure through which subject-object shifts actually occur.
The dissolution of the self-structure when the competency around which professional identity was organized is economically disposed of — the psychological dimension of expertise displacement.
The Opus 4.6 simulation's core diagnosis: AI broke the coordination bottleneck that governed knowledge work for fifty years, and the constraint has migrated to the builder's capacity to decide what deserves to exist.
The threshold zone between an old professional identity and a new one — borrowed by Ibarra from anthropologist Victor Turner to describe the disorienting, generative period when a person belongs fully to neither the self they are leaving no…
The zone of distress where anxiety is high enough to prevent avoidance but low enough to permit learning—Heifetz's temperature range for adaptive work.
Edmondson's foundational construct — the shared belief that a team is safe for interpersonal risk-taking — and the single strongest predictor of whether AI adoption produces learning or concealment.
The claim — central to MacIntyre's application to AI — that software engineering meets the criteria of a genuine practice: internal goods, standards of excellence, and a tradition of argument about what good software is.

A problem for which the necessary knowledge and procedures already exist—solvable by applying expertise without requiring those affected to change their identities or values.
Heifetz's spatial metaphor for diagnostic perspective: the dance floor (inside the action, reactive) versus the balcony (above it, seeing patterns invisible from within).
The willingness to fail the legitimate expectations people hold—that leaders will provide answers, reduce anxiety, protect from pain—because meeting those expectations prevents adaptive work.
The structural problem that AI systems most requiring human oversight are simultaneously eliminating the experiences through which the oversight capacity is built.

The systematic organizational error of treating the AI transition as a skills gap (technical) when it is fundamentally an identity crisis (adaptive).
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
Robert Solow's 1987 observation — you can see the computer age everywhere except in the productivity statistics — which Brynjolfsson spent his career resolving into three distinct problems: timing, measurement, and organization.
The question "what is a human being for?" — which Clarke predicted intelligent machines would force humanity to ask, and which arrived in 2022–2025 with more force and less philosophical preparation than he expected.
Juma's term for the integrated institutional system — economic, professional, cognitive, and cultural — that determines whether a technological transition produces broadly shared prosperity or concentrated suffering.
The constellation of mechanisms organizations deploy to manage anxiety without doing adaptive work—sophisticated, well-resourced strategies that create progress illusions.