This page lists every Orange Pill Wiki entry hyperlinked from C Fred Alford — On AI. 25 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
Alford's demonstration that the destruction of whistleblowers depends not on organizational malice but on the passivity of bystanders — the colleagues who saw, knew, and chose not to intervene, each rational choice compounding into a syst…
The Holocaust-scholarship concept Alford extends to ordinary organizational life: situations in which every available option involves moral compromise, and the actor must choose between evils rather than between good and evil.
Alford's sharpest diagnostic: the institutional mechanism by which moral concerns are converted into efficiency problems, then dismissed as optimization failures rather than addressed as ethical imperatives.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The harm that occurs when the institutions people depend on for protection become the source of harm — not through malice but through structural failure to support those the institution trained, formed, and now requires to bear the costs o…
The specific wound to conscience — distinct from PTSD — inflicted when a person participates in, witnesses, or fails to prevent acts that violate her moral beliefs. The injury the AI transition is producing in builders, users, and bystand…
Alford's most devastating empirical finding: the worst harm institutions do to whistleblowers is not economic but narrative — replacing the witness's story with the organization's story, fragmenting the self that depends on narrative coher…
The Orange Pill's term for compulsive engagement with generative tools — re-specified by the Skinner volume not as metaphor but as the precise behavioral signature of a continuous reinforcement schedule without an extinction point.
Honneth's framework holding that human identity is a social achievement constituted through three forms of mutual acknowledgment — love, rights, and social esteem — each producing a distinct dimension of selfhood.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
Byung-Chul Han 's term for the contemporary cultural preference for frictionless surfaces — the iPhone's glass, the algorithmic feed, the AI-generated text — that conceals the labor and struggle that traditionally produced depth.
The taken-for-granted background of beliefs about fairness, reciprocity, and institutional goodwill that makes organizational life possible — and whose shattering Alford identified as the deepest cost of whistleblowing.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Alford's sober accounting of what moral clarity actually costs inside institutions that prize speed over wisdom — position, influence, and the capacity to effect change from within, paid in advance by those whose clarity arrives early.
The cultural mechanism — operating through marginalization, reframing, and linguistic capture — by which every society disposes of the person whose accurate testimony it cannot accept. In the AI moment, the mechanism operates through a single word…
The population mourning what the AI transition eliminates — senior practitioners whose recognition demand is systematically truncated: their diagnosis acknowledged, their claim to institutional response denied.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
The person whose position inside an organization or culture lets her see a truth the institution has decided not to see — ordinary, not heroic, and in Alford's framework the figure the AI transition is producing at unprecedented scale.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
Alford's application of Goffman's concept: the modern corporation operates as a total institution whose self-preserving logic overrides the moral judgment of any individual within it — the structural mechanism that destroys moral witnesse…
The vast majority experiencing the full emotional complexity of the AI transition without a clean narrative to organize it — most accurate in perception, least audible in discourse.
Alford's empirical finding — documented across hundreds of cases — that truth-tellers inside organizations suffer not dramatic martyrdom but slow, grinding destruction of career, reputation, and self.