This page lists every Orange Pill Wiki entry hyperlinked from Michael Polanyi — On AI. 24 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis of the cultural trajectory toward frictionlessness — a smoothness that conceals the labor and struggle that gave previous work its depth.
The personal act by which a knower stakes herself on a claim—accepting responsibility, risking error, exercising judgment that transforms information into knowledge.
Wenger's foundational unit of social learning — a group bound together by shared domain, mutual engagement, and a collective repertoire developed over time through joint work.
The default failure mode of AI output — eloquent, structured, and incorrect — presented with the same confidence as valid claims and resistant to detection without trained evaluative capacity.
The cultivated capacity to distinguish quality from adequacy through tacit standards that resist specification—expertise as evaluative sensibility rather than rule-following.
The form of understanding that lives in the body — deposited through habitual engagement with resistant materials, irreducible to propositional content, and constitutive of genuine expertise.

Higher levels of reality—life, mind, culture—arise from lower levels yet possess organizational principles irreducible to those lower levels.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The universal architecture of knowing: consciousness attends from subsidiary elements to focal meanings—and the subsidiaries must remain subsidiary for understanding to occur.
The process by which a tool becomes phenomenologically transparent—absorbed so completely into the user's perceptual apparatus that she attends through it rather than to it.
The pre-articulate sense of a hidden pattern—the felt pull toward a discovery before the discovery can be specified or justified.
Knowledge is irreducibly personal—requiring the knower's commitment, judgment, and tacit engagement—against the positivist fiction of detached objectivity.
The observation that humans know far more than they can articulate—and that this tacit knowledge has historically resisted automation, predicting the difficulty of computerizing adaptive judgment.
The tacit capacity to assess significance, plausibility, and quality in scientific work—grounded in embodied practice and irreducible to explicit criteria.
The dual structure of attention: subsidiary elements support understanding without being noticed; focal meanings emerge through their integration—and inversion collapses skill.
The vast, inarticulate substrate of understanding that operates beneath conscious awareness and cannot be captured in any specification, no matter how detailed—Polanyi's foundational insight that "we can know more than we can tell."
The structural challenge that AI creates by eliminating the bodily engagement through which expertise was historically developed and transmitted between generations.
The structure of trust on which all knowing rests—the knower's responsible commitment to frameworks she cannot fully verify yet must accept before inquiry can begin.
Discovery begins with intimation—an inarticulate sense that something is there—followed by commitment to pursue it before evidence justifies the pursuit.
The conversion of humanity's accumulated written output — produced over centuries, sustained by public education and research — into private proprietary value, without compensation flowing back to the public that produced the resource.
Anthropic's command-line coding agent — the specific product through which the coordination constraint shattered in the winter of 2025, reaching $2.5B run-rate revenue within months.
Neural networks trained on internet-scale text that have, since 2020, demonstrated emergent linguistic and reasoning capabilities — in Whitehead's vocabulary, computational systems whose prehensions of the textual corpus vastly exceed any i…