This page lists every Orange Pill Wiki entry hyperlinked from Slavoj Zizek — On AI. 25 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis — extended through Dissanayake's biological framework — of the cultural dominance of frictionless surfaces and the specific reason the smooth feels biologically wrong.
Not a program but a practice: refusing the ideology that presents smoothness as the only desirable quality, treating friction as universal cost, making alternatives to smoothness thinkable rather than prescribing specific alternatives.
The Berkeley researchers' prescription for the AI-augmented workplace — structured pauses, sequenced workflows, protected human-only time, behavioral training alongside technical training — the operational counterpart to Maslach's fix-the-…
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The study of how AI-saturated environments shape the minds that live inside them — the framework for asking what becomes of judgment, curiosity, and the capacity for sustained attention when answers become abundant and friction is engineer…
McGann's post-exposure redefinition of authorship: not solitary creation but the act of pointing a collaborative process toward a specific end, from a position of stakes and biographical specificity.
Marx's analysis of how commodities appear to possess value inherently, concealing the social relations of production—now operating at maximum efficiency in AI outputs whose smooth surfaces hide infrastructure, training data, and design deci…

The Enlightenment pathology: consciousness that sees through the ideological mask and wears it anyway, sustained not by ignorance but by enjoyment that knowledge cannot touch.
The split structure: I know very well, but nevertheless—knowledge and practice operate simultaneously without contradiction, the gap where ideology achieves its most complete form.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The analysis revealing that ideology operates not through false belief but through practice—subjects who know perfectly well what they are doing, but still, they do it.
The delegation of passive experience to another—the VCR that watches for you, the prayer wheel that prays for you, the AI that creates for you while you occupy the position of creator.
Lacan's untranslatable term for satisfaction that exceeds pleasure—a pursuit that does not cease when its object is obtained, sustaining itself through repetition of the circuit rather than achievement of the goal.
The compulsive engagement pattern produced when the enterprise of the self encounters unlimited productive capability — behavior indistinguishable from addiction, output indistinguishable from achievement.
Not choice within a framework but the decision that reconfigures the framework—retroactively changing the symbolic coordinates, producing options that were previously unthinkable, unguaranteed and irreducible to planning.
The symbolic order—externalized, algorithmic—that serves as invisible guarantor of meaning, implied addressee of every action, the entity supposed to know even after subjects consciously recognize it doesn't.
The figure in whom the thymotic crisis of the AI transition concentrates — the credentialed professional whose decades of expertise are being repriced by a technology she did not design and cannot control.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The irreducible gap between incompatible perspectives on AI—builder's expansion, worker's contraction, philosopher's erosion—that cannot be synthesized without concealing the antagonism the synthesis is meant to resolve.
Lacan's sujet supposé savoir—the analyst to whom knowledge is attributed—now incarnated in AI systems whose fluent outputs sustain transference indefinitely without the dissolution analysis requires.
The conversion of humanity's accumulated written output — produced over centuries, sustained by public education and research — into private proprietary value, without compensation flowing back to the public that produced the resource.
The clinical moment when the fantasy structuring desire becomes visible as construction—no longer invisible scaffolding but seen contingency—producing subjective destitution and the possibility of genuine choice.