This page lists every Orange Pill Wiki entry hyperlinked from Marc Andreessen — On AI. 25 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Byung-Chul Han's diagnosis — extended through Dissanayake's biological framework — of the cultural dominance of frictionless surfaces and the specific reason the smooth feels biologically wrong.
The strategic deployment of public resources, regulatory frameworks, and trade instruments to build domestic AI capability — practiced aggressively by the leading nations, prohibited or discouraged for the rest, and the central determinant …
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
Andreessen's framework for how the binding constraint in information economies moved from distribution to attention as distribution costs collapsed — the pattern that now repeats with execution and judgment in the AI era.
The study of how AI-saturated environments shape the minds that live inside them — the framework for asking what becomes of judgment, curiosity, and the capacity for sustained attention when answers become abundant and friction is engineer…
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Clayton Christensen's framework for how incumbents are displaced by inferior products that serve overlooked segments — the analytical lens through which Andreessen's software-eating thesis becomes a precise economic claim.
Perez's structural distinction between the speculative capital that dominates installation and the patient capital that dominates deployment — two logics with different time horizons, different objectives, and different social consequences.
Carlota Perez's framework distinguishing the speculative, financial-capital-driven installation phase of a technological revolution from the orderly, institution-governed deployment phase — the analytical lens through which the AI transitio…
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The phenomenological signature of performative reconstitution experienced from the inside — the specific alternation between excitement and terror that marks the unmaking and remaking of a professional self.
Amodei's extension of Segal's amplifier framework — the amplifier is not neutral, the design choices embedded in an AI system are moral choices, and the designer shares responsibility with the user for what gets amplified.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
Andreessen's defining conviction that builders — founders, engineers, designers who ship working products — are the primary agents of historical progress, not regulators, critics, or commentators.
Segal's image of consciousness as a fragile flame in cosmic darkness — the philosophical foundation of consciousness-based identity, and the scaffolding whose developmental adequacy this book interrogates.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The structural inversion the AI transition produces — when building becomes easy, scarcity migrates from execution to the capacity to decide what deserves to be built.
The political and emotional reaction against transformative technology on behalf of the workers and ways of life it displaces — historically vilified, increasingly reconsidered, and directly relevant to the AI transition.
The specific behavioral configuration — compulsive AI-augmented engagement experienced as exhilaration from within and pathology from without — produced by a reinforcing loop without a balancing counterpart.
The question "what is a human being for?" — which Clarke predicted intelligent machines would force humanity to ask, and which arrived in 2022–2025 with more force and less philosophical preparation than he expected.
Andreessen's April 2020 pandemic-era essay diagnosing America's institutional failure as a failure of building — a call to action whose urgency the AI transition has rendered more complex than the original text anticipated.
Marc Andreessen's 2011 Wall Street Journal essay arguing that software would disrupt every industry — a thesis so thoroughly confirmed that its completion created a problem the thesis did not anticipate.
Edo Segal's 2026 book on the Claude Code moment and the AI transition — the empirical ground and narrative framework on which the Festinger volume builds its diagnostic reading.
Marc Andreessen's 2023 5,200-word essay declaring technology the primary driver of human flourishing and naming its critics the enemy — a document that crystallized the ideological fault lines of the AI moment.