This page lists every Orange Pill Wiki entry hyperlinked from Albert-Laszlo Barabasi — On AI. 28 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
The quality of subjective experience — being aware, being something it is like to be — and the single deepest unanswered question in both philosophy of mind and AI.
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Bianconi and Barabási's 2001 extension of preferential attachment in which each node has an intrinsic fitness — a capacity to attract connections that is independent of when it entered the network.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Nodes with disproportionately many connections, whose presence defines the topology of any scale-free network. In creative networks, hubs are the Dylans, the Google searches, the frontier AI platforms.
The interdisciplinary field — formalized by Barabási, Watts, Strogatz, Newman and others in the late 1990s — that studies the universal structural patterns shared by biological, social, technological, and economic networks.
The historical pattern by which the same innovation emerges from multiple independent explorers in narrow time windows — the empirical signature of topology, demonstrating that possibility spaces channel exploration toward accessible innov…
Sudden, structural reorganizations of a system when a control parameter crosses a critical threshold — the mathematical shape of the Software Death Cross and of every other moment when the AI economy's behavior changed qualitatively rather …
The heavy-tailed distribution P(x) ~ x^(-α) that characterizes scale-free networks and produces the long tail of creative economies — a mathematical contrast to the bell-curve distribution of human height.
The mechanism — sometimes called the rich-get-richer process — by which new nodes in a growing network connect preferentially to nodes that already have many connections. The engine behind every observed scale-free distribution.
The pathology — documented empirically in the Berkeley study and diagnosed philosophically by Camus — of a consciousness that cannot stop improving because the tool makes improvement effortless.
The discipline of formulating a question such that a capable answering system produces a useful answer. Asimov's Multivac stories prefigured it; prompt engineering operationalizes it.
Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which AI is the latest.
The asymmetric property of scale-free networks — exceptionally robust against random failures, catastrophically fragile to targeted attacks on hubs. The structural reason AI platform concentration matters.
Networks whose degree distribution follows a power law — a few hubs with enormous connectivity, many nodes with almost none. The structural signature Barabási found everywhere from the web to cells to citations.
Networks in which any two nodes are connected through a short chain of intermediaries — the six degrees of separation structure that combines local clustering with long-range bridging ties.
Mark Granovetter's 1973 thesis that novel information flows through weak, bridging connections rather than through strong, redundant ones — a foundational result for understanding why AI tools matter as ties of extraordinary reach.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
The threshold crossing after which the AI-augmented worker cannot return to the previous regime — The Orange Pill's central metaphor for the qualitative, irreversible shift in what a single person can build.
The tax every previous computer interface levied on every user — the cognitive overhead of converting human intention into machine-acceptable form. The tax natural language interfaces have abolished.
Maslow's reading of The Orange Pill's central question: worthiness is not a moral endowment but the developmental achievement of a person whose signal is shaped by B-values.
The 2025–2026 phase transition in which AI-assisted software production costs crossed below the costs of maintaining legacy code, triggering a trillion-dollar repricing of the SaaS industry in months.
The February 2026 week-long training session in which Edo Segal flew to Trivandrum, India, to work alongside twenty of his engineers as they adopted Claude Code — producing the twenty-fold productivity multiplier documented in The Orange Pill…