This page lists every Orange Pill Wiki entry hyperlinked from Jean-Baptiste Say — On AI. 30 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
Rogers's five-part typology dividing any social system by timing of adoption: innovators, early adopters, early majority, late majority, and laggards — each a structural position, not a personality type.
The progressive shortening of the interval between a technology's introduction and its saturation — from seventy-five years for the telephone to two months for ChatGPT — and the corresponding collapse of the adaptive window.
The Gramscian-Hanian condition in which the subject exploits herself and calls it freedom — the overseer's function having been transferred from the factory floor to the interior of the self through decades of hegemonic cultural work.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The systemic counterpart to Segal's individual beaver metaphor — the structural architectures of taxation, labor bargaining, portable benefits, and international coordination that operate at the level of the economy, not the level of the in…
The widening gap between the speed at which an institution can adapt and the speed at which its environment is changing — the mechanism through which individual future shock compounds into systemic disorientation.
The interface paradigm — inaugurated at scale by large language models in 2022–2025 — in which the user addresses the machine in unmodified human language and the machine responds in kind; the paradigm that, read through Gibson's framework,…
The adoption mechanism of category-three products: pre-cognitive recognition of a tool as the thing the user has been waiting for, bypassing the evaluative process that governs marketing-speed adoption.
The reciprocal mechanism linking production, income, and demand — not the slogan that supply creates its own demand, but the structural claim that the act of production generates the income that constitutes demand for other products.
The conditional optimism that distinguishes Say from his naive defenders: the circuit tends to re-establish itself, but the long run can be very long, and the people in the middle of it suffer.
The taxonomy Say's simplifiers erased: demand that precedes supply, demand created by novel supply, and latent demand awaiting adequate supply — the third category that explains the AI adoption curve.
Sixty-six years of unrealized production by programmers, designers, and builders who could conceive what they could not reach — the stored pressure that discharged through the natural language interface.
The reframing that transforms adoption data from a product report card into a diagnostic instrument — measuring not the quality of the rupture but the magnitude of the stress that accumulated along the fault line.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
The mechanism through which AI creates demand not only through the income channel but through the revelation of previously invisible possibility — expanded capability generating demand for the skills required to direct the expanded capabili…
The composite figure at the center of the AI democratization debate — a builder with intelligence, tools, and ambition whose capability has expanded dramatically while the institutional infrastructure that would convert capability into capi…
The adoption pattern of a category-three product: flat for a long time, then nearly vertical — not gradually vertical, but the way a wall is vertical, with a force proportional to the stored pressure behind it.
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
Say's most distinctive contribution: the entrepreneur as perceiver of disjunctions between what exists and what could exist — a function that AI amplifies, purifies, and makes more economically central than ever.
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
The claim — central to this book's reading of the Orange Pill — that the collapse of techne's cost reveals a deeper barrier that was always the harder problem: deciding what deserves to be built.
The specific behavioral configuration — compulsive AI-augmented engagement experienced as exhilaration from within and pathology from without — produced by a reinforcing loop without a balancing counterpart.
Rogers's foundational pattern — cumulative adoption plotted against time produces a logistic curve whose inflection point marks the passage from novelty to normality.
The circuit that follows the initial discharge: AI's supply of cheap execution creating demand for expensive judgment, through the capability circuit rather than the income circuit alone.
The photographic negative of the economy that exists: products never built, problems never solved, creative potential never realized because the tools were not yet adequate — invisible to the ledger, real in its cost.
The economic analog of potential energy: productive capacity held in latent form by a constraint, invisible to measurement, releasing instantaneously when the constraint is removed.
The tax every previous computer interface levied on every user — the cognitive overhead of converting human intention into machine-acceptable form. The tax natural language interfaces have abolished.