This page lists every Orange Pill Wiki entry hyperlinked from Gary Becker — On AI. 22 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The mechanism at the heart of Becker-Murphy rational addiction: current consumption of a good raises the marginal utility of future consumption of the same good, creating the self-reinforcing feedback loop that drives escalation.
The Orange Pill's thesis that AI does not eliminate difficulty but relocates it to a higher cognitive floor — the engineer who no longer struggles with syntax struggles instead with architecture.
An external structure — a time limit, a protected period, an institutional norm — that an agent adopts to bind her future self to choices her present self, operating under a high discount rate, would not make. The Becker remedy for producti…
The economic mechanism by which AI renders specific human capital — knowledge tied to particular firms, technologies, or contexts — worthless not because the knowledge is wrong but because it is no longer scarce.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
Becker's 1964 taxonomic distinction between skills portable across employers and contexts (general) and skills valuable only within a particular firm, industry, or technology (specific) — now the sharpest diagnostic instrument for who AI de…
Becker's 1981 reframing of the family as a production unit — a small factory that combines market goods, time, and human capital to produce the commodities people actually value — and the framework that reveals what AI does inside the home.
Becker's foundational reframing of education and training as capital investment — with rates of return, depreciation schedules, and opportunity costs — rather than consumption. The framework that made the AI transition economically legible.
The widening gap between the speed at which an institution can adapt and the speed at which its environment is changing — the mechanism through which individual future shock compounds into systemic disorientation.
The class of production inputs for which no combination of alternatives can compensate — the Leontief-structure constraint that explains why AI cannot produce what families need most, regardless of how much else it produces cheaply.
Becker and Kevin Murphy's 1988 theory that addicts are rational — forward-looking agents whose current consumption reflects an internally consistent calculation that discounts future costs too heavily relative to present returns. The model …
The developmental foundation formed through sustained, responsive parental presence — the irreducibly human capital that no technology substitutes and whose production requires the one input AI cannot provide.
The canonical example of allogenic ecosystem engineering — a structure that modulates rather than blocks the flow of its environment, creating the habitat pool in which diverse community life becomes possible.
The twelve-year-old's 'Mom, what am I for?' read not as a request for information but as an opening of the intermediate area — a question that asks to be held, not answered, because holding is what develops the capacity to inhabit unresolv…
The figure at the intersection of Segal's democratization narrative and Cipolla's helpless quadrant — genuinely empowered by AI and simultaneously positioned at the downstream end of the value flow.
The rate at which future consequences are weighted against present satisfactions — the critical parameter in Becker-Murphy rational addiction, and the variable the AI transition is pushing upward for an entire generation of knowledge worker…
Becker's 1957 measure of the premium an employer is willing to pay to indulge a preference for one type of worker over another — a self-imposed tax on productivity whose approach to zero under AI is the most consequential structural shift o…
The specific behavioral configuration — compulsive AI-augmented engagement experienced as exhilaration from within and pathology from without — produced by a reinforcing loop without a balancing counterpart.
The market return on capacities that require stakes — judgment under uncertainty, the creation of trust, the cultivation of care — which Becker's framework identifies as the durably scarce human capital in an AI-saturated economy.
Becker's 1965 formalization — in A Theory of the Allocation of Time — that every activity has a true cost equal to its market cost plus the opportunity cost of the time it consumes. The framework that reveals why AI-augmented workers canno…