This page lists every Orange Pill Wiki entry hyperlinked from Gordon Moore — On AI. 14 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The progressive shortening of the interval between a technology's introduction and its saturation — from seventy-five years for the telephone to two months for ChatGPT — and the corresponding collapse of the adaptive window.
The empirical power-law relationships — Kaplan (2020), Chinchilla (2022), and subsequent refinements — between model size, training data volume, and computational budget that now function as the AI industry's version of Moore's Law: trend l…
The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
The unseen foundation beneath every AI interaction — fabs, power plants, data centers, supply chains — whose concentration and opacity create a tenant-landlord relationship between users and providers that the democratization narrative syst…
Gordon Moore's 1965 observation — extrapolated from six data points — that the number of transistors on an integrated circuit would double approximately every two years, acquiring the force of a self-fulfilling prophecy that organized a thr…
The accumulated potential energy of unsatisfied human needs that builds between each compression of the imagination-to-artifact ratio — and the physical model that explains why adoption curves accelerate rather than merely improve across su…
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
The electricity consumption of AI training and inference — scaling with compute and approaching the capacity of existing power generation infrastructure — as the thermodynamic constraint that semiconductor industry parallels predict will fo…
The structural principle — drawn from microprocessor history — that a productivity multiplier of twenty is not an improvement but a phase transition: a qualitative change the organizational structures of the previous regime cannot accommoda…
The governance regime change in which the accumulated textual, visual, and computational output of millions of individuals was appropriated for AI training under terms their original contribution did not contemplate — the paradigmatic case …