This page lists every Orange Pill Wiki entry hyperlinked from Frederick Winslow Taylor — On AI. 33 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The Berkeley researchers' prescription for the AI-augmented workplace — structured pauses, sequenced workflows, protected human-only time, behavioral training alongside technical training — the operational counterpart to Maslach's fix-the-…
The contemporary descendant of Taylor's system — the use of software to monitor, measure, evaluate, and direct human work at computational scale, applying the worker-as-system ontology to knowledge labor the original framework could not rea…
The Gramscian-Hanian condition in which the subject exploits herself and calls it freedom — the overseer's function having been transferred from the factory floor to the interior of the self through decades of hegemonic cultural work.
The systematic breaking of complex work into elementary operations assigned to specialized workers — Taylor's core operational move, now revealed as a workaround for a limitation that AI has removed, and an obstacle to the integrated work t…
Smith's foundational principle that specialization produces the greatest improvement in the productive powers of labour — the pin factory's logic, now being inverted by AI tools that dissolve the boundaries between specialized operations.
Mihaly Csikszentmihalyi's name for the condition of optimal human engagement — and, in Wiener's framework, the subjective signature of a well-regulated negative feedback system.
The ontological inversion the AI age produces in the worker's role — from a part defined by function within a system to a whole that directs the system according to an irreplaceable vision of what the work is for.
Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an exploitation metric that leaves the exp…
Shannon Vallor's concept for the atrophy of moral capacities through technological mediation — what happens when the conditions for cultivating specific virtues are eroded by tools that produce the practice's outputs without requiring the v…
The specific behavioral signature of AI-augmented work: compulsive engagement that the organism experiences as voluntary choice, with an output the culture cannot classify as problematic because it is productive.
Edmondson's foundational construct — the shared belief that a team is safe for interpersonal risk-taking — and the single strongest predictor of whether AI adoption produces learning or concealment.
Taylor's systematic framework for organizing work through observation, measurement, task decomposition, and the separation of planning from execution — the operating system of twentieth-century production, and the unexamined inheritance tha…
Taylor's term for workers deliberately restricting output — the practice he condemned as theft, and the framework reveals, in the AI age, as a functional adaptation whose elimination produces the unlimited-demand pathology modern knowledge …
Michael Polanyi's term for the knowledge that lives in the hands and nervous system rather than in explicit propositions — acquired through practice, failure, and embodied pattern recognition, and dissolved by AI workflows that produce ou…
The mechanism — documented in the Berkeley study of AI workplace adoption — by which AI-accelerated work colonizes previously protected temporal spaces, converting every pause into an opportunity for productive engagement.
The device that increases the magnitude of whatever passes through it without evaluating the content — Wiener's framework for understanding AI as a tool that carries human signal, or human noise, with equal power and no judgment.
Segal's term for the cumulative signal degradation that occurs as a builder's intention passes through the multiple stages of the traditional software development process — given mathematical grounding by Claude Shannon's theory of cascaded…
The uncomfortable fact that AI's benefits and costs do not distribute evenly across the population of affected workers — a Smithian question about institutions, not a technical question about tools.
The Orange Pill's image for the set of professional and cultural assumptions so familiar they have become invisible — the water one breathes, the glass that shapes what one sees. A modern rendering of Smith's worry about the narrowing effe…
Taylor's operational instrument for transferring knowledge from worker to management and back — the paper artifact that codified the one best way, and whose contemporary descendants (sprint tickets, product specs, acceptance criteria) persi…
The economic regime that emerges when the cost of execution approaches zero and the premium on deciding what to execute rises correspondingly — the Smithian reading of the Orange Pill moment.
The structural transformation of management from Taylor's knowledge-holder-enforcing-compliance to the AI-age gardener-cultivating-judgment — a shift that requires different capabilities, different authority, and the dissolution of the hier…
The pathology specific to AI-augmented work — unlimited production past the point of diminishing returns, produced by the elimination of the natural and informal restrictions on output that protected worker capacity in previous eras.
Taylor's founding conviction that for every task there exists a single optimal method, discoverable through scientific observation — the organizing principle that built the twentieth century and that AI has inverted from method imposed to c…
Smith's founding illustration of the division of labour — ten workers performing eighteen distinct operations to make forty-eight thousand pins a day, each alone capable of making twenty.
Taylor's foundational ontological claim that the human worker is a collection of inputs and outputs subject to external optimization — the premise that justified extracting autonomy and judgment, and that AI's amplifier architecture has mad…
The systematic error of applying Taylor's observational framework to knowledge work whose value is invisible to his method — the specific pathology of measuring AI-era productivity through metrics designed for motions rather than judgments.
Ye and Ranganathan's 2026 Harvard Business Review ethnography of AI in an organization — the empirical documentation of task seepage and work intensification that prospect theory predicts.
Edo Segal's 2026 book on the Claude Code moment — the empirical and narrative ground on which this Whitehead volume builds its philosophical reading.
The 1899 Bethlehem Steel demonstration in which Taylor raised Henry Noll's output from 47 to 47.5 tons per day through systematic redesign of his motions — the founding empirical case of scientific management and the clearest illustration o…
The February 2026 training session in which Edo Segal's twenty engineers in Trivandrum crossed the orange pill threshold and emerged as AI-augmented builders producing twenty-fold productivity gains — the founding empirical moment of The Orange…