You On AI Encyclopedia · The Substitution Fallacy The You On AI Encyclopedia Home
Txt Low Med High
CONCEPT

The Substitution Fallacy

The assumption that because an AI system can produce the same output as a human expert, the machine has replicated the expert's knowledge — a category error that confuses the product of expertise with the developmental process that produced it.
The substitution fallacy is Flyvbjerg's name for the dominant cognitive error in contemporary AI discourse: the assumption that functional indistinguishability between machine output and human output entails substantive equivalence between machine capability and human capability. The AI system that writes code, drafts legal briefs, or generates medical diagnoses is performing operations previously performed by human experts. The output may be functionally identical to what the expert produced. But the process by which it was produced is fundamentally different — pattern-matching across training data versus embodied knowledge built through years of practice — and the difference matters because the developmental process, not the output it produces, is the substrate on which phronesis is built.
The Substitution Fallacy
The Substitution Fallacy

In The You On AI Encyclopedia

The fallacy operates at two levels. At the individual level, it treats the production of expert-like output as evidence that the machine has replicated expert knowledge — ignoring the fact that expert knowledge includes capacities the machine does not possess: the ability to recognize when rules fail, the judgment to weigh incommensurable values, the stake in outcomes that grounds responsibility. At the institutional level, the fallacy treats the automation of expert output as equivalent to the automation of the function expertise serves — ignoring the fact that the function depends on the developmental process, not just the output.

The ten minutes case Segal describes — the engineer whose routine plumbing contained the rare moments of formative struggle — illustrates the operational consequence. When AI automates the plumbing, it removes the tedium and the formative moments together, because from the system's perspective the entire four hours is plumbing, defined functionally rather than developmentally. The substitution appears successful because the output is maintained. The phronetic erosion it produces is invisible to any measurement that evaluates substitution by output rather than by the developmental trajectory of the person producing it.

Phronesis (Flyvbjerg's Reading)
Phronesis (Flyvbjerg's Reading)

The fallacy has an institutional analog in the commoditization of code that produced the 2026 Software Death Cross. Organizations whose value propositions rested on the scarcity of technical capability discovered that the capability had been commoditized and their valuations had to be repriced. But the substitution was real only at the techne layer. The phronetic layer — institutional knowledge, customer relationships, workflow patterns, regulatory expertise, cultural understanding — remained. The companies that mistook the techne commoditization for the substitution of their entire value proposition learned the lesson painfully; those that recognized the distinction survived.

The corrective is methodological vigilance. Every claim that AI has replaced a category of human work must be examined for the substitution fallacy: has the output been replicated, or has the capacity been replaced? These are different questions, and conflating them produces the systematic overestimation of AI substitution that now pervades both corporate strategy and public discourse. The honest answer in most cases is that the output has been replicated while the capacity has been partially eroded and partially preserved in ways that vary by context and that only phronetic analysis can detect.

Origin

The concept is implicit throughout Flyvbjerg's phronesis work and is made explicit in his 2025 AI writings. Related arguments appear in Hubert Dreyfus's long engagement with AI and in Shannon Vallor's work on moral deskilling.

Key Ideas

Output versus process. Functional equivalence of output does not entail substantive equivalence of capability — the developmental process is a separate dimension.

The Phronimos
The Phronimos

Invisible erosion. When substitution replaces the developmental process, the phronetic erosion is invisible to measurement that evaluates only output.

Institutional consequence. Organizations that mistake output commoditization for capability substitution misprice their own value and their competitors'.

Generational compounding. Each generation of practitioners trained through substituted processes loses the capacity the process developed, producing cumulative erosion invisible within any single generation.

Corrective vigilance. Every substitution claim must be examined separately for output replacement versus capacity replacement, because the two diverge systematically.

Further Reading

  1. Dreyfus, Hubert. What Computers Still Can't Do. MIT Press, 1992.
  2. Vallor, Shannon. Technology and the Virtues. Oxford University Press, 2016.
  3. Flyvbjerg, Bent. 'AI as Artificial Ignorance.' Project Leadership and Society, 2025.
  4. Ericsson, K. Anders. Peak: Secrets from the New Science of Expertise. Houghton Mifflin Harcourt, 2016.
Explore more
Browse the full You On AI Encyclopedia — over 8,500 entries
← Home 0%
CONCEPT Book →