The Mereological Fallacy — Orange Pill Wiki
CONCEPT

The Mereological Fallacy

The technical name for the confusion Midgley spent sixty years correcting — attributing to a component what can only be attributed to the whole system.

The mereological fallacy is the philosophical term for the error of attributing to a part what can only be attributed to the whole it belongs to. It is the error of saying that a carburetor drives to work. Carburetors do something essential; cars drive to work. The distinction matters because confusing the component with the system means making decisions about the system based on the component, and the decisions are wrong in ways invisible from inside the confusion. Midgley's entire methodology — philosophical plumbing, the whole-animal argument, the distinction between cleverness and integration — can be understood as the systematic exposure of mereological fallacies in the popular understanding of mind, biology, and now AI.

In the AI Story

Hedcut illustration for The Mereological Fallacy
The Mereological Fallacy

The term was given its contemporary philosophical prominence by Peter Hacker and Max Bennett in their 2003 book Philosophical Foundations of Neuroscience, where they argued that much of neuroscience commits a version of the fallacy by attributing to brains what can only be attributed to whole persons. The brain does not think; the person thinks. The brain does not decide; the person decides. The brain does not see; the person sees (using her brain, her eyes, her body, her history, and the world she is embedded in). Saying the brain does these things is like saying the carburetor drives to work.

The fallacy is pervasive in AI discourse, often in ways so familiar they are invisible. 'The model understands the prompt.' 'The system reasons about the problem.' 'The AI knows the answer.' Each attribution takes a cognitive capacity that belongs to whole beings — understanding, reasoning, knowing — and assigns it to a computational component. The assignment is not merely imprecise. It is categorically mistaken, because the capacities named only make sense when attributed to the kind of entity that can exercise them, and computational components are not that kind of entity.

Midgley's version of the argument went further than Bennett and Hacker's. They were concerned to show that neuroscientific language about brains was category-confused. She was concerned to show that the broader intellectual culture was making the same error in every domain — attributing to genes what belonged to organisms (the selfish gene), attributing to modules what belonged to minds (modular theories of cognition), attributing to computations what belonged to conscious beings (computational theories of mind). The error was not localized to one science. It was a pervasive feature of the reductionist temptation.

The practical consequence is that a culture systematically committing the mereological fallacy will build institutions that reward the components and erode the wholes. Performance metrics measure components. Benchmarks measure components. Productivity tools measure components. A culture that has confused component for whole will build measurement systems that track what AI does well and fail to measure what whole persons do that AI cannot do — because those capacities belong to the integrated system, not to any component, and the measurement apparatus was built for components.

Origin

The term 'mereological fallacy' was introduced by Maxwell Bennett and Peter Hacker in Philosophical Foundations of Neuroscience (2003), drawing on earlier work by Anthony Kenny and on the broader Wittgensteinian tradition. Midgley's version of the argument predates the formal naming and was developed independently across her work from Beast and Man (1978) forward.

Key Ideas

Parts don't do what wholes do. Cars drive; carburetors don't. Persons think; brains don't (though brains are essential to thinking).

The error is categorical. Attributing whole-properties to parts is not imprecise language — it is category confusion.

The error is pervasive. From 'selfish genes' to 'thinking computers,' the same structural mistake recurs across domains.

Measurement follows attribution. When a culture confuses component for whole, it builds measurement systems for components and loses the capacity to see what whole beings do.

Appears in the Orange Pill Cycle

Further reading

  1. Bennett, Maxwell and Peter Hacker. Philosophical Foundations of Neuroscience (Blackwell, 2003).
  2. Midgley, Mary. The Ethical Primate (1994).
  3. Kenny, Anthony. The Legacy of Wittgenstein (1984).
  4. Searle, John. The Rediscovery of the Mind (1992).
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT