'Neural Network' as Metaphor — Orange Pill Wiki
CONCEPT

'Neural Network' as Metaphor

The single most important metaphor in AI — a piece of engineering shorthand whose inflation into literal description is the textbook case of Midgleyian pipe failure.

'Neural network' is a metaphor. The computational structures called neural networks bear a superficial resemblance to biological neural networks — weighted connections between simple processing units — in the same way that a child's drawing of a house bears a resemblance to a house. The resemblance is useful for certain explanatory purposes. It is catastrophic when taken literally. Midgley's framework identifies the inflation of 'neural network' from engineering shorthand to literal description as the paradigm case of the pattern she spent her career exposing: a useful analytical tool promoted to a total description, with the promotion doing metaphysical work that no technical finding authorizes. When people conclude that because the machine has 'neural networks,' the machine thinks the way brains think, the metaphor has escaped its scope and the pipes are flooding.

In the AI Story

Hedcut illustration for 'Neural Network' as Metaphor
'Neural Network' as Metaphor

The history of the term is instructive. Warren McCulloch and Walter Pitts introduced the mathematical model in their 1943 paper 'A Logical Calculus of the Ideas Immanent in Nervous Activity,' explicitly as a formal idealization useful for logic — not as a faithful model of biological neurons. The original paper was careful. Subsequent decades of computer science, under pressure to communicate to funders, investors, and the public, were not. By the 1980s, 'neural network' had become the standard term for a broad class of machine learning architectures whose relationship to biological neurons ranged from metaphorical to purely nominal.

Contemporary large language models use architectures — transformers, attention mechanisms, deep feedforward networks — whose structural relationship to biological neurons is now deeply attenuated. The 'neurons' in a transformer are weighted sums followed by nonlinearities. Biological neurons are living cells with complex internal dynamics, neurotransmitter cascades, metabolic requirements, developmental histories, and dense contextual embedding in bodies, environments, and evolutionary lineages. The overlap between the two is the overlap between a child's stick figure and a human being.

The metaphor persists not because it is accurate but because it is useful — useful pedagogically for explaining what computational neural networks do, useful rhetorically for claiming biological legitimacy for engineering projects, useful commercially for making computational systems sound more organic and mysterious than they actually are. The usefulness comes from the same source as the danger: the term carries implications beyond its technical scope, and the implications shape how people understand what these systems are.

Midgley's critique is not that the term should be abandoned. It is that the users of the term — researchers, communicators, journalists, policymakers — should be attentive to the metaphysical work the metaphor is doing when it escapes its technical context. The engineer who says 'neural network' in a technical paper is using shorthand. The journalist who says 'neural network' in an article about AI consciousness is importing, perhaps without noticing, the claim that computational systems are in the same category as biological brains. The import is the pipe failure. Fixing it does not require renaming anything. It requires the discipline of noticing where and how the metaphor is being stretched and asking whether the stretch is justified.

Origin

The concept of 'neural network' as a designed computational structure traces to McCulloch and Pitts (1943) and to the perceptron work of Frank Rosenblatt (1958). Its current cultural inflation dates to the deep learning revolution of the 2010s, when architectural advances produced systems capable of feats that made the biological comparison rhetorically irresistible. Midgley's critique applies not to the technical history but to the cultural promotion.

Key Ideas

Engineering shorthand, not biological description. 'Neural network' was introduced as a formal idealization, not as a faithful model of biological cognition.

The overlap is shallow. Computational 'neurons' and biological neurons share a name and a superficial structural analogy — nothing more.

The metaphor does metaphysical work. Calling the machine a 'neural network' imports the moral and ontological status of brains onto structures that do not share the features generating that status.

Diagnostic test. When someone argues from 'neural networks' to 'therefore the machine thinks like a brain,' the metaphor has escaped its scope — and the argument rests on the escape, not on evidence.

Appears in the Orange Pill Cycle

Further reading

  1. McCulloch, Warren and Walter Pitts. 'A Logical Calculus of the Ideas Immanent in Nervous Activity,' Bulletin of Mathematical Biophysics (1943).
  2. Midgley, Mary. The Myths We Live By (2003).
  3. Mitchell, Melanie. Artificial Intelligence: A Guide for Thinking Humans (2019).
  4. Dennett, Daniel. From Bacteria to Bach and Back (2017).
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT