The exchange became one of the landmark intellectual confrontations of the late twentieth century. Dawkins responded with unusual sharpness, accusing Midgley of misunderstanding his technical point. Midgley returned fire, noting that if the technical point was as careful as Dawkins claimed, the rhetorical packaging was misleading — and that he had taken no apparent steps to prevent the misreading he now claimed to disavow. The exchange continued across multiple publications for years.
The substantive issue is the one that makes the case the paradigmatic example of Midgley's method. A scientific finding — that selection operates at the genetic level, with genes acting as units of replication — is genuinely interesting. Calling the genes 'selfish' adds a metaphor. The metaphor is useful pedagogically: it helps readers grasp that genes behave as if they were pursuing their own propagation. But the 'as if' is essential. Genes do not have motives. They do not pursue anything. They replicate or they don't. The metaphor captures a structural feature of the mechanism, not a psychological feature of the genes.
When the metaphor escapes the 'as if' — when readers conclude that living things are really selfish, that altruism is really an illusion, that moral behaviour is really genetic strategy in disguise — the metaphor has done work that no scientific finding supports. The mistake is what Midgley identified as the central vice of modern popular science: the conversion of a useful analytical tool into a total worldview, and the corresponding erasure of everything the tool cannot describe.
The template of this critique applies directly to the AI discourse. 'Neural network' is a metaphor. The computational structures so named bear a superficial resemblance to biological neural networks — useful for certain explanatory purposes. Catastrophic when taken literally. When the metaphor is mistaken for a description, people conclude that because the machine has 'neural networks,' the machine thinks the way brains think. The pattern is identical to the selfish gene case: technical shorthand escapes its scope, becomes a cultural myth, and organizes public understanding around a confusion the original science never required.
Midgley, Mary. 'Gene-Juggling,' Philosophy 54 (1979). Dawkins, Richard. 'In Defence of Selfish Genes,' Philosophy 56 (1981). Midgley, Mary. 'Selfish Genes and Social Darwinism,' Philosophy 58 (1983). The exchange was later anthologized and has been the subject of extensive academic commentary.
Metaphors have metaphysical residues. A metaphor in a scientific text carries implications beyond its technical content, and responsible scientific writing attends to the residue.
The 'as if' is essential. 'Selfish gene' is useful as long as readers hear the 'as if.' Without the 'as if,' the metaphor does work no evidence supports.
Scientists are responsible for their metaphors. The claim that readers 'misunderstood' a metaphor is weakened when the author took no steps to prevent the misunderstanding.
The template generalizes. The same pattern — technical shorthand inflated into cultural myth — occurs with 'neural network,' 'AI understands,' 'machine learning,' and every other metaphor at the boundary of public comprehension.