The reductionist position takes a specific form in artificial intelligence research: intelligence is computation. If a system computes the right functions — processes information, detects patterns, generates contextually appropriate outputs — it is intelligent, regardless of substrate or history. This functionalism is the implicit metaphysics of most AI research, and it is the direct analogue of the physicalist claim that biology is just chemistry.
Mayr's response to the biological version of the claim was empirical and decisive. Two populations with identical genomes, exposed to different selection pressures in different environments, evolve in different directions. The genome underdetermines the organism. The physical substrate is necessary but not sufficient. What determines the specific outcome is the history — the particular sequence of environmental challenges, mutations, and ecological interactions this population alone has experienced.
The argument applies to AI systems with force the functionalist position tends to obscure. Two transformer architectures with identical parameters, trained on different datasets, produce different systems. The architecture underdetermines behavior just as the genome underdetermines phenotype. What determines the specific capabilities and limitations of a given AI system is its training history — the corpus it was trained on, the reward model it was optimized against, the sequence of fine-tuning steps it underwent.
This is not a minor technical point. It is a fundamental fact systematically obscured by the discourse's tendency to treat AI as a monolithic category. The autonomy of intelligence — parallel to the autonomy of biology — is the claim that intelligence uses computation, depends on computation, but does not reduce to computation, because intelligent systems have histories, and those histories determine their specific capabilities in ways that architectural description alone cannot explain.
Mayr's anti-reductionism matured during the 1970s and 1980s, as molecular biology's ascendancy produced triumphalist claims that the organism would soon be reducible to its DNA. Mayr's response — consolidated in The Growth of Biological Thought (1982) and sharpened in What Makes Biology Unique? (2004) — was to document, with the patience of a working taxonomist, the specific ways in which biological explanation refused to collapse into physical explanation.
Irreducibility, not separation. Biology uses physics, depends on physics, and contradicts no physical law. It does not reduce to physics, because its entities have properties — variation, selection, adaptation, contingency — that physical entities do not share.
Substrate matters. A brain is not a computer that happens to be made of neurons. It is an organ that evolved in a specific lineage, embedded in a specific body, situated in a specific ecology. The computations it performs are shaped by this history.
Training history is ultimate cause. For AI systems, the training data, reward model, and fine-tuning sequence function as the evolutionary history that shapes capability — necessary information that architectural description omits.
No monolithic AI. Treating artificial intelligence as a single phenomenon with uniform properties commits the same error pre-Darwinian biologists made when they treated species as fixed essences rather than populations of unique individuals.
Historical entities require historical explanation. AI systems are genuinely new — neither purely physical like crystals nor purely biological like organisms — and require methods appropriate to their specific ontological status.
Functionalists in philosophy of mind — from Hilary Putnam onward — have argued that mental states are defined by their functional role rather than their physical substrate, and that therefore a sufficiently elaborate computational system could, in principle, instantiate mentality. Mayr's framework does not refute functionalism directly; it complicates it, by insisting that even if function is what matters, the function itself is shaped by history in ways that pure computation does not capture.