Entropy is the physical principle that organized structures tend toward disorder, pattern toward dissolution, concentrated energy toward dispersal. The second law of thermodynamics, formulated in the nineteenth century by Clausius and Boltzmann, is among the most democratic laws in physics: it applies equally to stars, sandcastles, civilizations, and the contents of a teenager's bedroom. Wiener saw entropy not merely as a physical principle but as the fundamental antagonist of everything that matters. Life is anti-entropic. Intelligence is anti-entropic. Communication is anti-entropic. Every act of creating order — writing a sentence, building a bridge, maintaining a body at thirty-seven degrees — is an act of local resistance against the universal tendency toward dissolution. The resistance is always temporary. The cost is always paid elsewhere in waste heat and dissipation. But while the resistance holds, something remarkable occurs: pattern emerges, structure persists, information accumulates.
The information-theoretic connection is essential to Wiener's framework. Claude Shannon's 1948 definition of information as the measure of order in a system — the inverse of its entropy — tied communication theory directly to thermodynamics. A high-information message is a highly ordered one; a noise-dominated channel is a high-entropy one; maintaining a signal against noise is, mathematically, the same operation as maintaining a cell against dissolution. The generalization that intelligence is the local accumulation of negentropy — the creation and preservation of order against the universal tide — was Wiener's, not Shannon's, but it followed from the mathematics both men had helped develop.
This framework gives Segal's river of intelligence its thermodynamic ground. The trajectory from hydrogen atoms through chemical self-organization, biological evolution, symbolic thought, and artificial computation is the trajectory of increasingly sophisticated anti-entropic channels. Each channel is a more powerful mechanism for creating and maintaining pattern against entropic pressure. Each is temporary; each requires continuous energy input; each eventually dissolves. But while the channels hold, the river widens, and the order that accumulates is genuine.
Wiener's emphasis on entropy had a moral dimension alongside the physical one. A society, he argued, is an organization that maintains its internal complexity against the entropic pressure of the larger world. The maintenance requires continuous work: communication between members, institutions that process information, feedback loops that detect and correct deviations from viable conditions. A society that stops maintaining its informational and institutional architecture does not remain the same; it dissolves, not because anyone attacks it but because the second law is relentless and the maintenance requires effort. The warning in The Human Use of Human Beings is that automated systems, if deployed without adequate governors, can accelerate this dissolution by substituting machine output for the human judgment that maintains institutional coherence.
The question the AI age raises, in thermodynamic terms, is whether large language models produce genuine anti-entropy — new order that did not exist before — or sophisticated recombination of existing order. The distinction matters. If the model creates, it is a genuine anti-entropic channel, adding to the sum of order in the universe. If the model recombines, the anti-entropic contribution is upstream: in the human purpose that directs the recombination, in the training data that originally embodied organized information, in the judgment that distinguishes valuable rearrangements from impressive but empty ones. Wiener did not settle this question; his framework provides the terms in which it can be asked with precision.
Entropy as a thermodynamic concept was formalized by Rudolf Clausius in 1865 and extended statistically by Ludwig Boltzmann in the 1870s. Claude Shannon's 1948 extension into information theory created the mathematical bridge between physical and informational entropy.
Wiener integrated the two frameworks in Cybernetics (1948) and developed the social and ethical implications in The Human Use of Human Beings (1950). His treatment of entropy as the universal antagonist of intelligence remains one of the most compact statements of the cybernetic worldview.
Universal tendency. The second law applies at every scale; no system is exempt.
Local anti-entropy is work. Maintaining order requires continuous energy expenditure; the alternative is dissolution, not stasis.
Information as negentropy. Shannon's mathematics ties organized information to thermodynamic order; the two are the same operation in different vocabularies.
Intelligence as channel. Each cognitive innovation — language, writing, AI — is a new channel for the creation and maintenance of order.
Genuine novelty vs. recombination. Whether an anti-entropic channel adds to the order of the universe or redistributes existing order is a question with real thermodynamic content.
Whether modern AI systems produce genuine novelty or sophisticated recombination is an active question. Wiener's framework does not settle it definitively but provides the vocabulary. The human contribution to human-AI loops — purpose, judgment, the evaluation of what is worth creating — may be what ensures the collaboration produces genuine anti-entropy rather than impressive rearrangement.