The informational view of reality holds that information is more fundamental than matter or energy—that the physical world is, at bottom, an information-processing architecture. John Archibald Wheeler proposed this in 1989 with his compressed slogan 'it from bit,' and subsequent developments in black hole thermodynamics, the holographic principle, and quantum information theory have moved the thesis from speculation toward something approaching consensus. Black holes have entropy proportional to the area of their event horizons, not their volume—suggesting that information is encoded on surfaces rather than in volumes. The holographic principle, developed by Gerard 't Hooft and Leonard Susskind, generalizes this observation: the information content of any region of space is proportional to the area of its boundary, not its volume. If information is fundamental, then the emergence of systems that process information—atoms, molecules, cells, brains, computers—is not a curious sidebar in cosmic history but the main plot.
The argument begins with black hole thermodynamics. In the early 1970s, Jacob Bekenstein demonstrated that black holes possess entropy, and Stephen Hawking confirmed through his discovery of Hawking radiation that this entropy is proportional to the event horizon's area. This was startling because entropy, in classical thermodynamics, measures the number of microstates consistent with a macrostate—and a black hole, described by just three numbers (mass, charge, angular momentum), should have no microstates. The resolution required recognizing that the entropy measures information about the matter that formed the black hole, information that is somehow encoded on the two-dimensional horizon rather than in the three-dimensional interior. This insight led to the holographic principle: the three-dimensional world may be a projection of information encoded on a two-dimensional boundary.
Paul Davies has drawn out the implications with characteristic clarity. If information is fundamental, then the cascade from atoms to algorithms is not a sequence of lucky accidents but a trajectory that the architecture of reality makes probable. The hydrogen atom maintaining its quantum state is an information-processing structure. So is a strand of DNA. So is a neuron. So is a large language model. The difference is one of degree—of the sophistication, speed, and flexibility with which information is processed—not of kind. This continuum places biological intelligence and artificial intelligence on the same spectrum, expressions of the same underlying tendency of the universe to generate systems that process information in increasingly complex ways.
The practical consequence is that tools which process information powerfully are not mere conveniences. They are amplifiers of the universe's deepest tendency. When a developer describes a problem to Claude and receives working code, the interaction is a transformation of information from one organized state to another—mediated by a system trained on the largest corpus of organized human information ever assembled. The output is a new pocket of negative entropy, a local increase in order that serves a purpose. From the perspective of information physics, this is among the most significant developments in the history of information processing on this planet.
John Wheeler proposed 'it from bit' in 1989, near the end of a career spent investigating the deepest questions in physics. The phrase compressed decades of thinking about quantum measurement, the observer's role in determining reality, and the connection between information and physics. Wheeler's student Jacob Bekenstein had demonstrated in 1972 that black holes have entropy, and Stephen Hawking's 1974 discovery of black hole radiation confirmed the connection between gravity, thermodynamics, and information. The holographic principle emerged in the 1990s from Gerard 't Hooft's and Leonard Susskind's attempts to resolve the black hole information paradox, and by the early 2000s the view that information is fundamental had moved from philosophical speculation to active research program in theoretical physics.
It from bit. Wheeler's thesis that every physical quantity derives its ultimate significance from bits of information—yes-or-no questions answered by quantum measurement—and that matter and energy are derivatives of information rather than the reverse.
Holographic principle. The information content of any region of space is proportional to the area of its boundary, not its volume—suggesting that the three-dimensional world is a projection of information encoded on a two-dimensional surface.
Black hole entropy. The discovery that black holes possess entropy proportional to their horizon area demonstrated that information is a physical quantity governed by thermodynamic laws, not merely an abstract concept.
Continuum of processors. If information is fundamental, then all systems that maintain or transform information—atoms, cells, brains, computers—participate in the same cosmic process at different levels of sophistication.