Across his late essays, Dyson argued that the capacity to think on long timescales was not merely an intellectual virtue but an ethical obligation for those whose work affected the future. Scientists, engineers, policymakers, and — increasingly — technologists whose tools shape the cognitive substrate of the species, all bear a duty that other actors might not bear to the same degree. The obligation is asymmetric because the capacity is asymmetric. Most people, most of the time, cannot think coherently about millennia; the institutions that make such thinking possible are rare and should be protected. But those who can think this way owe their thinking to those who cannot, because decisions made on short horizons by those with long-horizon capacity impose costs on populations that have no representation in the decision-making process.
The framework draws heavily on Dyson's reflections on the nuclear era. The decisions made in the 1940s and 1950s about nuclear weapons affected not only the people alive at the time but every generation that followed. Those decisions were made by small groups of scientists and officials who had the capacity to think about the long-term implications but often did not. Dyson's own participation in Project Orion — a proposal to use atomic bombs to propel spacecraft — taught him, uncomfortably, what happened when capable people failed to exercise the long-view responsibility their positions demanded.
The framework bears on AI with specific force. The engineers and researchers currently building frontier AI systems are, by any reasonable measure, among the most consequential actors in human history. Their decisions affect not only the quarterly metrics their employers track but the cognitive substrate on which future civilizations will build. The builder's responsibility that Wiener articulated and that Amodei has extended is, in Dyson's vocabulary, the ethical consequence of occupying a position of long-view capability.
The framework complicates simple prescriptions. Dyson did not believe that long-view responsibility required slowing down or refusing to build. He believed it required building with attention — attention to what the structures being built could become, attention to whether they served persistence or undermined it, attention to the populations that would inherit the decisions being made now. The beaver's dam must be built, but it must be built with awareness that the ecosystem it creates will outlast the builder.
The governance gap that the Orange Pill cycle diagnoses is, in Dyson's framework, a failure of long-view responsibility at the institutional level. Governments, regulators, and civic institutions have not developed the capacity to think on the timescales the technology requires. The failure is not individual — many of the people occupying these roles are serious and thoughtful — but structural. Institutions optimized for electoral cycles and quarterly reports cannot easily accommodate decisions whose consequences unfold across generations.
The framework emerged across Dyson's essays for The New York Review of Books and in his popular books, receiving its most systematic statement in Imagined Worlds. The thinking was shaped by his decades of service on government advisory committees, where he repeatedly observed the mismatch between the timescales on which decisions were made and the timescales on which their consequences unfolded.
Asymmetric capacity, asymmetric duty. Those with the institutional and cognitive resources to think long-term owe their thinking to those who lack those resources.
Attention, not refusal. Long-view responsibility does not require slowing down; it requires building with awareness of what the building will become.
Institutional design. The failures of long-view thinking are often structural rather than individual; institutional reform is the appropriate target.
Intergenerational justice. Decisions made now impose costs on populations that have no representation in the decision-making process; long-view responsibility is the partial remedy.