Builder responsibility is the ethical position Wiener developed in response to his wartime work on anti-aircraft fire control and defended for the remaining two decades of his life at substantial professional cost. The position has two claims. First, the builder of a powerful system — particularly one that learns, adapts, or operates autonomously — bears continuing moral responsibility for the system's downstream consequences. The construction does not end the obligation; it begins it. Second, the claim that research is morally neutral, that the engineer's responsibility ends when the specification is met, is a fiction whose function is to transfer responsibility to institutions (governments, corporations, markets) less accountable than the engineer. Wiener rejected this fiction explicitly in a 1947 open letter in the Atlantic Monthly declining to provide information to military agencies, and he maintained the position consistently through his final book God & Golem, Inc.
The conventional mid-twentieth-century view held that scientists produced knowledge and engineers produced technology, both neutral with respect to downstream use. Whether a discovery saved lives or took them, empowered citizens or oppressed them, was a question for politicians, generals, and markets to answer — not for the researcher whose contribution ended at the laboratory door. The view was institutionally convenient: it allowed vast expansion of military-funded research in the Cold War period without requiring individual researchers to confront what their work was for.
Wiener rejected this on two grounds. First, he argued that the neutrality claim was empirically false. The same mathematics that enabled anti-aircraft defense enabled offensive missile guidance; the same feedback principles that governed a thermostat could govern a system for automated surveillance of a population. The researcher who claims not to know what her work will be used for is, in most cases, choosing not to know. Second, he argued that the neutrality claim was structurally corrosive: if no one in the chain of construction bears moral responsibility for the system, responsibility becomes diffuse to the point of nonexistence, and the system's behavior is effectively no one's problem until after the catastrophe.
Wiener's 1947 Atlantic Monthly letter — 'A Scientist Rebels' — was the most public statement of the position. He announced that he would not share his research with any government agency that would use it for military purposes, and he encouraged other scientists to adopt similar stances. The letter was widely read, widely resented, and widely ignored. In the decades that followed, Wiener's funding diminished, his institutional position weakened, and his influence within the expanding national-security science establishment collapsed. He continued to refuse. The stance cost him dearly — financially, professionally, and in terms of the scientific prestige that mid-century American science could confer.
The contemporary AI context has made the position urgent again. The builder of a large language model is not in the same position as the builder of a hammer. The model learns, adapts, and produces outputs that shape billions of interactions between humans and information. The traditional move — ship the product, let the market sort it out — does not scale to systems of this consequence. Wiener's framework suggests that the builder's responsibility is continuing, not terminal: it extends to the maintenance of the governors, the monitoring of downstream effects, the willingness to shut the system down when it becomes clear that it is doing harm. The position is not comfortable. It is, Wiener argued, the only honest one.
Wiener's position developed during World War II, crystallized in the aftermath of Hiroshima and Nagasaki, and was publicly articulated in the 1947 Atlantic Monthly letter. It is developed at length in The Human Use of Human Beings (1950, revised 1954) and forms the ethical spine of God & Golem, Inc. (1964).
Wiener was not alone — figures like Joseph Rotblat (who left the Manhattan Project in 1944 on moral grounds) and, later, the Pugwash Conferences articulated similar positions — but he was the most prominent mainstream scientist to maintain the stance publicly throughout the Cold War.
Continuing obligation. The builder's responsibility does not end when the system ships.
Neutrality is a fiction. The claim of value-neutral research transfers responsibility to less accountable actors.
Public refusal is legitimate. Wiener demonstrated that individual scientists can decline participation in projects they consider immoral, at cost.
Understanding confers obligation. Those who understand the feedback dynamics of a system have a responsibility others cannot have.
Cost of the stance is the test. A position that costs nothing is not a moral stance; it is a preference.
Critics argue that individual refusal does not stop the technology; someone else will build it. Wiener's response, consistent across his career, was that this misunderstood what the refusal was for. The refusal was not an effective veto; it was a statement of moral position, and the aggregate of such statements shifts the culture within which institutional decisions are made. The fact that someone else might build the weapon did not, in his view, relieve him of the obligation not to build it himself.