The Designer's Obligation — Orange Pill Wiki
CONCEPT

The Designer's Obligation

The structural duty — analogous to medical and structural-engineering obligations — that knowledge of mechanism imposes on those who design tools affecting millions who cannot see the mechanism themselves.

The designer's obligation is the ethical and structural duty Raskin's framework imposes on technology designers whose work shapes the cognitive environment of millions. The obligation is grounded in the same principle that governs pharmaceutical, food-safety, and structural-engineering regulation: specialized knowledge creates an asymmetry of understanding between the provider and the public, and the asymmetry generates obligations that the public cannot discharge through its own efforts. The patient cannot study pharmacology. The consumer cannot test nutritional content. The building occupant cannot evaluate structural integrity. The user of an AI system cannot see the engagement architecture shaping her engagement. The specialist possesses knowledge the public needs; the specialist's obligation is to provide that knowledge — or, when knowledge alone is insufficient, to build protections into the product so that the public's safety does not depend on the public's understanding.

In the AI Story

Hedcut illustration for The Designer's Obligation
The Designer's Obligation

The obligation has three dimensions: informational, temporal, and numerical. The informational asymmetry is that the designer understands the engagement mechanisms while the user experiences the engagement without seeing its architecture. The temporal asymmetry is that design decisions persist — one choice made by a few people affects millions of users for years. The numerical asymmetry is that the affected population had no voice in the decision and has no mechanism to modify its effects.

The technology industry has largely escaped the regulatory framework that governs other domains where these asymmetries exist. The pharmaceutical industry operates under strict requirements for effect studies, documentation, disclosure, and regulatory oversight. The construction industry operates under building codes, professional licensing, and liability frameworks. The technology industry has operated, for most of its history, under almost none of these constraints — producing tools that affect billions of minds with no obligation to study effects, document harms, or submit to regulatory authority.

Raskin's framework argues that this exemption is an anomaly in the landscape of consumer protection, and the anomaly is becoming increasingly difficult to justify as the effects of engagement-optimized tools accumulate. The obligation he calls for would include: design standards for cognitive health analogous to product safety standards for physical products; transparency requirements for optimization objectives; regulatory institutions with the technical expertise and operational speed to evaluate AI designs and require modifications when designs produce cognitive effects exceeding specified thresholds.

The obligation is not discharged by publishing a terms-of-service document no user reads, or by offering an opt-out buried in a settings menu. It is discharged by building the protections into the design itself — the reflection prompts, the natural stopping points, the cognitive health metrics, the calibrated challenge — so that the user's well-being does not depend on the user's understanding of mechanisms operating below conscious awareness. The New Mexico courtroom in which Raskin testified against Meta in January 2026 is the book's closing image of what discharge of the obligation looks like when it is finally imposed.

Origin

The obligation framework draws on a lineage of consumer-protection argument running through Ralph Nader's Unsafe at Any Speed (1965), the regulatory apparatus built around pharmaceuticals after the thalidomide crisis, and the structural-engineering liability framework that followed nineteenth-century bridge collapses. Its application to digital and AI design has been developed by Raskin and Harris through the Center for Humane Technology's policy work from 2018 onward.

Key Ideas

Specialized knowledge creates obligation. The asymmetry between designer understanding and user experience generates duties the user cannot discharge herself.

Three asymmetries. Informational, temporal, and numerical — each requiring distinct institutional response.

Built into design. The obligation is discharged not through disclosure but through design that incorporates protections the user cannot implement herself.

Regulatory anomaly. Technology's exemption from the frameworks governing comparable domains is increasingly difficult to justify as effects accumulate.

Appears in the Orange Pill Cycle

Further reading

  1. Ralph Nader, Unsafe at Any Speed (1965)
  2. Lawrence Lessig, Code and Other Laws of Cyberspace (1999)
  3. Shannon Vallor, Technology and the Virtues (2016)
  4. Center for Humane Technology, policy recommendations
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT