Aza Raskin — Orange Pill Wiki
PERSON

Aza Raskin

American designer (b. 1984), inventor of infinite scroll, co-founder of the Center for Humane Technology and the Earth Species Project — the builder who indicts his own creation.

Aza Raskin is an American designer and technologist whose career spans the unusual territory of having built one of the most consequential engagement mechanisms in internet history and then publicly disavowed it. The son of Jef Raskin, who initiated the Macintosh project at Apple, he designed infinite scroll at twenty-two, served as Creative Lead for Firefox at Mozilla, co-founded Massive Health, and in 2018 co-founded the Center for Humane Technology with Tristan Harris. He simultaneously co-founded the Earth Species Project, which uses AI to decode nonhuman animal communication — making him one of the few voices in the AI discourse who holds the tension between technological promise and technological harm from direct experience on both sides.

In the AI Story

Hedcut illustration for Aza Raskin
Aza Raskin

Raskin's intellectual position is defined by a duality that most technology critics never have to inhabit. He is simultaneously an active builder of AI systems and one of the most vocal critics of AI's trajectory under the current incentive structure. The Earth Species Project uses the same transformer architectures that power Claude and GPT-4 — the same attention mechanisms, the same pattern-completion engines — to decode whale songs, dolphin clicks, and crow vocalizations. The technology he warns could hack the operating system of civilization is, in his hands, being used to listen to animals.

His critical framework, developed at the Center for Humane Technology, centers on what he and Harris have called the race to the bottom of the brain stem — the competition among platforms to engage ever-deeper neural circuits, producing the capture of attention, judgment, and eventually identity itself. The framework's application to AI is not an extension of Raskin's thinking about social media. It is, in his formulation, its fulfillment. Social media was actually humanity's first contact with AI, he has argued. The recommendation algorithms were early forms of the same computational intelligence now powering large language models.

In January 2026, Raskin testified in a New Mexico courtroom against Meta, explaining under oath how the engagement mechanisms he had helped pioneer were designed to capture and why they were designed to harm. The testimony crystallized his public position: the designer who understands the mechanism has an obligation to account for its consequences, regardless of the economic incentives that rewarded the original design.

Critics charge that Raskin's claims have sometimes been imprecise. A 2023 New York Times op-ed co-authored with Yuval Noah Harari contained factual errors about AI capabilities. Mathematician Noah Giansiracusa called some of the claims an AI hype trap. Raskin's response has been to distinguish the specific errors from the structural argument — that the asymmetry between AI designers and users creates a power differential the current institutional landscape cannot manage — which the criticisms largely leave intact.

Origin

Born in 1984, Raskin grew up in the orbit of Jef Raskin's humane-interface tradition at Apple. His early career at Humanized produced infinite scroll, which became one of the most widely deployed design patterns in consumer software and, later, the centerpiece of his public critique of engagement architecture.

The 2018 founding of the Center for Humane Technology with Tristan Harris marked Raskin's transition from designer to public advocate. The organization developed the analytical vocabulary — downgrading, the race to the bottom of the brain stem, extraction-oriented design — that has shaped subsequent technology policy discourse.

Key Ideas

The builder who critiques. Raskin's authority derives from having designed the mechanisms he now opposes, giving his critique a specificity that external commentators cannot match.

The two Azas. Critic and builder are the same person, holding the framework that technology's effects are determined not by capabilities but by design, and that design is determined by incentive structures.

Social media as first contact. Raskin's reframing of the attention economy as humanity's first encounter with AI, making the lessons of 2010–2020 directly applicable to the 2020s.

Super human vs extra human. The distinction between technology that accelerates what humans already do (super human) and technology that expands what humans can perceive and understand (extra human).

Appears in the Orange Pill Cycle

Further reading

  1. Aza Raskin and Tristan Harris, The AI Dilemma (Center for Humane Technology, 2023)
  2. Yuval Noah Harari, Tristan Harris, and Aza Raskin, You Can Have the Blue Pill or the Red Pill (New York Times, March 2023)
  3. Your Undivided Attention podcast (2019–present)
  4. Jef Raskin, The Humane Interface (2000)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
PERSON