The Authors Guild Letter — Orange Pill Wiki
EVENT

The Authors Guild Letter

The July 2023 open letter signed by more than ten thousand authors — Margaret Atwood, Jonathan Franzen, Jhumpa Lahiri, James Patterson among them — demanding that AI companies obtain consent, provide credit, and offer compensation before using published work to train language models.

The letter was organized by the Authors Guild, the largest professional organization for writers in the United States, and addressed to the chief executives of OpenAI, Alphabet, Meta, Stability AI, IBM, and Microsoft. Its demands were minimal: consent, credit, and compensation — the same three terms that have governed the use of creative work in the publishing economy for generations. The letter was the mildest form of collective action available, the form the framework knitters exhausted before turning to direct action. Its signatories were not breaking anything. They were asking, through the most polite available channel, for the recognition of a principle so basic that its contested status reveals the depth of the institutional failure: the principle that the use of a person's labor to enrich a corporation requires the person's consent.

In the AI Story

Hedcut illustration for The Authors Guild Letter
The Authors Guild Letter

The letter followed the discovery that the Books3 dataset — a corpus of approximately 196,000 books obtained from shadow library Z-Library — had been used to train several major language models including Meta's LLaMA. The dataset had been uploaded to the Pile, a widely-used training corpus, without author permission. Individual authors searching tools like The Atlantic's Books3 search discovered their work had been used without their knowledge.

The response from AI companies was minimal. None accepted the letter's three demands as a framework for negotiation. OpenAI continued to argue that its training practices constituted fair use. Meta argued that Books3 was one small part of a larger training corpus. No company offered the consent, credit, or compensation the letter had requested.

The letter's effectiveness must be measured against Thompson's framework for what improvised collective action can and cannot achieve. It compelled attention — media coverage, congressional interest, industry panels. It did not compel structural change. No legislation followed. No industry standard emerged. No mechanism for the authors' ongoing representation in AI governance decisions was created.

In the following year, the Authors Guild pursued more aggressive action, filing a class-action lawsuit against OpenAI in September 2023 with named plaintiffs including John Grisham, George R.R. Martin, and Jodi Picoult. The lawsuit joins Andersen v. Stability AI as a legal test of whether existing copyright doctrine can accommodate the unprecedented scale of AI training data acquisition.

Origin

The letter was published July 18, 2023, on the Authors Guild website and in multiple literary publications. Signatures continued to accumulate in the following weeks, eventually exceeding 10,000.

Key Ideas

Minimum demands. Consent, credit, compensation — the three most basic protections, not radical requirements.

Petition as mild action. The letter was the most polite form of collective response available, preceding any direct or legal action.

Institutional failure exposed. That such minimal demands were contested reveals how completely formal governance has failed to address AI training data acquisition.

Escalation pathway. The letter was followed by class-action litigation, illustrating the predictable progression from petition to direct action when demands are ignored.

Appears in the Orange Pill Cycle

Further reading

  1. Authors Guild, "Open Letter to Generative AI Leaders" (July 2023)
  2. Authors Guild v. OpenAI Inc., S.D.N.Y. Case No. 1:23-cv-08292 (filed September 19, 2023)
  3. Alex Reisner, "Revealed: The Authors Whose Pirated Books Are Powering Generative AI" (The Atlantic, 2023)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
EVENT