Credit Conventions in AI — Orange Pill Wiki
CONCEPT

Credit Conventions in AI

The shared understandings — inherited from traditional art worlds, inadequate to the AI moment — that determine who receives recognition for AI-assisted work and whose contributions remain invisible.

Every art world has credit conventions, and none of them perfectly match the actual distribution of contributions. They cannot, because contributions are continuous — each participant contributes something, shading into others without clean boundaries — while credit is discrete: someone's name goes on the cover, and someone's does not. The discretization necessarily produces distortion. The question Becker asked was not whether art worlds are fair but whether the credit conventions accurately represent the actual distribution of contributions, and what happens when they do not. The AI world has produced a new version of the credit problem, complicated because the cooperative network includes a participant that is not a person: the model itself. When Segal writes that 'neither of us owns that insight' — referring to a connection that emerged from collaboration with Claude — he is describing a situation for which existing conventions of authorship have no adequate response.

In the AI Story

Hedcut illustration for Credit Conventions in AI
Credit Conventions in AI

The existing conventions assume the unit of creative production is a human being. An author writes a book. A programmer writes code. When the process involves multiple humans, the conventions have mechanisms — co-authorship, ensemble credits, production credits — that distribute recognition, however imperfectly. But Claude is not a co-author in any sense the existing conventions recognize. It is not a person. It does not have interests, a career, a reputation that benefits from recognition.

This has led some participants to conclude the problem is simple: the human gets credit because the human is the only participant to whom credit is meaningful. The builder prompted, evaluated, selected, arranged, made editorial judgments. The builder is the author. Claude is a tool, like a word processor. The argument is clean and appealing. It is also, in Becker's terms, a convention masquerading as a fact. The claim that Claude is merely a tool equivalent to a word processor is a convention of classification, not an empirical observation.

The convention has consequences. It obscures the actual process of creation. It creates misleading standards for evaluating AI-assisted work. It discourages transparency about collaboration — if convention says the builder is author, then acknowledging Claude's contribution feels like an admission of diminished authorship, creating an incentive to conceal the collaboration, which prevents the AI world from developing accurate conventions for evaluating what it cannot see.

Segal's transparency about his collaboration with Claude — declared in the Foreword of The Orange Pill, examined in Chapter 7, reflected on throughout — is an attempt to establish a new convention of credit that is more adequate to the actual cooperative process. Not co-authorship exactly (Claude is not a person), but an acknowledged collaboration for which the AI world does not yet have a stable name.

Origin

Becker's analysis of credit conventions drew on studies of session musicians (like Glen Campbell in the 1960s Los Angeles recording scene, whose uncredited guitar appears on dozens of hit records), ghostwriters, screenwriters overshadowed by directors, and photographers whose assistants did substantial work.

The framework was extended to AI through analyses of authorship disputes emerging from AI-assisted writing, image generation, and code production — debates playing out in publishing contracts, journal policies, and platform terms of service.

Key Ideas

Credit conventions distort cooperative reality. The gap between discrete credit and continuous contribution is chronic; conventions can be more or less inclusive but cannot be perfectly accurate.

Claude is not merely a tool. A word processor does what the user tells it. Claude generates material the user did not specify, makes connections the user did not see. The tool classification is a convention, not an observation.

Sole-credit discourages transparency. The convention that the builder is author creates incentives to conceal the collaborative process, preventing accurate evaluation of AI-assisted work.

Existing conventions lack vocabulary for human-machine collaboration. Co-authorship assumes personhood; tool-use assumes passivity. Neither fits.

New conventions will stabilize whether deliberately chosen or not. The default — sole human credit — is stabilizing by inertia, and participants who do not intervene accept it by default.

Debates & Critiques

Some argue that legal frameworks should determine credit conventions (copyright law, contract terms). Becker's framework treats law as one influence among many; conventions of practice often precede and shape legal responses. Others argue that transparency about AI use is inherently stigmatizing and will always disadvantage those who disclose. The response is that stigma is itself convention-dependent and can shift as disclosure normalizes.

Appears in the Orange Pill Cycle

Further reading

  1. Howard Becker, Art Worlds, Chapter 1 (University of California Press, 1982)
  2. Jenny Davis, 'Anticipatory Disclosure: AI Attribution as Professional Norm' (2024)
  3. Authors Guild Open Letter on AI Training (July 2023)
  4. Andersen v. Stability AI, Class Action Complaint (2023)
  5. Henry Shevlin, 'Consciousness, Machines, and Moral Status' (2022)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT