Support Personnel — Orange Pill Wiki
CONCEPT

Support Personnel

Becker's term for the essential but uncredited contributors whose labor sustains creative work — the piano tuners, session musicians, annotators, and content moderators whose contributions the conventions of credit render invisible.

Support personnel are present in every art world, and their invisibility is not accidental — it is conventional, produced and maintained by shared understandings about what counts as creative contribution and what counts as mere support. The line between the two is not inherent in the work; it is drawn by the conventions, and it could be drawn differently. The audience at a symphony sees the conductor and first violinist but not the luthier who carved the violin, the piano tuner who arrived at seven in the morning, the stage crew, the custodian. Their contributions are essential — remove any and something goes wrong — but the conventions of credit allocate recognition to performers. In the AI world, the support personnel are even more invisible: data annotators on other continents, content moderators exposed to traumatic material, open-source developers whose code trained the models without consent.

In the AI Story

Hedcut illustration for Support Personnel
Support Personnel

The global data annotation workforce numbers in the hundreds of thousands to low millions, concentrated in Kenya, India, the Philippines, Venezuela — countries where English-language literacy is high and wages are low. The work pays a few dollars per hour: enough to attract workers locally, far below what the same work would command where AI companies are headquartered. Their contribution is essential — the model cannot be trained without labeled data — but the conventions of the AI world assign them no credit, no visibility, and no voice.

Content moderators occupy a similar position. They review and filter outputs, flag harmful content, red-team systems for safety. The emotional and psychological toll has been documented: exposure to violent, disturbing, and traumatic content as routine work. Their labor keeps AI outputs within the bounds users expect, and the conventions acknowledge their contribution no more than the concert world acknowledges the custodian's mop.

The open-source community represents a different kind of support personnel. Developers contributed code under licenses permitting sharing and modification. That code was subsequently ingested by AI training processes. The contributions are essential to model capabilities, but the developers did not consent to this specific use and receive no compensation. This is a convention gap — a situation where existing shared understandings do not provide clear guidance for new circumstances.

Becker would not have called this exploitation exactly — his vocabulary was deliberately less loaded. He would have called it a convention of credit and compensation that distributes cooperative returns in a way that reflects power relations among participants rather than the value of their contributions. The convention persists because those with power to change it have no incentive, and those who would benefit from change have no leverage.

Origin

Becker developed the concept through ethnographic observation of orchestras, film productions, theaters, and other cooperative creative enterprises. The concept was formalized in Art Worlds (1982), Chapter 4, which analyzed how different art worlds handle the support function — some inclusively (film credits), some exclusively (the novelist's byline).

Mary Gray and Siddharth Suri's Ghost Work (2019) documented the vast distributed workforce performing micro-tasks on which AI systems depend, providing the empirical foundation for applying Becker's framework to the AI world.

Key Ideas

Invisibility is a convention, not a fact. Support personnel are invisible because art world conventions render them so. Different conventions would render them differently visible.

The AI world's support layer is geographically distributed and contractually obscured. Global labor arbitrage routes annotation work to wherever workers will accept the lowest rate, maintaining invisibility through distance.

The convention serves power. It persists because changing it would require those with power to accept reduced returns, and those who would benefit lack leverage to force the change.

Convention gaps arise when practices outpace shared understandings. Open-source code ingested into AI training is a new use that existing licenses did not anticipate.

The scale is different from traditional art worlds. The global support network for AI dwarfs the cooperative networks of any previous creative domain, making the invisibility more consequential.

Debates & Critiques

Defenders of current arrangements argue that annotation workers freely accept the wages offered — that labor markets reveal preferences and contracts reflect agreement. Becker's framework does not dispute the market analysis but asks what it leaves out: the conventions that determine which participants appear as negotiating parties and which appear as raw material; the social structures that constrain the options available to workers in particular geographies; the invisibility that prevents the full cooperative network from being recognized as such.

Appears in the Orange Pill Cycle

Further reading

  1. Howard Becker, Art Worlds, Chapter 4: 'Mobilizing Resources' (University of California Press, 1982)
  2. Mary Gray and Siddharth Suri, Ghost Work (Houghton Mifflin Harcourt, 2019)
  3. Milagros Miceli and Julian Posada, 'The Data-Production Dispositif' (CSCW, 2022)
  4. Adrienne Williams, Milagros Miceli, and Timnit Gebru, 'The Exploited Labor Behind Artificial Intelligence' (Noema, 2022)
  5. Sarah Roberts, Behind the Screen: Content Moderation in the Shadows of Social Media (Yale University Press, 2019)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT