Every technological revolution produces a class of workers whose labor is essential to the system's operation and invisible in the system's narrative. The AI ecosystem depends structurally on human work that its marketing materials never mention: data labelers in Nairobi, Manila, and Dhaka who categorize millions of images and text samples; RLHF trainers who rate AI responses for helpfulness; content moderators who review harmful outputs; cloud infrastructure technicians maintaining server farms; domestic workers (often spouses) who sustain household functioning while the builder builds. These workers make AI possible—without labelers, the models would not train; without moderators, the outputs would be unsafe; without domestic infrastructure, the absorbed builder could not sustain absorption. Yet their contributions are systematically erased, their names appearing nowhere in the celebratory accounts of AI's capabilities. The erasure is not accidental but structural: systems designed to maximize visibility of certain labor (engineering, founding, using) and minimize visibility of other labor (preparing, monitoring, sustaining).
The data labeler in Nairobi embodies the starkest form of invisibility. She spends eight hours categorizing traffic signs so self-driving cars can learn to recognize them, earning in a day what an American software engineer earns in minutes. Her work is repetitive, cognitively demanding (sustained attention across hours), and essential—the model cannot learn without the labeled data. Yet her contribution is structurally concealed: the self-driving car's narrative celebrates the algorithm, the training process, and the users' liberation from driving. The labeler who made the algorithm possible is mentioned nowhere. Terkel would have sat in the labeling center, turned on the recorder, and asked: What do you do all day? The testimony would describe the screen, the quota (items per hour, per shift), the strategies for maintaining focus across numbing repetition. The labeler might say, as the spot welder told Terkel, 'I am a machine'—training a machine to make human labor unnecessary by performing the most machine-like human labor imaginable.
Content moderators occupy a more psychologically hazardous position. They review AI outputs for safety—scrolling through violent, sexually explicit, deceptive, or disturbing material as a condition of employment. The work is cognitively demanding (rapid categorization), emotionally taxing (exposure to humanity's worst outputs), and structurally invisible: recognized only when it fails, never when it succeeds. The moderator is the immune system of the AI ecosystem, and like biological immune systems, she is noticed only during infection. Research documents PTSD-level psychological damage among content moderators, yet the platforms that employ them (often through third-party contractors like Sama) provide minimal mental health support and compensate at rates reflecting the work's invisibility rather than its essentiality or its cost.
Domestic infrastructure supporting the AI-absorbed builder is the third form of invisible labor, and it maps precisely onto the feminist scholarship Terkel never engaged with explicitly but whose insights his method corroborates. Arlie Hochschild's second shift, Silvia Federici's wages for housework—both frameworks demonstrate that productive labor depends on reproductive labor that capitalism systematically undercounts. The builder working late with Claude depends on someone feeding children, managing logistics, performing the emotional labor of household maintenance. This someone is usually a woman, usually unpaid for the domestic work, and usually absent from the builder's account of how the productive output was achieved. Terkel's method would have corrected this absence by interviewing the spouse and presenting her testimony with equal weight to the builder's—not to accuse but to make visible the full ecology of labor the AI-augmented output depends on.
Terkel's attention to invisible workers was methodological from the start. Division Street interviewed washroom attendants and elevator operators alongside executives and professionals. Working included a prostitute, a strip miner, and a gravedigger—voices that middle-class readers had never encountered in the first person. The inclusion was not voyeuristic but epistemic: Terkel believed that the people whose labor was most invisible were often the people whose testimony was most revealing, because their position at the system's margins gave them a view of the system's functioning that insiders could not access. The washroom attendant saw the executives' faces; the executives did not see his. The asymmetry was evidence—evidence of how visibility itself is distributed along lines of power.
Essential labor is not always visible labor. The AI ecosystem depends on work the narrative systematically excludes—a structural feature, not an oversight, built into systems designed to celebrate creation and conceal maintenance.
The laborer's contribution and the laborer's compensation are inversely correlated. Data labelers and content moderators perform work that is essential to AI's value and are compensated at rates reflecting their geographic and institutional distance from value capture—$1–3 per hour in many cases.
Domestic invisibility sustains productive visibility. The builder's AI-augmented output depends on domestic infrastructure (childcare, meal preparation, emotional presence) that the builder's productivity narrative does not account for and that falls disproportionately on women.
Invisibility is violence. Not the spectacular violence of exploitation but the quiet, ambient violence of not being seen—of performing labor the system depends on while being excluded from the recognition that dignifies human contribution.