Every knowledge system depends on labor that its output conventions systematically conceal. In Schaffer's historical studies, this includes Robert Hooke building and operating Boyle's air pump while receiving no credit in published reports; David Kinnebrook performing astronomical observations that were attributed to the Astronomer Royal; and legions of human 'computers'—mostly women—performing mathematical calculations published under astronomers' names. The invisibility is produced by social conventions, not technical necessity: the gentleman's name appears on the published observation because gentlemanly status confers credibility, while the assistant's technically superior skill is epistemically irrelevant. Applied to AI, the framework reveals strata of invisible labor: millions of creators whose work was absorbed into training data without consent or compensation; data labelers in Kenya and the Philippines earning under two dollars per hour to categorize content; content moderators suffering psychological harm from reviewing toxic material. The smooth interface presents AI as autonomous intelligence; the social machinery of its production remains hidden.
Schaffer's excavation of invisible labor challenges the mythology of individual discovery. The air pump experiments that established Boyle's reputation required Hooke's unacknowledged mechanical genius. Newton's optical experiments required instrument makers whose glass-grinding skill Newton relied upon but did not credit. The Greenwich Observatory's published stellar positions required assistants whose personal equation—the systematic delay between observation and recording—was treated as error rather than as contribution. In each case, the conventions of scientific publication attributed knowledge to the director while erasing the workers who made the knowledge possible.
The pattern extends beyond individual laboratories into industrial-scale knowledge production. Babbage's calculating engines, celebrated as mechanical minds, were designed to replace human computers—workers, predominantly women, who performed mathematical calculations by hand at wages reflecting their work's social devaluation. The engines were intelligent precisely because the human labor they mechanized had been socially positioned as unintelligent—mere mechanical drudgery requiring no genuine thought. The attribution of intelligence to the machine depended on the prior devaluation of the humans.
The contemporary AI industry replicates this structure with disturbing fidelity. OpenAI's content moderation workforce in Kenya—documented by Time magazine in 2023—reviewed graphic violence and abuse for under two dollars per hour, producing the labeled datasets that taught models to avoid toxic outputs. These workers are to large language models what human computers were to nineteenth-century astronomy: essential contributors whose labor is technically recognized (in training pipelines, in corporate acknowledgments) but epistemically erased (in the interface, in the attribution of capability, in the public discourse that celebrates AI as autonomous intelligence).
The social mechanisms producing invisibility have evolved but retain structural continuity. The seventeenth-century mechanism was class: the gentleman's name appeared because social position conferred authority. The twenty-first-century mechanism is geographic and economic: the Silicon Valley builder's name appears, the Nairobi data labeler's does not, because the conventions of technology discourse attribute innovation to designers while treating implementation labor as interchangeable infrastructure. The platforms where demonstrations occur, the metrics that measure success, the narratives that distribute credit—all are controlled by the communities whose labor is visible, not by the communities whose labor sustains the system.
The concept emerged from Schaffer's archival work on seventeenth-century laboratories, particularly his study of Robert Hooke's dual role as operator and natural philosopher. Hooke was Boyle's intellectual equal and arguably his superior in experimental skill, yet his contributions were systematically erased by the social conventions that reserved philosophical credit for gentlemen. Schaffer recognized this as a structural pattern: every experimental system produces knowledge through a division of labor, and the conventions of attribution systematically privilege the labor of direction over the labor of execution. The Royal Society's motto Nullius in verba ('take nobody's word for it') was belied by the practice of taking the gentleman's word while dismissing the mechanic's testimony.
Attribution conventions shape who counts. The labor that receives credit is determined by social position and institutional conventions, not by technical contribution or epistemic importance.
Invisibility is produced, not natural. The erasure of technical labor from knowledge accounts is an active social achievement requiring specific conventions, not a passive feature of how knowledge works.
Devaluation precedes mechanization. Tasks performed by low-status workers are positioned as unintelligent before machines are celebrated for performing them—the attribution of intelligence follows the labor hierarchy.
Contemporary AI replicates historical patterns. Data labelers, content moderators, and creators are to AI what human computers and laboratory assistants were to nineteenth-century science—essential but invisible.
The smooth interface is a political achievement. Making labor invisible is not a technical feature of AI but a design choice that serves the interests of those who control the interface.