Shoshana Zuboff is one of the first tenured women at Harvard Business School, where her career-long investigation of technology's transformation of work established the foundational vocabulary for understanding AI's double movement. Her 1988 landmark In the Age of the Smart Machine introduced action-centered skill and intellective skill, documenting how computerization simultaneously destroyed embodied knowledge and created unrealized potential for deeper understanding. Her 2019 The Age of Surveillance Capitalism reframed the digital economy as a system extracting human experience as raw material, converting behavioral surplus into prediction products sold in behavioral futures markets. By 2025, her position hardened from regulation to abolition—calling for the dismantling of extraction mechanisms operating at civilizational scale.
Zuboff's methodology is ethnographic immersion. She spent years embedded in paper mills, telecommunications companies, and banks undergoing computerization in the 1980s—not consulting but observing, documenting how workers experienced the dissolution of knowledge forms they had spent decades building. The paper mill worker who felt pulp consistency between his fingers possessed action-centered skill—knowledge residing in the body's calibrated nerve endings. When computerization moved him to a control room reading digital displays, that knowledge had no substrate in which to persist. The transition was epistemological, not merely technological: the worker's relationship to truth itself was altered. Where verification had occurred through touch, it now occurred through symbolic interpretation. The cognitive demand shifted from embodied knowing to what Zuboff called intellective skill—the capacity to construct mental models from abstracted representations.
Her surveillance capitalism framework emerged from recognizing that twenty-first-century platforms had discovered a new raw material: human experience itself. Not labor—capitalism had exploited labor for centuries—but the totality of what people do, search, linger over, abandon while living digitally mediated lives. The architecture operates through a four-stage sequence: experience is claimed as free behavioral surplus, fed into the computational factory of machine intelligence, transformed into prediction products, and sold in behavioral futures markets to parties interested in modifying rather than understanding behavior. AI is not separate from this apparatus—it is the apparatus. As Zuboff told interviewers, the pipes filled with behavioral surplus all converge in the factory of machine intelligence, and what emerges are computational products predicting human behavior.
By December 2025, Zuboff declared AI "surveillance capitalism continuing to evolve and expand," calling for abolition rather than regulation of the fundamental extraction mechanisms. Her Harvard Kennedy School testimony demanded the end of "secret, massive-scale extraction of the human and its declaration as a corporate asset." The trajectory from documentation to critique to abolition tracks the trajectory of extraction itself—from search and social media residue to the cognitive labor of creative professionals using Claude Code. The data generated by human-AI interaction is richer and more intimate than any previous digital interaction: it reveals not what users search for but how they think, exposing the deep structure of cognitive architecture that is then claimed as platform property.
Zuboff's intellectual formation combined rigorous empirical methods with phenomenological attention to lived experience. Her early career in organizational behavior at Harvard established her ethnographic methodology—the commitment to embedding herself in workplaces undergoing transformation and documenting workers' experience from the inside. The paper mill fieldwork of the early 1980s required years: gaining access, building trust, learning the workers' vocabulary, observing the transition from hands-on to screen-mediated operation as it unfolded in real time. The workers she observed—given pseudonyms like Piney Woods in her field notes—were not abstractions but specific people wrestling with specific losses that productivity metrics rendered invisible.
The surveillance capitalism framework emerged from her recognition, watching the rise of Google and Facebook in the 2000s, that a new form of capitalism had crystallized—one whose raw material was not land, labor, or capital in classical sense but human experience converted into behavioral data. The extraction was unilateral, the conversion invisible, the monetization opaque. The Age of Surveillance Capitalism took years to research and seven hundred pages to document, mapping supply chains of behavioral data with the same exhaustive empiricism she had brought to paper mills. Her concepts—instrumentarian power, the Big Other, epistemic inequality—have shaped regulatory discourse worldwide, influencing the EU AI Act and democratic governance debates across continents.
Automating versus informating. Every technology that automates also informates—generating data about processes that creates potential for new understanding—but institutions determine which function is realized.
Action-centered and intellective skill. Embodied knowledge built through physical engagement versus symbolic knowledge built through interpretation of abstracted representations—AI eliminates the first while demanding evolution of the second.
Behavioral surplus extraction. Human experience claimed as free raw material and processed into prediction products—the mechanism through which surveillance capitalism operates and AI extends into cognitive labor.
The worker's dilemma. Resistance preserves identity but forfeits capability; adaptation expands capability but transforms identity—neither choice is costless in transitions that compress decades into months.
Institutional design determines outcomes. The informating dividend is real but not self-realizing—it flows to workers only through deliberate institutional structures that markets do not naturally build.
Cory Doctorow challenges Zuboff's behavioral modification thesis, arguing surveillance capitalism's claimed power to shape behavior is largely "snake oil" and the real problem is monopoly rather than manipulation. The debate hinges on whether extraction is primarily economic (addressable through antitrust) or epistemic (requiring abolition of extractive mechanisms). Zuboff's 2025 hardening toward abolition suggests she sees the epistemic dimension as irreducible to market structure. Her framework has also been contested by scholars who argue it overstates corporate power and understates user agency, though the AI moment's compression of extraction into cognitive labor appears to vindicate her structural diagnosis.