Prediction products are the manufactured goods of surveillance capitalism: computational assessments of future behavior produced by processing behavioral surplus through machine intelligence. Unlike products sold to consumers who use them, prediction products are sold in what Zuboff calls behavioral futures markets to businesses whose interest is not understanding behavior but shaping it—advertisers purchasing predictions of which users will click, insurers purchasing predictions of which applicants will file claims, employers purchasing predictions of which workers will remain productive. The products are valuable precisely to the extent they are accurate, and accuracy depends on the scale and intimacy of behavioral surplus extraction. AI has transformed prediction products from probabilistic forecasts into comprehensive cognitive profiles: not merely what individuals might do but how they think, how they evaluate, what expertise they possess—predictions about professional competence that could reshape hiring, promotion, compensation across knowledge economies.
Zuboff's framework identifies prediction products as the mechanism converting surveillance into capitalism—the step where extracted behavioral surplus becomes monetizable commodity. Google's original advertising model sold space; surveillance capitalism sells predictions about which users will click which ads and when. Facebook sells predictions about which content will maximize engagement with which demographic segments. The business model's innovation was not better advertising but better prediction, and the prediction's accuracy depended on behavioral surplus volume and algorithmic sophistication. The competitive dynamic driving platform development is not providing better services to users but producing better predictions for third-party purchasers—a dynamic Zuboff argues fundamentally inverts the classical market relationship where producers serve consumers.
The AI age's prediction products are qualitatively different from earlier forms. Search-based predictions forecast instrumental behavior (this user is likely to purchase running shoes); social media predictions forecast engagement behavior (this user is likely to share political content). AI-interaction predictions forecast cognitive behavior: how competently this user evaluates code, how sound this user's judgment is under uncertainty, how effectively this user directs AI tools toward valuable ends. The predictions rest on cognitive behavioral data that reveals professional capability more directly than any resume, credential, or interview. An employer purchasing cognitive prediction products could sort applicants by thinking quality rather than by stated qualifications—a form of panoptic sorting invisible to those sorted, uncontestable by those it harms.
Cory Doctorow's challenge to Zuboff's framework centers on prediction products' actual efficacy: he argues the claimed power to modify behavior is "snake oil," that advertising industry claims vastly overstate actual influence, and that the real problem is monopoly power rather than behavioral manipulation. The debate is empirically unresolved—measuring behavior modification is methodologically difficult—but the AI moment appears to vindicate Zuboff's structural diagnosis. Even if prediction products' behavioral modification power is overstated, the extraction enabling their production is real, the markets trading them are real, and the concentration of cognitive behavioral data in platform hands creates power asymmetries whose consequences extend beyond advertising efficacy into labor market dynamics, professional credentialing, and the distribution of economic opportunity.
The term appears in The Age of Surveillance Capitalism (2019), Chapter 8, where Zuboff maps behavioral futures markets' architecture. The concept synthesizes ideas from derivatives markets (products whose value derives from underlying assets users never see), actuarial science (predictions purchased to manage risk), and algorithmic trading (computational products sold to parties optimizing for outcomes users don't control). Zuboff's innovation was recognizing that human behavior had become the underlying asset in a futures market where the products traded were predictions about what people would do, and the purchasers were parties whose interests in modifying behavior systematically diverged from the interests of those whose behavior was being predicted.
Sold to third parties, not users. The asymmetry defining surveillance capitalism—products manufactured from user data are sold to businesses whose interests may oppose user interests, inverting the classical market relationship.
Value depends on accuracy. Prediction products are worth more when they are more accurate, driving platforms to extract more intimate behavioral surplus and develop more sophisticated processing algorithms—the competitive dynamic intensifying extraction.
AI enables cognitive predictions. Not merely what users will click or purchase but how competently they think, how sound their professional judgment is—predictions about knowledge work capability that could reshape labor markets more consequentially than demographic profiling.
Behavioral modification is the goal. Purchasers want not merely predictions but the capacity to shape behavior—to direct clicks, purchases, beliefs, choices in ways serving commercial objectives, whether or not modification power is as effective as claimed.
Market is opaque to the predicted. Users experience prediction products' effects (targeted ads, personalized pricing, algorithmic recommendations) without understanding the classification systems producing those effects or possessing any mechanism to contest them.