Consumer sovereignty was the most flattering fiction in the history of economic thought: the market serves the autonomous preferences of freely choosing individuals. Galbraith spent decades demonstrating this fiction was structurally inaccurate. The AI economy has produced a successor fiction — creator sovereignty. The AI-empowered builder freely chooses what to build, how to build it, and on what terms. The builder is sovereign. The platform serves. The tool amplifies autonomous creative vision. The fiction is flattering and, by Galbraithian standards, approximately as accurate as its predecessor. The builder's choices are constrained by structures the builder did not create and cannot unilaterally alter — structures that operate at the level of tool design, access pricing, terms of service, training data curation, and alignment decisions.
The platform determines what the model can and cannot do through alignment decisions, content policies, and capability limits reflecting institutional interests rather than the builder's creative vision. The pricing structure determines what level of capability the builder can afford, creating a hierarchy of access segmenting the market into tiers corresponding to willingness to pay. Terms of service function as private law: written by the platform, enforced by the platform, amendable by the platform at its sole discretion. The training data determines what the model knows — what languages it speaks fluently, what perspectives are represented, what cultural contexts are legible. These curation decisions are made by the technostructure with minimal public disclosure and no democratic accountability.
The model's aesthetic tendencies — default prose style, patterns of reasoning, habitual structures of argument — shape the builder's output in ways the builder may not recognize. The Orange Pill describes passages where Claude produced text that "sounded like insight but broke under examination," where "the prose had outrun the thinking." These are not failures of the tool; they are features of a system whose design produces outputs optimized for plausibility rather than truth, for fluency rather than depth. The builder who accepts the output uncritically has not exercised sovereign judgment; the builder has accepted the system's aesthetic as a substitute for the builder's own.
The structural response is not individual resistance but institutional accountability. Transparency requirements disclosing training data biases and capability limitations — not in unread terms of service but at the moment of use. Interoperability preventing platform lock-in. Standards for algorithmic disclosure making demand-creation mechanisms visible. Competitive structures preventing consolidation of inference capability in so few hands that the "choice" of platform is a choice among a cartel's offerings. These institutional structures do not currently exist.
The fiction of sovereignty is comfortable. It is comfortable for the builder, who prefers to believe creative decisions are autonomously made. It is comfortable for the platform, which prefers to be seen as serving rather than structuring. It is comfortable for the conventional wisdom, which can celebrate capability democratization without examining the terms on which capability is accessed. Galbraith spent his career making comfortable fictions uncomfortable. The fiction of creator sovereignty is the latest in a long sequence.
The concept is introduced in Chapter 9 of the Galbraith simulation volume as a direct extension of Galbraith's consumer-sovereignty critique. Its theoretical lineage runs from Thorstein Veblen's analysis of conspicuous consumption through Galbraith's own development of the dependence effect and revised sequence, then forward through Shoshana Zuboff's analysis of behavioral futures markets and contemporary work on platform governance. The term itself is newly coined in this volume to name a pattern that has existed for several years but has lacked adequate vocabulary.
Structural constraint masquerades as free choice. The builder experiences choice within a space whose boundaries — priced, aligned, permitted — were drawn by the platform.
Private law through terms of service. The governing documents of AI platforms are written, enforced, and amended by single parties who nonetheless bind millions of users, operating as law without democratic accountability.
Aesthetic homogenization. The model's default tendencies shape output in ways builders may not recognize; the builder's voice becomes harder to distinguish from the model's defaults.
Individual resistance is insufficient. Segal's own practice of deleting polished-but-hollow passages is admirable; the millions of builders without his self-awareness remain the market in which the fiction operates unimpeded.