The bribe's effectiveness rests on three structural features. The first is totality: it encompasses not a single domain but all domains simultaneously. The AI tool does not merely make work more productive — it reshapes the relationship to work, to rest, to creativity, to time itself. The builder who has experienced AI-augmented flow finds non-augmented work intolerably slow, the way walking feels intolerable to someone who has learned to fly.
The second feature is voluntary acceptance. The bribe is not imposed; it is offered, and the offer is so attractive that acceptance feels like choice rather than coercion. The slave knows he is enslaved. The user of an extraordinarily effective AI tool knows only that the tool works, that the work it enables is satisfying, and that continuing to use the tool is freely chosen. The freedom is experienced as real, and the structural consequence — the progressive surrender of the time and attention required to evaluate the exchange — proceeds behind the veil of the voluntarism.
The third feature is the escalation of the baseline. Each benefit becomes the new minimum expectation, and withdrawal of the benefit is experienced not as return to a prior condition but as active deprivation. The engineer who has worked with Claude Code for six months finds manual coding existentially threatening, as though she has been asked to perform at a level she has outgrown. The tool has not merely augmented her productivity; it has redefined her sense of what her productivity should be, and the redefinition ratchets in one direction only.
The ratchet is what makes the bribe so difficult to refuse. Initial acceptance is genuinely free. Each subsequent period of use raises the baseline. Each raised baseline makes refusal more costly. Each higher cost makes the next acceptance less free — not because anyone forces continuation, but because the structure of the exchange has been designed, whether intentionally or emergently, to make continuation feel rational and cessation feel like failure. This is the mechanism Edo Segal describes when he confesses, in You On AI, that he could not stop building.
Mumford introduced the term most explicitly in his 1964 essay 'Authoritarian and Democratic Technics', where he asked why the age had 'surrendered so easily to the controllers, the manipulators, the conditioners of an authoritarian technics.' His answer was that the bargain took the form of a magnificent bribe — genuine goods distributed widely enough to secure voluntary compliance with systems whose operation no individual could meaningfully contest.
The insight emerged from Mumford's long study of how industrial civilization had achieved a hold over populations that the pharaohs, for all their divine authority, could not have attained. The conclusion was counterintuitive: the more democratic the distribution of benefits, the more comprehensive the system's capture of the beneficiaries.
Genuine goods. The bribe's effectiveness depends on the reality of the benefits offered; a system of pure propaganda would collapse, while a system of real abundance persists.
Totality of scope. The bribe operates across every domain of life simultaneously, colonizing not just the hours of use but the hours of non-use, which become experienced as deprivation.
Voluntarism. Acceptance feels like choice; the coercion is embedded in the structure of the exchange rather than imposed from outside.
Baseline escalation. Each benefit becomes the new minimum expectation, ratcheting the cost of refusal until refusal becomes practically impossible.
Invisibility of terms. The bribe succeeds precisely because its conditions are embedded in the texture of experience rather than announced as rules; evaluation requires a distance the bribe's totality eliminates.