The attention economy names the business model under which most digital platforms operate: providing free services in exchange for user attention, which is then sold to advertisers at prices determined by the precision with which the attention can be targeted. Gore has identified this model as the structural driver of the algorithmic pathologies degrading democratic deliberation. The incentive to maximize time-on-platform produces engagement optimization, which systematically favors content producing emotional arousal over content informing rational deliberation. The pathology is not a design bug. It is the intended consequence of the economic model, and it cannot be fixed by individual platforms without altering the competitive dynamics that make ruthless optimization the rational strategy.
Gore's analysis follows Tim Wu, Shoshana Zuboff, and the broader literature that has mapped the attention economy's architecture. Every design decision — infinite scroll, notification system, recommendation algorithm, variable reward schedule — serves a single objective: maximize the quantity and predictability of human attention flowing through the platform. The algorithms that Gore calls digital AR-15s were not accidents. They were designed, and the designers understood the effects. The Orange Pill's account of Segal's own confession — that he built products he knew were addictive by design — locates this architecture in specific decisions by specific people operating under specific incentives.
AI supercharges the attention economy in three compounding ways. It industrializes persuasion: a single operator can generate thousands of unique, locally adapted persuasive messages in the time previously required for one. It perfects prediction: recommendation algorithms already predict user behavior with remarkable accuracy, and AI improves predictions by processing more data and identifying subtler patterns. It dissolves the effort signal: AI-generated content is indistinguishable from expert content on every surface criterion citizens use to evaluate quality. The compound effect is the information ecosystem crisis that the previous entry describes.
Gore's framework extends the responsibility from individual builders to institutional structures. Individual builders can choose not to build manipulative systems, and the choice matters. But the systemic incentive to build them persists as long as the economic model rewarding attention extraction remains in place. The democratic response requires systemic intervention: changes in the economic model funding digital platforms, changes in the regulatory framework governing algorithmic systems, changes in the educational infrastructure preparing citizens to navigate algorithmically curated environments. These are democratic choices about what kind of information environment a self-governing society requires.
The policy tools exist: transparency requirements for algorithmic systems, interoperability mandates reducing platform lock-in, data portability rights shifting power from platforms to users, public-interest obligations for platforms functioning as essential communication infrastructure, and educational programs developing the critical media literacy that democratic citizenship now requires. Each has precedent in existing regulatory frameworks. The obstacle is not the absence of solutions but the presence of powerful interests that benefit from the absence of regulation.
The attention economy as a diagnostic concept has roots in Herbert Simon's 1971 observation that a wealth of information creates a poverty of attention. The framework was developed through the 2010s by researchers including Tim Wu, Tristan Harris, and Shoshana Zuboff, and integrated into Gore's political analysis during the updating of The Assault on Reason for its 2017 edition.
Business model as cause. The pathologies of algorithmic curation are produced by the attention-extraction business model, not by individual design choices within it.
Engagement optimization. Maximizing time-on-platform requires optimizing for emotional arousal, which systematically favors content that undermines democratic deliberation.
Competitive lock-in. Individual platforms cannot unilaterally reform without losing competitive position to platforms continuing to optimize ruthlessly; the intervention must be systemic.
AI supercharge. Generative AI industrializes persuasion, perfects prediction, and dissolves effort signals — compounding the attention economy's democratic costs.
Policy toolkit available. Transparency, interoperability, data portability, and public-interest obligations are established regulatory mechanisms that could be applied to platforms if political will existed.