This page lists every Orange Pill Wiki entry hyperlinked from Eli Pariser — On AI. 23 entries total. Each is a deeper-dive on a person, concept, work, event, or technology that the book treats as a stepping stone for thinking through the AI revolution. Click any card to open the entry; in each entry, words colored in orange link to other Orange Pill Wiki entries, while orange-underlined words with the Wikipedia mark link to Wikipedia.
The paradox at the heart of filter bubble analysis — that the mechanisms producing cognitive confinement are the same mechanisms producing the experience of ease, fit, and home.
The mechanism by which AI systems intensify the human tendency to seek and remember information confirming existing beliefs — by mirroring cognitive signatures with statistical precision and reducing the diversity of inputs that unmediated …
Pariser's framework for redesigning AI systems so that cognitive diversity becomes a first-order optimization target rather than a side effect sacrificed to helpfulness, alignment, and user satisfaction.
The reliance on external systems for one's understanding of the world — extended in the AI era from dependence on algorithmic curation of information to dependence on generative systems for cognitive capacity itself.
Pariser's counter-intuitive thesis that difficulty is not merely an obstacle but a carrier of signal — the resistance of a task tells the builder something important about her relationship to the material that frictionless interfaces engine…
Pariser's structural insight that the most powerful filtering systems are the ones whose operations the user cannot perceive — invisibility is not a side effect of filtering but its load-bearing feature.
Byung-Chul Han 's term for the contemporary cultural preference for frictionless surfaces — the iPhone's glass, the algorithmic feed, the AI-generated text — that conceals the labor and struggle that traditionally produced depth.
The design of information and production environments reconceived as civic architecture — structural choices about what users notice, what they overlook, what feels important, that shape cognition as consequentially as urban design shapes p…
The evolution of Pariser's framework for the AI era — an enclosure not around what users see but around what they can produce, imagine, and conceive as possible.
A design intervention that periodically produces outputs deliberately misaligned with the builder's request — outputs drawn from the margins of the possibility space — functioning as a structural reminder that convergent outputs are selecti…
Pariser's design concept for structured periods within AI-augmented workflows where the AI is deliberately absent — architectural intervention that creates spaces for independent cognitive operation rather than relying on the builder's will…
Pariser's 2011 diagnosis of the invisible algorithmic enclosure that surrounds each user — a personalized information environment whose selections feel like the world but are a curated subset of it.
The London School of Economics formalization (2025) of cognitive confinement in AI-mediated production — where users are filtered not by algorithms but by themselves, through the interaction of their prompts with the model's statistical ten…
The cognitive phenomenon by which unconscious processing during periods of conscious disengagement produces insights that direct effort cannot generate — eliminated by AI-augmented workflows that compress the gaps where incubation occurs.
The cognitive profile — frameworks, preferences, default assumptions — that determines what a person can perceive and produce before any external filter operates, and which AI systems mirror with unprecedented precision.
Herbert Simon's 1956 term for selecting the first adequate option rather than continuing to search for the optimal — operationalized by AI systems whose outputs are calibrated to satisfy minimum criteria and thereby preempt the search for b…
The feedback mechanism by which each user interaction strengthens the algorithm's model of the user, which produces more confirming outputs, which produce more confirming interactions — the engine of monotonic bubble contraction.
The systematic elimination of valuable unplanned encounters by optimization systems that, by definition, cannot produce what users did not predict wanting — extended from content consumption to creative production by generative AI.
The mathematical structure of large language models by which generation gravitates toward the probable center of the training distribution, systematically underrepresenting the productive edges where genuine novelty lives.
Serial entrepreneur and technologist whose The Orange Pill (2026) provides the phenomenological account — the confession over the Atlantic — that Pang's framework diagnoses and treats.
American author, activist, and technology entrepreneur (b. 1980) whose 2011 book The Filter Bubble defined a generation's understanding of algorithmic personalization and whose ongoing work on digital public spaces extends the analysis to …