Programmed Visions — Orange Pill Wiki
CONCEPT

Programmed Visions

Chun's 2011 thesis: software does not merely process information but programs perception—shaping what users see as visible, possible, and normal through interface architecture and default settings.

Programmed visions are the structured perceptual habits that software produces in its users. Every medium shapes what its audience can see, but software does so with a specificity that earlier media could not approach: by organizing the space within which thinking occurs, by determining which options appear as defaults and which require effort to access, by structuring information in ways that feel neutral but carry the assumptions of the system's designers. The user who works within a software environment does not simply use the software—the user sees through the software, the way a person wearing tinted glasses sees through the tint. The color of the world shifts, and the shift becomes invisible because it is total. This is not manipulation in the propaganda sense; it is environmental—a shaping of perception through the architecture of interaction.

In the AI Story

Hedcut illustration for Programmed Visions
Programmed Visions

Chun's concept emerged from her analysis of how early internet culture promised transparency ("information wants to be free") while actually constructing highly mediated environments. The transparency was ideological; the mediation was infrastructural. Software appeared to give users direct access to information, but the interface—the search algorithm, the database query, the visualization schema—shaped what information appeared, in what order, with what emphasis. The shaping was not secret; it was structural, embedded in design choices that felt technical rather than political. Programmed visions names the cumulative effect: users' sense of what exists, what matters, what is normal is shaped by the architectures they inhabit.

Applied to AI, the concept undergoes a significant mutation. Pre-AI software programmed vision by displaying information—selecting what to show from an existing corpus. AI systems program vision by generating information—producing outputs that did not exist before the model produced them. The shift from curation to generation is a shift in the depth of the programming. The builder who sees a curated list sees someone else's priorities made visible. The builder who sees generated code, generated prose, generated design sees the model's training distribution made manifest—not as a representation of the world but as a produced version of possible worlds, carrying the model's biases, blind spots, and characteristic tendencies as constitutive features of the output.

The prompted imagination—the sense of what is possible, buildable, worth attempting—becomes over time a trained imagination, shaped by the range of outputs the model characteristically produces. The builder does not notice the solutions the model never generates, the approaches that fall outside the training distribution, the unconventional responses a human collaborator might offer. The absence is invisible because it is structural—not a gap in any particular output but a feature of the statistical architecture that produces all outputs. The builder's sense of unlimited possibility is itself a programmed vision: produced by a medium that is, like all media, both enabling and constraining at once.

Origin

The concept built on Chun's earlier work in Control and Freedom (2006) and reached its fullest statement in Programmed Visions: Software and Memory (2011). Chun's training in systems design engineering at Waterloo and her Princeton doctorate in comparative literature produced the specific dual competence the concept required: technical literacy sufficient to read code architectures and cultural-theoretical sophistication sufficient to read those architectures as ideological forms. The book's title encodes the dual claim: visions are programmed (technical), and programming produces visions (cultural).

The concept's immediate genealogy includes Lev Manovich's analysis of software as cultural form, Alexander Galloway's work on protocol, and Friedrich Kittler's media archaeology. What distinguished Chun's intervention was the insistence on memory as software's organizing principle—software is a medium of memory before it is a medium of processing, storing inscriptions of past actions to be executed in the future. Applied to AI, this insight becomes: the model does not think, it remembers; it recombines patterns from training data rather than producing genuinely novel solutions. The prompted builder is collaborating not with an intelligence but with a vast, statistically organized memory of what other builders have already done.

Key Ideas

Software shapes perception through architecture. Not through overt persuasion but through the structure of interaction—which options appear, which require effort, which remain invisible—the interface programs what the user sees as possible and normal.

From display to generation. Pre-AI software programmed vision by curating existing information; AI programs vision by producing information that reflects its training data's patterns, biases, and blind spots.

The prompted imagination is a trained imagination. Builders who work daily with AI tools develop a sense of the possible shaped by the model's characteristic outputs—learning to imagine within the space the tool can service while losing visibility of what lies beyond it.

Memory, not intelligence. Software stores inscriptions of past actions encoded to execute in the future; AI models recombine remembered patterns from training corpora, producing outputs that reflect history rather than genuinely novel futures.

The absence is invisible. The builder cannot see what the model does not generate—the solutions outside the training distribution, the approaches the architecture renders unlikely—because absence leaves no trace in the presented output.

Appears in the Orange Pill Cycle

Further reading

  1. Wendy Hui Kyong Chun, Programmed Visions: Software and Memory (MIT Press, 2011)
  2. Lev Manovich, Software Takes Command (Bloomsbury, 2013)
  3. Alexander Galloway, Protocol: How Control Exists After Decentralization (MIT Press, 2004)
  4. Friedrich Kittler, "There Is No Software," in Literature, Media, Information Systems (1997)
  5. Matthew Fuller, Software Studies: A Lexicon (MIT Press, 2008)
  6. Safiya Noble, Algorithms of Oppression (NYU Press, 2018)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT