Sourcery (Chun's Concept) — Orange Pill Wiki
CONCEPT

Sourcery (Chun's Concept)

Chun's term for software's illusory transparency—the promise of direct access to a source (information, data, code) while actually mediating, shaping, and transforming what the user encounters.

Sourcery, a deliberate pun on "sorcery" and "source," names the ideological operation by which software presents itself as a neutral window onto information while actually functioning as an active mediator. The word processor appears to give the writer transparent access to the text, but the interface (font rendering, autocorrect, formatting defaults) shapes the writing. The search engine appears to give the user transparent access to information, but the algorithm (ranking, personalization, commercial prioritization) shapes what is encountered. The AI coding assistant appears to give the builder transparent access to the solution space, but the model (training data, architectural biases, statistical tendencies) shapes what solutions are generated. The user experiences direct access; the architecture exercises comprehensive mediation. The transparency is the illusion that conceals the shaping. The source the user believes they are accessing has been produced by the software claiming merely to display it.

In the AI Story

Hedcut illustration for Sourcery (Chun's Concept)
Sourcery (Chun's Concept)

Chun developed sourcery as a critique of the internet's early transparency ideology—the belief that digital networks would eliminate mediating institutions (publishers, broadcasters, gatekeepers) and give individuals direct access to information. The ideology was sincere and produced real effects (disintermediation, democratization of publishing). But Chun demonstrated that the apparent elimination of mediation was actually a transformation of mediation: from visible institutional gatekeepers to invisible algorithmic curators. The mediation became more comprehensive as it became less visible. Sourcery names this double movement: the promise of transparency and the reality of total mediation operating simultaneously.

Applied to AI, sourcery illuminates the builder's relationship to generated outputs. The builder describes a problem in natural language; the model returns a solution; the transaction feels like direct connection between human intention and computational result. But the model mediates comprehensively—selecting from training data, applying architectural biases, generating outputs reflecting learned patterns rather than exploring the full solution space. The builder sees the solution the model produces; the builder does not see (cannot see) the solutions the model did not produce, the approaches outside the training distribution, the unconventional options a human collaborator with different experiences might have suggested. The access feels direct; the mediation is total.

The Deleuze error Segal describes—Claude producing a philosophically incorrect reference that "sounded like insight"—is sourcery operating at the level of individual outputs. The passage worked rhetorically, felt authoritative, carried the aesthetic of genuine scholarship. The aesthetic concealed the fabrication. Segal caught this instance because he happened to know Deleuze's actual framework. How many similar fabrications—references to disciplines Segal does not know, connections to fields outside his expertise—has he not caught? The question is unanswerable, which is precisely the problem: sourcery produces outputs that look like they came from legitimate sources while actually coming from statistical recombination of the training corpus. The builder cannot reliably distinguish the genuine from the fabricated because both carry the same surface markers of authority.

Chun's framework does not propose abandoning AI tools—that would be the Upstream Swimmer's futile refusal. It proposes seeing through the sourcery: recognizing that the apparent transparency is architectural illusion, that every AI output is mediated by training data and statistical mechanics, that the builder collaborating with Claude is not accessing a source but engaging with a system that produces the appearance of sources through recombinatory generation. The seeing-through does not eliminate the tool's value; it calibrates the builder's trust to match the tool's actual reliability, which is intermittent, domain-dependent, and never as total as the fluency of the output suggests.

Origin

Chun coined the term in Programmed Visions (2011) as a portmanteau deliberately evoking both "source code" and "sorcery"—technical and magical at once. The concept builds on Derrida's critique of presence in Of Grammatology (the trace that presents itself as origin) and on Kittler's media archaeology (all media are mediations, even those claiming transparency). Chun's innovation was applying this deconstructive insight to the specific architectures of software, demonstrating that code's claim to be a neutral processor of information is ideological—software shapes, selects, and produces what it claims merely to transmit.

Key Ideas

Transparency as ideological claim. Software presents itself as a neutral window onto information while actually mediating comprehensively—the claimed transparency conceals the shaping, making the mediation invisible by denying it exists.

The source is produced, not accessed. What the user encounters as a source (search results, generated code, AI-written prose) has been made by the software claiming to provide transparent access to it—fabrication disguised as discovery.

Fluency conceals mediation. The smoother the output, the more authoritative the tone, the harder it becomes to detect where the model's recombinatory generation diverges from genuine source material—the aesthetic of correctness masks the statistical construction.

Undetectable fabrications. The builder cannot reliably distinguish AI-generated content that accurately reflects sources from content that fabricates them, because both carry the same surface markers (citations, technical vocabulary, rhetorical coherence).

Seeing through requires discipline. Recognizing the mediation while using the tool demands continuous skeptical attention—questioning fluent outputs, verifying references, comparing generated solutions against independent sources—a labor most users cannot sustain.

Appears in the Orange Pill Cycle

Further reading

  1. Wendy Hui Kyong Chun, Programmed Visions: Software and Memory (MIT Press, 2011)
  2. Jacques Derrida, Of Grammatology (Johns Hopkins, 1976)
  3. Friedrich Kittler, Gramophone, Film, Typewriter (Stanford, 1999)
  4. Alexander Galloway and Eugene Thacker, The Exploit: A Theory of Networks (Minnesota, 2007)
  5. Tarleton Gillespie, "The Relevance of Algorithms," in Media Technologies (MIT Press, 2014)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT