Exemplification — Orange Pill Wiki
CONCEPT

Exemplification

A symbol exemplifies properties it possesses by highlighting them—a tailor's swatch refers to fabric color not by denoting but by being that color and directing attention to it.

Exemplification is Goodman's term for the referential mode in which a symbol refers to properties it literally possesses by selecting and highlighting them. A tailor's swatch exemplifies the color and texture of the fabric—it does not denote the fabric (it is a piece of the fabric, not a label for it) but refers to color and texture by possessing those properties and making them salient. Exemplification is reference running in the opposite direction from denotation: where denotation goes from symbol to subject, exemplification goes from possessed property back to the label. The swatch possesses the color red; it exemplifies redness. A painting exemplifies the density of its brushwork, the warmth of its palette, the rhythmic arrangement of its forms—not by depicting these properties (you cannot paint a picture of brushwork density) but by possessing them and directing attention to them. Exemplification is the primary mode through which aesthetic works achieve their cognitive contribution, because it is through exemplification that the formal properties of the work—the specific configuration of colors, lines, sounds, words—become referentially significant rather than merely decorative. A painting that only denoted (pointed to a landscape) without exemplifying would be aesthetically inert; the exemplificational dimension is where the work's specific symbolic resources do their cognitive work.

In the AI Story

Goodman introduced exemplification to solve a problem the copy theory of representation could not address: how do abstract or non-representational works refer? A Mondrian composition does not denote anything in the world—it is not a picture of objects or events. But it clearly refers to something; viewers perceive it as meaningful, as organized, as conveying understanding about visual structure, balance, tension. Goodman's answer: the Mondrian exemplifies properties it possesses—the orthogonality of its lines, the purity of its colors, the asymmetric balance of its composition. The exemplification makes these formal properties referentially significant, and the understanding the viewer gains is understanding of those properties themselves—how balance can be achieved without symmetry, how color can structure space without depicting objects, how abstraction can be rigorous without being arbitrary.

Exemplification is selective possession-and-display. A swatch possesses many properties—weight, flammability, cost per yard—but it exemplifies only color and texture, because those are the properties the tailor's use-context makes salient. The selection is not arbitrary; it is determined by the purposes of the worldmaking project. An AI-generated image possesses visual properties—color values, edge contrasts, compositional arrangements—but whether it exemplifies those properties depends on whether they are deployed within a project that gives them referential significance. A casually prompted image may display formal properties without exemplifying them, in the same way that a random arrangement of fabric scraps might happen to display certain colors without functioning as a swatch. The formal properties are there. The referential function is absent, because there is no worldmaking intention establishing which properties are salient and why.

The exemplificational mode is where AI-generated work most often fails Goodman's standards while passing surface tests. The work looks like it exemplifies—it displays formal properties with apparent deliberateness, it organizes them into structures that appear coherent. But the appearance of exemplification is not exemplification, because exemplification requires that the possessed properties be selected for highlighting by a worldmaker with purposes. The selection is what makes the properties referentially significant rather than merely present. AI generates; it does not select in the relevant sense. The probability distribution determines which properties appear, but the determination is statistical, not purposeful. The output displays properties that training data correlates with the prompt, but it does not exemplify them—does not possess them with the specific intention of directing attention to them for reasons grounded in a worldmaking project. The exemplificational function, which Goodman identified as central to aesthetic cognition, is therefore exactly where the gulf between human craft and machine rendering opens widest.

Origin

Exemplification was introduced in Languages of Art (1968), Chapter II, as Goodman's solution to the problem of how non-representational art refers. The concept built on his earlier work in The Structure of Appearance on the logic of part-whole relations and property ascription. The key insight—that possession-and-highlighting is a distinct referential mode, irreducible to denotation—transformed aesthetics by providing a formal account of how abstract art, musical performance, and literary style achieve cognitive content. The concept has been extended by Catherine Z. Elgin and others into epistemology generally, where it illuminates how examples function in scientific reasoning and pedagogy.

Key Ideas

Reference through possession. Exemplification refers to properties the symbol literally possesses—the swatch is red and thereby refers to redness by being that color and making it salient.

Selection is constitutive. Exemplification requires selective highlighting—the swatch possesses many properties but exemplifies only those made salient by the use-context; selection requires a worldmaker with purposes.

Non-representational art exemplifies. Abstract works refer not by denoting objects but by exemplifying formal properties—balance, rhythm, density—yielding understanding of those properties themselves.

AI displays without exemplifying. AI-generated works may possess and display formal properties without genuinely exemplifying them, because exemplification requires purposeful selection that statistical generation cannot provide.

Appears in the Orange Pill Cycle

Further reading

  1. Nelson Goodman, Languages of Art, Chapter II (Hackett, 1968)
  2. Catherine Z. Elgin, 'Understanding: Art and Science,' Midwest Studies in Philosophy 16 (1991)
  3. Stephanie Ross, 'How Words Hurt: Attitude, Metaphor, and Oppression,' in Kelly Oliver and Stephanie Ross, eds., Re-Reading the Canon (Penn State, 1992)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT