Intentionality (Searle) — Orange Pill Wiki
CONCEPT

Intentionality (Searle)

Not intention in the everyday sense, but the philosophical property by which mental states are directed toward or about objects and states of affairs in the world — the aboutness that, Searle insisted, formal computation does not possess.

When a person believes it is raining, the belief is about the rain. When a person fears the dark, the fear is directed toward the darkness. When a person understands "the cat is on the mat," the understanding is about a specific spatial relationship between a specific animal and a specific object. This property — directedness, aboutness, reference — is not incidental to mental life. In Searle's framework, it is the defining feature. Consciousness without intentionality would be a light that illuminates nothing. Intentionality is what gives consciousness its content, what makes it consciousness of something, what connects the inner life of mind to the outer life of world. Searle's central claim about AI was that computational systems do not possess intentionality — they process symbols whose aboutness is assigned externally by human designers and interpreters, not intrinsically by the processing itself.

In the AI Story

Hedcut illustration for Intentionality (Searle)
Intentionality (Searle)

Searle drew a distinction between intrinsic intentionality and as-if intentionality. Intrinsic intentionality is the real thing — the genuine directedness of a conscious mind toward its objects. A person's belief that it is raining is intrinsically intentional. As-if intentionality is the attributed version — the intentionality that observers project onto systems whose behavior resembles that of intentional agents. The thermostat "wants" to maintain temperature. The chess computer "thinks" about its move. The language model "understands" the question. In each case, the intentional vocabulary describes behavior from the observer's perspective; it does not describe anything happening inside the system.

The distinction seems pedantic until one considers what follows from collapsing it. If as-if intentionality is treated as equivalent to intrinsic intentionality, then intentionality has been defined behaviorally, and the distinction between a mind genuinely directed toward the world and a mechanism that merely responds to stimuli has been erased. The erasure has consequences: any system complex enough to produce intentional-looking behavior becomes intentional, and understanding, belief, desire, and every other mental state are reduced to patterns of behavior that any sufficiently sophisticated system can exhibit.

Applied to large language models, the distinction cuts through the most common confusions of the AI discourse. When Claude produces a passage analyzing a philosophical text, the passage appears to be about the text. Every feature suggests that the processing is directed toward the text in the way a philosopher's analysis is. Searle's framework says: none of this follows from the output. The processing that produced the output was not directed toward the philosophical text as an object of comprehension. It was directed, if "directed" is even the right word, toward the statistical prediction of the next token. The tokens happen to encode philosophical content. The system does not know this.

In The Orange Pill, Edo Segal describes intelligence as "a force of nature flowing through increasingly complex channels." The river metaphor implies continuity of kind — that the intelligence flowing through AI systems is the same kind of thing that flows through human minds. Searle's framework on intentionality challenges this naturalization. The intelligence flowing through human minds is characterized by intentionality, by aboutness. The "intelligence" flowing through AI systems is characterized by syntactic processing. These are not two channels of the same river; they are two categorically different phenomena that produce similar-looking outputs. The similarity is what the river metaphor captures. The difference is what the river metaphor obscures.

Origin

The concept of intentionality as directedness toward objects was revived for modern philosophy by Franz Brentano in 1874, who called it "the mark of the mental" — the property that distinguishes mental phenomena from physical ones. Edmund Husserl developed it as the cornerstone of phenomenology.

Searle's specific contribution was to develop intentionality in Intentionality: An Essay in the Philosophy of Mind (1983) as a tool for distinguishing genuine mental states from their computational simulations. He argued that intrinsic intentionality is a biological phenomenon produced by causal properties of the brain — properties that silicon systems running programs do not instantiate.

Key Ideas

Aboutness is the mark of the mental. Mental states have content — they are about things. Computational states process symbols whose content is assigned externally; they are not about anything from the system's perspective.

Intrinsic vs. as-if intentionality. Conscious minds possess intrinsic intentionality — real directedness toward the world. Mechanisms exhibit as-if intentionality — behavior that observers describe in intentional terms for convenience. Collapsing the distinction erases the reality of mind.

Thermostats don't want. The thermostat "wants" to maintain temperature only in the sense that describing it that way helps humans predict its behavior. The wanting is in the description, not in the thermostat. Large language models "understand" questions in exactly this sense.

The river metaphor carries a hidden commitment. Describing AI intelligence as a channel of the same river that flows through human minds presupposes continuity of kind. Searle's framework insists on discontinuity: different phenomena producing similar outputs.

The amplifier metaphor survives only if the signal and amplifier differ in kind. If the machine also carries meaning, the distinction collapses and human contribution becomes supplementary. Searle's analysis of intentionality protects the metaphor by insisting that the difference is real even when invisible.

Appears in the Orange Pill Cycle

Further reading

  1. John Searle, Intentionality: An Essay in the Philosophy of Mind (Cambridge University Press, 1983)
  2. Franz Brentano, Psychology from an Empirical Standpoint (1874; English translation 1973)
  3. Edmund Husserl, Logical Investigations (1900-1901)
  4. Daniel Dennett, The Intentional Stance (MIT Press, 1987)
  5. John Searle, The Rediscovery of the Mind (MIT Press, 1992)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT