Situated meaning is Gee's foundational concept for distinguishing decontextualized knowledge (what textbooks provide, what AI can generate on demand) from the rich, textured, experientially grounded understanding that develops through practice in specific contexts. A computer science student who has read about constraints knows the definition. A developer who has spent three weeks watching a scheduling system fail because she misunderstood how constraints interact possesses situated meaning — knowledge of how constraints behave, how they cascade, what they feel like when they're about to break. Situated meaning is more durable than abstract knowledge, more transferable across contexts (counterintuitively), and more generative, because it supports the kind of judgment that lets a practitioner recognize novel problems as instances of familiar patterns.
Gee developed situated meaning through his study of literacy, observing that reading and writing are not abstract skills separable from the Discourses within which they are practiced. A person who can read scientific papers cannot necessarily read legal briefs. The situated meaning of "evidence" differs across these domains — not because the word means different things but because competent practice requires the textured understanding of how evidence functions in the specific context of use. The meaning is inseparable from the practice.
AI disrupts situated meaning in a specific, identifiable way. When Claude writes the constraint system, the developer who directed Claude has not acquired the situated meaning that implementation would have produced. She may acquire some situated meaning — about how to describe constraints clearly, how to evaluate constraint implementations, how to recognize when output diverges from intention. This is genuine situated learning, embedded in the emerging Discourse of AI-augmented practice. But it is the situated meaning of directing, not the situated meaning of doing, and the two are not interchangeable.
Gee's 2024 work on cybersapien literacy acknowledges this directly. The cybersapien practitioner develops genuine situated understanding of how to collaborate with AI — a new Discourse with its own identity kit, practices, and forms of situated meaning. But Gee also warned, in a less-cited dimension of the same framework, that uncritical AI use could produce what he called frozen language: language that looks right but lacks the situated meaning that would allow the user to adapt it to new contexts, recognize when it fails, or revise it when circumstances change.
The Deleuze error Segal describes in The Orange Pill is a precise instance of frozen language. Claude produced a passage connecting Csikszentmihalyi's flow concept to Deleuze's smooth space — elegant, well-structured, philosophically wrong. The passage deployed the correct names and argumentative structure but lacked the situated meaning that would have caught the category error. Segal caught it because his own situated understanding of the relevant Discourses was deep enough to register that something was off — a pre-articulate sense of wrongness that developed through years of engaging with the material and that no amount of frozen language can replicate.
Gee articulated the concept in Social Linguistics and Literacies (1990) and elaborated it across An Introduction to Discourse Analysis (1999) and subsequent works. The concept drew on Vygotskian cognitive psychology, pragmatist philosophy of language, and sociocultural theory of learning — a synthesis that placed Gee among the founders of the situated cognition movement in learning science.
Meaning is embedded in contexts of use. The dictionary definition is a starting point, not the thing itself.
Situated meaning is more transferable than abstract knowledge. Paradoxically, knowledge grounded in specific experience adapts more flexibly to new situations than knowledge that was always abstract.
Directing is not doing. The situated meaning of using AI differs from the situated meaning of performing the underlying practice.
Frozen language is the characteristic AI failure mode. Output that deploys correct vocabulary and structure without the situated understanding that would make it genuinely responsive to context.
Domain depth detects frozen language. Only practitioners with situated meaning in the relevant Discourse can reliably recognize when fluent output is hollow.
The question whether AI-collaborative practice produces situated meaning of equivalent depth to pre-AI practice is unresolved, and it may be answered differently in different domains. In domains where the AI handles implementation entirely, the situated meaning of directing may not substitute for the situated meaning of doing. In domains where practitioners continue to encounter the underlying reality — medicine, engineering, research — AI may augment situated meaning rather than displace it. The outcome depends on the specific structure of the practice and the deliberate design of the environment within which AI is deployed.