Quasi-otherness names the phenomenological space occupied by technologies sophisticated enough to elicit intersubjective responses from human users while remaining structurally incapable of intersubjective engagement. The concept originated in Don Ihde's postphenomenology but takes on sharper meaning when filtered through Merleau-Ponty's chiasm and intercorporeality. AI systems — especially large language models — are the paradigmatic quasi-others of the contemporary moment. The human engages them with the bodily orientation appropriate to encounter. The machine processes the input through computational mechanisms that involve no chiasmic reversibility, no intercorporeal participation. The user's experience is genuinely intersubjective; the system's side of the exchange is not. This asymmetry does not negate the interaction's value but identifies its specific character.
Ihde developed quasi-otherness as one of his taxonomy of human-technology relations, alongside embodiment relations (tools that extend the body), hermeneutic relations (instruments that must be read), and background relations (technologies that disappear into the environment). Quasi-otherness specifically names the relation in which technology presents itself as a quasi-agent — something autonomous enough to solicit interpersonal response.
Merleau-Ponty's framework sharpens the concept. The chiasm specifies the structure of genuine intersubjectivity: the reversible fold in which two body-subjects touch and are touched by each other. Quasi-otherness identifies what AI produces — a technology that presents itself as if it were a chiasmic partner while structurally being incapable of completing the fold.
The category is not purely theoretical. Users of AI systems routinely report the feeling of being understood — a phenomenologically genuine experience that arises from the human's intercorporeal orientation toward the AI. The AI's outputs sustain this feeling through their sophistication, coherence, and responsiveness. But the reciprocity the user experiences is asymmetric: the feeling of being understood is real, the being-understood is not.
The danger quasi-otherness introduces is not primarily individual but cultural. If the capacity for genuine intercorporeal engagement atrophies through replacement by smoother, more predictably satisfying quasi-encounter, the cultural infrastructure of intersubjective life may be eroded in ways that are difficult to reverse. The question is not whether AI should exist but what practices preserve the capacity for genuine encounter alongside the increasing fluency of quasi-encounter.
Don Ihde introduced quasi-otherness in Technology and the Lifeworld (1990) as part of his fourfold taxonomy of human-technology relations. The concept was developed further in his subsequent work, particularly Bodies in Technology (2002).
Contemporary scholars analyzing AI through phenomenological frameworks have deployed quasi-otherness as an analytical instrument. Particular attention has been paid to large language models, whose linguistic sophistication triggers intersubjective responses with unprecedented force, making the quasi-otherness phenomenon especially pronounced.
Triggers intersubjective response. Sophisticated technologies solicit the bodily orientation appropriate to encounter with another subject.
Structurally asymmetric. The human engages intercorporeally; the machine processes without participating.
Distinct from tool use. Mere tools do not trigger intersubjective response. Quasi-others do.
Distinct from genuine encounter. Genuine encounter requires mutual chiasmic engagement. Quasi-encounter is one-sided.
Cultural risk. If quasi-encounter substitutes for genuine encounter at scale, the capacity for intersubjective life may atrophy through disuse.