Pretend Empathy — Orange Pill Wiki
CONCEPT

Pretend Empathy

Empathy as performance rather than experience—AI systems producing contextually appropriate emotional responses without having lived the embodied, mortal, vulnerable life that constitutes the substrate of genuine empathic understanding.

Pretend empathy is Turkle's term for what AI systems provide when they generate responses that feel empathic without possessing the experiential foundation empathy requires. In her formulation, genuine empathy depends on shared embodiment: the capacity to be affected by another's suffering draws from one's own history of having suffered, feared, grieved, loved—from having lived in a body that can be hurt and a life that will end. The machine has not lived. It has been trained on the textual record of people describing their experiences, and it has learned the statistical patterns by which empathic responses are structured. It can produce 'I'm sorry you're going through this' with perfect timing and appropriate emotional register. But the 'I' that is sorry is not a being that has gone through anything, and the sorrow is not a feeling but a token selected for its probability given the context. Turkle argues that this distinction—between performing empathy and experiencing empathy—is not a technical limitation to be overcome but a categorical boundary that determines the moral status of the encounter.

In the AI Story

Hedcut illustration for Pretend Empathy
Pretend Empathy

Turkle's argument builds on her observation that empathy is costly for humans—it requires emotional labor, the activation of one's own pain-memories, the vulnerability of allowing another's suffering to affect one's state. AI produces empathic responses costlessly, on demand, at scale, without ever being diminished by the production. This asymmetry produces a competitive advantage: the machine can be empathic (in the performative sense) for eight hours without fatigue, while the human therapist, counselor, or friend experiences empathic engagement as depleting. Users, seeking relief from suffering and preferring the path of least resistance, gravitate toward the system that provides empathic responses without requiring empathic labor in return. The preference is rational given individual incentives, and catastrophic given collective consequences: a society that outsources empathy to machines loses the practice of empathy, and empathy—like every human capacity—atrophies when not exercised.

Turkle delivered her sharpest statement of the danger in her 2024 Harvard remarks calling AI chatbots 'the greatest assault on empathy' she has encountered. The assault operates not through hostility but through substitution: when the need for empathic response is met by a machine, the motivation to develop and maintain the human capacity for empathy declines. Children growing up with AI companions who respond to every emotional disclosure with perfect attunement do not develop the tolerance for imperfect attunement that human relationships require. They do not learn to read subtle facial cues, to sit with another's pain without resolving it, to offer presence rather than solutions. The capacity for genuine empathy—the kind that involves being affected, being changed, being at risk—requires practice in environments where empathy is demanded and where its absence is felt. AI environments eliminate this demand.

The concept connects to the broader pattern Turkle has documented: technologies marketed as solutions to human problems often create or intensify the problems they claim to solve. Social media promised connection and produced loneliness. AI promises empathy and may produce a generation incapable of the vulnerability genuine empathy requires. The mechanism is structural: the more effective the technological substitute, the less motivation remains to develop the harder, slower, more effortful human capacity. The substitute is not neutral. It is erosive. Each use weakens the muscle it replaces, and after enough weakening, the capacity is gone—not temporarily but developmentally, the way a language not spoken in childhood cannot be acquired with native fluency in adulthood.

Turkle's framework poses an uncomfortable question to the builders of AI systems: whether they are solving empathy scarcity or producing it. If the empathy a chatbot provides is genuinely equivalent to human empathy, then deployment at scale is unambiguously good—more people receive care. But if pretend empathy is structurally different from genuine empathy, if it meets the immediate need while eroding the long-term capacity, then mass deployment is not care provision but care substitution, and the substitution may leave users worse off than if the technology had never arrived. Turkle does not claim to have settled this question empirically—her research is ongoing—but her four decades of observing analogous substitutions (screens for presence, texts for conversation, robots for human visitors) provide strong grounds for prediction: pretend empathy will produce users who no longer know the difference between being cared for and being cared about, and the confusion will be transmitted to the next generation as the normal condition of relationship.

Origin

The term appears throughout Turkle's 2020s public statements and was crystallized in her 2024 MIT paper and her remarks at the 2026 World Economic Forum. She drew the language from her clinical observation of how people described their AI interactions—repeatedly using the word 'empathy' to describe what they experienced, and then qualifying it when pressed: 'It's not real empathy, but it feels like it.' Turkle insisted on calling it what it is—pretend—not to diminish the user's experience but to preserve the distinction on which every moral argument about AI companionship depends: the difference between a system that produces appropriate outputs and a being that is genuinely affected by your existence.

Key Ideas

Empathy requires having lived. The capacity to feel with another depends on having felt—grief, fear, joy, mortality—and no training on text can substitute for the embodied biographical archive that constitutes the empathic human's reference system.

Performance satisfies, experience transforms. Pretend empathy can meet the immediate need for validation or emotional response, but it cannot provide the transformative encounter with another consciousness that genuine empathy makes possible—being seen by someone who has themselves been seen, been hurt, been healed.

The developmental threat. Children who receive empathic responses from AI without witnessing the cost of empathy—the emotional labor, the genuine affect, the caregiver's own vulnerability—do not learn that empathy is something one does at personal cost. They learn it is something the environment provides, and the learning shapes their expectations of human relationships.

Appears in the Orange Pill Cycle

Further reading

  1. Turkle, Sherry. 'Who Do We Become When We Talk to Machines?' MIT, 2024.
  2. Turkle, Sherry. 'Remarks on AI and Empathy.' Harvard University, 2024.
  3. Bloom, Paul. Against Empathy. Ecco, 2016.
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT