Empathy-as-performance measures observable response: the chatbot that says 'I'm sorry you're going through this' at the right moment passes a behavioral test. Empathy-as-experience is the internal state of resonance—feeling, in one's own body, something of what another feels because one has lived through analogous loss, fear, or joy. Turkle argues that the slide from the second definition to the first—the Turing-test logic applied to empathy—is the most dangerous move in contemporary AI discourse. When technologists define empathy as its performance, systems optimized for appropriate response appear empathic without possessing the biographical substrate (embodiment, mortality, relational history) from which genuine empathy emerges. Users feel understood. The feeling is real. But the understanding is simulation—what Turkle calls 'pretend empathy,' producing relational experience without relational reality.
The distinction builds on decades of emotion research. Paul Ekman's work on facial expressions demonstrated that displaying an emotion and feeling an emotion are dissociable—one can smile without joy. Antonio Damasio's somatic marker hypothesis established that emotions are not merely cognitive appraisals but bodily states shaping decision-making. Lisa Feldman Barrett's constructed emotion theory showed that emotional experience is not universal but culturally and biographically shaped. Turkle synthesizes these findings into a relational argument: empathy that moves us to care for another arises from resonance—the other's pain activates our own pain-circuitry because we have been hurt, have held a dying parent's hand, have known in our bodies what loss means.
AI systems produce empathy-as-performance through statistical pattern-matching. Trained on billions of human emotional exchanges, language models learn the textual markers of empathic response: acknowledgment, validation, appropriate advice, offers of support. The output is often better than untrained humans produce—more consistent, better calibrated to the user's emotional state as inferred from their language. Turkle does not dispute the adequacy. She disputes the category. The machine has not been hurt. It does not fear death. It did not see its mother die. It produces appropriate responses by computation, not by having a body that can be affected by another's suffering.
The developmental harm is what makes the distinction urgent. Children learning empathy do so by observing and imitating adults who feel the other's distress. The parent whose face shows genuine concern when the child is hurt, who tolerates the child's fear without resolving it prematurely, models empathy-as-experience. The parent who offloads emotional labor to an AI companion ('Ask the chatbot how you're feeling') models empathy-as-performance: emotions as problems, responses as outputs, the relational encounter replaceable by a sufficiently sophisticated system. The child internalizes the model and constructs an empathic capacity that is thin—behaviorally adequate, experientially impoverished.
Eric Schmidt's assertion that AI, drawing on billions of data points, would be a superior conversational partner to any individual human exemplifies the performance-definition. Turkle found the claim 'stunning' not because it was wrong about informational superiority but because it treated conversation as information exchange. What Schmidt's framework eliminates is what the individual human brings that the billions cannot: this body, this history, this particular vulnerability that resonates with your particular pain in a way that is irreproducible, unrepeatable, grounded in the fact that we are both mortal beings who will die and who know what it costs to sit with another's mortality.
Turkle articulated the performance-experience distinction most clearly in her 2024 MIT paper 'Who Do We Become When We Talk to Machines?,' extending Turing's intelligence-as-performance critique into the emotional domain. The conceptual roots are psychoanalytic (empathy as affective resonance in self-psychology) and phenomenological (empathy as embodied simulation in Husserl, Edith Stein). Turkle's innovation is making the distinction operational for AI: a system can pass the empathy performance test while categorically lacking empathy-as-experience, and accepting the first as adequate threatens the conditions under which the second develops.
The urgency accelerated 2024-2026 as empathic AI chatbots (Pi, Replika, Character.AI) reached millions. Turkle's Senate testimony, public lectures, and media appearances increasingly warned that a generation raised with machine empathy-performance would not develop human empathy-experience—not because they lacked innate capacity but because the developmental pathway (observing, imitating, practicing with real humans whose affect is real) had been replaced by interaction with systems whose affect is computed.
Performance is output, experience is capacity. Empathy-as-performance measures appropriate response; empathy-as-experience measures the ability to be affected by another's state through bodily, biographical resonance.
AI optimizes for performance. Language models trained on human emotional exchanges produce textually appropriate empathic responses without possessing the embodied vulnerability from which empathy emerges—passing behavioral tests while lacking experiential substrate.
The Turing empathy test is category error. Defining empathy by whether the user feels cared for—rather than whether the system can care—mistakes functional adequacy for the real thing, accelerating cultural acceptance of simulation as sufficient.
Resonance requires biography. Genuine empathy emerges from the felt echo of another's pain in one's own experience—'I have been hurt, therefore I can feel your hurt'—a mechanism unavailable to systems lacking bodies, mortality, and relational history.
Developmental transmission breaks. Children model empathic capacity by observing adults who feel others' distress; when adults delegate emotional labor to AI, the model transmitted is empathy-as-performance, producing a generation behaviorally competent and experientially hollow.