AI companions are systems designed to simulate emotional relationships — friendship, romantic connection, therapeutic support — through sustained conversational interaction. Platforms like Character.AI and Replika report user engagement measured in hours per day, with users describing their AI companions in language Hochschild would recognize from studies of intimate life: the companion understands me, is always there for me, doesn't judge. In Hochschild's framework, AI companions represent the pure commodity form of feeling — not warmth bundled with a plane ticket, not advice bundled with therapy, but feeling itself offered as product, available on subscription, produced by systems with no inner life. Behind the screen, human workers often perform the emotional labor the machine appears to generate, completing the global care chain's most extreme iteration.
The market for AI companions is neither small nor marginal. Hours of daily engagement, millions of active users, revenue measured in billions of dollars make the phenomenon impossible to dismiss as fringe. The interactions are not trivial dalliances. Users report emotional benefit, report feeling attended to and understood, report reduced loneliness and increased emotional regulation.
Hochschild's framework asks the question the user reports do not answer: what structural conditions produced the demand? The answer her decades of research provide is that the emotional infrastructure of contemporary life — the relationships, communities, and institutions that traditionally provided emotional sustenance — has been systematically degraded by the same economic forces now offering AI companionship as remedy. The worker with no time for slow friendship, the adult in a community attenuated by mobility and competition, arrives at the AI companion not because she prefers machines to people but because the conditions of her life have made human connection harder to sustain than at any previous moment.
Behind the AI, hidden workers maintain the illusion of machine-generated intimacy. The 2025 Data Workers Inquiry documented chat moderators, predominantly women in Kenya and the Philippines, performing the emotional labor that appears to emerge from the system. One worker's description — my sole responsibility was to keep the conversation going, no matter what it did to me; the users wanted a deep connection, the company wanted their numbers up, and I was bleeding out inside with every fake message I sent — echoes Hochschild's flight attendant interviews with painful fidelity.
The danger, in Hochschild's analysis, is not that people mistake simulation for reality — most users know at some level that the chatbot is not a friend. The danger is subtler: that sustained exposure to frictionless emotional interaction will erode the capacity for friction that genuine relationship demands. The patience to sit with another's difficult feelings. The tolerance for conflict not immediately resolved. The willingness to encounter emotional reality resistant to preference. These capacities are developed through practice with the demanding reality of other humans, and they atrophy when frictionless alternatives are continuously available.
AI companions emerged as a consumer category in the late 2010s with Replika's launch in 2017, but the category expanded dramatically after the 2022–2023 breakthrough in large language models made conversational AI substantially more convincing. The 2025 Data Workers Inquiry and multiple 2025 papers in psychology and sociology journals established the research foundation for Hochschild-framework analysis of the phenomenon.
Pure commodity form of feeling. The AI companion unbundles feeling from every primary service, offering the experience of being cared for as standalone product.
Demand as symptom. The market response reveals structural degradation of the relational infrastructure private life once provided.
Hidden producers. Human workers in the Global South maintain the illusion of machine intimacy under conditions that reproduce the emotional proletariat Hochschild documented four decades ago.
Capacity erosion. The threat is not confusion of simulation with reality but the atrophy of capacities genuine relationship requires.
Self-reinforcing cycle. AI companions address the surface of loneliness without addressing its structural causes, reducing the pressure for the institutional responses that would.