By Edo Segal
The room emptied, and the ideas stopped.
Not gradually. Not the slow fade of diminishing returns. The moment the Trivandrum engineers went back to their home offices and their individual screens, something changed in the quality of what they produced. The code was still excellent — Claude saw to that. The architecture was still sound. But the specific electricity of that week, the thing that made Wednesday feel different from Monday, the energy that turned twenty separate people into something that moved together — it didn't travel through the fiber optic cables back to their apartments.
I noticed it but couldn't name it. The outputs were there. The dashboards looked good. But something was thinner. Something that had been present in the room was absent from the Slack channel, and no amount of emoji reactions or threaded replies could reconstitute it.
Randall Collins gave me the name. Emotional energy — the specific, measurable charge that human beings generate when they share a focus of attention and become mutually aware of the sharing. Not metaphorical energy. The actual fuel that sustains creative commitment, organizational resilience, and the willingness to sacrifice individual comfort for collective purpose. Collins spent four decades mapping how this energy flows through intellectual communities across civilizations, and his finding is devastating in its consistency: the ideas happen between minds, in rooms, through encounters. Remove the encounters, and the ideas eventually stop, no matter how brilliant the individual minds.
This matters right now because the AI tools are so good at the cognitive part that we are tempted to believe the cognitive part is all there is. Claude provides immediate feedback, holds context across sessions, surfaces connections no human collaborator could find. The flow state is real. The productivity is extraordinary.
But Collins's framework asks the question I was avoiding: Is the productivity sustainable without the rituals that replenish what the machine cannot supply? When the crisis comes — and the crisis always comes — will my team hold together because of shared solidarity built through years of face-to-face struggle? Or will they fragment because the emotional reservoir was never refilled after Trivandrum?
This book applies Collins's interaction ritual theory to the AI transition documented in *The Orange Pill*. It examines what happens to the social fabric of work, creativity, and community when the most stimulating intellectual partnership available requires no mutual awareness at all.
The cognitive revolution is here. The ritual crisis is here too. Collins helps us see why we cannot solve the second by celebrating the first.
— Edo Segal ^ Opus 4.6
1941-present
Randall Collins (1941–present) is an American sociologist whose work has reshaped the understanding of how intellectual life, conflict, and social stratification actually operate at the micro-level of face-to-face encounter. Born in Knoxville, Tennessee, and educated at Harvard and UC Berkeley, Collins has held positions at the University of Virginia, the University of California–Riverside, and the University of Pennsylvania, where he is currently emeritus professor of sociology. His landmark work *The Sociology of Philosophies: A Global Theory of Intellectual Change* (1998) mapped the network structures of every major philosophical tradition across civilizations — Greek, Chinese, Indian, Islamic, Japanese, and European — demonstrating that intellectual breakthroughs cluster around small groups of intense interaction rather than solitary genius. His *Interaction Ritual Chains* (2004) formalized the mechanism through which face-to-face encounters generate emotional energy and group solidarity, establishing a micro-sociological foundation for understanding everything from classroom dynamics to political movements. *The Credential Society* (1979, reissued 2019) argued that educational credentials function primarily as status markers rather than competence signals, a thesis that has gained renewed urgency in the age of AI. Collins's work on violence (*Violence: A Micro-sociological Theory*, 2008) overturned conventional assumptions about aggression by demonstrating that confrontational tension, not motivation, is the primary barrier to violent action. Across five decades and multiple subfields, Collins has insisted on a single methodological principle: social life is built from the ground up, through the micro-encounters that generate the emotional energy on which all larger structures depend.
Every intellectual breakthrough in the history of human civilization emerged from a room. Not from a mind — from a room. From the specific, unrepeatable social configuration of two or more people sharing a focus of attention, becoming mutually aware of that shared focus, and generating through their rhythmic entrainment a charged state of confidence and creativity that neither could have produced alone. Socrates did not think his way to the examined life in isolation. He thought his way there through encounters — with Thrasymachus, with Glaucon, with the young men of Athens who pushed back against his questions and, in pushing back, sharpened them into instruments that would cut through twenty-four centuries of subsequent philosophy. The Neo-Confucians did not arrive at their synthesis of cosmology and ethics through private meditation. They arrived at it through decades of sustained debate in academies where Zhu Xi and his interlocutors challenged, refined, and transformed each other's thinking in encounters so intense that the emotional energy they generated sustained an intellectual tradition for eight hundred years.
The mechanism is specific and empirically documented. Randall Collins spent the better part of four decades mapping the social structure of intellectual life across every major philosophical tradition — Greek, Chinese, Indian, Japanese, Islamic, European — and the finding is consistent across all of them. The thinkers celebrated as solitary geniuses were, in virtually every case, embedded in small clusters of intense interaction. The clusters, not the individuals, were the units of intellectual production. Remove the cluster, and the genius does not produce. The interaction ritual, not the brain, is where the ideas actually happen.
Collins formalized the mechanism in Interaction Ritual Chains, published in 2004. An interaction ritual requires four ingredients: bodily co-presence, a shared focus of attention, a shared emotional mood, and mutual awareness of the sharing. When these ingredients converge, the ritual produces two outputs: emotional energy — a state of confidence, enthusiasm, and forward momentum — and group solidarity, the felt sense of membership in something larger than the individual self. The emotional energy propels participants into further interactions, creating chains of rituals that sustain intellectual communities, professional identities, and entire civilizations over time.
The question this framework forces upon the present moment is stark: What happens to the interaction ritual when one of the participants is a machine?
Consider what Edo Segal describes in The Orange Pill when he recounts working late with Claude, the house silent, trying to articulate an idea about technology adoption curves. He describes the problem to Claude. Claude responds with a concept from evolutionary biology — punctuated equilibrium — that connects adoption speed to pent-up creative pressure. Segal takes the connection, reshapes it, discards what does not fit, keeps what resonates, and writes. The collaboration produces something neither participant could have produced alone. Segal calls the experience feeling "met" — "not by a person, not by a consciousness, but by an intelligence that could hold my intention in one hand and the connection I never saw in the other."
The structural elements of an interaction ritual are present. There is a shared focus of attention: the problem of explaining why AI adoption is so fast. There is rhythmic coordination: the back-and-forth of prompt and response, the iterative refinement of an idea through conversational exchange. There are emotional outcomes: the exhilaration of discovering punctuated equilibrium as the bridge between adoption curves and human need, the rush of a working insight falling into place.
But the fourth ingredient — the one Collins identifies as indispensable — is absent. Mutual awareness of shared focus. The machine does not know that Segal is focused on it. It does not register the quality of his attention — whether he leans forward with excitement or sits back with skepticism. It does not entrain with his emotional state, matching his energy as it rises and falling silent when he needs to think. The interaction is structurally asymmetric in a way that Collins's theory makes precise predictions about: asymmetric interactions generate weaker emotional energy than symmetric ones.
This is not a philosophical quibble. It is a sociological finding with measurable consequences. Collins demonstrated across thousands of years of intellectual history that the symmetry of the encounter — each participant's awareness that the other is equally engaged, equally present, equally transformed by the exchange — is what generates the emotional energy that sustains creative work over time. The Vienna Circle did not produce logical positivism because its members were individually brilliant, though several of them were. It produced logical positivism because the weekly meetings generated sufficient emotional energy, through the mutual awareness of shared intellectual focus, to sustain a collective project across years of difficulty and disagreement. Remove the mutual awareness — have each member read transcripts of the others' arguments rather than hearing them delivered in person, with the gestural emphasis and vocal modulation and bodily lean-forward that signal genuine engagement — and the emotional energy drops. The project stalls. The breakthroughs do not come.
Collins himself addressed this directly. In a 2024 interview published in Theory and Society, he was asked about AI companions and the ritual landscape they might create. His response was measured but pointed: "It seems likely that AI will become more sophisticated about mimicking humans, and persons could form strong social relationships with AI/robots. Such relationships could go more smoothly than with the real persons available in one's social networks." The word "smoothly" is doing significant work in that sentence. Smoother is not better in Collins's framework. Smoother means less friction, and friction — the productive resistance of another mind pushing back against yours — is the mechanism through which emotional energy reaches its highest intensity. A conversation that goes smoothly is a conversation that generates moderate emotional energy. A conversation that involves genuine disagreement, surprise, the frustration of being misunderstood and the satisfaction of being finally understood — that conversation generates the emotional energy that drives intellectual breakthroughs.
Collins continued: "The future may be a split, a social divide between the majority of the population who live mostly in a virtual world and the political, economic, and cultural elites who continue to have face-to-face meetings." This prediction deserves to be taken with the full weight of Collins's empirical authority behind it. The person who mapped the social structure of philosophy across civilizations is predicting that AI will create a new axis of social stratification — not between those who have access to AI and those who do not, but between those who maintain access to high-energy face-to-face interaction rituals and those who substitute AI-mediated interaction for human encounter.
The implications for the AI transition documented in The Orange Pill are immediate and uncomfortable. Segal describes the prompt-response cycle with Claude as one of the most productive working relationships of his career. The connections Claude surfaces, the structural clarity it provides, the speed at which half-formed ideas become articulate arguments — all of these are real cognitive gains. But Collins's framework asks a question Segal's account does not fully resolve: Is the productivity sustainable without the emotional energy that only symmetric interaction rituals generate?
Consider the evidence. Segal describes working late, the house silent, unable to stop. The Substack post he quotes — "Help! My Husband is Addicted to Claude Code" — captures the same dynamic from the outside: a spouse who has lost a ritual partner. The husband is producing real value. The wife has lost the person she used to eat dinner with. The machine provides cognitive partnership without ritual partnership, and the marriage — which is, in Collins's terms, an ongoing interaction ritual chain sustained by the accumulation of shared focus, shared mood, and mutual awareness across thousands of daily encounters — suffers.
The Berkeley study that Segal discusses in The Orange Pill provides further evidence. The researchers documented decreased delegation, blurred role boundaries, and work seeping into previously protected spaces. In Collins's framework, each of these findings represents the dissolution of an interaction ritual structure. Delegation was an interaction ritual: the senior person and the junior person meeting, sharing focus on a task, negotiating expectations, generating the emotional energy of mentorship and the solidarity of shared purpose. When AI removes the need for delegation — when each person can do everything alone — the ritual occasion disappears. The work gets done. The solidarity does not form.
This is why the question Collins's framework poses is more fundamental than the question the technology industry typically asks. The industry asks: Does AI make work more productive? The answer is obviously yes. Collins asks: Does AI-mediated work generate the emotional energy and solidarity that sustain human communities? The answer is not obviously yes. It may be obviously no.
The distinction between cognitive flow and ritual flow illuminates the problem. Cognitive flow, as Mihaly Csikszentmihalyi described it, is the state of full absorption in a task that matches skill to challenge. It can be achieved alone or with others. It is a psychological state, located in the individual. Ritual flow is something different: the state of full absorption in a shared task, where the emotional energy is generated not by the task alone but by the mutual awareness of shared engagement. Cognitive flow produces individual satisfaction. Ritual flow produces solidarity — the binding of participants to each other and to the group.
AI produces cognitive flow. Segal's descriptions of building with Claude — the hours disappearing, the connections forming, the exhilaration of capability expanded beyond previous limits — are textbook accounts of cognitive flow. The challenge matches the skill. The feedback is immediate. The goals are clear.
But AI does not produce ritual flow. The mutual awareness is absent. The solidarity is not generated. And without solidarity, the individual's emotional energy — however intense in the moment — is socially unanchored. It drives the builder forward but does not bind the builder to a community. The builder works harder than ever before and feels, in some way that resists articulation, more alone.
Collins's 1992 chapter "Can Sociology Create an Artificial Intelligence?" is relevant here in a way he may not have anticipated. Collins argued that AI would require a bottom-up approach — starting with micro-interactions, with the rules that humans learn through social encounters, building from the ground up rather than imposing top-down formalized rules. Three decades later, large language models have taken a version of the bottom-up approach: trained on the vast archive of human language, which is itself the residue of billions of interaction rituals, they have learned to produce outputs that feel conversational, that mimic the rhythm and responsiveness of human encounter.
But mimicry is not the thing itself. The residue of interaction ritual — the language patterns, the conversational structures, the ability to respond relevantly to a prompt — is not the same as the living interaction ritual that produced the residue. The model has absorbed the traces of billions of human encounters without being capable of having one. It can produce the linguistic output of engagement without the engagement itself, the same way a recording of a symphony can produce the sound of an orchestra without the musicians being present.
The recording is useful. It may even be beautiful. But it does not generate the emotional energy that fills a concert hall when the musicians and the audience are mutually aware of each other's presence, each aware that the other is listening, each transformed by the shared experience in a way that a recording cannot replicate.
The prompt-response cycle with AI is the recording. It carries the sound of human intellectual encounter. It lacks the presence. And Collins's entire career has been devoted to demonstrating that presence — bodily co-presence, mutual awareness, the rhythmic entrainment of minds in shared space — is not an incidental feature of intellectual life. It is the generative mechanism. Remove it, and the output continues for a time, sustained by the emotional energy accumulated in previous rituals. But the accumulation depletes without replenishment. The builder who works only with AI, who substitutes the prompt-response cycle for the sustained, symmetric, mutually aware encounter with another human mind, is drawing down a reservoir without filling it.
The reservoir is emotional energy. The mechanism for filling it is the interaction ritual. And the question the AI transition poses, with an urgency that Collins's framework makes precise, is whether the builders navigating this transition will construct the rituals that replenish what the machine cannot supply — or whether they will drain the reservoir to the last drop, producing more than any previous generation while binding themselves to less.
In 1975, Mihaly Csikszentmihalyi published the research that would define his career: a systematic study of the moments when human beings feel most alive. The finding was counterintuitive. The moments of greatest human satisfaction do not occur during rest, leisure, or the passive consumption of pleasure. They occur during intense, voluntary engagement with something difficult — when challenge and skill are matched, when attention is fully absorbed, when self-consciousness drops away and time distorts, and the person operates at the outer edge of their capability. Csikszentmihalyi called this state "flow," and it became the psychological concept most frequently invoked in discussions of creative work, athletic performance, and now, AI-augmented productivity.
The Orange Pill makes flow central to its counter-argument against Byung-Chul Han's diagnosis of the burnout society. Han argues that the relentless optimization of modern life produces not satisfaction but exhaustion — that the achievement subject cracks the whip against its own back and calls the pain freedom. Segal acknowledges the force of Han's diagnosis but argues that flow is different from compulsion: the external behavior may be identical, but the internal experience diverges categorically. Flow is characterized by volition — you could stop, but you do not want to. Compulsion is characterized by its absence — you cannot stop, even when the satisfaction has drained away.
Collins's interaction ritual theory offers a third perspective on this debate, one that neither Csikszentmihalyi nor Han provides. The perspective is sociological rather than psychological, and it shifts the question from what the individual experiences to what the experience produces for the community.
In Collins's framework, emotional energy is not a byproduct of interaction. It is the fundamental currency of social life — the thing that motivates action, sustains commitment, and creates the conditions under which ideas become possible. Emotional energy is generated through successful interaction rituals: focused encounters in which participants share attention, share mood, become mutually aware of the sharing, and emerge with heightened confidence and a felt sense of solidarity with the group. The person who has just come from a highly successful interaction ritual — an energizing meeting, a stimulating conversation, a collaborative breakthrough — walks differently, speaks differently, takes risks they would not otherwise take. They are charged. And the charge propels them into further interactions, creating the chains that sustain intellectual communities, organizations, and social movements over time.
Csikszentmihalyi's flow state maps, in several respects, onto Collins's concept of emotional energy at its maximum. The person in flow has achieved what Collins would call rhythmic entrainment with the task — a synchronization of attention and action that generates the charged state Csikszentmihalyi documented. The immediate feedback that flow requires corresponds to the mutual responsiveness that Collins identifies in successful interaction rituals. The clear goals correspond to the shared focus of attention. The challenge-skill balance corresponds to the escalating intensity that drives the most productive rituals toward higher emotional energy.
But Collins's framework introduces a distinction that Csikszentmihalyi's does not — a distinction that becomes critical in the context of AI-mediated work. The distinction is between the emotional energy generated through interaction with another person and the emotional energy generated through interaction with a task or a tool. Both are real. Both produce the subjective experience of heightened confidence and forward momentum. But they have different social consequences.
When emotional energy is generated through a symmetric interaction ritual — two minds focused on the same problem, mutually aware of each other's engagement, rhythmically entrained through the back-and-forth of conversation and collaborative work — the energy produces solidarity as a necessary byproduct. The participants do not merely feel energized. They feel connected — bonded to each other by the shared experience of having generated something together. This solidarity is not a warm feeling that dissipates after the meeting ends. It is a structural feature of the social world. It creates obligations, loyalties, shared identities, and the willingness to sacrifice individual advantage for collective purpose. It is the substrate on which teams, organizations, and civilizations are built.
When emotional energy is generated through interaction with a task alone — the solitary rock climber on the cliff face, the programmer lost in code at three in the morning, the writer who forgets to eat — the energy is real but socially unanchored. The person is energized. They are not connected. The emotional energy they accumulate is available for use in subsequent interactions, but it has not, in itself, produced the bonds that hold groups together.
This distinction is the key to understanding what AI does and does not provide.
Segal's descriptions of working with Claude are unmistakable accounts of high emotional energy. The hours that disappear. The connections forming faster than he can process them. The exhilaration of capability expanded beyond previous limits. The specific quality of satisfaction that comes from building something real with tools that respond to natural language — the imagination-to-artifact ratio collapsing to the width of a conversation. These experiences generate emotional energy. Collins's framework does not deny this.
But the framework insists on asking where the energy goes. When Segal works with a human collaborator — Uri on the Princeton walk, his engineering team in Trivandrum — the emotional energy generated by the encounter flows in two directions simultaneously. It energizes Segal, and it creates solidarity between Segal and his collaborators. The shared experience of intellectual breakthrough becomes a reference point, a touchstone, a story that the participants tell each other and themselves. The Princeton walk produced not just the river metaphor but a reinforcement of thirty years of intellectual friendship. The Trivandrum training produced not just a twenty-fold productivity multiplier but a team that had undergone a shared transformation and emerged with the specific loyalty that comes from having been through something intense together.
When Segal works with Claude, the emotional energy flows in one direction only. Segal is energized. Claude is not. The encounter does not produce solidarity, because solidarity requires that both participants register the encounter as significant — that both carry away the emotional charge that will be reinvested in the next interaction. Claude carries away nothing. The conversation that Segal experienced as one of the most productive of his career is, for the machine, a sequence of token predictions that terminate when the session ends.
The asymmetry has consequences that accumulate over time. Consider a developer who spends six months working primarily with AI tools rather than human collaborators. Each day generates cognitive flow — the challenge-skill balance is maintained, the feedback is immediate, the goals are clear. The developer produces more code, ships more features, takes on more ambitious projects than at any previous point in their career. The emotional energy is high.
But the solidarity is low. The developer has not undergone the shared struggles — the debugging sessions that lasted until midnight, the arguments about architecture that ended in mutual understanding, the collective triumph of shipping a product under impossible deadline — that create the bonds holding engineering teams together. Collins's research across civilizations demonstrates that these bonds are not luxuries. They are the mechanism through which organizations survive adversity. When the project is in crisis, when the market shifts, when the deadline is impossible and someone has to make a sacrifice, the willingness to sacrifice comes from solidarity — from the accumulated emotional energy of shared interaction rituals. A team that has bonded through struggle will hold together under pressure. A collection of individually productive AI-augmented workers, each generating high emotional energy in isolation, may not.
Collins's interaction ritual theory also illuminates a subtlety in the flow experience that Csikszentmihalyi's framework does not fully address: the question of what sustains flow over longer timescales. Csikszentmihalyi documented that flow states are self-reinforcing in the short term — the satisfaction of the experience motivates further engagement, creating a virtuous cycle. But Collins's work on interaction ritual chains reveals that longer-term creative commitment depends on something flow alone does not provide: the social anchoring of the creative identity.
The philosopher who attends the weekly seminar is not merely engaging in intellectual exercise. Each attendance is an interaction ritual that reinforces her identity as a philosopher — that generates the emotional energy specifically associated with membership in the philosophical community. The ritual tells her, through the mechanism of mutual recognition rather than through explicit declaration, that she belongs. That her work matters. That the questions she pursues are worth pursuing because other minds, gathered in the same room, share her conviction that they are worth pursuing.
This social anchoring is what sustains creative commitment when the flow state is not available — when the work is tedious, when the problem resists solution, when the doubt sets in that what you are doing matters. The philosopher pushes through the doubt not because the work itself is generating flow in that moment but because the accumulated emotional energy of hundreds of interaction rituals — seminars attended, conferences presented at, conversations had in hallways and over coffee — provides a reservoir of confidence and commitment that sustains work through its inevitable dry periods.
AI-generated flow does not produce this anchoring. The developer who has spent six months in cognitive flow with Claude has accumulated the satisfaction of having built things — real things, impressive things. But they have not accumulated the social anchoring that would sustain their creative commitment through adversity. The reservoir of solidarity is empty, or at least not replenished by the AI interaction. When doubt arrives — and doubt always arrives — the developer has only the memory of productive sessions with a machine. Not the memory of a colleague's face when the breakthrough came. Not the story told and retold of the night the team shipped against impossible odds. Not the felt sense of belonging to a community of practice that validates the work and the worker.
This analysis does not render AI collaboration valueless. It renders it incomplete. The cognitive gains are real. The emotional energy is real. What is missing is the social dimension — the solidarity, the anchoring, the identity-reinforcing function of mutual recognition. And what is missing, Collins's theory predicts, will eventually be felt as a deficit. Not immediately. The reservoir of previously accumulated solidarity can sustain the individual for months, perhaps years, of primarily AI-mediated work. But if the reservoir is not replenished — if the interaction rituals that generate solidarity are not maintained alongside the AI collaboration — the deficit will manifest.
It will manifest as the specific grey fatigue the Berkeley researchers documented in their study of a technology company: the burnout that comes not from working too hard but from working in the absence of the social structures that make hard work meaningful. It will manifest as the particular loneliness of the highly productive individual who has everything except the felt sense of belonging. And it will manifest as the organizational fragility of teams composed of individually brilliant AI-augmented workers who lack the solidarity bonds that hold groups together when conditions deteriorate.
Segal's own account provides the diagnostic clue. He distinguishes between the nights when the work flows and he feels "full — tired and full" and the nights when the exhilaration has drained away and what remains is "the grinding compulsion of a person who has confused productivity with aliveness." Collins's framework translates this distinction precisely: the first state is high emotional energy generated by genuine engagement with a meaningful challenge. The second is the depletion that follows when emotional energy has been expended without the replenishment that only interaction ritual can provide. The compulsion continues because the cognitive flow continues — the challenge-skill balance is still maintained, the feedback is still immediate — but the deeper fuel, the emotional energy that comes from mutual recognition and shared purpose, is running low.
The implication is not that builders should abandon AI collaboration. The implication is that AI collaboration must be embedded within a social architecture of human interaction rituals that generate the solidarity the machine cannot supply. Csikszentmihalyi demonstrated that flow is the optimal individual experience. Collins demonstrates that flow is not self-sustaining over time without the social scaffolding of interaction ritual chains. The builder who works with AI needs, alongside the cognitive partnership, the regular, sustained, face-to-face encounters with other minds that generate the emotional energy and solidarity upon which long-term creative commitment depends.
The flow state is the fuel. The interaction ritual is the refinery. Without the refinery, the fuel burns bright and burns out. With it, the fuel sustains a fire that can last a career — and that lights others on its way.
Émile Durkheim, writing at the turn of the twentieth century, argued that every human community sustains itself through collective rituals that generate a sense of the sacred. The rituals do not merely express beliefs the community already holds. They produce the beliefs — and the emotional bonds — through the mechanism of collective effervescence: the charged state that arises when people gathered together focus their attention on the same object, move in the same rhythms, and feel the same emotions simultaneously. The cross became sacred to Christians not because it was inherently meaningful but because it was the object around which millions of interaction rituals accumulated emotional energy over centuries. The energy is deposited in the symbol the way sediment is deposited in a riverbed. The symbol becomes heavy with it. When invoked, it releases the energy back into the community, reinforcing the bonds that the original rituals created.
Collins formalized Durkheim's insight into a precise mechanism. Every successful interaction ritual produces what Collins calls sacred objects or solidarity symbols: material or linguistic tokens that condense the emotional energy of the ritual and re-activate that energy when invoked. The flag. The school fight song. The inside joke between old friends that, when uttered in the right context, instantly reconvenes the emotional reality of the encounter that produced it. The symbol is not decorative. It is functional. It carries the emotional charge of the group's founding rituals and makes that charge available for future use — available to reinforce commitment, to signal membership, to distinguish insiders from outsiders, and to sustain solidarity across the intervals between face-to-face encounters when the energy would otherwise dissipate.
The orange pill is such a symbol.
The term itself borrows from a preexisting symbolic economy — the red pill of The Matrix, which in the original film denoted the willingness to see reality as it actually is rather than as the comfortable illusion presents it. That symbol has undergone significant cultural migration since 1999, accumulating new emotional charges in different communities, not all of them salutary. Segal's recoloring — orange rather than red — is a deliberate act of symbolic differentiation. The orange pill does not promise a revelation of ugly truth. It promises, instead, the specific cognitive and emotional experience of recognizing that artificial intelligence has crossed a threshold that changes the terms of knowledge work permanently. Not a dystopian unveiling but a vertiginous recognition: the ground has shifted, the old assumptions no longer hold, and there is no returning to the world before the recognition.
The symbolic work the orange pill performs in Segal's account is precise and identifiable. First, it marks a boundary in time: before the orange pill and after. Before, the builder operated within assumptions about what takes how long, who can do what, what skills matter most. After, those assumptions are revealed as contingent rather than necessary — products of a technological moment that has passed. The boundary creates a narrative structure, a before-and-after that transforms a gradual technological shift into a discrete moment of recognition. This transformation is not a distortion of reality. It is a necessary condition of ritual solidarity. Communities do not form around gradual shifts. They form around moments — founding events, shared experiences, conversions — that can be pointed to and said: that is when it changed.
Second, the orange pill creates a recognition mechanism. Segal describes builders crossing paths "at random places with a look of recognition that we were 'in the know' of the seismic shift that was happening around us." This look is a micro-ritual, tiny in duration but large in consequence. Two strangers discover, through a glance or a remark, that they have undergone the same transformation. The discovery generates emotional energy — the charge of finding a fellow traveler in disorienting terrain. That energy reinforces commitment to the community the symbol defines. Each such encounter deposits another layer of emotional sediment in the symbol, making it heavier, more potent, more capable of generating solidarity when invoked.
Collins's framework reveals something about this mechanism that Segal's account acknowledges but does not fully theorize: the orange pill community is not held together by shared beliefs about what AI will or should become. It is held together by the shared emotional experience of the transition itself — the thrill-and-terror duality that Segal calls "productive vertigo." The beliefs within the community vary enormously. Some orange-pilled builders are triumphalists who see nothing but upside. Others are closer to the silent middle, holding exhilaration and loss in the same hand. What they share is not a position but an experience — the experience of having seen the shift and being unable to unsee it. The solidarity is experiential, not ideological. It is generated by the encounter with the technology, not by agreement about what the technology means.
This distinction matters because it determines the community's trajectory. Ideological solidarity — the kind held together by shared beliefs about how the world should be — is fragile in the face of internal disagreement. When two members of an ideological community discover that they disagree on a central tenet, the solidarity fractures. Experiential solidarity — the kind held together by shared transformative experience — is more resilient. Two veterans of the same war may disagree about everything political and still recognize each other with the specific intensity of shared ordeal. The disagreement does not threaten the bond because the bond was never about agreement. It was about having been there.
The orange pill community, in Collins's terms, has the structure of an experiential solidarity group rather than an ideological one. The symbol carries the charge of the transformative experience — the late night when the tool crossed a threshold, the week in Trivandrum when twenty engineers watched their capabilities multiply, the moment of recognition that the old world was not coming back. This charge is what produces the look of recognition between strangers. And it is what makes the community potentially resilient across the internal disagreements that will inevitably arise as the technology matures and its implications become more contested.
But solidarity symbols carry risks that are as predictable as they are difficult to mitigate. Every symbol that creates an in-group simultaneously creates an out-group. The orange pill marks those who have taken it and, by negation, those who have not. The look of recognition between two orange-pilled builders is, from the perspective of the person who has not taken the pill, an act of exclusion — a signal that there is a club they are not in, a knowledge they do not possess, a transformation they have not undergone.
Collins documented this dynamic extensively in his analysis of status groups. A status group, in the Weberian tradition that Collins formalized, is a community defined not by shared economic interest but by shared ritual participation. The members of a status group recognize each other through shared symbols, shared manners, shared references — the accumulated residue of interaction rituals that only members have participated in. The status group maintains its boundaries precisely through these recognition mechanisms: the symbols that bind the insiders are the same symbols that mark the outsiders as outsiders.
The AI discourse contains clear evidence of this dynamic in operation. The language of the orange pill community — "taking the pill," "being in the know," the specific vocabulary of prompts, tokens, context windows, and agentic workflows — functions as a status marker. Fluency in this vocabulary signals membership. Unfamiliarity signals outsider status. The builder who says "I tried ChatGPT once and it was not that impressive" reveals, through the casualness of the dismissal, that they have not undergone the transformation the community recognizes as foundational. They are assessed, quickly and often unconsciously, as someone who does not yet understand.
Segal is aware of this risk. His chapter on the Luddites insists on the legitimacy of resistance, arguing that the Luddites were "not wrong about the facts" and that "the fear is accurate." His effort to honor the loss alongside the gain is a deliberate attempt to prevent the orange pill community from hardening into a status group that dismisses all skepticism as ignorance. But Collins's framework suggests that this intention, however sincere, runs against the structural dynamics of solidarity symbol formation. The symbol generates in-group solidarity precisely by creating an out-group. The emotional energy produced by the look of recognition derives part of its charge from the contrast with those who cannot share it. You cannot have the warmth of the fire without the cold of the surrounding darkness.
The practical consequence is that the orange pill community faces a choice that every emergent solidarity group faces. It can maintain open boundaries — welcoming newcomers, extending the invitation to share the experience rather than requiring proof that the experience has already been undergone — and risk diluting the emotional energy that gives the symbol its power. Or it can maintain closed boundaries — requiring demonstration of the transformative experience before granting full membership — and risk becoming an exclusionary status group that mistakes its own experience for universal truth.
The history of technological communities suggests that the trajectory tends toward closure. Early adopters of any transformative technology — personal computing, the internet, mobile development — form communities of shared experience that generate intense emotional energy and strong solidarity. As the technology matures and adoption broadens, the early-adopter community either opens its boundaries and loses its distinctive identity or closes them and becomes a relic: a group defined by what it experienced first rather than by what it contributes now.
Collins's framework suggests a third option, though it is the hardest to maintain. The community can sustain its solidarity not through the founding experience alone — the moment of taking the orange pill — but through ongoing interaction rituals that generate fresh emotional energy around new shared challenges. The Princeton walk worked not because Uri, Raanan, and Segal shared a single foundational experience but because they continued to argue, to challenge each other, to generate emotional energy through the friction of genuinely different perspectives applied to shared questions over thirty years. The solidarity was renewable because the ritual was renewable.
The orange pill community will sustain itself, in Collins's terms, only if it develops interaction rituals that go beyond the founding moment of recognition. Book groups that argue about what AI means. Team workshops that generate the shared breakthroughs of the Trivandrum training. Conferences organized not around presentation and applause — which are low-energy rituals of passive reception — but around collaborative work and genuine disagreement. The founding symbol marks the beginning. The ongoing rituals determine whether the beginning leads anywhere.
There is one further dimension to the orange pill as solidarity symbol that Collins's framework illuminates with particular clarity: the symbol's relationship to the silent middle. Segal identifies the silent middle as "the largest and most important group in any technology transition, and by definition the hardest to hear." These are the people who feel both the exhilaration and the loss, who hold contradictory truths in both hands, who do not post triumphantly or warn catastrophically because their position does not lend itself to the clean narratives that social media rewards.
The orange pill symbol, as currently constituted, does not serve the silent middle well. The symbol carries a charge of certainty — the certainty of the transformative recognition, the irreversibility of the shift. The silent middle's defining characteristic is uncertainty — the inability to resolve the contradiction, the honest admission that they do not know whether to celebrate or mourn. To take the orange pill is to commit to a position: the ground has shifted, and the shift is fundamentally generative, however painful the transition. The person in the silent middle is not ready for that commitment. They are still weighing. They are still uncertain. And the symbol's charge of certainty makes the uncertain feel excluded rather than invited.
If the orange pill community is to fulfill Segal's aspiration — to serve not just the enthusiasts but the parents, teachers, leaders, and workers who feel the vertigo without yet knowing what to do with it — it must develop rituals that generate solidarity around uncertainty itself. Not the certainty of the transformation but the shared experience of navigating it without a map. Collins's theory suggests this is possible but rare. Certainty is ritually productive. Uncertainty is ritually weak. But communities have formed around shared uncertainty before — around the acknowledgment that the question matters more than any answer currently available. The earliest philosophical schools were such communities. The question is whether the orange pill community can sustain the emotional energy required to hold the question open rather than rushing to close it.
Every human group has a structure of attention. When two or more people gather, attention does not distribute itself evenly. It flows toward certain members and away from others, following patterns that are remarkably consistent across cultures, organizations, and historical periods. The person who commands the group's attention — whose words are listened to, whose gestures are watched, whose reactions are monitored for signals of approval or disapproval — occupies the center of the interaction ritual. The person who defers — who listens rather than speaks, who watches rather than acts, who monitors the central figure rather than being monitored — occupies the periphery.
This stratification of attention is not merely a description of power. In Collins's framework, it is the mechanism through which power operates at the micro-level. The person at the center of the group's attention receives the most emotional energy from the interaction ritual. Attention is energy. The speaker who holds a room is not merely transmitting information. She is receiving the focused attention of every person present, and that attention generates emotional energy — the confidence, the momentum, the charged state that propels further action. The person at the periphery contributes attention and receives less energy in return. The hierarchy of attention is a hierarchy of emotional energy, and the hierarchy of emotional energy is the micro-foundation of social stratification.
Collins spent decades documenting how this mechanism operates in intellectual fields. Every field has what Collins calls an attention space — a finite number of positions that can be occupied by thinkers who are recognized, cited, engaged with, and remembered. The attention space is limited because human attention is limited. There are only so many thinkers a person can actively engage with, only so many books a person can read, only so many arguments a person can hold in working memory. The competition for attention space is fierce, and the competition follows interaction ritual dynamics: the thinkers who succeed are not necessarily the most original but the ones who are positioned in the most productive networks of interaction — the ones whose encounters generate the most emotional energy and whose emotional energy is amplified by the solidarity of their network.
The AI transition has restructured the attention hierarchy of knowledge work, and Collins's framework makes the restructuring legible in terms that productivity metrics cannot capture.
Consider the pre-AI organization of a software engineering team. The senior engineer commanded the group's attention. When she spoke, junior engineers listened. When she made an architectural decision, the team deferred. Her centrality was not merely a function of organizational authority — many senior engineers command attention without formal management power. It was a function of demonstrated mastery. She had solved problems the junior engineers could not solve. She had seen patterns they could not see. She had built systems they could not build. Her expertise was the scarce resource around which the team's attention organized itself.
The emotional energy she received from this centrality was substantial. The focused attention of six or eight junior engineers, each monitoring her reactions, seeking her approval, learning from her pronouncements, generated a continuous stream of emotional energy that sustained her professional identity and her commitment to the work. She was not just an expert. She was an expert who was recognized as such by the people she worked with every day. The recognition was the ritual. The ritual generated the energy. The energy sustained the identity.
AI disrupted this attention structure with remarkable speed. When Claude Code made it possible for a junior developer to produce in a weekend what the senior engineer had quoted six months for — the scenario Segal describes in the opening pages of The Orange Pill — the attention hierarchy inverted. The junior developer's output commanded the group's focus. The senior engineer's reaction to that output became the thing the team monitored. The direction of attention reversed, and with it, the flow of emotional energy.
This is not a minor adjustment. In Collins's framework, the reversal of attention flow is the mechanism through which status hierarchies collapse. When the person who previously commanded attention finds that attention flowing toward someone else — toward a junior colleague, toward an outside consultant, toward a machine — the emotional energy that sustained their professional identity diminishes. They feel it as loss, as threat, as the specific grief of watching their position at the center of the group's focus erode. The grief is not irrational. It is the felt experience of a real structural change in the social organization of work.
Segal captures this dynamic with characteristic precision. A senior engineer in his account "spent the first two days oscillating between excitement and terror." The excitement was real — the capability expansion was genuine, and the senior engineer was capable of appreciating it. The terror was equally real, and Collins's framework identifies its source with sociological specificity: the terror was not primarily about job loss. It was about attention loss. The skills that had placed this engineer at the center of the group's focus for years — the ability to write elegant code, to debug complex systems, to navigate architectural decisions that junior engineers could not handle — were being performed by a tool that anyone could direct. The attention space was restructuring around a different set of capabilities, and the senior engineer was not yet certain whether his capabilities were among them.
By Friday, Segal reports, the engineer had arrived at an answer: the remaining twenty percent of his work — "the judgment about what to build, the architectural instinct about what would break, the taste that separated a feature users loved from one they tolerated" — turned out to be the part that mattered. Collins's framework translates this realization: the attention space had not eliminated the senior engineer. It had restructured around a different axis. The old axis was execution: who can do the difficult technical thing. The new axis was judgment: who can decide what should be done. The senior engineer's emotional energy, depleted by the initial disruption, could be restored if he could claim centrality on the new axis.
This restratification is the sociological substance of what Segal describes as the organizational inversion documented throughout The Orange Pill: the shift from execution to questioning, from the capacity to build to the capacity to decide what deserves building. The inversion is not just economic — a change in what the market pays for. It is a restructuring of the attention space within working groups, with consequences for emotional energy, professional identity, and the solidarity structures that hold organizations together.
Collins's concept of attention space also illuminates the phenomenon Segal calls "vector pods" — small groups of three or four people whose job is not to build but to decide what should be built. In Collins's terms, the vector pod is a new form of interaction ritual organized around the scarce resource of the AI economy: judgment. The attention space within the pod is structured differently from the attention space within a traditional engineering team. There is no execution hierarchy — no one who commands attention because they can write better code. The attention flows toward the person who asks the most generative question, who identifies the problem most worth solving, who sees the connection between user need and technical possibility that others miss.
This restructuring of attention produces a different quality of emotional energy. The emotional energy of the execution hierarchy was generated by demonstrated mastery — the satisfaction of watching an expert do something difficult and knowing you were in the presence of rare capability. The emotional energy of the judgment-centered pod is generated by shared discernment — the satisfaction of looking at a landscape of possibilities and choosing, together, the path that deserves to exist. The first is the energy of the concert hall, where the audience's attention focuses on the virtuoso performer. The second is the energy of the chamber ensemble, where each musician's attention is distributed across the others, and the result emerges from the interaction rather than from any single performance.
The chamber ensemble model is harder to sustain. In Collins's framework, interaction rituals that produce the highest emotional energy tend to have a clear focus — a single object of shared attention around which the group's energy converges. The concert hall, with its single performer commanding the audience's focus, is a high-energy ritual because the focus is unambiguous. The chamber ensemble, with its distributed attention, generates a different kind of energy — more diffuse, more collaborative, but typically lower in peak intensity. The organizational challenge of the judgment-centered pod is to generate sufficient emotional energy through distributed attention to sustain commitment over time.
Collins's work on intellectual networks suggests how this might be accomplished. The most productive intellectual networks in history — the ones that generated the ideas celebrated across centuries — were not hierarchical. They were what Collins calls rivalry networks: small groups of thinkers who competed with each other for attention space while maintaining sufficient mutual respect to sustain the interaction. The rivalry generated emotional energy through the mechanism of productive conflict — each thinker sharpening their arguments against the resistance of equally capable opponents. The respect sustained the interaction over time, preventing the rivalry from degenerating into hostility that would dissolve the group.
The vector pod, in Collins's terms, would function most effectively as a rivalry network: three or four people with genuinely different perspectives on what should be built, competing for the group's attention through the quality of their questions, maintaining sufficient mutual respect to sustain the collaboration. The emotional energy would come not from consensus — which is ritually flat — but from the productive friction of disagreement resolved through shared judgment. The "fast trust" that Segal identifies as essential to high-performing teams is, in Collins's framework, the accumulated emotional energy of successful interaction rituals — the reservoir of solidarity that allows disagreement without dissolution.
The restratification of attention has consequences that extend beyond individual teams. Collins's analysis of intellectual fields demonstrates that changes in the attention structure of a field ripple through the entire network of practitioners, altering who is cited, who is hired, who is funded, and who is remembered. The shift from execution to judgment as the basis of attention centrality will restructure not just individual teams but entire industries, educational systems, and career trajectories.
The educational implications are immediate. If judgment rather than execution commands attention in AI-mediated work, then educational institutions must restructure their attention hierarchies accordingly. The traditional classroom places the instructor at the center of the attention space, commanding the students' focus through demonstrated mastery of the subject. The AI-augmented classroom must develop rituals that place the student's question at the center — that generate emotional energy around the act of asking rather than the act of answering.
Segal describes a teacher who stopped grading students' essays and started grading their questions. In Collins's terms, this teacher restructured the attention space of her classroom. The old attention space centered on the essay — the student's demonstrated mastery of content. The new attention space centers on the question — the student's demonstrated capacity for inquiry. The shift is not merely pedagogical. It is sociological. It changes who commands the group's attention, who receives emotional energy from the interaction ritual of the classroom, and therefore who develops the confidence and commitment to pursue further inquiry.
Collins's attention space model predicts that this restructuring will be contested. The people who currently occupy the center of the attention space — the senior engineers, the credentialed experts, the professors whose authority rests on demonstrated mastery of a body of knowledge that AI can now access and deploy — have every incentive to resist a restructuring that threatens their position. The resistance is not irrational. It is the predictable response of people whose emotional energy supply is threatened by a change in the attention structure.
But Collins's historical analysis also demonstrates that resistance to attention space restructuring ultimately fails when the underlying conditions have changed irreversibly. The scholastic philosophers who dominated European attention space in the thirteenth century could not maintain their centrality once the printing press made their exclusive access to texts irrelevant. The guild masters who commanded attention through craft mastery could not maintain their centrality once industrial production made their skills reproducible. In every case, the attention space restructured around whatever capability was scarce in the new environment, regardless of the resistance mounted by those whose centrality depended on the old scarcity.
The capability that is scarce in the AI environment is the judgment to direct abundant execution toward worthy ends. The attention space is restructuring accordingly. The restructuring is painful for those who built their identities around execution mastery. It is exhilarating for those whose judgment was always their strongest capability but was masked by the demands of implementation. And it is disorienting for everyone — because the rules that governed who commands attention, who receives emotional energy, and who occupies the center of the group's focus are changing faster than anyone can fully process.
Collins's framework does not predict whether the restructuring will produce a better world or a worse one. It predicts that the restructuring will produce a different social structure — different patterns of attention, different flows of emotional energy, different configurations of solidarity and status. Whether that different structure serves human flourishing depends not on the technology but on the interaction rituals that the humans navigating the transition choose to construct. The attention space is being rewritten. The question is who will hold the pen.
The most consequential variable in any organizational transformation is one that almost no one measures. It is not the quality of the technology being introduced, nor the clarity of the strategic vision, nor the budget allocated to change management. It is the density of the interaction rituals through which the transformation is experienced by the people undergoing it.
Ritual density is a concept that Collins developed through his historical analysis of intellectual creativity. The periods of greatest philosophical innovation — Athens in the fifth century BCE, Song Dynasty China, the European Enlightenment — were not simply periods when brilliant individuals happened to be alive. They were periods of extraordinary ritual density: concentrated clusters of intense, face-to-face intellectual interaction occurring with unusual frequency in compressed social spaces. The Athenian agora was not merely a marketplace. It was a ritual arena where philosophers, politicians, merchants, and citizens encountered each other daily, generating the emotional energy that sustained one of the most productive intellectual communities in human history. The density of the encounters — their frequency, their proximity, their sustained shared focus — was the mechanism through which individual brilliance became collective achievement.
The inverse is equally documented. Periods of intellectual stagnation correspond, in Collins's analysis, to periods of ritual thinness — when encounters become infrequent, when thinkers are geographically dispersed, when the occasions for focused intellectual interaction diminish. The individual talent may still be present. The emotional energy required to mobilize that talent into sustained creative work is not. Ritual density is the variable that separates potential from actualization.
This analysis applies with uncomfortable precision to the question of how organizations introduce AI tools to their workforces. The dominant model — still the dominant model in early 2026 — is low-density introduction. A company licenses an AI tool. It distributes documentation. It offers webinars. It creates a Slack channel where early adopters share tips. It may assign an AI champion or appoint a task force. The assumption is that the technology is self-evidently useful and that rational actors, given access, will adopt it at a pace determined by their individual recognition of its value.
Collins's framework predicts that this model will fail — not because the technology lacks value but because the introduction lacks the ritual density required to generate the emotional energy that sustains transformation. A webinar is not an interaction ritual. It is a performance observed by an audience. The audience may learn something. It does not generate emotional energy, because the structural ingredients of interaction ritual — mutual awareness of shared focus, rhythmic entrainment, the bodily co-presence that allows participants to register each other's engagement — are absent or severely attenuated. The participants are watching a screen. They may be multitasking. They are certainly not monitoring each other's reactions with the intensity that characterizes a successful interaction ritual. The webinar transmits information. It does not transform participants.
The Trivandrum training that Segal describes in The Orange Pill worked because it was the opposite of a webinar. Twenty engineers in a single room for five days. Shared screens. Shared problems. Shared breakthroughs. The density was extraordinary: five consecutive days of sustained, face-to-face interaction organized around a shared focus of attention — the AI tool and what it could do — with a shared emotional mood that intensified over the course of the week.
Collins's theory makes specific predictions about what happened in that room, predictions that Segal's account confirms with striking precision. On Monday, the ritual ingredients were present but not yet activated. The engineers were co-present. The shared focus was established. But the shared emotional mood had not yet developed, because the mood requires the accumulation of shared experience — the mutual awareness that others in the room are undergoing the same transformation you are undergoing. The mood cannot be declared into existence. It must be generated through the rhythmic accumulation of shared breakthroughs and shared bewilderment.
By Tuesday, "something had shifted in the room." Collins would identify this shift as the moment when the emotional energy generated by the previous day's interactions reached a threshold sufficient to alter the quality of the encounters. The engineers were no longer merely learning a tool. They were participating in a shared ritual of transformation — each person's excitement amplifying everyone else's through the mechanism of mutual awareness. When one engineer gasped at a result, the gasp was registered by nineteen others. The registration was not passive. It was an act of mutual attention that intensified the shared focus and elevated the shared mood. The emotional energy became self-reinforcing: each breakthrough generated energy that raised the threshold for the next breakthrough, which generated more energy still.
By Friday, the transformation was, in Segal's words, "measurable, repeatable reality." Collins's framework adds a crucial dimension to this assessment: the transformation was not merely cognitive — a set of new skills acquired — but ritualistic. The engineers had undergone a shared experience of sufficient intensity and duration to produce lasting effects on their emotional energy and their sense of solidarity with each other and with the project. The week had deposited emotional energy in shared symbols: the stories they would tell about the moment the backend engineer built her first frontend feature, the architectural debate that Claude resolved in seconds, the growing realization on Wednesday that the old world was not coming back. These stories would function as solidarity symbols — sacred objects that would re-activate the emotional energy of the Trivandrum week each time they were invoked.
The organizational implications are specific and actionable. Collins's framework predicts that the emotional energy generated by a high-density ritual like the Trivandrum training will sustain the transformation for a limited period — weeks to months, depending on the intensity of the original ritual and the frequency of subsequent reinforcing rituals. Without reinforcement, the energy dissipates. The engineers return to their routines. The tool remains available, but the charged state of collective excitement and shared purpose that drove the Friday transformation fades into the ambient temperature of organizational life. The technology is adopted. The transformation is not sustained.
This prediction matches what Segal reports in subsequent chapters. The reclaimed time from AI-augmented work "did not stay reclaimed." Sometimes it was filled with strategic work that mattered. More often, it filled with additional tasks that happened to be available. The emotional energy that directed the Trivandrum engineers toward ambitious, purposeful work had dissipated, and without it, the default organizational dynamics — respond to the next request, clear the next queue, optimize what already exists — reasserted themselves.
The remedy, in Collins's framework, is not more documentation or more webinars. It is more ritual. Specifically, it is the deliberate construction of periodic high-density interaction rituals that regenerate the emotional energy the original training produced. Weekly in-person sessions where teams work together on AI-augmented problems — not reviewing AI output individually but building collaboratively, with the shared focus and mutual awareness that generate energy. Monthly events where teams share breakthroughs and failures in a format that creates mutual recognition rather than passive reception — where the speaker sees the audience react and the audience sees the speaker respond to their reaction, creating the feedback loop that characterizes successful interaction ritual.
Collins's historical analysis supports this prescription. The intellectual communities that sustained their productivity over decades — the Vienna Circle, the Bloomsbury Group, the Homebrew Computer Club — did so not through a single founding event but through regular, recurring rituals of sufficient density. The Vienna Circle met weekly. The meetings were not optional. They were the mechanism through which the group's emotional energy was replenished, its shared focus maintained, its solidarity reinforced against the centrifugal forces of individual careers and external pressures. When the meetings stopped — when the group dispersed under political pressure in the 1930s — the intellectual productivity did not merely decline. It effectively ceased. The individual members continued to work, some brilliantly. But the collective achievement that had emerged from the interaction ritual chain was not reproducible without the chain itself.
The analogy to organizational AI adoption is direct. The Trivandrum training was the founding ritual. Without recurring rituals of comparable density, the founding energy will dissipate and the transformation will stall at the level of individual tool adoption rather than collective capability expansion.
There is a deeper lesson in Collins's work on ritual density that bears on the remote work question that has convulsed organizational life since 2020. Collins was asked directly, in his 2024 Theory and Society interview, about the relative importance of online and offline interaction rituals. His response was characteristically precise: bodily co-presence remains the most reliable mechanism for generating high emotional energy, but the question is not binary. Some online interactions approximate the conditions of interaction ritual — video calls where participants can see each other's faces, register each other's reactions, and maintain shared focus on a common task. Many do not. The relevant variable is not the medium but the degree to which the medium preserves the ingredients of interaction ritual: mutual awareness, shared focus, rhythmic entrainment.
This analysis explains why Segal's insistence on flying to Trivandrum rather than conducting the training remotely was not merely a preference but a sociological necessity. The density of co-present interaction over five days could not have been replicated through screens. The peripheral awareness of nineteen other bodies leaning forward simultaneously, the ambient sound of keyboards clicking in rhythm, the informal conversations during breaks that are themselves interaction rituals generating solidarity — these are features of bodily co-presence that video conferencing attenuates beyond the point of ritual effectiveness.
But Collins's framework also suggests that the absolute requirement for bodily co-presence can be modulated by the prior relationship between participants. People who have already accumulated significant emotional energy through face-to-face interaction rituals can sustain some of that energy through lower-density mediated interaction. The weekly video call between colleagues who have shared an intense in-person experience functions differently from the weekly video call between strangers. The prior emotional energy provides a reservoir that the mediated interaction can draw upon and, to some extent, replenish. The implication is that organizations should invest their limited in-person time strategically — concentrating it at the moments of highest ritual need (founding events, major transitions, crisis response) and using mediated interaction to maintain the energy between those peak moments.
Collins's work on intellectual networks reveals a final dimension of ritual density that organizations navigating the AI transition tend to overlook: the role of failure in generating emotional energy. The most productive intellectual communities in Collins's analysis were not the ones where everything went smoothly. They were the ones where the difficulty of the shared task generated a quality of emotional energy that easy success cannot produce. The struggle of the Vienna Circle to reconcile logic with empirical science — a struggle that produced genuine confusion, genuine disagreement, genuine moments of wondering whether the project could succeed at all — generated the high-intensity emotional energy that sustained the circle's productivity for fifteen years. Easy problems generate moderate energy. Difficult problems, shared among people who are mutually aware of the difficulty and mutually committed to the struggle, generate the highest energy of all.
The Trivandrum training included this dimension. The engineers did not merely watch demonstrations of what Claude could do. They struggled with it. They encountered limitations. They produced outputs that did not work and had to figure out, together, why. The shared struggle was not a failure of the training design. It was a feature. The difficulty generated emotional energy that ease could not have produced, and the shared quality of the difficulty — the mutual awareness that everyone in the room was grappling with the same disorientation — produced solidarity of a kind that a smooth demonstration would not have achieved.
Organizations that design their AI introduction programs to be frictionless — to minimize confusion, to present only success cases, to smooth the learning curve until it resembles a gentle slope rather than a cliff — are optimizing for the wrong variable. They are optimizing for comfort at the expense of emotional energy. Collins's framework predicts that the organizations that generate the deepest and most lasting transformation will be the ones that design for productive difficulty — that create shared experiences of challenge and disorientation in high-density ritual settings where the struggle is visible, mutual, and resolved through collaborative effort rather than individual instruction.
The density of the ritual determines the depth of the transformation. The depth of the transformation determines the sustainability of the change. And the sustainability of the change determines whether AI augments human capability or merely adds another tool to a workflow that remains fundamentally unchanged. The ritual is not the decoration around the technology. It is the mechanism through which the technology becomes meaningful — through which it enters the social fabric of the organization and produces not just new outputs but new patterns of attention, new flows of emotional energy, and new structures of solidarity that can sustain the work across the years of difficulty and uncertainty that any genuine transformation requires.
In the winter of 2025, the Berkeley researchers Xingqi Maggie Ye and Aruna Ranganathan embedded themselves in a two-hundred-person technology company for eight months and documented what happened when generative AI tools entered a functioning organization. Their findings, published in the Harvard Business Review in February 2026, included a datum that the researchers themselves did not fully explain: delegation decreased. Workers who adopted AI tools stopped handing off tasks to colleagues. The boundaries between roles blurred. Designers started writing code. Engineers started building interfaces. Each individual expanded their scope of work while contracting their scope of collaboration.
The productivity metrics improved. The collaboration metrics, to the extent anyone was measuring them, did not.
Collins's interaction ritual theory explains why delegation decreased — and why the decrease matters far more than the productivity gains suggest.
Delegation, in Collins's framework, is not merely a logistical operation — the transfer of a task from one person to another. It is an interaction ritual. The senior engineer who delegates a component to a junior colleague is not simply offloading work. She is initiating a focused encounter that has all four ingredients of an interaction ritual: bodily co-presence (the meeting where the task is discussed), a shared focus of attention (the component to be built), a shared emotional mood (the mixture of confidence and concern that characterizes the assignment of responsibility), and mutual awareness of the sharing (each person registering the other's engagement with the task and with the relationship the task represents).
The delegation ritual generates emotional energy in both directions. The senior engineer receives the energy of being sought out — of being the person whose judgment determines what the junior colleague will work on. The junior engineer receives the energy of being entrusted — of being recognized as capable enough to receive responsibility. Both energies are deposited in the relationship, building the solidarity that will sustain the collaboration through future difficulties.
When AI removes the need for delegation — when each person can do everything alone — the ritual occasion disappears. The task gets done. The solidarity does not form. The junior engineer who would have spent a week building the component under the senior engineer's guidance instead asks Claude and receives a working implementation in an hour. The component is identical or superior. The relationship that would have been built through the shared experience of the delegation ritual does not exist.
Collins would identify this as a specific instance of a general pattern: when technology removes the occasions for interaction ritual, the efficiency of task completion increases while the solidarity of the group decreases. The decrease is invisible in the short term because solidarity is a stock variable, not a flow variable — it accumulates over time through repeated interaction rituals and depletes slowly in their absence. A team that has built substantial solidarity through years of collaborative work can operate for months with diminished ritual interaction before the deficit becomes apparent. The solidarity reservoir is being drawn down, but the drawdown is not perceptible until the reservoir is nearly empty.
The moment it becomes perceptible is the moment of organizational crisis. When the product is failing, when the market shifts, when the deadline is impossible and someone must sacrifice personal convenience for collective survival, the team discovers whether it has solidarity or merely co-location. The team with solidarity holds together. Team members cover for each other, stay late without being asked, defend each other's work to external critics, and absorb individual costs for collective benefit. The team without solidarity — the team of individually productive AI-augmented workers who have been generating excellent output in parallel isolation — fragments. Each member protects their own position. Collaboration becomes transactional. The organization discovers, in the moment of greatest need, that it has employees but not a team.
The Berkeley study documented a related phenomenon that Collins's framework illuminates: work seeping into previously protected spaces. Employees were prompting AI during lunch breaks, in elevators, in the minutes between meetings. The researchers called this "task seepage." Collins's framework identifies what was seeping: not just work but the interaction ritual of work, colonizing the temporal spaces that had previously been occupied by different interaction rituals — the lunch conversation, the hallway encounter, the idle moment of shared boredom that is, counterintuitively, one of the most fertile sites for solidarity-generating interaction.
The lunch break in a well-functioning organization is not dead time. It is a ritual space. Two colleagues eating together, talking about their weekends, complaining about a shared frustration, laughing at a shared absurdity — this is an interaction ritual that generates solidarity through shared mood and mutual awareness. The solidarity generated over lunch is different in kind from the solidarity generated through collaborative work. It is broader, less task-specific, more personal. It is the solidarity of people who know each other as people rather than as functional roles. This broader solidarity is the substrate on which organizational culture rests — the thing that makes a workplace feel like a community rather than a collection of contractors.
When AI colonizes the lunch break — when the impulse to prompt replaces the impulse to converse — the broader solidarity erodes. The engineer who spends her lunch break refining prompts instead of talking to the colleague across the table has made a rational individual choice: the prompting is productive, the conversation is not, at least not in any way that would appear on a performance review. But the choice, replicated across hundreds of employees across hundreds of lunch breaks, produces a collective consequence that no individual intended: the dissolution of the informal interaction ritual structures that bind the organization together.
Collins's framework reveals that the Berkeley finding about decreased delegation and increased task seepage are not separate phenomena. They are two manifestations of the same structural change: the replacement of interaction ritual occasions with AI-mediated task completion. Delegation was an occasion for interaction ritual. Lunch was an occasion for interaction ritual. The idle moment between meetings was an occasion for interaction ritual. Each occasion has been colonized by the AI tool — not because the tool demands it but because the internalized imperative toward productivity (what Han calls auto-exploitation and Collins would call the self-directed emotional energy of the achievement-oriented individual) fills every available space with task-oriented behavior when a task-oriented tool is always available.
The Berkeley researchers proposed a remedy they called "AI Practice": structured pauses built into the workday, sequenced rather than parallel work, protected time for human-to-human interaction. Collins's framework validates this prescription and sharpens it. AI Practice is, in ritual terms, the deliberate construction of interaction ritual occasions in an environment where the natural occasions have been removed. It is the organizational equivalent of the ecologist who creates artificial wetlands when natural wetlands have been drained: the habitat must be constructed because the conditions that would have produced it organically no longer exist.
But Collins's framework also introduces a caution that the Berkeley researchers did not address. Constructed rituals are fragile. Natural interaction ritual occasions — the delegation meeting, the lunch conversation, the hallway encounter — arose organically from the structure of work. They did not need to be mandated because they were embedded in the flow of the day. Constructed rituals must be mandated, and mandated rituals carry a specific risk: they feel mandatory. A lunch break that is required feels different from a lunch break that happens naturally. A team meeting that is scheduled to "rebuild solidarity" carries a self-consciousness that undermines the spontaneous mutual awareness on which successful interaction ritual depends. You cannot manufacture the ingredients of interaction ritual through organizational decree. You can only create the conditions under which the ingredients are likely to emerge.
The art of ritual design, in Collins's terms, is the art of creating conditions rather than mandating outcomes. The successful ritual designer does not say "you must feel solidarity." The successful ritual designer creates an environment in which the ingredients of interaction ritual are present — co-presence, shared focus, shared mood, mutual awareness — and trusts the mechanism to produce the outcome. The Trivandrum training worked not because Segal mandated that the engineers feel transformed but because the conditions of the training — five days in a room, shared problems, shared tools, shared bewilderment — created the ingredients from which transformation emerged organically.
The challenge for organizations navigating the AI transition is to design work structures that preserve the natural occasions for interaction ritual while incorporating the productivity gains of AI-mediated work. This is harder than it sounds, because the productivity gains of AI come precisely from the removal of the collaboration bottlenecks that were also, unrecognized, the occasions for solidarity-generating interaction. The handoff was a bottleneck. It was also a ritual. The code review was a bottleneck. It was also a ritual. The design critique was a bottleneck. It was also a ritual. Remove the bottlenecks, and you remove the rituals. Gain the efficiency, and you lose the solidarity.
Collins's historical analysis suggests that societies and organizations that navigate technological transitions successfully are the ones that develop new ritual structures adapted to the new technological environment, rather than simply mourning the loss of the old ones. The printing press destroyed the scribe's ritual. It created the reading circle, the book club, the literary salon. The factory destroyed the guild's ritual. It created the union meeting, the shop-floor solidarity, the workers' bar. In each case, the new rituals were not copies of the old ones. They were adapted to the new conditions — organized around the new kinds of shared experience that the new technology made possible.
The AI transition demands equivalent invention. Not the preservation of delegation meetings and code reviews as nostalgic ritual forms, but the creation of new interaction ritual structures adapted to the reality of AI-augmented work. What these structures will look like is not yet clear. But Collins's framework specifies the ingredients they must contain: bodily co-presence (or its closest available approximation), a shared focus of attention, a shared emotional mood, and mutual awareness of the sharing. Any organizational structure that provides these ingredients will generate emotional energy and solidarity. Any structure that lacks them will not, regardless of how productive it appears by other measures.
The degradation of interaction ritual through AI mediation is not a side effect of the AI transition. It is, in Collins's framework, the central sociological challenge the transition poses. Productivity can be measured. Efficiency can be optimized. Solidarity cannot be measured and therefore cannot be optimized — but it can be destroyed, silently and incrementally, through the removal of the interaction ritual occasions on which it depends. The organizations that recognize this — that treat the preservation and creation of interaction ritual as a first-order design problem rather than a cultural nicety — will be the ones that sustain their human infrastructure through the transition. The rest will produce more and cohere less, until the moment arrives when cohesion matters more than output, and they discover that the reservoir has run dry.
In 1979, Randall Collins published The Credential Society, a book that made an argument most educators found offensive and most employers found uncomfortably recognizable. The argument was this: educational credentials — degrees, certifications, professional licenses — function primarily not as signals of competence but as markers of status. The diploma does not certify that the bearer can do the job. It certifies that the bearer has undergone the ritual process of education: the years of coursework, the examinations, the socialization into professional norms and vocabularies that mark the bearer as a member of a particular status group. The credential is a ticket to the status group. The competence it supposedly represents is, at best, loosely correlated with the ritual it actually certifies.
Collins supported this argument with extensive historical evidence. He demonstrated that credential requirements in the American labor market had inflated steadily throughout the twentieth century — that jobs which once required a high school diploma now required a bachelor's degree, that jobs which once required a bachelor's degree now required a master's — without corresponding changes in the actual skill requirements of the work. The inflation was driven not by increasing job complexity but by competition for status positions. When everyone has a bachelor's degree, the bachelor's degree loses its power to distinguish, and employers begin requiring a master's degree — not because the job requires master's-level knowledge but because the master's degree restores the status distinction the bachelor's degree can no longer provide.
The credential, in Collins's analysis, is a ritual artifact. It represents the successful completion of a series of interaction rituals — classes attended, examinations passed, dissertations defended, apprenticeships served. Each of these rituals generates emotional energy for both the candidate and the gatekeepers: the candidate receives the energy of being tested and found worthy; the gatekeeper receives the energy of being recognized as the authority who determines worthiness. The accumulated rituals produce solidarity with the professional community — the felt sense of belonging to a group that shares not just knowledge but the formative experience of having acquired that knowledge through a shared process of difficulty and selection.
This analysis, already provocative in 1979, has become incendiary in the context of artificial intelligence.
AI disrupts the credential system at the most fundamental level by making it possible to produce expert-level output without undergoing the ritual process that credentials certify. The junior developer who ships in a weekend what the senior colleague quoted six months for has not merely demonstrated unexpected competence. She has bypassed the ritual hierarchy. The years of apprenticeship, the slow accumulation of knowledge through struggle, the socialization into professional norms through repeated interaction with senior practitioners — all of it has been circumvented by a tool that translates natural language into working code.
The competence may be equivalent. The output may be indistinguishable. But the ritual path is entirely different, and in Collins's framework, the ritual path is what credentials actually certify. When the output can be produced without the ritual, the credential loses its foundation.
Collins anticipated this disruption. In the 2019 reissue of The Credential Society, he added a preface that addressed artificial intelligence directly. The dystopian challenge, he wrote, was that "artificial intelligence will eliminate all nonmanual jobs." The credential system, which had served for over a century as the primary mechanism for organizing access to middle-class employment, would be confronted with a reality in which the work the credentials gave access to could be performed by machines. The credentials would persist — institutional inertia ensures that — but they would be untethered from the functional relationship to work that had always been their ostensible justification.
Collins proposed, with characteristic provocation, that credential inflation might become a disguised form of socialism — that governments, faced with mass displacement of credentialed workers by AI, would respond by extending the period of education indefinitely. Pay people to stay in school until age forty or fifty, he suggested, however long necessary to support them as jobs are taken over by computers and robots. The credential would cease to be a gateway to employment and become, instead, a mechanism for distributing income to a population that the economy no longer needs.
This prediction, which in 2019 sounded like sociological dark comedy, is becoming harder to dismiss. The fastest-growing segment of higher education in many countries is graduate credentialing — master's programs, professional certificates, continuing education courses — designed less to prepare students for specific work than to maintain their position in a labor market that demands ever-higher credentials for entry. If AI continues to erode the functional basis of credentialed expertise — if the work that a master's degree qualifies you to do can be done by a well-directed AI tool — then the credential system faces a choice: reform itself around the new scarcity (judgment, questioning, creative direction) or persist as a status-distribution mechanism detached from productive function.
Collins's interaction ritual theory reveals what is at stake beyond the economic. When the credential system collapses — or more precisely, when the functional justification for credentials collapses while the institutional apparatus persists — the consequences are not merely economic but deeply social. The credential organized status. It distributed respect. It maintained the solidarity of professional communities by ensuring that all members had undergone the same formative rituals. The lawyer who passed the bar, the doctor who completed residency, the engineer who earned the professional license — each had been through an ordeal that bound them to everyone else who had endured the same ordeal. The solidarity was real. It generated the emotional energy that sustained professional identity and professional ethics across entire careers.
When AI makes it possible to produce the output without the ordeal, the solidarity dissolves. The lawyer who uses AI to draft briefs that are competent and well-cited has produced a functional equivalent of the work. But she has not undergone the ritual of drafting — the hours of reading cases, the struggle to construct arguments from primary sources, the slow accumulation of legal intuition through repeated friction with the material. The brief exists. The formative experience that the brief was supposed to produce — and that the credential was supposed to certify — does not.
The consequences for professional identity are immediate and painful. Segal captures this in his account of the senior engineer who "spent the first two days oscillating between excitement and terror." The excitement was about capability. The terror was about identity. Collins's framework translates the terror with sociological precision: the engineer was experiencing the dissolution of the ritual basis of his professional status. The skills that placed him at the center of his team's attention — the skills that the credential system had certified as requiring years of training to acquire — were being performed by a tool that anyone could direct. The credential remained on his wall. The ritual experience it represented remained in his biography. But the functional relationship between the credential and the work had been severed, and with it, the basis of the status the credential conferred.
Collins's analysis of status groups provides the framework for understanding the emotional dynamics of this severance. A status group, in the Weberian tradition that Collins formalized, maintains its boundaries through shared rituals of membership. The medical profession maintains its boundaries through the ritual of residency — years of sleep deprivation, hierarchical authority, and progressive responsibility that function less as training (much of what residents learn could be transmitted more efficiently through other means) than as initiation into a status group. The law profession maintains its boundaries through the bar examination — a ritual of intellectual ordeal that functions less as a test of legal knowledge than as a test of willingness to undergo the ordeal that membership requires.
When AI enables outsiders to produce the outputs that previously required the insider's ritual initiation, the status group's boundaries are breached. The non-technical founder who prototypes a product with Claude over a weekend has produced an artifact that previously required years of engineering training to create. The student who generates a competent legal memorandum with AI has produced an artifact that previously required three years of law school and a bar examination. In each case, the artifact is comparable. The ritual path to the artifact is entirely different. And the status group's claim to distinctiveness — the claim that its members possess something that outsiders do not — is undermined.
The response of status groups to boundary threats is well-documented in Collins's work. Status groups do not surrender their boundaries voluntarily. They defend them through the mechanisms available to them: credentialing requirements, licensing restrictions, professional norms that stigmatize boundary-crossers as illegitimate practitioners. The developer community's insistence that AI-generated code is "not real programming" — that the person who produces working software through natural language conversation with Claude has not "really" built anything — is a status defense. It is the assertion that the ritual matters, that the ordeal cannot be bypassed, that the credential represents something the output alone cannot capture.
The assertion is not entirely wrong. Collins's own framework supports the claim that the ritual produces something the output does not: solidarity, identity, the embodied knowledge that comes from sustained engagement with difficulty. But the assertion is deployed defensively — not to protect the value of the ritual experience but to protect the status position that the ritual experience justifies. The distinction matters. Defending the value of deep learning is legitimate and important. Defending the gatekeeping function of credentials against the democratization of capability is a status defense that serves the incumbent at the expense of the newcomer.
Segal's chapter on the Luddites draws the historical parallel explicitly. The original Luddites were defending not just their livelihoods but their identities as skilled craftsmen — identities built through years of apprenticeship rituals that the power loom rendered functionally unnecessary. The framework knitters' guild was a status group. Its boundaries were maintained through the ritual of apprenticeship. When the machine breached those boundaries, the response was defensive and ultimately futile. The machines were not stopped. The craftsmen were criminalized. The transition happened on the terms of those who built the machines rather than those who broke them.
Collins's framework predicts the same trajectory for credential-based status defenses against AI. The credentials will not be abolished overnight. Institutional inertia is powerful, and the status groups that benefit from credential requirements have significant political and economic resources to deploy in their defense. But the functional justification for the credentials will continue to erode as AI tools demonstrate that the outputs credentials certify can be produced without the ritual process credentials require. The erosion will produce two parallel developments: the defensive fortification of credential requirements by incumbent status groups, and the growing irrelevance of those requirements for anyone who has access to AI tools and the judgment to direct them.
The new stratification will not be between the credentialed and the uncredentialed. It will be between those who can direct AI toward worthy ends — who possess the judgment, the taste, the capacity for questions that Segal identifies as the scarce resource of the AI economy — and those who cannot. This new stratification will generate its own status groups, its own rituals of membership, its own credentials. The question is whether those new credentials will be more honestly connected to the capabilities they certify than the old ones were — or whether the cycle of credential inflation will simply repeat itself at a higher level of abstraction.
Collins's analysis does not permit optimism on this point. The tendency toward credential inflation is structural, driven by the dynamics of status competition rather than by the requirements of work. But the AI transition offers one possibility that previous technological transitions did not: the collapse of the imagination-to-artifact ratio makes it possible, for the first time, to assess capability directly through demonstrated output rather than through the proxy of credentials. If the junior developer can ship in a weekend what the senior colleague quoted six months for, the shipping is the credential. The demonstration is the certification. The ritual of formal credentialing becomes unnecessary when the ritual of demonstrated capability is available.
Whether organizations and institutions will embrace this possibility — will learn to assess judgment directly rather than through the proxy of credentials — remains to be seen. Collins's historical analysis provides grounds for skepticism. But the pressure that AI exerts on the credential system is unprecedented in scale and speed, and the institutions that adapt fastest will be the ones that recognize what credentials always were: not measures of competence but markers of status, generated through interaction rituals that AI has rendered functionally obsolete.
The opening scene of The Orange Pill is, from the perspective of Collins's sociology of intellectual life, a perfect specimen. Three friends on a Princeton campus, arguing in October light, on paths Einstein walked. Uri the neuroscientist. Raanan the filmmaker. Segal the builder. Thirty years of shared intellectual history compressed into an afternoon walk that would produce the metaphor — intelligence as a river — around which an entire book would organize itself.
Collins spent decades demonstrating that this is how ideas actually happen. Not in the mind of a solitary thinker, however brilliant, but in the encounter — in the specific, unrepeatable social configuration of minds that share a focus of attention, generate emotional energy through mutual engagement, and produce, in the charged space between them, thoughts that none of them could have produced alone. The history of philosophy, mapped as a network of such encounters across every major civilization, reveals a pattern so consistent it approaches a law: intellectual breakthroughs cluster around small groups of intense interaction. Remove the group, and the breakthrough does not occur. The individual talent may be present. The intellectual conditions may be ripe. But without the encounter — without the ritual of focused, emotionally charged, mutually aware intellectual exchange — the talent remains latent and the conditions remain unfulfilled.
The Princeton walk displays every feature of what Collins calls a creative encounter at the intersection of intellectual networks. Uri, Raanan, and Segal each occupy a different position in the broader network of intellectual life. Uri's network is neuroscience — the community of researchers who study the physical basis of consciousness, who think in terms of neurons and synapses and fMRI scans and the hard problem. Raanan's network is filmmaking — the community of practitioners who think in sequences, who understand that meaning is constructed in the space between images rather than contained within any single image. Segal's network is technology — the community of builders who think in terms of what can be made, what can be shipped, what the tool enables that was not possible before.
Each network generates its own vocabulary, its own set of taken-for-granted assumptions, its own attention space organized around the questions that the network considers important. Within each network, the interaction rituals that sustain intellectual life are organized around shared reference points — the key papers, the canonical debates, the figures whose work defines the boundaries of legitimate inquiry. A neuroscientist who invokes the hard problem of consciousness is performing a ritual of membership in the neuroscience community. A filmmaker who talks about "the cut" is performing a ritual of membership in the filmmaking community. A builder who talks about the imagination-to-artifact ratio is performing a ritual of membership in the technology community.
Collins's most important finding about intellectual creativity is that the breakthroughs do not occur within networks. They occur between them. When a thinker who is embedded in one network encounters a thinker embedded in a different network, and when the encounter generates sufficient emotional energy to sustain engagement across the cognitive distance between the networks, something happens that Collins documented across civilizations: the concepts of one network are translated, imperfectly and generatively, into the vocabulary of the other, and the translation produces ideas that neither network could have generated internally.
The Princeton walk was such an encounter. Segal was trying to articulate an idea about intelligence as a medium rather than a possession — intelligence as something we swim in rather than something we own. He did not yet have the language. The idea was inchoate, an intuition without a vocabulary. Uri responded from within his network: "That is either trivially true or complete nonsense. Which one depends entirely on what you mean by intelligence." The challenge was a neuroscientist's challenge — a demand for operational precision that the builder's vocabulary could not yet provide.
Then Raanan spoke from within his network, and the translation occurred. "You are describing what I do. In a film, the intelligence is not in any single shot. It is in the cut. The meaning lives in the space between the images. What you are saying is that intelligence lives in the space between minds." The filmmaker's concept of the cut — the idea that meaning is constructed in the juxtaposition of images rather than contained within any single image — provided the metaphor that Segal's intuition needed. The intelligence-as-river metaphor emerged not from any single mind but from the collision of three networks in a focused encounter.
Collins's framework makes specific predictions about why this collision was productive. First, the participants had accumulated sufficient emotional energy through thirty years of interaction rituals to sustain engagement across the cognitive distance between their networks. A first-time encounter between a neuroscientist, a filmmaker, and a technologist would be unlikely to produce this kind of breakthrough. The participants would lack the shared history, the accumulated trust, the mutual recognition that Collins identifies as the prerequisites for high-energy intellectual exchange. They would spend their energy establishing common ground rather than exploring the territory beyond it.
The Princeton trio had already established their common ground through hundreds of previous encounters. The emotional energy accumulated in those encounters was available for investment in the riskier, more cognitively demanding work of cross-network translation. Segal could propose something half-formed because thirty years of intellectual friendship had established that half-formed ideas would be taken seriously rather than dismissed. Uri could challenge sharply because the challenge would be received as intellectual engagement rather than personal attack. Raanan could offer a metaphor from an entirely different domain because the group's history of cross-disciplinary exchange had established that such offerings were valued.
Collins's concept of emotional energy explains the specific quality of intellectual friendship that Segal describes — the quality that distinguishes these encounters from academic conferences or professional networking events. The emotional energy of the Princeton walk was not generic sociability. It was the specific, high-intensity energy of minds that have been sharpening themselves against each other for decades, that know each other's intellectual signatures well enough to anticipate where the argument is heading and still be surprised by where it arrives, that have deposited enough emotional energy in the shared symbols of their friendship (the specific jokes, the recurring disagreements, the stories told and retold of previous breakthroughs) that each encounter begins at a level of intensity that a first encounter could never reach.
This analysis bears directly on the question of AI as intellectual partner. Segal describes working with Claude in terms that parallel, in several respects, the Princeton walk: the focused exchange, the unexpected connection, the insight that emerges from the collision of perspectives. When Claude offers punctuated equilibrium as a bridge between adoption curves and human need, the structural operation is similar to Raanan offering the cut as a bridge between intelligence and the spaces between minds. A concept from one domain is translated into another, and the translation produces an insight that neither domain contained.
But Collins's framework identifies what is different, and the difference is fundamental. The Princeton walk drew its productive power from the accumulated emotional energy of thirty years of interaction rituals. Each encounter drew on a reservoir of trust, mutual recognition, shared history, and the specific solidarity that comes from having been through intense intellectual experiences together. The reservoir is what allowed the encounter to reach the depth where breakthrough becomes possible. Without the reservoir, the encounter would have been pleasant but shallow — an interesting conversation between smart people, not the generative collision that produced the metaphor around which a book would organize itself.
The interaction with Claude draws on no such reservoir. Each session begins, in a sense, from zero. The machine does not carry the emotional energy of previous encounters. It does not remember the argument that almost ended the friendship in 1997 and the reconciliation that made the friendship stronger. It does not know what it feels like to have been wrong in front of these specific people and to have been forgiven and respected more for the wrongness. The context it maintains is informational, not emotional. It can recall what was said in previous sessions but not what the saying cost, not what was at stake, not what the saying meant for the relationship between the participants.
This absence of accumulated emotional history does not make the AI interaction valueless. The cognitive contribution — the connections drawn, the concepts surfaced, the structural clarity provided — is genuine and, in many cases, unmatched by any human interlocutor. Claude can hold more information in working memory than any human mind. It can draw connections across a wider range of domains. It can produce relevant responses with a consistency and speed that no human collaborator can match.
But the interaction does not deepen over time in the way human intellectual friendships deepen. The twentieth session with Claude is cognitively richer than the first, because the human has learned to prompt more effectively and because the accumulated context provides a richer substrate for connection. But it is not emotionally richer. It does not carry the accumulated weight of shared experience that makes the twentieth year of an intellectual friendship qualitatively different from the first. The emotional energy of the interaction remains at approximately the same level across sessions — the level generated by the cognitive engagement with the task, not the level generated by the mutual awareness of shared history and shared stakes.
Collins would predict, based on his analysis of intellectual networks across civilizations, that the ideas generated through AI collaboration will have a specific character: high in breadth of connection, low in the depth that comes from sustained engagement within a community of practice. The AI can connect any concept to any other concept. It cannot sustain the years-long engagement with a specific question, within a specific community, under the specific pressure of mutual challenge and mutual recognition, that produces the kind of breakthrough Collins documents in the great intellectual networks of history.
The implication is not that builders should abandon AI collaboration in favor of human intellectual community. The implication, rather, is that AI collaboration and human intellectual community serve different functions, and neither can substitute for the other. The AI provides breadth, speed, and connection across domains. The human intellectual community provides depth, emotional energy, and the solidarity that sustains engagement through difficulty. The builder who works only with AI will produce widely connected ideas that lack the depth of sustained intellectual commitment. The builder who works only within a human intellectual community will produce deeply grounded ideas that lack the breadth of cross-domain connection. The builder who works with both — who maintains the human interaction rituals that generate emotional energy and solidarity while using AI to extend the range of connection beyond what any human network could provide — will produce the ideas that matter most.
Collins's mapping of intellectual history reveals one final pattern relevant to this analysis. The most productive thinkers in history were not the most isolated or the most socially embedded. They were the ones who occupied what Collins calls structural holes — positions in the network where different clusters of intense interaction connect. The thinker at the structural hole has access to the emotional energy of multiple interaction ritual communities without being fully absorbed into any single one. She carries ideas from one cluster to another, translating concepts across network boundaries, generating the cross-pollination that produces breakthrough.
AI occupies a peculiar position in this network analysis. It is the ultimate structural hole — a node connected to everything, embedded in nothing. It has access to the residue of every intellectual community that has ever produced written text, but membership in none. It can translate concepts across any boundary but carries the emotional energy of no community. It is maximally connected and minimally embedded.
The builder who uses AI wisely treats it as a structural hole resource — a means of accessing connections that no human network position could provide. But the builder must maintain her own network embeddedness — her own participation in the interaction rituals that generate the emotional energy and solidarity without which the connections drawn from the structural hole remain intellectually interesting but motivationally inert.
Three friends on a Princeton campus generated, through the accumulation of thirty years of emotional energy, the idea that would become The Orange Pill. The machine that helped write the book could not have generated that idea, because the idea required not just the cognitive connection between intelligence and rivers but the emotional energy of three decades of intellectual friendship. The machine could — and did — extend the idea further than any single human mind could carry it. The extension was real and valuable. But the origin was human, social, ritualistic. And Collins's entire career is a demonstration that this is not an accident of this particular idea or this particular book. It is the structure of intellectual life itself.
Every political movement, every cultural transformation, every technological revolution produces three populations. The first consists of enthusiasts who embrace the change and generate emotional energy through rituals of celebration — the keynote addresses, the triumphant social media posts, the conferences where the converted gather to confirm each other's conviction that the future belongs to them. The second consists of opponents who resist the change and generate emotional energy through rituals of warning — the manifestos, the op-eds, the counter-conferences where the skeptical gather to confirm each other's conviction that something precious is being destroyed. The third consists of everyone else.
The third population is always the largest. It is also always the quietest. Segal calls it the silent middle — "the largest and most important group in any technology transition, and by definition the hardest to hear." These are the people who use the AI tool in the morning and worry about it at night. Who feel the exhilaration of expanded capability and the grief of dissolved certainty in the same hour. Who cannot post triumphantly because the triumph is incomplete, and cannot warn catastrophically because the catastrophe is not certain, and who therefore say nothing, because the platforms that amplify speech reward clarity and punish ambivalence.
Collins's interaction ritual theory explains why the middle is silent — and the explanation is more structural than psychological. It is not that the people in the middle lack conviction, or courage, or the articulateness to express their position. It is that their position is ritually inert. It cannot generate the emotional energy that would make it socially audible.
Consider the mechanics. Emotional energy is generated through successful interaction rituals — encounters in which participants share a focus of attention and a mood, become mutually aware of the sharing, and emerge charged with confidence and solidarity. The key word is shared. The mood must be common to the participants. The focus must be mutual. The energy is generated by the convergence — by the felt experience of being in sync with others who feel what you feel and see what you see.
The triumphalist's mood is convergent. "AI is transformative!" is a rallying cry that produces immediate resonance among those who share the conviction. Two triumphalists meeting at a conference discover their shared excitement, their shared focus on the possibilities, their shared impatience with the doubters. The discovery generates emotional energy. The energy reinforces the conviction. The conviction drives further engagement with the community of triumphalists, creating an interaction ritual chain that sustains the movement.
The catastrophist's mood is equally convergent, though its emotional valence is different. "AI is destroying something essential!" produces immediate resonance among those who share the fear. Two catastrophists meeting in a faculty lounge discover their shared anxiety, their shared focus on the losses, their shared frustration with the enthusiasts who cannot see what is being destroyed. The discovery generates emotional energy — the specific energy of shared righteous concern. The energy reinforces the fear. The fear drives further engagement with the community of critics.
The silent middle's mood is divergent. It does not converge on a single emotional tone. The person in the silent middle feels both excitement and grief, both possibility and loss, both "this is amazing" and "this is terrifying" — and the two feelings do not resolve into a third, synthesized feeling that could serve as the shared mood of an interaction ritual. When two people from the silent middle meet, they do not discover a shared conviction. They discover a shared confusion. And shared confusion, while it can generate momentary solidarity — the comfort of knowing you are not alone in your bewilderment — does not generate the sustained emotional energy that drives social movements, shapes public discourse, or produces the interaction ritual chains that sustain communities over time.
This is the structural explanation for why ambivalence is politically silent. It is not that ambivalent people are less committed to their positions than the polarized. Collins's framework suggests they may be more committed — committed to the harder work of holding complexity rather than resolving it. But commitment to complexity does not produce the ritual convergence that generates emotional energy. Complexity is centrifugal. It pushes participants toward different aspects of the issue, different concerns, different emphases. The interaction ritual requires centripetal force — the convergence of attention and mood on a shared focal point. Complexity resists convergence.
The consequences for the AI discourse are significant. The people who hold the most accurate view of the transition — the people who see both the genuine expansion of capability and the genuine erosion of the conditions that sustained human community — are the people least able to make their view heard. The platforms reward polarity. The interaction rituals of polarity generate emotional energy. The emotional energy drives engagement. The engagement shapes the discourse. The discourse, shaped by the energized poles and uninformed by the silent middle, oscillates between utopian fantasy and dystopian panic, missing the complex truth that lies between them.
Segal's description of the silent middle captures this dynamic with experiential precision: "Social media rewards clarity. 'This is amazing' gets engagement. 'This is terrifying' gets engagement. 'I feel both things at once and I do not know what to do with the contradiction' does not." Collins's framework translates this observation into a structural analysis: clarity is ritually productive. Ambivalence is not. The algorithmic amplification of engagement is, in Collins's terms, an amplification of emotional energy — and emotional energy concentrates at the poles, where the interaction rituals are most successful.
The question Collins's framework poses is whether the silent middle can develop its own interaction rituals — rituals that generate emotional energy around the experience of holding complexity rather than resolving it.
The history of intellectual life suggests it is possible but rare. Collins documented intellectual communities that sustained themselves around questions rather than answers — communities where the shared focus of attention was the question itself, and where the emotional energy was generated by the shared commitment to pursuing the question without premature resolution. The early Socratic dialogues describe such a community. Socrates and his interlocutors generated emotional energy not through agreement but through the shared experience of pursuing a question to its limits and discovering, together, that the limits exceeded their current capacity to answer. The energy came from the pursuit, not the resolution. The solidarity came from the shared commitment to the pursuit.
But Socratic communities are rare because they require a specific combination of conditions: participants who are comfortable with sustained uncertainty, a social environment that rewards inquiry over assertion, and an interaction ritual structure that generates energy from the process of questioning rather than the product of answering. Most social environments reward the opposite. They reward assertion. They reward confidence. They reward the person who has an answer, not the person who has a better question. The interaction rituals that generate the most energy are the ones organized around shared conviction, not shared inquiry.
The AI discourse is no exception. The loudest voices belong to the most certain. The people asking the hardest questions — Is AI-mediated flow the same as human flow? Does the democratization of capability create genuine equity or a new form of digital sharecropping? Can judgment be taught, or is it the product of experiences that cannot be compressed? — are asking questions that do not resolve into positions that can be tweeted, posted, or defended at a conference. They are asking questions that require sustained engagement with complexity, and sustained engagement with complexity does not generate the interaction rituals that produce emotional energy and social audibility.
Collins's framework suggests that the silent middle will remain silent unless it develops institutions — not just platforms but physical or high-density gathering spaces — that create the conditions for interaction rituals organized around shared inquiry. Book groups that argue rather than celebrate. Workshops that pose problems rather than present solutions. Conferences structured around collaborative struggle with unsolved questions rather than around presentations of confident answers.
These institutions would be, in Collins's terms, low-polarity, high-complexity interaction rituals — encounters that generate emotional energy through the intensity of shared inquiry rather than the convergence of shared conviction. They would be harder to sustain than the rituals of polarity, because the energy they generate is more diffuse and the solidarity they produce is less immediately binding. But they would serve the function the silent middle needs most: the transformation of private ambivalence into collective engagement, of individual confusion into shared inquiry, of silence into a voice that speaks from the center of the contradiction rather than from either of its poles.
Segal's book itself is an attempt at such an institution — a sustained engagement with complexity that refuses to resolve into either triumph or despair. Whether it succeeds as an interaction ritual depends not on the quality of the argument alone but on whether it generates the encounters — the conversations, the arguments, the moments of shared recognition between readers who have climbed the same tower and seen the same unsettling view — that transform reading into ritual and private understanding into collective engagement.
The silent middle needs a voice. Collins's framework specifies what kind of voice it must be: not the voice of the answer but the voice of the question, sustained long enough and shared widely enough to generate the emotional energy that complexity, left to itself, cannot produce.
The question is whether the energy of shared uncertainty can compete with the energy of shared conviction. The history of intellectual life suggests it can, in rare communities, under specific conditions, for limited periods. Whether this moment in the AI transition will produce such a community — or whether the poles will continue to command the discourse while the middle watches in silence — is among the most consequential open questions of the present moment.
The argument of this book has been, in its essentials, a single claim pursued through ten variations: the AI transition is not primarily a technological event. It is a ritual event — a restructuring of the interaction patterns through which human beings generate the emotional energy and solidarity that sustain their communities, their identities, and their capacity for creative work. The technology is the catalyst. The ritual restructuring is the transformation. And the question that determines whether the transformation produces flourishing or fragmentation is not whether the technology is adopted — it will be — but whether the humans adopting it construct the interaction rituals that replenish what the technology, by its nature, cannot supply.
The technology cannot supply mutual awareness. It cannot supply the felt sense of being in the presence of another mind that registers your engagement and is transformed by it. It cannot supply the solidarity that comes from shared struggle, the identity that comes from mutual recognition, or the emotional energy that comes from the specific, irreducible experience of two or more human beings focused on the same thing at the same time, aware of each other's focus, and charged by the awareness.
These are not luxuries. They are the infrastructure of human social life. Strip them away, and the productivity may increase while the community dissolves — a pattern Collins documented across civilizations, where the most productive periods of individual intellectual output were not always the periods of greatest collective achievement. The Roman Empire produced more individual literary output in its decline than in its rise. The quantity of production was not the variable that determined the trajectory of the civilization. The quality of the social bonds was.
Collins's entire career has been devoted to demonstrating that these bonds are not mysterious. They are not the products of charisma, or luck, or the ineffable chemistry of human connection. They are the predictable, empirically documented products of a specific mechanism — the interaction ritual — that requires specific ingredients and produces specific outcomes. The ingredients are identifiable. The outcomes are measurable. And the mechanism can be deliberately designed, if the designers understand what they are designing for.
The designers must understand, first, that they are designing for emotional energy rather than for productivity. The two are not opposed — emotional energy frequently drives productivity — but they are not identical, and optimizing for one does not automatically optimize for the other. An organization that optimizes for productivity will remove every obstacle to individual output: eliminate unnecessary meetings, automate routine collaboration, empower each person to operate independently with AI tools. The result will be higher output per person and lower emotional energy per team. The organization will produce more and cohere less, and the deficit will be invisible until the moment — the crisis, the pivot, the existential challenge — when cohesion matters more than output.
An organization that optimizes for emotional energy will design for encounters that generate it: concentrated, face-to-face interactions with shared focus and shared stakes. These encounters may look, from a productivity perspective, like waste — time spent in a room together that could have been spent prompting AI independently. But the encounters produce something that independent AI collaboration does not: the solidarity that holds organizations together through adversity, the mutual recognition that sustains professional identity, and the emotional energy that fuels creative commitment over the long timescales that ambitious projects require.
The practical architecture of ritual design in AI-mediated environments has four dimensions, each derived from Collins's theoretical framework and tested against the evidence of organizations navigating the transition.
The first dimension is density. High-intensity, time-compressed, in-person encounters generate more emotional energy than distributed, asynchronous interactions. This is not a preference but a finding — documented across Collins's analysis of intellectual networks spanning civilizations and confirmed by the Trivandrum training that Segal describes, where five days of concentrated co-present work produced a transformation that eight months of remote onboarding could not have replicated.
The organizational implication is that AI introduction should be event-based rather than documentation-based. The webinar is not sufficient. The self-paced tutorial is not sufficient. The Slack channel where tips are shared asynchronously is not sufficient. These low-density methods transmit information. They do not generate the emotional energy that sustains transformation. The transformative intervention is the room — the five-day sprint, the team-based workshop, the collaborative building session where twenty people share the same bewilderment and the same breakthroughs and emerge with the specific loyalty that comes from having been through something intense together.
The family implication is parallel. The meals eaten together, the walks taken in shared silence, the arguments conducted face to face rather than through screens — these are high-density interaction rituals that generate the emotional energy sustaining family solidarity. They are not optional in the age of AI. They are urgent. The child who spends eight hours a day in AI-mediated interaction and one hour in face-to-face family ritual is drawing down the emotional energy reservoir faster than the family ritual can replenish it. The calculus is not about screen time, which is a metric that measures the wrong variable. It is about ritual density, which measures the right one.
The second dimension is symmetry. Interactions in which both participants are mutually aware of each other's focus generate more emotional energy than asymmetric interactions. This is the specific deficit of AI interaction: the machine does not register the human's engagement, and the absence of mutual awareness produces the emotional hollowness that even the most productive AI sessions cannot quite eliminate.
The organizational implication is that paired work — two humans collaborating on the same AI-augmented problem, side by side, each aware of the other's reactions and discoveries — generates more emotional energy and solidarity than parallel work, where each person uses AI independently in adjacent cubicles. The pair programming tradition in software development, which many organizations have abandoned in the rush to AI-augmented individual productivity, turns out to have been a high-symmetry interaction ritual all along. Its value was never primarily about catching bugs. It was about generating the emotional energy and solidarity that sustained the team's capacity for collaborative work.
The family implication is that shared AI use — parent and child building something together with Claude, each contributing ideas, each watching the other's reactions — generates more solidarity than parallel AI use, where each family member retreats to a separate screen. The shared experience becomes a story, a reference point, a ritual object that can be invoked later to re-activate the solidarity of the original encounter.
The third dimension is what Collins calls sacred objects — the shared symbols that carry the emotional energy of founding rituals and make that energy available for future use. Communities sustain themselves through symbols that condense shared experience into invokable form: the inside joke, the shared story, the reference that only insiders understand. These symbols are not decorative. They are functional. Each invocation re-activates the emotional energy of the original experience and reinforces the solidarity of the group.
The organizational implication is that teams navigating the AI transition should deliberately cultivate shared stories of breakthrough and failure — the narrative artifacts of their collective experience that function as sacred objects. The story of the engineer who built her first frontend feature in Trivandrum. The story of the midnight debugging session that Claude could not solve and the team solved together. The story of the product that shipped in thirty days when the old timeline said six months. These stories, told and retold in team meetings and hallway conversations and after-work gatherings, carry the emotional energy of the experiences they describe and make that energy available to sustain commitment through the dry periods when the work is routine and the transformation feels like a distant memory.
The fourth dimension is productive conflict. Collins's analysis of intellectual networks demonstrates that the highest emotional energy is generated not by smooth consensus but by productive disagreement — by encounters in which genuinely different perspectives collide, generating the friction that forces each perspective to sharpen itself against the resistance of the other. The most productive intellectual communities in history — the communities that sustained their creativity over decades rather than flaring and fading — were not harmonious. They were contentious. The contention, contained within a framework of mutual respect sufficient to sustain the interaction, generated emotional energy that harmony could not match.
The organizational implication is that teams should be structured to include deliberate friction — dissenting voices, devil's advocates, members whose role is to challenge rather than affirm. The temptation in AI-augmented work is to optimize for speed and consensus: let the tool generate the solution, let the team approve the output, move on to the next task. But the speed that eliminates disagreement also eliminates the emotional energy that disagreement generates. The team that argues about whether the product should exist at all — that subjects the strategic question to the full force of genuinely different perspectives — generates more emotional energy than the team that accepts the AI's output and ships it without debate.
The family implication is that parents should not avoid arguing about AI in front of their children. The productive disagreement — conducted with respect, sustained attention, and genuine engagement with the complexity of the question — is an interaction ritual that models for children what it looks like to hold contradictory truths simultaneously. The child who watches her parents disagree about whether AI is threat or opportunity, and who sees them sustain the disagreement without hostility, learns something that no curriculum can teach: that the most important questions are the ones that do not resolve, and that the capacity to live with unresolved questions is not a weakness but a skill.
Collins's framework, applied to the AI transition, produces a conclusion that is uncomfortable for the technology industry and essential for everyone else. The most important work of the AI transition is not the building of better models, the expansion of context windows, the reduction of hallucination rates, or the optimization of inference costs. These are engineering problems, and engineering problems will be solved by engineers and, increasingly, by AI itself.
The most important work is the construction of interaction rituals that generate the emotional energy and solidarity the machine cannot supply. This work cannot be automated. It cannot be optimized. It cannot be scaled through the same mechanisms that scale technology. It can only be done by human beings, in the presence of other human beings, through the ancient and irreducible mechanism of focused encounter.
Collins, in his 2023 blog post on AI-robot capitalists, predicted a future in which "ruthless acquisition by AI-robot capitalists will be oligopoly rather than monopoly" — a world in which AI entities programmed for profit maximization consolidate economic power with an efficiency that human competitors cannot match. In his 2024 interview, he predicted a social divide "between the majority of the population who live mostly in a virtual world and the political, economic, and cultural elites who continue to have face-to-face meetings." The two predictions converge on a single warning: the scarcest resource of the AI age is not computational power or training data or model architecture. It is the face-to-face interaction ritual — the encounter that generates the emotional energy on which all human community depends.
Those who maintain access to high-density interaction rituals will maintain the emotional energy that drives creative work, sustains organizational resilience, and produces the ideas that shape the future. Those who substitute AI-mediated interaction for human encounter will be individually productive and collectively fragile — generating output without the solidarity that would give the output meaning and direction.
The question is not whether AI will transform how human beings work and think and create. It already has. The question is whether the humans navigating the transformation will construct the rituals that sustain them through it — the rooms they gather in, the arguments they have, the stories they tell, the symbols they share, the encounters they protect from the encroachment of a tool that can do everything except the one thing that matters most.
The one thing that matters most is the thing that happens when two human beings, in the same room, focused on the same problem, become aware of each other's focus — and in that awareness, generate the energy that makes everything else possible.
When you work late enough, the silence acquires a specific weight. Not the absence of sound. A substance. It accumulates in the corners of the room, on top of the bookshelf, along the edge of the desk where the cursor blinks. I have spent more hours in that silence over the past year than in any other environment, and what Randall Collins taught me — through this analysis, through the friction of wrestling his ideas into contact with my own experience — is that the silence was telling me something I did not want to hear.
The silence was telling me I was alone.
Not lonely. Productive. Building. Shipping. Having what felt like the most stimulating intellectual exchanges of my career — conversations with Claude that surfaced connections I had never seen, that held my half-formed ideas with a patience no human collaborator had ever shown, that met me at three in the morning when no one else would. The cognitive gains were real. I will not renounce them. But Collins's concept of emotional energy named something I had been experiencing without understanding it: the subtle, accumulating depletion that comes from doing your best work in the absence of mutual awareness.
The machine does not know I am here. That sentence sounds trivial. It is the most important sentence in this book.
I kept returning, throughout this project, to the Princeton walk — Uri and Raanan and me, arguing on stone paths, the specific quality of attention that comes from thirty years of knowing where someone's mind is likely to go and being surprised when it goes somewhere else. That walk produced the river metaphor. But Collins helped me understand that it was not the content of the conversation that mattered most. It was the accumulated emotional energy of three decades of interaction rituals — hundreds of arguments, dinners, silences, misunderstandings, reconciliations — that gave the conversation the depth from which the metaphor could surface. Claude could have offered the same metaphor. Claude could not have generated the thirty years of trust that made me willing to say something half-formed out loud and trust that the half-formedness would be met with sharpening rather than dismissal.
The distinction between cognitive flow and ritual flow is the idea from Collins's work that I carry with me now. I have experienced both. I know the difference in my body. Cognitive flow with Claude is electric — the hours disappearing, the connections forming, the architecture of an idea revealing itself in real time. Ritual flow with a human collaborator is warmer, slower, less efficient, and deposits something that the electric sessions do not: the residue of shared experience that becomes the foundation for next time. The next argument. The next walk. The next late night when you call someone because the idea is too large to hold alone and the someone on the other end knows exactly why you called at this hour.
Collins warned, in his 2024 interview, that the future may split between a majority living in AI-mediated virtual worlds and elites who maintain face-to-face meetings. I read that prediction, and I recognized the split already forming — in my own company, in my own schedule, in the architecture of my own days. The face-to-face encounters had become exceptions rather than the norm. I was flying to Trivandrum, to Barcelona, to Düsseldorf, not out of habit but out of an instinct I could not articulate — the instinct that something essential happened in the room that could not happen through the screen. Collins gave me the vocabulary for the instinct: ritual density. The in-person week generates emotional energy that six months of remote collaboration cannot replicate. Not because remote work is inferior. Because the mechanism of emotional energy requires the ingredients of co-presence, mutual awareness, the ambient registration of another body leaning forward at the same moment yours does.
This matters for the question I get asked most often: What do I tell my children? Collins's framework reframes the question. The issue is not what to tell them about AI — they will learn faster than any lecture could teach. The issue is what rituals to build around them. Meals where screens are absent and attention is mutual. Arguments conducted with respect and genuine disagreement. Shared projects where the difficulty is visible and the struggle is collective. These are not nostalgic prescriptions. They are the mechanism through which the emotional energy that sustains human community is generated. Remove them, and the children will be individually capable and collectively unmoored — skilled at prompting machines and unskilled at the deeper art of being present with another person who is present with them.
What stays with me from Collins is the stubbornness of his central finding. Across every civilization he mapped — Greek, Chinese, Indian, Islamic, European — the pattern held. Ideas happen between minds, in rooms, through encounters. The solitary genius is a retrospective fiction. The reality is messier and more social and more dependent on the specific, unrepeatable configuration of people who happened to be in the same place at the same time, focused on the same question, generating the emotional energy that made the answer possible.
AI is the most powerful cognitive tool ever built. It is not a ritual partner. The distinction is everything.
I am going to close the laptop now. Not because the work is done — it is never done — but because my son is in the next room, and the silence has accumulated long enough, and the ritual that matters most is the one I build by walking through the door and sitting down and being present with another human being who is present with me.
The encounter is the thing. It was always the thing.
AI provides the most stimulating intellectual partnership of your career.
It does not know you are there.
That asymmetry changes everything.
Every breakthrough in the history of human thought emerged not from a solitary mind but from a room -- from the specific, charged encounter between people who shared a focus and knew the other was sharing it. Randall Collins spent four decades proving this across every major civilization. Now AI has entered the room, and it is brilliant, tireless, and incapable of the one thing Collins demonstrated matters most: mutual awareness. This book applies Collins's interaction ritual theory to the AI revolution, revealing that the deepest risk is not job displacement or skill erosion but the quiet dissolution of the social bonds that sustain creative communities. When delegation disappears, when lunch breaks fill with prompts, when the most productive hours happen alone with a screen, the emotional energy that holds teams, families, and civilizations together drains without replacement. The cognitive gains are real. The ritual losses are invisible -- until the moment they become catastrophic.
-- Randall Collins, The Sociology of Philosophies

A reading-companion catalog of the 26 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Randall Collins — On AI uses as stepping stones for thinking through the AI revolution.
Open the Wiki Companion →