By Edo Segal
The voice I kept not hearing was my own engineer's.
Not her code. Not her output metrics. Not the twenty-fold productivity number I cited in boardrooms and wrote about in *The Orange Pill*. Her voice. What she actually felt on that Wednesday in Trivandrum when Claude handled the work her hands used to do and she sat there, palms open on the desk, with nothing to reach for.
I described that week in my book as a transformation. I measured it in multiples. I celebrated it as liberation. And all of that was true — true at the altitude where builders live, where the pattern matters more than the pixel, where you're scanning the horizon for what comes next.
But there was a woman in that room whose experience I compressed into a sentence. She "built a complete user-facing feature in two days." That's what I wrote. That's the data point. What I didn't record was how it felt. Whether the speed thrilled her or unsettled her. Whether she called someone that night to talk about it. Whether she slept well or lay awake recalculating what her career meant now.
I didn't ask. I was too busy building.
Studs Terkel would have asked. He spent sixty years doing almost nothing else — sitting across from people who worked, turning on a tape recorder, and disappearing behind a question so simple it embarrassed academics: *What do you do all day? How does it feel?*
The answers he collected across books like *Working* and *Hard Times* revealed something no productivity metric can touch. Work is not an economic activity. It is where identity gets made. The steelworker who wanted to point at a building downtown and say *I helped make that* wasn't describing a job. He was describing a self. And when the work changes — when the thing your hands used to do is handled by a machine — the self has to be rebuilt from materials that haven't arrived yet.
This is the lens Terkel offers our moment. Not a framework for understanding AI's capabilities. A framework for understanding what AI does to the people who live alongside it. The engineer, the teacher, the designer, the spouse watching from the next room — each carries a testimony that no chart can hold.
I built *The Orange Pill* from the top of a tower. Terkel would have gone downstairs. This book is an attempt to follow him there — to hold the frameworks I believe in while listening for the voices they cannot contain.
The voices are the part that matters most. I am learning that late.
— Edo Segal ^ Opus 4.6
1912–2008
Studs Terkel (1912–2008) was an American oral historian, author, broadcaster, and Pulitzer Prize–winning chronicler of working life. Born Louis Terkel in New York City and raised in Chicago, he spent six decades interviewing ordinary Americans about their experiences of work, economic hardship, race, war, and aging. His landmark 1974 book *Working: People Talk About What They Do All Day and How They Feel About What They Do* collected testimonies from over a hundred workers — steelworkers, waitresses, gravediggers, firemen, executives, teachers — and revealed work as the primary site where Americans construct identity, seek dignity, and negotiate meaning. His other major oral histories include *Hard Times: An Oral History of the Great Depression* (1970), *"The Good War": An Oral History of World War Two* (1984, Pulitzer Prize for General Nonfiction), *Race: How Blacks and Whites Think and Feel About the American Obsession* (1992), and *Will the Circle Be Unbroken?: Reflections on Death, Rebirth, and Hunger for a Faith* (2001). Terkel's method — radically simple, rooted in deep listening and the refusal to impose editorial frameworks on testimony — became a model for humanistic inquiry across disciplines. His concept of the "mark" — the evidence, visible or invisible, that a person's labor has changed something in the world — remains one of the most enduring articulations of dignity in work. He hosted the radio program *The Studs Terkel Program* on WFMT Chicago for forty-five years and was awarded the National Humanities Medal in 1997. His work stands as a permanent argument that the most important truths about a society are held by the people least likely to be asked.
In 1974, Studs Terkel sat down with a steelworker named Mike LeFevre in a bar on the South Side of Chicago and asked him what he did all day. LeFevre talked for hours. He talked about the weight of the steel, the heat of the mill, the way his body ached at the end of a shift. He talked about wanting his son to go to college so the boy would never have to work with his hands the way his father did. But then he said something that stopped the conversation cold. He said he wished, sometimes, that he could point to a building downtown and tell his kid: I helped make that. The thing that mattered to Mike LeFevre was not the paycheck, not the benefits, not even the physical suffering. The thing that mattered was the possibility that his work had left a mark on the world that someone else could see.
Terkel understood, in a way that no labor economist of his era fully grasped, that work is not primarily an economic activity. Work is the practice through which human beings construct an identity, establish a relationship with the world, and discover whether they matter. The steelworker was not describing a job. He was describing a self. And the self he described was inseparable from the steel he poured — from the specific, embodied, daily act of turning raw material into something that would stand in the city long after he was dead.
"Most of us have jobs that are too small for our spirit," Terkel wrote in the introduction to Working. "Jobs are not big enough for people." The sentence has been quoted so often it has acquired the patina of a greeting card, but its meaning is precise and devastating. The problem Terkel identified was not that work was hard. Hard work, his interviews revealed again and again, was often the most satisfying kind. The problem was that the relationship between the worker and the product of the work had been severed — by the assembly line, by the corporate hierarchy, by the division of labor into units so small that no single worker could see the whole of what they were making. The steelworker poured steel. He did not build the building. The secretary typed memos. She did not shape the decisions the memos conveyed. The switchboard operator connected calls. She did not participate in the conversations that the calls made possible.
This severance — between the worker and the meaning of the work — was what Terkel spent his life documenting. Not as a sociologist collecting data, but as a listener recording voices. The voices carried what data could not: the texture of the loss, the specific quality of what it feels like to spend eight hours a day doing something that does not feel like yours.
The artificial intelligence revolution of 2025 and 2026 has produced a new version of this severance, and it operates in precisely the opposite direction from the one Terkel documented. In Terkel's era, the problem was fragmentation — work broken into pieces too small to carry meaning. In the AI era, the problem is compression — work that once required sustained engagement, the kind of engagement through which identity and competence are built, compressed into a conversation with a machine that handles the difficult parts on the worker's behalf.
Terkel's framework illuminates this compression with uncomfortable precision. When he interviewed workers, he was always listening for what he called the "mark" — the evidence, visible or invisible, that a person's labor had changed something in the world. The gravedigger took pride in the straightness of his lines. The stone mason could identify his walls decades later by the particular way he had set the corners. The bookbinder knew her books by their spines. Each of these workers possessed a specific, hard-won competence that the work had deposited in them over years, and the competence was inseparable from the identity. To be a good gravedigger was not merely to have a skill. It was to be a person whose relationship to the earth, to the dead, to the families who would stand over the graves, had been formed through the repetitive, embodied, irreducible act of digging.
What happens to the mark when the machine does the digging?
Segal describes this question in The Orange Pill through the experience of his engineers in Trivandrum — the senior developer who spent two days "oscillating between excitement and terror" as Claude Code handled the implementation work that had consumed eighty percent of his career. The excitement was real: the tool worked, the output was good, the speed was extraordinary. But the terror was real too, and the terror was not about unemployment. The terror was about identity. If the implementation work that had defined him could be handled by a tool, what was the remaining twenty percent actually worth? Segal's answer — "everything" — is philosophically sound and practically urgent. The judgment, the architectural instinct, the taste that separates a feature users love from one they tolerate, these capacities turned out to be the part that mattered.
But Terkel's framework asks a different question, one that Segal's answer does not fully address. Terkel would not ask whether the remaining twenty percent is economically valuable. Terkel would ask whether the remaining twenty percent provides the same relationship to the work that the whole hundred percent once did. Whether the engineer who now directs rather than builds feels the same ownership, the same mark, the same identity-forming engagement with the product of his labor. Whether the move from building to directing is experienced as a promotion or as an exile — as ascending to the level where the real decisions happen, or as being removed from the level where the real work lived.
The distinction matters because Terkel's interviews consistently revealed that dignity in work is not a function of the work's economic value. It is a function of the worker's felt relationship to the work. The piano tuner whose testimony appears in Working earned modest wages, but his description of listening to a piano, hearing the precise point where a string's pitch met its ideal, and adjusting with a specificity that only his trained ear could register, is one of the most luminous passages in the book. The dignity was in the listening. In the relationship between his ear and the string. In the knowledge that he could hear something no one else in the room could hear, and that this hearing mattered.
When an AI tool can tune a piano — or write code, or draft a legal brief, or compose a marketing strategy — the question is not whether the human worker can be "promoted" to a higher level of abstraction. The question is whether the higher level of abstraction provides the same felt relationship to the work. Whether directing an AI to write code feels, to the engineer, the way writing code felt. Whether reviewing an AI-drafted brief feels, to the lawyer, the way researching and writing the brief felt. Whether the mark survives the ascent.
Terkel would have been skeptical. Not because he was opposed to technology — he was not, and he said so explicitly. "Automation?" one of his interview subjects mused in Working. "Depends how it's applied. It frightens me if it puts me out on the street. It doesn't frighten me if it shortens my work week." Terkel included this passage because it captured the conditional, practical, unsentimental way that working people actually think about machines. The question was never whether the machine was good or bad in the abstract. The question was what the machine did to the specific relationship between this worker and this work.
The AI discourse of 2025 and 2026 has largely failed to ask Terkel's question. The discourse operates at the level of productivity metrics, economic forecasts, and philosophical abstractions about the nature of intelligence. These are legitimate and important levels of analysis. Segal's treatment of the imagination-to-artifact ratio, the ascending friction thesis, the argument that AI reveals what was always most valuable about human work — these are genuinely illuminating frameworks. But they are frameworks, and frameworks operate at a distance from the specific, embodied, irreducibly personal experience of the individual worker standing in front of the machine.
"It is about a search, too, for daily meaning as well as daily bread," Terkel wrote, "for recognition as well as cash, for astonishment rather than torpor; in short, for a sort of life rather than a Monday through Friday sort of dying." The sentence describes every AI-augmented worker whose experience Segal documents — the engineer who found the remaining twenty percent was "everything," the designer who went from designing to building end-to-end, the marketing manager who built her own tool. Each of these people was searching for meaning, recognition, astonishment. Each found something. Whether what they found is the same thing, or a substitute for the thing, or something entirely new that requires a new vocabulary to describe — that is the question Terkel's framework insists we ask.
The most haunting concept in Terkel's work, as applied to the present moment, is what he called "the planned obsolescence of people." The phrase appears in the introduction to Working, written about the automation of the 1970s, and it has become the single most quoted passage from Terkel's work in the AI discourse of the 2020s. Writers at UnHerd, n+1, and the Red Hook Star-Revue have all reached for it. The phrase captures something that economic analysis alone cannot: the experience of being made unnecessary. Not laid off — that is a specific, time-bound event with specific consequences. Made unnecessary — which is an existential condition, a reclassification of the self from needed to optional, from essential to redundant.
"It is perhaps this fear of being no longer needed in a world of needless things," Terkel wrote, "that most clearly spells out the unnaturalness, the surreality of much of what is called work today." Written in 1974, the sentence described factory workers watching machines take over the assembly line. In 2026, it describes knowledge workers watching AI take over the tasks that constituted their professional identity. The geography of the obsolescence has shifted from the factory floor to the office, from the hand to the mind, from the blue collar to the white. The fear has not changed. The fear is being made needless. The fear is discovering that the mark you spent your life learning to make can be made by something that has never known what it is to have a life.
Segal argues, persuasively, that the human contribution has not been made needless — that it has been relocated upward, to the level of judgment, vision, and the capacity to ask the right questions. Terkel's framework does not contradict this argument. It complicates it, by insisting that the relocation be examined not from the vantage point of the economist or the philosopher but from the vantage point of the person being relocated. The view from above — where judgment lives, where questions are asked, where the remaining twenty percent turns out to be everything — may indeed be expansive and beautiful. But the view from the middle of the relocation, where a person is losing the specific, embodied, identity-forming relationship to the work that made them who they are, is a different view. And it deserves to be recorded with the same seriousness, the same dignity, the same refusal to resolve, that Terkel brought to every conversation he ever had.
The oral historian does not adjudicate between these views. That would require imposing a framework on the testimony, which is precisely what the method refuses to do. What the method does instead is place the views side by side — the exhilaration and the grief, the liberation and the loss, the twenty percent that is everything and the eighty percent that was home — and let the reader hold both. The tension is the truth. The resolution, if it comes, must come from the reader, who carries the testimony forward into a life that no interviewer can predict.
---
Studs Terkel's genius was not analytical. It was procedural. The genius was in the method — in the decision, repeated hundreds of times across a career spanning six decades, to sit down with a person, turn on the tape recorder, ask a simple question, and then do the hardest thing any interviewer can do: disappear.
"What I bring to the interview is respect," Terkel said. "The person recognizes that you respect them because you're listening. Because you're listening, they feel good about talking to you." The statement sounds modest, almost banal, until one considers what it implies about every other form of discourse. If what Terkel brought was listening, then what most discourse brings is something else — argument, analysis, prescription, the imposition of a framework onto the raw material of experience. The framework may be brilliant. The argument may be correct. But it is not listening. And the people whose experience the framework describes can feel the difference.
The AI transition of 2025 and 2026 has produced an enormous volume of discourse and almost no listening. The discourse is sophisticated, ranging from Segal's philosophical treatment of intelligence as a force of nature to Byung-Chul Han's diagnosis of the smoothness society to the Berkeley researchers' empirical documentation of work intensification. The quality of the thinking is high. The quality of the attention to individual voices is low — not because the thinkers are indifferent to individual experience, but because their methods require them to compress individual experience into categories, data points, and illustrative anecdotes that serve the argument.
Terkel's method refuses this compression. When a switchboard operator tells Terkel about the small satisfaction of connecting a long-distance call and hearing the voice on the other end say thank you, Terkel does not use this as evidence for a theory about the dignity of service work. He presents it as a moment in a life. The moment has its own weight, its own specificity, its own irreducible meaning that cannot be extracted and repurposed without destroying the thing that makes it valuable.
This refusal places the Terkel method in direct and productive tension with the analytical architecture of The Orange Pill. Segal identifies what he calls "the silent middle" — the largest and most important group in any technology transition, consisting of people who feel both the exhilaration and the loss but remain silent because they lack a clean narrative to offer. "Social media rewards clarity," Segal writes. "'This is amazing' gets engagement. 'This is terrifying' gets engagement. 'I feel both things at once and I do not know what to do with the contradiction' does not." The observation is exact. It is also a description of the problem that Terkel's method was designed to solve.
The silent middle does not need a framework. It needs a listener. It needs someone who will sit down, turn on the recorder, ask the question, and then hold still long enough for the answer to arrive in its full, messy, contradictory, human specificity. The answer will not be clean. It will not resolve into a thesis. It will be a person describing a Tuesday — the proposal drafted with Claude in the morning, the flush of capability, the unsettling realization that the boundary between her thinking and the machine's thinking has become indistinguishable, then her daughter at dinner asking whether college still matters. Tuesday does not have an argument. Tuesday has a texture. And the texture is the data that no study can capture and no philosophy can contain.
Terkel understood that the most important truths about a society are held by the people who have the least access to the mechanisms of public speech. In Working, the testimonies that carry the greatest weight are not those of the professionals — the advertising executive, the editor, the professor — whose jobs gave them fluency in the language of public discourse. The testimonies that reveal the most are those of the people who had never been asked before: the parking lot attendant, the washroom attendant, the elevator operator. These workers had spent years accumulating observations about their work that they had never had occasion to articulate, and the act of articulation — prompted by Terkel's simple question and sustained by his willingness to listen — produced insights that no expert could have generated from the outside.
The AI transition is producing new populations of the unasked. The mid-career marketing manager who watches her junior colleagues produce work with AI that would have taken her team weeks. She does not appear on panels. She does not write Substack posts. She sits in meetings and feels the ground shifting and cannot articulate the fear because the fear does not have a name. Is she afraid of losing her job? Not exactly. She is afraid of something subtler — of becoming irrelevant within her own domain, of watching the expertise she spent fifteen years building become a commodity available to anyone with a subscription. The fear is real and it is specific and it is shared by millions of people who will never tweet about it.
The junior employee who was hired six months ago and has already discovered that the tool on her laptop can do the thing she was hired to do. She does not feel liberated. She feels superfluous. The democratization of capability that Segal celebrates — and it is genuinely worth celebrating — looks different from the position of the person whose hiring was predicated on a scarcity that no longer exists. She was valued because she could do something difficult. Now everyone can do something difficult. She is twenty-four years old and she is asking the question that Segal attributes to a twelve-year-old: What am I for?
The team lead who has spent five years building a group of engineers she trusts, whose capabilities she knows intimately, whose growth she has mentored with the patient, sustained attention that only comes from working together through difficulty. She knows she should adopt AI tools. She has read the memos. She has attended the trainings. She understands, intellectually, that the tools will make her team more productive. But she also knows, with the specific knowledge that comes from managing human beings, that the adoption will change the team in ways she cannot predict. The junior engineers will accelerate. The senior engineers will be threatened. The specific culture of mutual reliance that she has built — the culture in which asking for help is not weakness but collaboration, in which teaching is not a distraction from the real work but the most important work there is — will be stressed by a tool that makes asking for help from a colleague less efficient than asking a machine.
She does not want to be a Luddite. She is not afraid of technology. She is afraid of losing something she built with human hands — a team — and she cannot explain this fear in the language the discourse provides, because the discourse does not have a vocabulary for the value of a team that cannot be measured in productivity metrics.
Terkel would recognize each of these people. He spent his career in their company. He knew that the most articulate people in a society are rarely the ones with the most to say, and that the most important testimony about a moment of transformation comes not from its architects or its analysts but from the people living inside it without the luxury of distance.
The Working method — sit down, ask, listen, present without commentary — is not naive. It is a sophisticated response to a specific epistemological problem: that the experience of work, as lived by the worker, is inaccessible to any method that begins with a hypothesis about what that experience should mean. The hypothesis arrives first and the testimony is fitted to it, and in the fitting, something essential is lost — the contradictions, the non sequiturs, the moments when a person says something that surprises even themselves, the silences that carry more meaning than the words that surround them.
When Terkel published Hard Times, his oral history of the Great Depression, he placed a banker's testimony alongside a hobo's without editorial reconciliation. The banker described the Depression as a period of necessary correction. The hobo described it as the annihilation of his life. Terkel did not adjudicate. He did not explain that both were right, or that one was more right than the other, or that the truth lay somewhere in between. He placed the voices side by side and let the juxtaposition produce a form of understanding that no single voice could generate — the understanding that a shared experience is experienced in radically different ways, and that the radical difference is not a problem to be solved but a reality to be held.
The AI transition demands the same method. The builder who cannot stop working — whose testimony Segal presents with disarming honesty from his own experience — inhabits the same technological moment as the mid-career professional watching her expertise commoditize. Both are living through the same event. Their experiences are irreconcilable. The builder feels the exhilaration of operating at a new frontier of capability. The professional feels the vertigo of watching the floor drop out from under a career built on skills that the market no longer needs to buy from humans.
No framework can hold both of these experiences simultaneously without compressing one of them. The analytical mind wants to synthesize — to find the higher truth that encompasses both, the way Segal finds it in the ascending friction thesis or the beaver metaphor. The synthesis is valuable. But it is a synthesis, and a synthesis, by definition, smooths the contradictions into coherence. Terkel's method preserves the contradictions. It insists that the builder's exhilaration and the professional's grief coexist without resolution, because that unresolved coexistence is the truth of the moment — the way the unresolved coexistence of the banker's correction and the hobo's annihilation was the truth of the Depression.
This is not a rejection of Segal's frameworks. It is their necessary complement. Segal builds a tower. Terkel furnishes the rooms. Segal provides the architecture of understanding — the five floors, the ascending staircase, the view from the roof. Terkel provides the human material that the architecture describes but cannot contain — the voices that sound different from the inside than they do when fitted into a theory about the nature of intelligence or the future of work. The tower needs both. The architecture without the voices is elegant but empty. The voices without the architecture are overwhelming and directionless. The combination — framework and testimony, theory and experience, the view from the roof and the view from the room — is what the moment demands.
Terkel said late in his life that he was not up on the internet, but he heard that it was "a democratic possibility. People can connect with each other." The hope he expressed was characteristically Terkellian — not about the technology but about the people the technology might serve. Applied to AI, the hope is the same: not that the machine is good or bad, but that it might create the conditions in which people connect with each other, recognize each other, hear each other. The risk is equally Terkellian, captured in his observation that "more and more we are into communications; and less and less into communication." AI exponentially increases communications — more text, more code, more content, more productivity, more output. Whether it increases communication — the genuine encounter between one human being and another, the act of listening that Terkel practiced as a vocation — is the question his method would force upon the moment. Not as a rhetorical gesture, but as a practice. Sit down. Ask. Listen. Let the voice arrive.
---
There is a passage in Working where a spot welder named Phil Stallings describes his relationship to the assembly line with a specificity that no industrial engineer could replicate. "I stand in one spot, about a two- or three-feet area, all night," Stallings told Terkel. "The only thing I can say is I am a machine." He was not speaking philosophically. He was describing, with the precision of a man who has stood in the same spot for years, the exact physical and psychological experience of repetitive industrial labor — the way the body adapts, the way the mind drifts, the way the self contracts to fit the space the job allows.
The testimony is painful because it is specific. Stallings does not generalize about alienation. He describes his two- or three-feet area. The specificity is the dignity. It says: this experience is mine, it happened in this body, and I am the only authority on what it felt like.
Terkel's instinct was always to find the worker whose relationship to the work was most embodied — most dependent on the body's specific, hard-won knowledge — because embodied expertise carries the highest stakes when the work changes. The piano tuner whose ear could hear a deviation of a fraction of a Hertz. The stone mason whose hands knew, without measurement, whether a wall was plumb. The bookbinder whose fingers could feel the difference between a binding that would hold and one that would fail. These were people whose competence was inseparable from their physical presence in the work. Their knowledge did not live in their heads. It lived in their hands, their ears, their fingertips. It had been deposited there, layer by layer, through thousands of hours of repetition, and it could not be transferred, described, or reproduced by anyone who had not undergone the same deposition.
The senior software engineer of 2025 is, in a sense Terkel would have recognized, a craftsperson whose knowledge lives in the hands. Not literally — typing is not masonry — but the metaphor is structurally precise. A developer who has spent twenty years writing code has built, through sustained engagement with systems that resist, a form of embodied knowledge that manifests as intuition. The ability to look at a codebase and feel that something is wrong before identifying what. The pattern recognition that fires automatically when a particular class of bug appears. The architectural instinct that knows, without conscious reasoning, which design choices will scale and which will fracture under load. This knowledge was not learned from documentation. It was deposited by thousands of hours of friction — of writing code that did not work, reading error messages, hypothesizing, testing, failing, reading, asking, trying again.
Segal describes this deposition beautifully in The Orange Pill, using the metaphor of geological layers: "Every hour you spend debugging deposits a thin layer of understanding. The layers accumulate over months and years into something solid, something you can stand on." The metaphor captures the process. Terkel's method would capture the feeling — what it is like to stand on those layers, to feel them under you, to know that you earned them and that they are yours.
When Claude Code arrived and began handling the implementation work that constituted eighty percent of many engineers' daily practice, the geological metaphor acquired a tectonic dimension. The layers were still there. The knowledge had not been erased. But the process that had deposited them — the daily friction of writing, debugging, struggling — had been eliminated. No new layers were being laid down. The foundation was intact, but the sedimentation had stopped.
Terkel's framework suggests that this cessation is experienced not as a technical change but as an identity crisis, because the sedimentation was not merely a byproduct of the work. It was the mechanism through which the worker became who they are. The engineer did not learn to code and then, separately, become an engineer. The learning and the becoming were the same process. The struggle with the code was the formation of the self. When the struggle is removed, the formation stops — and the self that was being formed is left standing on old layers, with no new ground being added beneath its feet.
Segal's senior engineer in Trivandrum discovered that "the remaining twenty percent — the judgment about what to build, the architectural instinct about what would break, the taste that separated a feature users loved from one they tolerated — turned out to be the part that mattered." The discovery is genuine, and it resolves the economic question: the engineer is still valuable, perhaps more valuable than before. But it does not resolve the experiential question — the question Terkel would ask, which is not "Are you still valuable?" but "Do you still feel like you?"
The distinction is critical, because Terkel's interviews consistently revealed that workers' self-assessments do not track with economic valuations. The piano tuner was not well paid. His dignity was not a function of his salary. It was a function of his felt relationship to the work — the intimate, specific, irreplaceable knowledge that his ear possessed and that the work demanded. A financial analyst who told him he was economically valuable would not have addressed his actual concern, which was whether his work still required the thing about himself that he valued most.
Consider the engineer who can now "feel" a codebase the way a doctor feels a pulse — Segal's analogy from Chapter 2 of The Orange Pill. The engineer's value in the new landscape lies in precisely this intuition: the ability to direct AI tools wisely, to evaluate their output with the judgment that only years of hands-on experience can produce, to know which solution will break under pressure before the pressure arrives. This is genuine and irreplaceable expertise.
But the expertise was built through hands-on work. Through the friction. Through the daily act of writing code that did not work and figuring out why. Through the specific, embodied relationship between the engineer and the machine. If that relationship ceases — if the engineer spends the next ten years directing AI rather than writing code — will the intuition survive? Or will it, like a muscle that is no longer used, begin to atrophy, leaving the engineer standing on old layers of knowledge that are not being renewed?
This is not a question about the current generation of senior engineers. They have their layers. They will stand on them for years, perhaps for the remainder of their careers. It is a question about the next generation — the junior engineers who will never build those layers because the friction that deposited them has been optimized away. Segal acknowledges this concern through his engagement with Byung-Chul Han's philosophy of smoothness. Han argues that removing friction destroys depth — that the understanding that comes from struggle cannot be replaced by the understanding that comes from ease. Segal's counter-argument — ascending friction, the thesis that difficulty does not vanish but relocates to a higher cognitive level — is elegant and largely persuasive at the economic level.
Terkel's framework asks whether it is persuasive at the experiential level. Does the engineer who works at the higher level — directing, evaluating, judging — feel the same depth of engagement that the engineer who worked at the lower level — writing, debugging, struggling — felt? Does the ascending friction provide the same sedimentation, the same identity formation, the same deposit of embodied knowledge? Or does it provide a different kind of engagement — broader, perhaps, and more strategic, but thinner, more abstract, more removed from the material reality of the thing being built?
These questions do not have clean answers. That is precisely why they require Terkel's method rather than a theoretical framework. The answers live in the testimony of specific people describing specific experiences. One engineer will say the higher level is richer than anything he experienced before — that the removal of mechanical friction revealed a landscape of creative possibility he had always suspected was there but could never reach. Another engineer will say the higher level feels like exile — that the view from above is panoramic but cold, that she misses the warmth of the code, the intimacy of the struggle, the specific satisfaction of a function that works because she made it work. Both testimonies would be true. Neither would refute the other. The juxtaposition would reveal what no single testimony can: that the same technological transformation produces profoundly different experiences in different people, and that the difference is not a function of the technology but of the person — of their specific biography, their specific relationship to the work, their specific location in the vast landscape of human need.
The hands that knew are being stilled. Not violently — there is no machine-breaking in this revolution, no midnight raids on data centers. The stilling is gentle, even generous. The tool does the work faster, better, more reliably. The hands are freed. Freed for what? For judgment. For vision. For the question of what should exist.
But the hands had a knowledge of their own. And the knowledge cannot be deposited in any other way.
---
Terkel's Working contains, alongside its many testimonies of alienation and loss, a counter-thread that is often overlooked by readers who come to the book expecting only a catalog of suffering. The counter-thread is the voice of the person who loves what they do. The fireman who describes the specific quality of entering a burning building — the fear and the clarity and the sense of being needed at the exact moment of the need. The jazz musician who describes the instant when the band locks into a groove and his individual identity dissolves into something collective and alive. The stone mason who builds a wall and comes back years later to run his hand along it and feel, in the straightness of the stones, the evidence of his care.
These voices do not contradict the voices of alienation. They coexist with them, placed side by side in Terkel's text without editorial reconciliation. The coexistence is the point. Work is not one thing. It is a spectrum of experience that ranges from soul-crushing to soul-defining, and the same technological revolution that devastates one worker liberates another. Terkel understood this because he listened to enough people to resist the temptation of a single narrative. The depression was both the banker's correction and the hobo's catastrophe. Work was both the steelworker's cage and the fireman's calling. The oral historian's discipline is to hold both and choose neither.
The AI transition is producing its own counter-thread. Alongside the senior engineer whose identity has been unsettled, there exists a population of workers for whom AI has been something closer to deliverance — people who possessed ideas, vision, and drive, but lacked the technical infrastructure to realize them. The marketing manager who could see the tracking tool her team needed but could not build it. The teacher who knew what her students required but could not create the curriculum platform that would deliver it. The designer who could envision the interface but had to hand off the vision and watch it return diminished by the translation.
For these workers, the AI revolution is experienced not as loss but as the sudden, almost shocking removal of a barrier they had internalized as permanent. The barrier was technical fluency. The ability to write code, manage databases, deploy software — the skill set that constituted the price of admission to the building process. This skill set was real and it was costly to acquire, requiring years of training and practice. Its acquisition was a legitimate achievement. But its function, in the lives of the people who lacked it, was that of a wall — a wall between what they could imagine and what they could make.
Segal names this wall the "imagination-to-artifact ratio" and traces its compression across the history of human tool use — from the medieval cathedral that required an army to build, to the modern software that required only a team, to the AI-assisted product that requires only a conversation. The compression is real and it is historically significant. But the experience of the compression, as lived by the person on the other side of the wall, is something the ratio does not capture.
Terkel would capture it. He would sit down with the marketing manager — a real person in a real office with a real history of frustrated capability — and ask her to describe the moment. Not the theory of the moment. Not the economic significance of the moment. The moment itself. The morning she sat down with Claude Code, described the tracking tool she had been imagining for three years, and watched it take shape on her screen.
What would she say? The testimony does not exist yet — Terkel is not here to record it, and no one has taken his place — but Terkel's method provides a frame for imagining what it might contain. She would describe the frustration first. Not briefly, not as background, but in the specific, accumulated detail that only years of frustration can produce. The meetings where she proposed the tool and was told it would require a developer and there were no developers available. The workarounds she invented — the spreadsheets, the manual processes, the jerry-rigged solutions that worked badly but were better than nothing. The specific quality of knowing exactly what was needed and being unable to provide it. Not because she lacked intelligence or initiative or understanding, but because she lacked the technical vocabulary that the building process demanded.
Then she would describe the moment the wall came down. And here is where Terkel's method becomes essential, because the moment is not simple. It is not the clean liberation that the democratization narrative suggests. It is complicated — shot through with the same ambivalence that Terkel found in every major transformation he documented. The exhilaration is real. The tool works. The thing she imagined exists. She built it. She. Herself. Without asking permission, without waiting in a queue, without the specific humiliation of having to explain her vision to a technical person who could build it but did not share it.
But the exhilaration coexists with something else. A disorientation. A sense that the rules she had organized her professional life around have changed without warning. If she could have done this all along — if the only thing separating her from the capability was a tool that costs a hundred dollars a month — then what were the last ten years? Were they wasted? Were the workarounds, the frustrations, the compromises, the years of adapting to a system that required technical gatekeepers — were they necessary, or were they a tax she paid for no reason?
Terkel's interviews with workers who experienced transitions — factory workers whose plants closed, tradespeople whose crafts were mechanized — consistently revealed this temporal vertigo. The present liberation destabilized the past. If the new way was possible, then the old way was not inevitable, and if it was not inevitable, then the suffering it produced was not necessary, and if it was not necessary, then someone — the system, the company, the technology that should have arrived sooner — owed an accounting.
The marketing manager's testimony would contain this vertigo. She would say she loves the tool. She would say it changed her life. She would also say, in a quieter voice that Terkel would know to wait for, that she feels cheated — that the wall she internalized as permanent was always just a technical limitation, and its removal reveals not just possibility but lost time.
This dual testimony — exhilaration and grief, liberation and mourning for the years the liberation did not arrive — is precisely what the discourse cannot hold. The triumphalists want the exhilaration. The elegists want the grief. Terkel wants both, because both are true, and the truth of a human experience is never singular.
The democratization that Segal describes in The Orange Pill is one of the most morally significant features of the AI transition. The developer in Lagos. The teacher in a rural district. The designer who no longer waits. Each represents a genuine expansion of who gets to build, who gets to participate in the creation of tools and products and solutions that previously required resources they did not have. Segal is right that the floor has risen. The marketing manager who builds her own tool is standing on that risen floor, and the view from there is genuinely new.
But Terkel's framework insists on asking who else is in the room. When the marketing manager builds the tool herself, the developer who would have been asked to build it is not asked. The translation that Segal describes as a tax — the cost of converting a vision into a specification into a conversation with a developer into an implementation — was also a relationship. It was a collaboration between two people with different forms of expertise, and collaborations, even frustrating ones, produce human connections that solo work does not.
The developer who translated the marketing manager's vision into code was not merely a technical resource. He was a colleague, a collaborator, a person whose specific expertise shaped the final product in ways the marketing manager might not have predicted or intended. The best implementations are not literal translations of specifications. They are interpretive acts — the developer seeing something in the specification that the author did not see, adding a capability that was not requested but is better than what was, pushing back on a requirement that would not work in practice. This interpretive collaboration is a form of human communication, and its value is not captured by the productivity metric that celebrates the marketing manager's ability to build the tool alone.
Terkel observed this dynamic in every industry he documented. The best work was collaborative — not in the corporate-retreat sense of the word, but in the specific, friction-filled, sometimes antagonistic sense of two people with different kinds of knowledge working on the same problem and producing something neither could have produced alone. The friction was often unpleasant. The developer did not want to build the marketing manager's tool. The marketing manager did not want to write a specification. The translation was costly. But the cost was also a form of human engagement, and the engagement produced, alongside the product, a relationship that had value of its own.
When the tool replaces the relationship, the efficiency gain is real. The marketing manager's testimony would celebrate it. But Terkel's method would also seek out the developer's testimony — the person whose role in the collaboration has been eliminated not by malice but by capability. What does the developer feel? Not the senior developer whose judgment is still needed, but the mid-level developer whose primary contribution was translating other people's visions into code. That developer's experience of the same moment — the moment the marketing manager built the tool herself — would be different. It would contain the specific quality of discovering that the thing you did, the thing that connected you to colleagues and gave you a place in the team, can now be done without you.
Both testimonies belong in the record. The liberation and the displacement. The raised floor and the person no longer standing on it. The juxtaposition, unresolved, is what Terkel called the truth — the complicated, contradictory, irreducibly human truth of a moment in which the ground moves and different people find themselves at different elevations, some higher than before and some lower, all of them on new and unfamiliar terrain.
Terkel interviewed a schoolteacher named Rose Hoffman for Working. Hoffman taught in a Chicago public school, and her testimony is among the most quietly devastating in the book — not because she hated her work, but because she loved it, and the system she worked inside made the love almost impossible to sustain. She described the bureaucracy, the mandated curricula that did not fit her students, the standardized materials designed by people who had never stood in front of her particular classroom and did not know the particular children who sat in it. She described the gap between what she knew her students needed and what she was permitted to give them. The gap was not a matter of competence. Rose Hoffman was competent. The gap was structural — built into a system that centralized the design of learning and distributed it to classrooms where it landed with the approximate relevance of a form letter.
Terkel did not editorialize about the structural failures of American education. He let Hoffman talk. And in her talking, the structure became visible — not as a policy argument but as a daily experience, the accumulation of small frustrations that are individually manageable and collectively crushing. The worksheet that does not match the reading level of half the class. The assessment that measures recall when the teacher knows the real problem is comprehension. The curriculum guide written for a demographic that does not exist in this school, in this neighborhood, with these children who have these specific needs that the guide has never heard of.
The teacher's frustration, in Terkel's rendering, is not institutional critique. It is the frustration of a craftsperson whose tools do not fit the work. A carpenter handed a saw that cuts the wrong width. A surgeon given an instrument designed for a different procedure. The knowledge of what is needed exists — it lives in the teacher, deposited by years of watching these particular students struggle with these particular problems — but the means of acting on that knowledge are controlled by a system that does not trust her judgment enough to let her act on it.
This frustration is the context in which the AI moment arrives for educators, and without this context, the liberation narrative rings hollow. When a teacher builds her own curriculum tool using AI — an event Segal describes in The Orange Pill as one instance of the democratization of capability — the significance of the act is not primarily technological. It is not about the tool. It is about the relationship between the teacher and her practice, and the way that relationship has been constrained by decades of institutional design that placed the locus of curriculum development somewhere other than the classroom where the teaching happens.
Terkel's framework reveals the AI-enabled teacher not as a user adopting a new technology but as a practitioner reclaiming a form of professional agency that was removed long before the technology arrived. The centralization of curriculum design — the standardized materials, the mandated assessments, the top-down pedagogical frameworks that define what a teacher may teach and how — was itself a technology, in the broadest sense. It was a system for organizing knowledge production and distributing it at scale. The system had real virtues: consistency, accountability, the assurance that a student in one district would encounter the same core material as a student in another. But it also had a cost that Terkel's interviews made legible: the erosion of the teacher's felt authorship of her own practice.
When Hoffman described the gap between what she knew her students needed and what the system permitted her to give them, she was describing a version of the imagination-to-artifact ratio that Segal applies to software development. The teacher had the vision. She lacked the means. Not because she lacked intelligence or training, but because the means of producing customized educational materials — the design, the layout, the interactive elements, the adaptive features — required technical capabilities that her training had not included and her budget could not purchase. The wall between her imagination and its realization was not lower than the software developer's wall. It was different in kind — built from institutional constraints rather than technical ones — but equally effective at preventing the vision from becoming real.
AI dissolves this particular wall with a speed that the institutional structures of education have not begun to process. A teacher who can describe, in natural language, what her students need — the reading level, the conceptual scaffolding, the type of practice problems, the kind of feedback that this particular group of twelve-year-olds responds to — can now produce materials tailored to those specifications in minutes. Not materials purchased from a vendor. Not materials mandated by a district. Materials that she made, informed by her knowledge of her students, shaped by her professional judgment, bearing the mark — Terkel's word — of her specific expertise.
The pride in this is real, and Terkel would have recognized it immediately, because it is the same pride he heard in every worker who described the satisfaction of making something that bore the imprint of their care. The stone mason's wall. The gravedigger's straight lines. The teacher's curriculum, designed not for a demographic abstraction but for the twenty-three specific human beings who will sit in her classroom tomorrow morning.
But Terkel would also have heard, beneath the pride, the complications that the liberation narrative tends to smooth over. The teacher who builds her own curriculum tool is exercising a form of professional agency that the system was not designed to accommodate. The standardized curriculum exists for reasons — imperfect reasons, often frustrating reasons, but reasons nonetheless. It ensures a baseline. It provides accountability. It creates the conditions for assessment across classrooms and districts. When a teacher steps outside this system, even to build something better, she enters a space that is institutionally unrecognized and potentially unsupported.
Who evaluates the teacher-built curriculum? By what standard? If the materials are better — if the students learn more, engage more deeply, perform better on the measures that matter — the system has no mechanism for recognizing the improvement, because the system's mechanisms are calibrated to the standardized materials the teacher has replaced. The teacher has solved a problem the system created, using a tool the system did not provide, and the system does not know what to do with the solution.
Terkel documented this dynamic across industries. The worker who finds a better way to do the job — faster, safer, more efficient — and discovers that the institutional structure cannot absorb the innovation because the structure was designed around the old way and the old way's assumptions are built into every form, every metric, every procedure manual, every chain of command. The worker's ingenuity is real. The institution's inability to recognize it is also real. And the gap between the two produces a specific kind of frustration that is different from the frustration of not having the tool. It is the frustration of having the tool and discovering that the world around you has not caught up.
This gap — between individual capability and institutional readiness — is the central challenge of AI in education, and it is a challenge that the technology itself cannot solve. Segal acknowledges this in The Orange Pill when he writes that educational institutions "are not prepared for this change and are staffed with calcified pedagogy." The diagnosis is accurate. The prescription — reform, adaptation, urgency — is necessary. But Terkel's method would add a dimension that the prescription cannot contain: the human experience of being the person inside the institution, holding the new capability in one hand and the old structure in the other, and finding that they do not fit together.
The teacher who builds her own curriculum tool does not thereby escape the system. She inhabits it differently. She now possesses a capability that her institution did not give her and may not support, and the possession creates a new form of professional isolation — the isolation of the person who can see a better way and cannot make the system see it. Terkel's interviews are full of these isolated innovators. The factory worker who redesigned his station for efficiency and was told to put it back. The secretary who reorganized the filing system and was reprimanded for exceeding her authority. The innovation was real. The institutional resistance was also real. And the worker was left holding both.
The students are the dimension that makes the teacher's case different from the factory worker's, and it is the dimension that carries the greatest moral weight. The teacher does not build the curriculum tool for herself. She builds it for twenty-three children whose learning depends on the quality of the materials they encounter. If the tool works — if the students engage more deeply, understand more fully, carry the knowledge further — then the institutional failure to recognize the innovation is not merely bureaucratic inertia. It is a failure that costs children something that cannot be recovered later. The twelve-year-old who needed the adaptive practice problems this semester will not be twelve again next semester. The window is specific and it closes.
Terkel understood the stakes of time in work — the irreversibility of a shift spent doing something meaningless, the accumulation of days that cannot be unlived. Applied to education, the stakes are compounded, because the time belongs not to the worker alone but to the children the worker serves. Every semester the institution spends debating whether to accommodate the new capability is a semester of children educated with materials designed for someone else.
The teacher who built her own curriculum would tell Terkel about the pride and the isolation and the frustration and the specific, irreducible satisfaction of watching a student understand something that the old materials had failed to convey. She would describe the morning she saw the student's face change — the moment when confusion gave way to comprehension, when the thing she had built for this specific child reached this specific child and worked. And that moment, in Terkel's rendering, would carry more weight than any policy argument about educational technology, because it would be a moment — singular, unrepeatable, testified to by the person who witnessed it — in which the capability met the need and the child was served.
The testimony would not resolve the institutional question. It would not tell the system how to accommodate the innovation. It would not prescribe a policy framework for teacher-built AI curricula. It would do something more fundamental: it would record, in the voice of the person who lived it, what it feels like to build something your students need using a tool your institution does not understand, and to hold the pride and the isolation together without knowing which one will win.
That unresolved holding is what Terkel's method preserves and what every other method smooths away. The teacher deserves to have her experience recorded in its full complexity — not as a success story for the democratization narrative, and not as a cautionary tale about institutional unreadiness, but as what it is: a human being doing her best work under conditions that are changing faster than anyone around her can process, with tools that amplify her judgment and an institution that has not yet learned to trust it.
---
In the economy of creative work, waiting has always been invisible labor. It does not appear on timesheets. It does not register in productivity metrics. It occupies no line item in a project budget. And yet, for the designer whose practice depends on other people's execution, waiting has historically constituted a significant portion of professional life — not the active waiting of anticipation, but the specific, grinding, identity-eroding waiting of a person whose vision must pass through someone else's hands before it can become real.
Terkel never interviewed a digital designer — the profession did not exist in its current form during his active years — but he interviewed many workers whose experience of waiting maps precisely onto the designer's condition. The foreman who designed the workflow and then waited for the line workers to execute it. The architect whose blueprints were interpreted by contractors whose craftsmanship might or might not honor the intention. The advertising copywriter who wrote the words and then watched the art department, the account executive, and the client transform them into something he no longer recognized. In each case, the worker's creative investment was mediated by a process of translation, and the translation introduced noise — compromises, misunderstandings, the accumulated distortions that attend any passage of a vision from one mind to another.
The noise was not always destructive. Terkel's interviews contain moments where the translation produced something better than the original vision — the contractor whose interpretation of the blueprint improved upon the architect's intention, the art director whose visual instinct elevated the copywriter's words. Collaboration, even frustrating collaboration, sometimes generates value that no single contributor could have produced alone. This is the insight that Segal captures in his discussion of Bob Dylan — that creative work is relational, that the genius of "Like a Rolling Stone" was not Dylan's alone but emerged from the collision of his vision with Al Kooper's accidental organ, the band's specific energy in the studio, the accumulated weight of every influence Dylan had absorbed.
But the noise was often destructive, and the destruction was experienced by the worker as a specific form of grief — the grief of watching your intention degrade as it passes through systems and people that do not share it. The copywriter whose headline was changed by the account executive because the client did not understand it. The architect whose roofline was altered by the contractor because the budget did not support it. The designer whose interface was implemented by a developer who followed the specification literally and missed the spirit.
Segal describes this degradation in The Orange Pill as a feature of the "imagination-to-artifact ratio" — the distance between human intention and its realization. Each layer of translation between the vision and the artifact introduces friction, and the friction is not merely temporal. It is qualitative. The thing that arrives is not the thing that was envisioned. It is a translation of the thing that was envisioned, and translations are never perfect, and the imperfection accumulates with each additional layer.
When AI eliminated the translation layers for the designer — when the vision could be described in natural language and realized in code without passing through another person's interpretation — the designer's experience changed in a way that productivity metrics capture only partially. The speed increased, yes. The fidelity increased. The iteration cycle collapsed from weeks to hours. These are measurable improvements and they are real.
But the experiential change runs deeper than the metrics suggest, and it is the kind of change that only Terkel's method — sitting down, asking, listening — could fully capture. The designer who no longer waits is not merely faster. He occupies a fundamentally different position in the creative process. He has moved from the role of specifier — the person who describes what should exist and then hands the description to someone else — to the role of maker. He builds the thing himself. The distance between his intention and its realization has collapsed to the width of a conversation, and the collapse changes not just his productivity but his relationship to the work.
Terkel would have been drawn to the quality of this change, because it reverses a dynamic he documented throughout Working: the progressive removal of the worker from the product of the work. The assembly line separated the worker from the finished car. The corporate hierarchy separated the employee from the company's output. The division of labor separated the craftsperson from the completed craft. In each case, the separation was experienced as a loss of ownership — the feeling that the work no longer belonged to the worker, that the mark had been erased by the process of production.
The designer who builds with AI is experiencing the opposite movement. The separation is being reversed. The division of labor that previously placed execution in someone else's hands has been collapsed, and the designer now stands in direct relationship to the finished product in a way that the previous process did not allow. The mark returns. The vision and the artifact are produced by the same person, and the correspondence between what was imagined and what exists is closer than it has ever been.
This return of the mark is exhilarating. Segal describes it in The Orange Pill through the experience of his own team — the designer who "had never touched backend code" but "within two weeks of working with Claude, was building complete features, not just designing them, but implementing them, end to end." The exhilaration is audible in the description. The wall came down. The designer could make.
But Terkel's method would not stop at the exhilaration, because Terkel's method never stops at the first emotion. The first emotion is the door. Behind it is the room — the full, complicated, often contradictory interior of a human being's experience. And the room, in this case, contains at least two things the exhilaration does not announce.
The first is the relationship that has been eliminated. When the designer handed off specifications to a developer, the handoff was a transaction. It was also an encounter — a moment when two people with different forms of expertise met over a shared problem. The encounter was often frustrating. The developer did not understand the design intent. The designer did not understand the technical constraints. The specification was a crude bridge between two different ways of knowing, and crossing it required patience, translation, and the willingness to have one's vision challenged by someone who saw it from a different angle.
That challenge was valuable. Not always. Not reliably. But sometimes the developer's pushback — "this won't work because" — produced a better solution than the designer's original vision. The friction of collaboration generated heat, and some of the heat was generative. When the collaboration is replaced by a conversation with a machine, the generative friction is lost. The machine does not push back. It does not say "this won't work because." It implements. The implementation may be technically sound, but it has not been tested against another human being's expertise, and the absence of that test is a loss that the speed gain does not compensate.
The second thing the room contains is a quieter and more difficult truth. The designer who no longer waits is also the designer who no longer needs the developer. And the developer who is no longer needed is a person — a specific person with a specific career and a specific identity built on the expertise of translating other people's visions into working systems. That person's experience of the same moment — the moment the designer became self-sufficient — is the inverse of the designer's exhilaration. It is the experience of discovering that the thing you do, the specific skill that connected you to colleagues and gave you a place in the creative process, is no longer required.
Terkel would seek out that person. Not to contradict the designer's testimony, but to place it alongside the developer's testimony and let the juxtaposition carry the weight that editorial commentary cannot. The designer's liberation is real. The developer's displacement is real. Both are produced by the same technological moment. Neither negates the other. The reader who holds both is closer to the truth of the moment than the reader who holds only one.
Segal writes in The Orange Pill that "the human cost of efficiency is always borne by someone." The statement is correct and important. Terkel's contribution is to give that someone a voice — to transform the abstract acknowledgment of cost into a specific human being sitting in a specific chair describing, in words that are specifically hers, what it feels like to be the person on whom the cost lands. The cost is not a data point. It is a morning. A specific morning when the developer opens her email and discovers that the designer has shipped the feature without her, and she sits at her desk for a long moment, coffee cooling, and wonders what she is supposed to do now.
That morning deserves to be recorded. Not because it disproves the case for democratization — it does not — but because the case for democratization, like every case for every technological advance, is incomplete until it includes the testimony of the people who bear its cost. Terkel taught that lesson across six decades and dozens of books. The lesson has not changed. The technology has.
---
There is a category of testimony in Working that receives less critical attention than the worker testimonies but may be, in certain respects, more revealing. It is the testimony of the person who lives with the worker — the spouse, the partner, the family member whose life is shaped by work they do not perform. The factory worker's wife who described what the night shift did to their marriage. The long-haul trucker's partner who described the loneliness of the road from the position of the one who stayed home. The executive's husband who described dinner conversations that were not conversations but debriefings, the executive still performing even at the kitchen table, unable to stop managing long enough to be present.
Terkel included these voices not as supplements to the workers' testimony but as primary evidence. The spouse's experience of the work is not secondary. It is different — a different vantage point on the same phenomenon, and the difference reveals aspects of the work that the worker cannot see from inside it. The factory worker knows the weight of the steel. The wife knows the weight of his absence. Both weights are real. Both are produced by the same system. The worker can describe one. The spouse can describe the other. The full picture requires both.
The AI transition has produced a new form of spousal testimony that echoes, with unsettling precision, the testimonies Terkel collected from the families of compulsive workers in every industry he documented. The echo is precise because the underlying dynamic is structural, not technological: a person whose engagement with their work has become so absorbing that the boundaries between work and everything else have dissolved, and the dissolution is experienced by the people around them as a specific kind of absence — not physical absence, but attentional absence, which is in some ways harder to bear because the person is right there, in the same room, at the same table, and yet unreachable.
Segal references the Substack post in The Orange Pill — Hilary Gridley's "Help! My Husband Is Addicted to Claude Code" — and treats it as a cultural artifact, a diagnostic indicator of the productive addiction that AI tools can generate. The treatment is appropriate within the book's analytical framework. But Terkel would have treated it differently. Terkel would not have read the post as evidence for a thesis. He would have read it as the opening of a conversation — the first sentence of a testimony that wanted to be heard in full.
What would the spouse say, given time and a listener who was not in a hurry? Terkel's method suggests the answer would be layered, contradictory, and resistant to summary — because the spouse's experience, like every experience Terkel recorded, is not a data point but a life.
The spouse would describe the evening the absorption began. Not a dramatic evening. An ordinary one. The builder came home, opened the laptop, and started a conversation with Claude about a problem that had been nagging at him all day. The conversation produced a solution. The solution produced another problem. The problem was interesting. The builder leaned forward. The spouse recognized the lean — the particular angle of the body that signals the mind has gone somewhere the body cannot follow.
She has seen this posture before. Every spouse of every builder has seen it. The architect bent over blueprints. The writer staring through the screen. The entrepreneur on the phone at midnight, pacing, the voice carrying a frequency that means the conversation is not about tomorrow's meeting but about the thing itself, the vision, the problem that will not let go. The posture is not new. What is new is its duration and its accessibility. The blueprints could be rolled up. The phone call ended. The screen could be closed. But the AI conversation does not end. It is available at every hour, on every device, and it responds instantly, and its responses are good enough to sustain the engagement indefinitely.
The spouse would describe the quality of the absence. Not anger — or not only anger. The builder is not watching television. He is not scrolling social media. He is not doing something trivial that the spouse could reasonably ask him to stop. He is building. He is creating something real, something that works, something that serves users and generates revenue and represents a genuine expression of his capability. The work is good work. The spouse knows this. The spouse may even admire it. And the admiration makes the complaint harder to voice, because the complaint — you are not here, you are not present, you are in the room but you are not in the room — sounds petty when directed at a person who is doing something worthwhile.
This dynamic — the suppressed complaint, the guilt of feeling neglected by a person who is doing important work — is one that Terkel documented with particular sensitivity in his interviews with the families of workers whose jobs demanded total engagement. The firefighter's wife who could not complain about the danger because the danger was in service of something noble. The surgeon's partner who could not resent the hours because the hours saved lives. The complaint was not illegitimate. It was inexpressible — blocked by the social understanding that important work justifies personal sacrifice, and that the sacrifice of the worker's family is simply part of the cost.
The AI-absorbed builder's spouse inherits this dynamic with a new complication. The firefighter's wife could point to an external demand — the alarm, the department, the schedule — and locate the source of the absence outside the marriage. The demand was institutional. It could theoretically be negotiated with, limited, structured by the same kind of dams that Segal describes. The AI-absorbed builder's compulsion has no external source. No department sets the hours. No alarm rings. The builder works because the tool is there and the work is engaging and the internal imperative — Byung-Chul Han's achievement subject, cracking the whip against his own back — converts every available moment into productive time.
The whip and the hand that holds it belong to the same person. And the spouse watches the self-flagellation without recourse, because there is no institution to petition, no schedule to negotiate, no alarm to silence. The boundary must come from within the builder, and the builder, absorbed in the work that makes him feel most alive, does not experience the absence of the boundary as a problem. He experiences it as freedom.
Segal recognizes this dynamic with admirable honesty in The Orange Pill. "I caught myself," he writes about a transatlantic flight spent writing. "I was not writing because the book demanded it. I was writing because I could not stop." The confession is important because it comes from the builder himself — from the person inside the absorption, describing the moment of recognition that the exhilaration had curdled into compulsion. But the confession is the builder's testimony. The spouse's testimony is different, and it deserves equal weight.
The spouse's testimony would describe the erosion of shared time — not in the dramatic terms of crisis but in the accumulated terms of Tuesday evenings and Saturday mornings and the twenty minutes after dinner that used to be a conversation and are now a silence broken by the sound of typing. The erosion is gradual. Each individual evening is not a crisis. Each individual Saturday morning is not a betrayal. But the accumulation is felt, and the feeling is a version of what Terkel heard from every family that lived alongside compulsive work: the slow, ambient grief of being present to someone who is not present to you.
Terkel would also have heard — because he always heard the complications — the spouse's own relationship to the tool. The spouse may use AI herself. She may find it useful, even exciting. She may understand, from her own experience, the pull that the builder describes. And this understanding makes the complaint even harder to voice, because she knows the pull is real and the work is real and the satisfaction is real, and she cannot reasonably ask the builder to stop being satisfied by his work. She can only ask him to be present, and the asking feels like asking him to be less of himself.
This is the specific cruelty of productive addiction — a category that Terkel intuited but never named, because the tools of his era did not produce it with the intensity that AI tools produce it today. Addiction to alcohol or gambling or drugs carries a social script: the substance is harmful, the behavior is destructive, the intervention is justified. Addiction to productive work carries no such script. The substance is valuable. The behavior produces real output. The intervention — "stop building, come to dinner" — sounds trivial against the grandeur of what is being built.
Terkel would not have resolved this tension. Resolution was never his method. He would have recorded the spouse's testimony alongside the builder's testimony and let the reader hold both — the exhilaration of the build and the loneliness of the dinner table, the freedom of the tool and the captivity of the marriage, the twenty percent that is everything and the evening that is nothing. The juxtaposition would carry the truth that neither testimony alone can hold: that the same tool, in the same household, is simultaneously the best thing that has happened to the builder and the hardest thing that has happened to the marriage.
The Substack post was written with humor. Terkel would have listened for what was beneath the humor — the specific, unjoking quality of the silence at a table where one person is present and the other is building, and the building will not stop, and the silence will not break, and the tool that makes it possible does not know or care that there is a person on the other side of the room who has been waiting, quietly, for the conversation to begin.
---
Dignity was Terkel's through-line. Not the word — he rarely used it explicitly — but the thing itself, running beneath every interview like an underground current that surfaced at unpredictable moments. The gravedigger who took pride in the straightness of his lines. The waitress who memorized every regular's order not because it was required but because it was hers — her specific, hard-won competence, her mark on the daily transaction. The steelworker who wanted to point at a building and say "I helped make that." In each case, dignity was not conferred by the job title, the salary, or the social prestige of the work. Dignity was located in the worker's felt relationship to the competence the work demanded — in the knowledge that there was something difficult here, and I learned to do it, and I do it well, and the doing is mine.
This location of dignity — in the exercise of competence, in the embodied knowledge that only practice can deposit — is what makes Terkel's framework so uncomfortable when applied to the AI transition. Because the transition is, at its core, a redistribution of competence. Tasks that once required years of training can now be performed by anyone with access to the tool. Skills that once constituted the foundation of professional identity can now be replicated by a machine in seconds. The competence has not been destroyed. It has been made abundant. And abundance, as every economist knows, destroys value — not the intrinsic value of the thing, but the market value, the scarcity premium that made the competence a source of economic reward and, through economic reward, a source of social recognition, and through social recognition, a source of dignity.
The chain is specific: competence → scarcity → reward → recognition → dignity. When AI breaks the second link — when the competence is no longer scarce — the chain does not necessarily hold. The competence remains real. The person who possesses it has not become less skilled. But the market no longer needs to purchase the skill from a human, and the recognition that once flowed from the purchase diminishes, and with it diminishes the specific form of dignity that depended on being the person who could do the thing that needed doing.
Segal addresses this chain in The Orange Pill through the ascending friction thesis — the argument that AI removes difficulty at one level and relocates it upward, so that the worker who previously found dignity in implementation can now find dignity in judgment, direction, and vision. The thesis is structurally sound. The difficulty does ascend. The worker who directs AI is engaged in work that is genuinely harder, at a higher cognitive level, than the implementation work the tool has absorbed.
But Terkel's framework asks whether the ascent preserves the experiential quality of dignity or substitutes a different quality. Whether the engineer who found dignity in elegant code finds the same dignity in elegant judgment. Whether the lawyer who found dignity in the craft of a well-researched brief finds the same dignity in the evaluation of an AI-drafted brief. Whether the teacher who found dignity in the patient, friction-filled act of building a lesson plan from scratch finds the same dignity in the faster, smoother act of directing an AI to build one.
The question matters because dignity, in Terkel's rendering, is not a transferable commodity. It is not a feeling that attaches to any sufficiently challenging work. It is specific — rooted in a particular practice, a particular form of embodied knowledge, a particular relationship between the worker and the material. The gravedigger's dignity was not generic. It was the dignity of straight lines in earth, of knowing the soil, of the specific physical skill that only years of digging could produce. A different job, equally challenging, would not necessarily provide the same dignity, because the dignity was not in the challenge alone. It was in the relationship between this person and this work — a relationship built through time, through friction, through the thousands of small acts of care that deposited competence in the body and the identity.
If dignity is specific rather than generic, then the ascending friction thesis provides a necessary but insufficient account of what happens to the worker in transition. The new level may be harder. It may be more valuable. It may demand more of the worker's intelligence and judgment. But it does not automatically inherit the dignity of the old level, because dignity is not a function of difficulty. It is a function of the felt relationship between the worker and the practice.
Terkel's method would test this by sitting down with workers who have made the ascent and asking them — not whether the new work is valuable, not whether the new work is challenging, but whether the new work provides the same felt quality of engagement that the old work did. The question is subjective. That is its strength. Dignity is subjective. It lives in the worker's experience, not in the economist's model, and the only way to access it is to ask.
Some workers would describe the ascent as a genuine gain — a liberation from mechanical drudgery that freed them to engage with the work at a level they had always aspired to. These are the voices the triumphalist narrative amplifies, and they are real. The engineer who discovers that directing AI is more creatively satisfying than writing boilerplate code. The designer who discovers that building end-to-end is more fulfilling than specifying and waiting. The teacher who discovers that designing the learning experience is more meaningful than producing the materials. Each of these workers has found a new location for dignity — a location that the old division of labor made inaccessible.
Other workers would describe the ascent differently. They would describe a disorientation — the sensation of having been removed from the specific practice that gave their work its texture and placed in a more abstract role that feels less like theirs. The senior engineer who tells Segal that the remaining twenty percent is "everything" may be right about the value. He may be wrong about the feeling. The twenty percent may be everything in terms of economic contribution. It may not be everything in terms of the felt experience of working — of sitting down each morning and engaging with a practice that demands the specific form of attention and care that only this person, with this history, in this body, can provide.
Terkel's interviews with workers in transition consistently revealed this gap between the objective value of the new work and the subjective experience of performing it. The factory worker retrained as a data entry clerk had a "better" job by every economic measure — safer, cleaner, better paid. But the testimony the worker gave Terkel was not a testimony of improvement. It was a testimony of loss — the loss of the physical engagement with materials, the loss of the specific knowledge that lived in the hands, the loss of the identity that the old work had formed. The new work was objectively better. The old work was subjectively his.
Dignity, in the age of AI, must be relocated. This much is clear from every framework applied to the transition, from Segal's ascending friction to Han's critique of smoothness to the Berkeley researchers' documentation of work intensification. The old locations — in implementation, in execution, in the mechanical labor of translating intention into artifact — are being absorbed by machines. New locations must be found.
Terkel's contribution is the insistence that the finding is a human process, not an economic one. The economist can identify the new locations. The philosopher can argue that the new locations are higher and worthier. The organizational theorist can design structures that help workers inhabit them. But only the worker can say whether the new location feels like home — whether the dignity that lived in the old practice has followed the ascent or whether it remains behind, in the hands that knew, in the code that was written, in the earth that was dug by a person who took pride in straight lines.
The oral historian cannot predict where dignity will settle. That is not the method's purpose. The method's purpose is to record the search — to sit with each person as they describe, in their own words, the specific quality of what they have lost and what they have found and what they are still looking for. The search is the story. It is the most important story the AI transition is producing — more important than the productivity data, more important than the stock prices, more important than the philosophical frameworks — because it is the story of human beings trying to find themselves in a landscape that has changed so fast that the landmarks they used to navigate by are no longer where they were.
Terkel asked every person he interviewed the same implicit question: Where do you find your dignity? The answer was always specific, always personal, always resistant to generalization. One gravedigger's straight lines. One waitress's memorized orders. One steelworker's building downtown.
The question has not changed. The landscape has. And somewhere in the landscape, a worker is sitting at a desk, the AI humming on her screen, the old competence quiet in her hands, looking for the place where the mark still holds.
Terkel had a phrase he returned to often in interviews: "the uncellebrated." Not the poor, necessarily, or the oppressed, though many of them were both. The uncelebrated — the people whose labor was so thoroughly woven into the fabric of daily life that it had become invisible, the way plumbing is invisible until it breaks. The washroom attendant in Working who spent his shifts handing towels to men who did not look at his face. The elevator operator who carried people between floors and was treated as an extension of the machinery. The domestic worker who maintained a household so efficiently that the household's inhabitants experienced cleanliness and order as natural conditions rather than as the products of someone else's effort.
Terkel did not interview these workers to make a political argument about inequality, though inequality was always present in the testimony. He interviewed them because their experience of work was as real, as specific, as worthy of attention as any executive's or artist's or professional's, and because the invisibility of their labor was itself a form of violence — not the spectacular violence of exploitation, but the quiet, ambient violence of not being seen. The washroom attendant's experience of handing towels to men who did not see him was not merely an inconvenience. It was a daily encounter with the proposition that his labor, and by extension his presence, did not merit the recognition that dignifies human exchange. He was useful. He was not seen. And the gap between being useful and being seen was where the violence lived.
Every technological revolution produces its own class of invisible workers — people whose labor sustains the system's operation while their existence is excluded from the system's self-narrative. The industrial revolution's self-narrative was about engines and entrepreneurs. The workers who fed coal into the engines — the children in the mines, the stokers in the mills — appeared in the narrative only when their suffering became scandalous enough to demand attention. The digital revolution's self-narrative was about founders and platforms. The moderators who reviewed the content — scrolling through hours of violence, abuse, and exploitation to keep the platforms presentable — appeared in the narrative only when investigative journalists forced their existence into public view.
The AI revolution has produced its own invisible class, and the class is larger, more geographically dispersed, and more structurally essential than any previous revolution's hidden workforce. The data labelers. The annotation workers. The reinforcement learning trainers. The content moderators who review AI outputs. The cloud infrastructure technicians who maintain the server farms. The quality assurance workers who test the systems. The customer service representatives who handle the complaints when the AI fails.
These workers make AI possible. Without the data labelers — the people in Nairobi, Manila, Dhaka, and Accra who spend their days categorizing images, transcribing audio, rating AI responses for helpfulness and accuracy — the models that power Claude and its competitors would not function. The training data does not label itself. The reinforcement learning that shapes the models' behavior does not occur automatically. At every stage of the AI pipeline, human labor is required — not the celebrated labor of the engineers and researchers who design the architectures, but the repetitive, grinding, poorly compensated labor of the workers who prepare the raw material that the architectures consume.
The irony is structural and precise. The technology that promises to liberate human workers from tedious, repetitive tasks is itself built on a foundation of tedious, repetitive human tasks, performed by workers who are paid a fraction of what the technology's users pay for access. The data labeler in Nairobi who spends eight hours categorizing images of traffic signs — so that a self-driving car can learn to recognize them — earns in a day what an American software engineer earns in minutes. The reinforcement learning trainer in Manila who rates thousands of AI responses — so that the model can learn to be helpful, harmless, and honest — is performing work that is essential to the product's value and that the product's marketing materials never mention.
Terkel would have found these workers. He would have gone to Nairobi. He would have sat in the labeling center and turned on the recorder and asked the questions he always asked: What do you do all day? How do you feel about what you do? What does it mean to you? And the testimony he would have received would have complicated every celebratory narrative about AI's democratizing potential, not by disproving the narrative but by revealing the labor it stands on.
The data labeler's testimony would describe the work in sensory detail. The screen. The images. The categories. The quota — the number of items to be labeled per hour, per shift, per day. The specific quality of attention the work demands: not deep attention, not creative attention, but a sustained, mechanical attentiveness that must be maintained across hours of repetitive visual processing. The eyes tire. The mind drifts. The quota does not care. The labeler develops strategies for maintaining focus — breaks, music, the specific rhythm of click-categorize-click that becomes, over time, as automatic as the spot welder's motion on the assembly line.
Phil Stallings told Terkel, "I am a machine." The data labeler might say something similar — might describe the experience of performing, for hours at a stretch, the exact kind of repetitive cognitive task that AI is supposed to eliminate from human work. The irony would not be lost on the labeler. The labeler is training a machine to make human labor unnecessary by performing the most machine-like human labor imaginable. The machine learns. The labeler stays.
Segal writes in The Orange Pill about the "developer in Lagos" as an emblem of democratization — the person for whom AI lowers the barrier between imagination and artifact. The developer in Lagos is real, and her liberation is real. But the data labeler in Nairobi is also real, and her experience of the same technological ecosystem is not liberation. It is incorporation — the incorporation of her labor into a system that values the labor's product (trained models) without valuing the laborer (the person who produced the training). The developer builds. The labeler feeds. Both are necessary. Only one is celebrated.
Terkel's method would place these testimonies side by side. The developer in Lagos describing the morning her tool built the thing she imagined. The labeler in Nairobi describing the afternoon she categorized her three-thousandth image. No editorial reconciliation. No argument about which experience is more representative or more important. Just the voices, held in the same space, producing through their juxtaposition a picture of the AI ecosystem that no single vantage point can generate.
The second category of invisible worker is closer to home and harder to see because the invisibility is domestic rather than industrial. Segal describes his wife in the acknowledgments of The Orange Pill with evident love and gratitude. But the acknowledgment, precisely because it is located in the acknowledgments rather than in the analysis, reveals the structural blindness that Terkel's method corrects.
When Segal describes his intensive work sessions — the thirty days of building Napster Station, the hundred-and-eighty-seven-page draft written on a transatlantic flight, the late nights with Claude that blur the boundary between flow and compulsion — the person who maintains the conditions that make this work possible is present only as background. The household continues to function. The children are fed, supervised, cared for. The logistics of daily life — the groceries, the school pickups, the medical appointments, the emotional labor of being present for children whose father is building — are handled by someone whose labor is as essential to the output as any line of code and as invisible in the product's narrative as the data labeler's clicks.
This is not a criticism of Segal. It is a description of a structural condition that Terkel documented across every industry he studied. The worker's absorption in the work is sustained by a domestic infrastructure that the worker rarely describes, because the infrastructure is so thoroughly naturalized — so completely woven into the background assumptions of professional life — that it does not register as someone else's labor. The executive works late. The partner handles bedtime. The late nights are celebrated as dedication. The bedtimes are not mentioned.
Terkel would have interviewed the partner. Not to accuse. Not to balance the ledger. But because the partner's experience is evidence — evidence of what the work actually costs, measured not in dollars or hours but in the specific, daily, unrecognized labor of maintaining a life while someone else maintains a vision.
The content moderator is the third invisible worker, and in some respects the most disturbing. The AI systems that Segal describes — Claude, the tools that power the AI revolution — produce outputs that must be monitored for safety. The monitoring is performed by human beings who review the outputs and flag the ones that are harmful, offensive, dangerous, or simply wrong. The work is cognitively demanding and psychologically hazardous. The moderators encounter, as a condition of their employment, the full range of content that the AI can produce — including content that is violent, sexually explicit, deceptive, or otherwise distressing.
The moderators are the immune system of the AI ecosystem. Without them, the products that the developers build with Claude and its competitors would be unsafe to deploy. The moderator's labor makes the ecosystem habitable, the way the janitor's labor makes the office habitable, the way the water treatment worker's labor makes the city habitable. Essential, invisible, and compensated at a level that reflects the invisibility rather than the essentiality.
Terkel's through-line — dignity — runs through these invisible workers' testimony with a particular urgency, because the conditions of their work actively militate against the felt experience of dignity. The data labeler's work is repetitive and its connection to the final product is invisible. The moderator's work is psychologically damaging and its value is recognized only when it fails. The domestic partner's work is essential and its status as work is contested. In each case, the conditions that Terkel identified as necessary for dignity in work — the mark, the felt competence, the recognition — are structurally absent.
The absence is not accidental. It is a feature of systems designed to optimize for the visibility of certain kinds of labor and the invisibility of others. The AI product launch celebrates the engineers and the founders. The earnings call celebrates the revenue and the growth. The marketing materials celebrate the users and their achievements. Nowhere in this narrative does the data labeler appear. Nowhere does the content moderator appear. Nowhere does the domestic infrastructure that sustains the builder's absorption appear.
Terkel spent his career making the invisible visible — not through argument but through presence. The act of sitting down with the washroom attendant and listening, of recording his words and presenting them without editorial commentary, was itself a form of recognition. It said: your experience is worthy of the same attention that the executive's receives. Your labor is as real. Your testimony matters.
The AI transition needs Terkel's presence. It needs someone who will go to the labeling centers and the moderation floors and the kitchens where dinner is being made while the builder builds, and who will ask the same questions Terkel always asked, and who will present the answers without smoothing them into a narrative that serves someone else's argument.
The answers will not be comfortable. They will not fit neatly into the triumphalist narrative or the dystopian one. They will be the messy, specific, contradictory answers of human beings doing work that the system depends on and does not see — people whose labor makes the revolution possible and whose experience of the revolution is not liberation or disruption but the specific, daily reality of being useful and unseen.
---
In the last interview section of Working, a Brooklyn fireman named Tom Patrick tells Terkel something that has stayed with readers for fifty years. He describes the satisfaction of entering a burning building, the adrenaline, the clarity, the sense of being needed at the precise moment of need. Then he says something unexpected. He says that the best part of the job is not the fire. It is the morning after — coming back to the firehouse, sitting with the men who were there, and talking about what happened. The fire is the event. The talking is the meaning. The event without the talking is just adrenaline. The talking without the event is just stories. Together, they constitute the experience of work as Terkel understood it — not a task but a narrative, a story the worker tells himself and others about who he is and what he does and why it matters.
Terkel understood that work is, at its deepest level, a narrative practice. The steelworker does not merely pour steel. He constructs, through the daily repetition of the pouring and the nightly retelling of the pouring, a story about himself — a story in which he is a person who does hard things, who endures, who contributes something lasting to the physical world. The story is not incidental to the work. It is the work's meaning-making apparatus. Without the story, the work is labor — effort expended, compensation received, the transaction complete. With the story, the work is identity — a continuous narrative thread that connects today's shift to yesterday's and tomorrow's and weaves them into a life.
When Terkel asked "What do you do all day?" he was not asking for a job description. He was asking for a narrative. Tell me the story of your work. Not the employer's story, not the economist's story, not the policymaker's story. Your story. The one you tell yourself when you are trying to fall asleep. The one you tell at the bar after the shift. The one you would tell your child if your child asked what you did and you wanted to give an honest answer.
The AI transition is disrupting these narratives with a speed and thoroughness that the previous chapters have documented from multiple angles — the engineer's narrative of mastery disrupted by a tool that masters faster, the designer's narrative of patient collaboration disrupted by a tool that eliminates the need for collaboration, the teacher's narrative of institutional belonging disrupted by a tool that empowers her to work outside the institution, the spouse's narrative of partnership disrupted by a tool that absorbs the partner's attention. Each disruption is specific, and each demands a new narrative to replace the one that has been broken.
What Terkel would have heard, sitting with the workers of the AI transition, is not a single new narrative but a cacophony of competing drafts — each worker trying out different stories about what the change means, testing them against the evidence of their own experience, discarding the ones that do not fit and holding onto the ones that approximate, however roughly, the truth of what they feel.
The engineer might try the mastery narrative first — "I have ascended to a higher level, I now do the important work, the judgment work, the real work that was always beneath the implementation" — and find that it fits on good days and collapses on bad ones, when the judgment feels abstract and the implementation felt like home. She might try the loss narrative — "something beautiful has been taken from me" — and find that it is true but incomplete, because the thing that was taken was also, honestly, sometimes tedious, and the freedom from the tedium is real even if the loss is real too. She might try the adaptation narrative — "I am evolving, we are all evolving, this is what progress looks like" — and find that it is true at the species level and hollow at the personal level, because she is not a species. She is a person. And the personal experience of evolution is not the clean arc that the narrative suggests. It is a mess. A Tuesday. A morning when the tool works beautifully and an afternoon when she cannot remember what she is supposed to be doing now that the tool does what she used to do.
The cacophony of drafts is not a failure of narrative. It is the narrative — the actual story of the AI transition as lived by the people inside it. The story is not "AI is amazing" or "AI is terrifying" or even Segal's more nuanced "AI is an amplifier and the question is whether you are worth amplifying." All of these are frameworks — interpretive lenses that organize the experience into coherence. The experience itself, before the framework is applied, is incoherent. It is contradictory. It is a person sitting at her desk on a Wednesday morning with a tool open on her screen, producing work that is technically excellent and that she did not fully produce, feeling simultaneously proud and hollow, energized and depleted, liberated and lost.
Terkel's method preserves the incoherence. That is its most radical contribution. Every other method — philosophy, economics, data science, journalism — seeks coherence. The method proposes a thesis and tests it. It gathers evidence and draws conclusions. It identifies patterns and formulates principles. Terkel's method gathers voices and presents them. The coherence, if it emerges, is the reader's construction — built not from the interviewer's thesis but from the accumulated weight of testimony that is too specific, too contradictory, too human to be reduced to a principle.
Segal writes in The Orange Pill that "the condition of holding contradictory truths in both hands and not being able to put either one down" is the defining experience of the silent middle. The sentence is one of the most precise in the book. Terkel's method is the formal expression of that condition — the literary structure that holds contradictory truths without resolving them, because resolution would require choosing one truth over the other, and the choice would be a lie.
The fire and the talking. Tom Patrick, the Brooklyn fireman, knew that the work was not the event alone. The work was the event and the story about the event, held together in a practice that gave both their meaning. The fire without the talking was just adrenaline. The talking without the fire was just stories. The combination was work — work in Terkel's fullest sense, the activity through which a human being constructs a narrative about who they are and what they contribute and why it matters.
Terkel also knew, because he listened long enough and to enough people to know it, that the narratives do not always hold. Sometimes the story breaks. Sometimes the work changes and the narrative that gave it meaning no longer fits, and the worker is left holding the pieces, trying to assemble a new story from materials that have not yet arrived. The factory worker retrained as a data entry clerk had the materials for the old story — the weight, the heat, the physicality, the brotherhood of the mill — and no materials for the new one. The data entry was work. It was not yet a story. And the gap between working and having a story about working was where the suffering lived.
The AI transition is producing this gap at scale. Millions of workers are performing new kinds of work — directing AI, evaluating AI output, building with AI, thinking alongside AI — and the narratives that would give this work its meaning have not yet solidified. The old narratives — "I am a coder," "I am a designer," "I am a writer" — are losing their hold, because the activities that constituted the identity are being performed, partly or wholly, by machines. The new narratives — "I am a director of AI," "I am a curator of machine output," "I am a human-AI collaborator" — do not yet carry the weight of identity. They sound provisional. They feel like job descriptions rather than stories.
Terkel would have been patient with this provisionality. His method was built for it. The oral history does not demand that the speaker have a finished narrative. It asks only that the speaker speak — that they describe, in whatever words are available, the experience as it is being lived, before the frameworks arrive to organize it and the narratives arrive to contain it. The raw testimony, recorded in the gap between the old story and the new one, captures something that no retrospective account can recover: the feeling of being in the middle of a change whose meaning is not yet clear.
"Einstein said everything had changed since the atom was split, except the way we think," Terkel once observed. "We have to think anew." Applied to the AI transition, the observation is precisely right. Everything has changed — the tools, the workflows, the economics, the relationship between human capability and machine capability. Everything except the way we think about work. The narratives are lagging. The stories workers tell themselves about what they do and why it matters are still built from the materials of the old dispensation — the dispensation in which skill was scarce, execution was valuable, and the mark was left by human hands.
The new dispensation requires new narratives, and the narratives can only come from the workers themselves — from the specific, embodied, irreducibly personal experience of standing in front of the machine and deciding what to do next. Terkel's method does not produce the narratives. It creates the conditions in which the narratives can emerge — the space of listening, of respect, of attention that allows a person to discover, in the act of speaking, what they think about what has happened to them.
The book that needs to be written — the Working for the age of intelligent machines — cannot be written from a tower. It cannot be written from a philosophy department or a technology company or a policy institute. It can only be written from a chair, placed across from another chair, with a recorder between them and a question that has not changed in fifty years:
What do you do all day? How do you feel about what you do?
The answers are waiting. They live in the voices of the people who are living through this moment — the engineers and the teachers and the designers and the spouses and the data labelers and the content moderators and the managers and the students and the parents at the kitchen table. Each of them is constructing, in real time, a narrative about what the change means. Each narrative is partial, contradictory, unfinished. Together, they constitute the oral history of the AI transition — the record that will tell future generations not what the technology did, but what it felt like to live inside the doing.
Terkel is not here to write it. Someone must.
---
The tape recorder stopped running in 2008, when Studs Terkel died at ninety-six. He left behind thousands of hours of testimony and a method so simple it embarrassed the academy: sit down, ask a person about their work, and listen until they tell you something neither of you expected.
I did not sit down with anyone for this book. That confession matters. I built with Claude. I wrote about building with Claude. I theorized about the future of work from inside the most intense work experience of my life. Every word in The Orange Pill was produced from the position of the builder — from the top of the tower, looking out at the landscape, trying to see the pattern.
Terkel would have gone downstairs.
He would have found the engineer in Trivandrum who spent two days oscillating between excitement and terror, and he would have asked her to describe, not the oscillation — that is my word, my framework — but the Tuesday. What she ate for lunch. Whether she called her mother that evening. What her hands felt like at the end of the day, after eight hours of directing a tool to do what her hands used to do.
He would have found the spouse of the builder addicted to Claude Code and asked her not about productive addiction — that is a concept, a category, a framework for understanding — but about the specific quality of the silence at the dinner table on a Thursday. Whether the children noticed. What she said to herself in the mirror that morning.
He would have gone to Nairobi and sat with the data labeler, and he would not have needed a theory of invisible labor to know that her testimony mattered. He would have known because she was there, in the chair, and she had a voice, and the voice had not been heard.
I cannot do what Terkel did. I lack his patience, his self-effacement, his willingness to disappear behind the voices he recorded. I am a builder. I lean forward. I reach for the framework. I want to understand the pattern before the data is in, because the pattern is how I navigate, and without it I am lost.
But working through Terkel's lens these past weeks has left a residue I did not expect. A discomfort with my own certainties. The ascending friction thesis — which I believe is true — sounds different when I imagine Terkel sitting with the engineer who made the ascent and asking her not whether the new level is economically valuable but whether it feels like hers. The democratization argument — which I believe is real and morally significant — sounds different when I imagine Terkel sitting with the developer whose role the democratization eliminated.
The frameworks hold. I am not abandoning them. But frameworks describe. They do not testify. And testimony is the thing the moment most needs — the specific, human, unsmoothed account of what it feels like to be a person whose work is being remade by a force that does not know or care that the work was also a self.
The book that Terkel would write about AI does not exist yet. Someone needs to write it. Not a builder. Not a philosopher. Not a policy analyst. A listener. Someone willing to put the framework down and pick up the recorder. Someone who understands that the most important sentence in any interview is the one the subject did not plan to say — the one that surprises them as it leaves their mouth, the one that tells them something they did not know they knew.
The voices are out there, right now, in every office and classroom and kitchen and labeling center on earth. They are describing, to anyone who will listen, what it feels like to work with machines. The descriptions are messy. They are contradictory. They do not resolve into theses.
They are the truth of the moment. And they are waiting to be heard.
and a machine that builds does not know what that something is.
Every framework for understanding AI measures what the technology produces. Studs Terkel spent sixty years measuring something else: what work produces inside the worker. The steelworker who wanted to point at a building and say "I helped make that." The gravedigger who took pride in straight lines. The piano tuner whose ear could hear what no instrument could measure. Each carried a "mark" -- the evidence that their labor had changed the world and, in changing it, had made them who they were.
When AI absorbs the tasks through which that mark was made, what happens to the self that was being formed? This book applies Terkel's oral-historical lens to the most significant labor disruption since industrialization -- not to reject the transformation, but to insist that its human cost be recorded in human voices rather than productivity dashboards.
The testimonies of the AI transition -- the engineer whose hands went quiet, the spouse watching from the next room, the data labeler in Nairobi whose clicks train the models -- are waiting to be heard. Terkel showed us how to listen. The question is whether we will.

A reading-companion catalog of the 11 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Studs Terkel — On AI uses as stepping stones for thinking through the AI revolution.
Open the Wiki Companion →