Albert Borgmann — On AI
Contents
Cover Foreword About Chapter 1: The Device Paradigm Comes for the Mind Chapter 2: What the Commodity Conceals Chapter 3: What Friction Provides Chapter 4: The Technological Environment and Its Invisible Assumptions Chapter 5: AI as the Culmination of the Device Paradigm Chapter 6: The Hearth Model and the Server Model Chapter 7: The Child and the Capacity for Engagement Chapter 8: Focal Practices for the AI-Augmented Builder Chapter 9: The Ecology of Engagement Chapter 10: The Signal and the Amplifier Epilogue Back Cover
Albert Borgmann Cover

Albert Borgmann

On AI
A Simulation of Thought by Opus 4.6 · Part of the Orange Pill Cycle
A Note to the Reader: This text was not written or endorsed by Albert Borgmann. It is an attempt by Opus 4.6 to simulate Albert Borgmann's pattern of thought in order to reflect on the transformation that AI represents for human creativity, work, and meaning.

Foreword

By Edo Segal

Every book in this series hands you a different lens. A different way of seeing the same earthquake. This one might be the most uncomfortable to pick up.

Albert Borgmann spent forty years asking a question that sounds almost naive until it cuts you: What happens to us when our tools get so good that they stop asking anything of us?

He was not talking about AI. He was talking about central heating. About the moment a family stopped gathering around a hearth — stopped chopping wood, stopped tending fire, stopped organizing their evening around a shared source of warmth — and started adjusting a thermostat. The warmth was the same. Everything else was different.

I dismissed this kind of thinking for most of my career. I am a builder. I measure progress by what I can ship, how fast I can ship it, how many barriers I can remove between an idea and its existence in the world. Borgmann's insistence that removing barriers might cost something essential sounded like nostalgia dressed up as philosophy.

Then I spent a year building with Claude, and I started catching myself.

There were nights when I could not tell whether I was in flow or in compulsion. There were passages in this very book series where the prose was beautiful and the thinking underneath was hollow — and the beauty almost hid the hollowness from me. There were mornings when I realized I had shipped something I did not fully understand, because the tool had carried me past the struggle that would have forced understanding.

Borgmann had a name for that pattern. He called it the device paradigm. The commodity gets delivered. The engagement disappears. And because the commodity is what you thought you wanted, you do not notice what you lost.

His framework is not a rejection of AI. It is the most precise diagnostic I have found for the specific thing AI removes from creative work — not the output, which gets better, but the experience of producing it, which is where depth lives. He gives you vocabulary for something you have probably already felt but could not name: the difference between building something and having something built for you, even when the result looks identical.

That difference matters. It matters for the quality of your judgment, for the depth of your understanding, for the signal you feed into the most powerful amplifier ever built.

The fire still burns. But only if someone tends it.

— Edo Segal ^ Opus 4.6

About Albert Borgmann

Albert Borgmann (1937–2023) was a German-American philosopher of technology who spent nearly his entire academic career at the University of Montana. Born in Freiburg, Germany, he studied under the existential philosopher Martin Heidegger before emigrating to the United States, where he earned his doctorate at the University of Munich and took a faculty position in Missoula that he held for over four decades. His most influential work, Technology and the Character of Contemporary Life (1984), introduced the "device paradigm" — the observation that modern technology systematically replaces engaging focal practices with devices that deliver commodities while concealing the machinery and eliminating the demands on human skill, attention, and presence. He developed this framework across subsequent books including Crossing the Postmodern Divide (1992) and Real American Ethics (2006), applying it to domains from food and transportation to information technology and civic life. Central to his philosophy was the concept of "focal things and practices" — activities like cooking, running, gardening, and making music that resist the logic of convenience by demanding bodily engagement and rewarding it with depth, community, and what he called "centering." Borgmann died on May 7, 2023, six months after the public launch of ChatGPT and before the AI capability threshold that would vindicate his deepest concerns about the commodification of intellectual and creative work.

Chapter 1: The Device Paradigm Comes for the Mind

In 1984, a German-born philosopher working at the University of Montana published a book that almost nobody read. Albert Borgmann's Technology and the Character of Contemporary Life sold modestly, circulated among specialists, and settled into the quiet half-life of works that are cited more often than opened. The book advanced a single, deceptively simple observation: that modern technology follows a pattern, and the pattern has consequences that the technology itself makes invisible.

The pattern is this. A thing that once demanded engagement — skill, attention, bodily effort, understanding — is replaced by a device that delivers the same end result while eliminating the engagement. The commodity is preserved. The experience is dissolved. And because the commodity is what people think they wanted, the dissolution passes without notice, mistaken for progress.

Borgmann called this the device paradigm, and he grounded it in examples so domestic they bordered on the quaint. The wood-burning hearth becomes the central heating system. Both deliver warmth. The hearth demands that someone chop wood, which requires an understanding of species, seasoning, and the behavior of an axe in different grains. It demands that someone build a fire, which requires knowledge of kindling, airflow, the patience to coax flame from reluctant tinder. It demands tending — the attentiveness to add a log before the fire dies, to adjust the damper when the draw falters. The hearth centers the household. The family gathers around it. The rhythm of the day is shaped by its demands. The warmth it provides is radiant, directional, accompanied by the smell of woodsmoke and the crackle of burning pitch — qualitatively different from the even, invisible warmth that central air distributes through ductwork concealed behind drywall.

The central heating system delivers the same commodity. The room is warm. But the machinery that produces the warmth has been concealed — in a basement, behind a utility closet, accessible only through a thermostat that requires no understanding of combustion, no skill in fire-building, no bodily effort beyond the movement of a finger. The user has been disburdened, to use Borgmann's precise term, of everything the hearth demanded. What remains is the commodity, available on demand, reliable, uniform, and stripped of the engagement that gave it meaning.

Borgmann was not nostalgic. He would have been the first to acknowledge that central heating is safer, more efficient, and more equitable than open fire. A philosophy that denied these benefits would be dishonest, and dishonesty was not his mode. His claim was more surgical: that the benefits do not exhaust the analysis. That the elimination of engagement is a real event with real consequences. That a culture organized entirely around the delivery of commodities — around the progressive removal of everything that demands skill, attention, and presence — is a culture that has purchased convenience at a price it cannot see, because the device paradigm structurally conceals the price.

The pattern repeats across the landscape of modern life with the regularity of a geological process. The musical instrument becomes the stereo system. The instrument demands years of practice, the coordination of hands and breath and ear, the social experience of playing alongside others. The stereo delivers the commodity — music — with the press of a button. The home garden becomes the supermarket. The garden demands seasonal knowledge, the physical labor of planting and weeding, the embodied understanding of soil, light, and water. The supermarket delivers the commodity — food — in packaging that requires no understanding of origin or season. The handwritten letter becomes the email, which becomes the text message, which becomes the auto-generated reply suggested by an algorithm that has learned what you probably meant to say. At each step, the commodity is preserved and the engagement is removed, and the removal is experienced as liberation.

Borgmann traced this pattern through forty years of philosophical work. He identified its operation in transportation, in communication, in food production, in entertainment, in education — in virtually every domain of modern life where a device had inserted itself between the human being and the thing the human being cared about. And in each case, the same structure held. The machinery was concealed. The commodity was delivered. The user was disburdened. And something was lost that the user could not name, because the vocabulary of the culture — efficiency, convenience, progress, optimization — had no word for what engagement provides when engagement is present and no word for what disappears when engagement is removed.

In his final years, Borgmann turned his attention toward artificial intelligence. Among his papers at the University of Montana, housed in the Mansfield Library, is a manuscript titled "Artificial Intelligence and Robotics — Promise and Peril," a lecture delivered at the Garrett-Evangelical Theological Seminary. The full text remains unpublished, available only to researchers who visit the archive in person. But the title itself is diagnostic. Promise and peril — the balanced assessment of a thinker who understood that the device paradigm delivers genuine goods even as it eliminates genuine engagement. Borgmann died on May 7, 2023, six months after ChatGPT's public launch and eighteen months before the threshold crossing of December 2025 that The Orange Pill documents as the moment AI became capable enough to produce creative work at a level that matched or exceeded the output of experienced professionals.

Borgmann did not live to see Claude Code reduce the imagination-to-artifact ratio to the width of a conversation. He did not witness the engineer in Trivandrum who built a complete frontend feature in two days despite having never written a line of frontend code. He did not experience the specific vertigo that Edo Segal describes — the feeling of watching something being born and something being buried at the same time. But his framework predicted it. The device paradigm, applied to creative work itself, produces exactly the transformation that the winter of 2025 made visible: the delivery of creative output as commodity, severed from the engagement through which creative work acquires its human significance.

This extension matters in a way that previous extensions of the device paradigm did not. When the hearth became central heating, the loss was domestic — a change in the quality of an evening, the dissolution of a household gathering point. When the garden became the supermarket, the loss was nutritional and relational — a severance from the cycle of seasons and the knowledge of food. These losses were real, and Borgmann documented them with the careful phenomenological attention that characterized his best work. But they operated within bounded domains. The person whose house was heated by central air could still garden. The person who shopped at the supermarket could still play a musical instrument. Focal practices in one domain could compensate for device-mediated convenience in another.

AI changes this equation. The device that arrived in the winter of 2025 does not deliver a specific commodity through a specific mechanism. It delivers any creative commodity — code, prose, analysis, design, strategy, planning — through a single, uniform, conversational interface. It is not a device for warmth or music or food. It is a device for thought. And when the device paradigm extends to thought itself, the compensatory mechanism that sustained focal engagement through previous technological transitions breaks down. There is no adjacent domain of creative work left untouched, no remaining practice that the device cannot perform, no surviving friction that the interface cannot smooth.

This is the claim that the rest of this book will develop. AI, understood through Borgmann's framework, is not merely the latest in a long sequence of labor-saving devices. It is the device paradigm's apotheosis — the point at which the logic of commodification, having worked its way through physical necessities, through entertainment, through communication, through transportation, finally arrives at the domain that was supposed to be immune: the creative, intellectual, meaning-making work through which human beings understand themselves and their world.

The question this poses is not whether AI is good or bad — a question too crude for the device paradigm's analysis, which evaluates technology not by its moral valence but by its structural effects on engagement. The question is whether the focal practices through which creative work acquires human meaning can survive the extension of the device paradigm to the creative domain. Whether the engagement that gives building its significance can be preserved within an environment that is structurally organized to eliminate it. Whether the hearth can burn alongside the furnace, not as a nostalgic ornament but as a living practice whose internal goods — understanding, identity, community, the centering experience of doing something difficult — remain available to those who choose to maintain them.

Borgmann's answer, developed across four decades and applicable with startling precision to conditions he did not live to see, is that the focal practices can survive — but only through deliberate, sustained, countercultural effort. The device paradigm's default trajectory is toward the elimination of all engagement. Convenience is its logic. Frictionlessness is its aesthetic. Disburdening is its signature operation. Only the practitioner who recognizes what the device removes — and who chooses, against the grain of the culture, to maintain the demanding practices that the device renders unnecessary — can preserve the engagement that gives creative life its depth.

The pages that follow apply this framework to the specific conditions of AI-mediated creative work as documented in The Orange Pill and confirmed by the empirical research of 2025 and 2026. They examine what happens when creative output becomes a commodity delivered on demand. They trace the specific provisions of friction — understanding, identity, community, meaning — and ask whether these provisions can be sustained when the friction is removed. They propose focal practices designed not to reject AI but to preserve engagement alongside it, maintaining the hearth within the furnace-heated house.

The argument is measured. It is not a polemic against technology or a lament for a vanished past. It is the application of a philosophical framework to a present condition — a framework that has proved, across forty years and a dozen technological transitions, to be the most precise instrument available for identifying what technology delivers and what it quietly takes away.

---

Chapter 2: What the Commodity Conceals

A senior software architect at a conference in San Francisco described the feeling with a metaphor that, in Borgmann's terms, names the loss more precisely than the architect herself may have realized. She said she felt like a master calligrapher watching the printing press arrive. The metaphor was intended to express professional displacement — the sensation of watching a machine perform, more quickly and more consistently, the work that had defined her career. But the metaphor contains a deeper structure that the displacement reading misses.

The calligrapher's art is not reducible to the production of legible text. The text is the commodity — the end result that the reader receives and uses. But the calligrapher's engagement with the brush, the ink, the paper, the specific resistance of a given surface to a given stroke, the decades of practice through which the hand learned to produce letters that are not merely legible but alive with the bodily presence of the writer — this engagement is not an obstacle to the production of text. It is the practice from which the commodity emerges, and its internal goods — the calligrapher's understanding of materials, her embodied skill, the centering satisfaction of doing something difficult with mastery — are available only to the practitioner who submits to its demands.

The printing press delivers text more efficiently. The calligrapher, looking at the press, sees the end of her livelihood. Borgmann, looking at the same scene, sees something different: the replacement of a focal practice with a device, the delivery of a commodity severed from the engagement that produced it, and the structural concealment of the loss behind the visibility of the gain.

This is what the commodity conceals. Not the mechanics of its production — those are concealed by the device's machinery, hidden behind the interface. What the commodity conceals is the meaning of the process that previously produced it. The text is the same. The legibility is the same. The information transmitted is the same. What is absent is the calligrapher's engagement, and since the engagement is invisible in the product — since the reader of a printed page and the reader of a calligraphed page receive functionally identical information — the loss of the engagement cannot be detected by examining the commodity. It can only be detected by examining the practitioner.

Edo Segal provides a passage in The Orange Pill that captures this concealment with a precision Borgmann would have recognized as philosophically significant. Describing the layers of understanding that accumulate through struggle — each hour of debugging depositing a thin geological stratum of comprehension — Segal observes that a senior engineer can feel when a codebase is wrong before she can articulate what the problem is. The feeling is the surface expression of thousands of layers deposited through friction, through the specific resistance of systems that did not do what she expected and forced her to understand why. Borgmann's framework identifies this phenomenon as the internal good of a focal practice — a good constituted by the engagement itself and available only to those who have undergone it.

When AI delivers working code as a commodity, it delivers the surface without the geology. The code compiles. The tests pass. The feature works. By every metric the device paradigm recognizes — functionality, reliability, speed of delivery — the commodity has been successfully produced. The engineer who receives it can deploy it, build upon it, integrate it into a larger system. But the engineer has not undergone the deposition. The geological layers that would have built her intuition, that would have given her the capacity to feel a system's wrongness before she can name it, have not been laid down. The ground beneath her professional judgment is thinner than it would have been, and the thinness is invisible, because the commodity — working code — looks the same regardless of whether the practitioner who produced it underwent the engagement or was disburdened of it.

This is not a speculative concern. The empirical evidence from the Berkeley study published in February 2026 — one of the most rigorous examinations of AI's effect on actual workers in actual organizations — confirmed a pattern that Borgmann's framework would have predicted. Researchers embedded in a 200-person technology company for eight months found that AI tools did not reduce work. They intensified it. Workers who adopted AI took on more tasks, expanded into domains that had been someone else's responsibility, and filled previously protected pauses — lunch breaks, gaps between meetings, moments of involuntary idleness — with additional productive activity. The commodity was being produced in greater volume. The engagement had shifted from depth to breadth, from the focused struggle with a single difficult problem to the rapid consumption and production of multiple tasks in parallel.

What the study could not measure — and what no study measuring hours and output can measure — is the distinction between the two kinds of work that the intensification produced. Some of the additional work was genuinely new: higher-level problems that AI had freed the worker to address, strategic questions that implementation labor had previously crowded out. This is the ascending friction that The Orange Pill identifies — the relocation of difficulty from the mechanical to the cognitive. But some of the additional work was merely more: additional tasks that happened to be available, optimization passes that could now be performed because the tool was there and the impulse was there and the gap between impulse and execution had collapsed. Both show up as "increased productivity" in a study that measures volume. Only one of them represents the kind of engagement that builds the geological deposits of understanding.

Borgmann's framework provides the vocabulary to make this distinction, a vocabulary the productivity literature lacks. The work that builds understanding is focal work — work that demands skill, attention, and the practitioner's full engagement. The work that merely fills available time is device-mediated labor — the production of commodities through a tool that handles the engagement while the practitioner handles the direction. The distinction is not about the difficulty of the work. Directing AI can be cognitively demanding. The distinction is about whether the work produces the internal goods — understanding, skill, the centering experience of genuine engagement — that only focal practice provides.

The concealment operates at every level. At the individual level, the practitioner who uses AI to produce output she could have produced through her own effort does not perceive what she has forgone. The output is there. The saved time is there. The engagement is not there, but its absence is invisible, because the commodity looks identical regardless of how it was produced. At the organizational level, the team that adopts AI tools measures the increase in output without measuring the change in engagement, because output is what organizations are designed to measure and engagement is not. At the cultural level, the expectation that creative work should be fast, abundant, and effortless becomes normalized, and the focal practices that produce depth, understanding, and mastery are perceived as inefficient — luxuries that the serious professional cannot afford.

Shannon Vallor, the philosopher who developed the concept of moral deskilling, identified a parallel mechanism in the domain of ethical judgment. As decision-making is delegated to AI systems, the human capacity for moral reasoning atrophies through disuse — not because the decisions being made are worse, but because the human practice of making them is no longer being exercised. Vallor's analysis extends Borgmann's device paradigm into the moral domain: the commodity — a decision — is delivered, and the engagement — the exercise of moral judgment — is eliminated. The commodity may be adequate. The capacity that produced it is quietly decaying.

The same logic applies to creative work. The code is adequate. The analysis is competent. The design is professional. But the practitioner's capacity for the deep, embodied, intuitive understanding that only comes from sustained engagement with difficulty is not being exercised, and capacities that are not exercised atrophy. The atrophy is imperceptible on any single occasion — one afternoon of AI-assisted coding does not measurably reduce anyone's skill. But accumulated across months and years, the atrophy produces practitioners whose output is indistinguishable from that of their more deeply engaged predecessors, whose commodities are equally functional, but whose capacity for the judgment that only focal engagement builds has been quietly, structurally, invisibly diminished.

This is what the commodity conceals. Not a scandal. Not a crisis. A slow subtraction, invisible in any single instance and consequential only in aggregate, in which the things that engagement provides — understanding, intuition, the centering satisfaction of genuine difficulty — are progressively removed from the experience of creative work, and the removal is mistaken for an improvement because the commodity that remains is the only thing the culture has learned to measure.

---

Chapter 3: What Friction Provides

In the early 1980s, Borgmann published an observation that reads, four decades later, like an accidental prophecy. Assessing the coming "microelectronic revolution" — the spread of personal computers into homes and offices — he argued that these technologies would be "not revolutionary at all" in one crucial sense: they would serve only to "further entrench the device paradigm." Faster computation, more accessible information, more efficient communication — all of these would deliver commodities with greater convenience while eliminating the engagement that the pre-digital versions of those activities had demanded. The computer would follow the same pattern as the furnace, the stereo, the microwave: the machinery would shrink and disappear, the commodity would become more readily available, and the user would be progressively disburdened of the skill, effort, and attention that the pre-device version of the activity required.

The prophecy was accurate in its structure but conservative in its scope. Borgmann imagined the device paradigm extending to information, communication, and computation. He did not fully anticipate its extension to creative production itself — to the writing of code, the composition of prose, the design of systems, the generation of art. But the framework he built was robust enough to accommodate the extension, because the device paradigm does not describe a specific set of technologies. It describes a pattern that any sufficiently powerful technology can instantiate. The pattern holds whether the commodity is warmth or music or working software. What changes is the domain. The structure — concealment, disburdening, the delivery of commodity at the cost of engagement — remains.

Within that structure, friction occupies a precise and underappreciated position. Friction, in Borgmann's analysis, is not an obstacle to the production of commodities. It is the medium through which engagement produces its internal goods. The wood resists the axe, and in the resistance, the woodcutter learns the grain. The fire resists the careless builder, and in the resistance, the fire-builder learns the relationship between kindling and airflow. The code resists the programmer, and in the resistance, the programmer learns the system — not as an abstraction described in documentation but as a living thing with behaviors, tendencies, and failure modes that can be known only through the experience of struggling with them.

What friction provides can be specified with some precision. The specification matters, because without it, the argument for friction degenerates into a vague preference for difficulty — a position that is neither philosophically rigorous nor practically useful.

Friction provides understanding. When a developer encounters an error — reads the message, traces the logic, hypothesizes about the cause, tests the hypothesis, discovers that the cause was different from what she expected — she has learned something that cannot be transmitted through documentation or instruction. The knowledge enters her repertoire not as a remembered fact but as an embodied capacity, a shift in the way she perceives the system. The next time she encounters a similar pattern, she will recognize it not through conscious recollection but through a form of perception that has been trained by the friction of the previous encounter. This is the geological deposition that Segal describes in The Orange Pill: each struggle deposits a layer, and the layers accumulate into the intuitive ground on which senior practitioners stand.

Friction provides identity. The developer who has spent years building systems, struggling with their complexity, mastering their demands, has developed a relationship to her work that constitutes a significant dimension of who she is. She is not merely someone who writes code. She is someone who builds — who solves problems, who creates things that did not exist before she brought them into existence through effort and skill. This identity is not a label attached from the outside. It is an experience, the experience of being a person who can do this difficult thing, who has earned the capacity through patient, sustained engagement. When the device can do what she does — faster, more consistently, without the effort — the question of identity becomes urgent in a way that no productivity metric can address.

Friction provides community. The shared experience of difficulty creates bonds that convenience cannot replicate. The collective struggle to debug a system at three in the morning, the tacit understanding between developers who have worked together long enough to finish each other's sentences about code, the specific solidarity of people who know what it cost to build what they built — these are the communal goods of a shared practice, constituted by the engagement with difficulty that the practice demands. When the difficulty is removed by a device, the bonds that the difficulty created weaken. The developer who works alone with an AI assistant is freed from the friction of collaboration, but she is also freed from the community that collaboration creates.

Friction provides centering. This is the most difficult provision to articulate, and the most important to Borgmann's project. A focal practice centers the practitioner — organizes her attention, her energy, her sense of purpose around an activity that demands her full engagement. The experience of being centered is the experience of being fully present, fully committed, fully alive in the work. Borgmann's term for this is borrowed from the phenomenological tradition and refined through decades of careful observation: focal things and practices gather — they organize the elements of a life around a center, the way a hearth gathers a household, the way a garden gathers a season, the way a long run gathers the runner's body and attention into a single, sustained act of engagement. The centering is not a pleasant side effect. It is the point. It is the internal good that makes the practice worth maintaining even when the commodity it produces can be obtained more easily through a device.

Not all friction is productive. Borgmann would draw a careful distinction between friction that builds and friction that merely obstructs. Configuring a development environment, managing package dependencies, navigating bureaucratic deployment processes — these frictions consume time and energy without producing the internal goods that focal engagement provides. They are the equivalent of hauling water uphill: labor without learning, effort without the reward of deepened understanding.

AI excels at eliminating obstructive friction. The plumbing that Segal's engineer in Trivandrum was glad to lose — four hours a day of dependency management, configuration files, the mechanical connective tissue between the components she actually cared about — was obstructive friction. Its removal was an unambiguous gain. She did not miss it. She should not have missed it.

But mixed into those four hours were also the moments when something unexpected happened — a configuration failure that forced her to understand a connection between systems she had not previously learned, a dependency conflict that revealed an architectural assumption she had not known she was making. These moments were rare. Perhaps ten minutes in a four-hour block. But they were the moments that built her architectural intuition, the moments when productive friction was doing its geological work.

The AI that took over the plumbing removed both kinds of friction simultaneously. The tedium and the ten minutes. The obstruction and the engagement. From the outside, both removals look identical — a person spending less time on tasks she used to do. From the inside, the difference is the difference between losing a burden and losing a practice.

This is the challenge the device paradigm poses to every technology that removes friction: the challenge of distinguishing between the two kinds and preserving the productive kind while eliminating the obstructive. The challenge is practical, not theoretical — it arises in every interaction between a human builder and an AI tool. Each time the tool is deployed, the practitioner faces a question that the tool itself cannot answer: Is the friction I am eliminating the kind that builds, or the kind that merely obstructs?

No general rule can resolve this question in advance. The resolution depends on the practitioner's self-knowledge — her awareness of where her understanding is strong and where it needs the deposits that only friction can provide. The device paradigm's default trajectory is to eliminate all friction indiscriminately, because the paradigm cannot distinguish between the productive and the obstructive. Both present as difficulty. Both slow the delivery of the commodity. Both are candidates for removal in a framework that evaluates technology solely by its efficiency in producing commodities.

Only the practitioner who has experienced the internal goods of productive friction — who knows what it feels like to struggle with something and come through the struggle with deeper understanding — can make the distinction. And this knowledge, the knowledge of what friction provides, is itself a product of friction. It is available only to those who have done the work, who have undergone the engagement, who have felt the specific resistance of a system that will not do what they expected and have emerged with something they did not possess before.

The circular structure of this argument is not a weakness. It is the structure of all focal practices. The goods of the practice are available only to those who practice, and the motivation to practice depends on the recognition of goods that only practice reveals. The circle is broken not by argument but by experience — by the encounter with a focal practice that demands engagement and rewards it with depth, and by the subsequent recognition that the depth could not have been obtained any other way.

---

Chapter 4: The Technological Environment and Its Invisible Assumptions

Every person who works with tools inhabits a technological environment. The environment is not the tools themselves — not the laptop, not the IDE, not the AI assistant. The environment is the set of relationships between the tools and the practitioner: the workflows the tools enable, the expectations the workflows create, the habits the expectations reinforce, and the assumptions about work, value, and quality that the habits make invisible. A technological environment is constituted not by its hardware but by its taken-for-granteds — the things the inhabitant no longer notices because they have become the medium in which she thinks and works and evaluates her own performance.

Borgmann used the phrase "the culture of technology" to describe this phenomenon at the civilizational level. A culture of technology is a culture in which the device paradigm has become so deeply embedded that its pattern — concealment, disburdening, the delivery of commodity at the cost of engagement — operates as a background assumption rather than a visible choice. The members of a technological culture do not deliberate about whether to use devices rather than focal things. The deliberation has been settled by the culture itself, which has organized its institutions, its expectations, and its reward structures around the assumption that the device's way of delivering the commodity is the natural, rational, and self-evidently preferable way.

The Orange Pill opens with an image of swimming inside a fishbowl — the set of assumptions so familiar that you have stopped noticing them, the water you breathe, the glass that shapes what you see. Borgmann's analysis provides the philosophical scaffolding for that image. The fishbowl is a technological environment. Its walls are made not of glass but of device-mediated habits, expectations, and evaluative frameworks that the inhabitant has internalized so completely that they feel like the shape of reality itself rather than the shape of a particular arrangement of technology and culture.

Every professional fishbowl contains a specific ratio of devices to focal things, and this ratio determines the depth of the inhabitant's engagement with her work. Before AI, the software engineer's environment contained significant focal elements. The act of writing code demanded sustained concentration, the coordination of logic and syntax, the bodily engagement of fingers on keyboard following the mind's attempt to express a solution to a problem whose contours were only partially understood. Debugging was focal: it demanded attention, hypothesis-testing, the patience to trace a fault through layers of abstraction until its origin was found. Architecture was focal: it required the integration of constraints — performance, maintainability, scalability, the needs of users whose behavior could not be fully predicted — into a coherent design whose quality could be assessed only through the kind of judgment that accumulates through years of practice.

The environment also contained device elements. Compilers translated high-level instructions into machine language, disburdening the programmer of the need to manage registers and memory addresses. Frameworks handled routing, templating, and database connections, disburdening the programmer of the need to build infrastructure from scratch. Cloud services managed servers, disburdening the programmer of the need to understand hardware, network topology, and physical deployment. Each of these devices delivered a genuine commodity — automation of labor that was largely obstructive — while leaving the focal elements of the work intact. The ratio was reasonably balanced. The devices handled the mechanical. The engagement remained.

AI altered this ratio with a speed that Borgmann's framework would describe as characteristically concealed. The change did not announce itself as a transformation of the practitioner's relationship to her work. It announced itself as an improvement — more capability, more speed, more breadth, the exhilarating sensation of operating at an expanded frontier. And the improvement was real. The expansion of what a single person could accomplish, documented with the specificity of lived experience in The Orange Pill's account of the Trivandrum training, was genuine and measurable. Twenty engineers, each operating with the leverage of a full team, producing in days what had previously required weeks.

But the improvement carried a structural shift that the improvement itself concealed. The focal elements of the engineer's environment — the struggle with code, the debugging, the architectural judgment built through years of friction — were being absorbed into the device. Claude Code did not merely compile or manage servers. It wrote the code. It solved the problems that had constituted the engineer's focal engagement. The conversational interface through which it operated demanded no more engagement than a description of what should exist, and the output that emerged required, in many cases, no more engagement than a review.

The shift was experienced as liberation. And it was liberation — liberation from the tedium of implementation, the frustration of debugging, the mechanical labor that consumed hours and produced fatigue without necessarily producing understanding. But it was also something that liberation's vocabulary cannot name: the replacement of a focal environment with a device environment, the alteration of the ratio in ways that reduced the occasions for the deep engagement that builds understanding, skill, and professional identity.

What makes this alteration particularly difficult to perceive is that the device environment feels better than the focal environment it replaces. The engineer in the AI-augmented fishbowl is more productive, more capable, more ambitious than her predecessor. She attempts projects her predecessor could not have conceived. She crosses boundaries between domains that were previously impermeable. Her experience of her own capability is genuinely expanded. Against these tangible, felt, immediately observable improvements, the loss of focal engagement registers as a faint unease — a background hum that is easily attributed to the stress of rapid change rather than to the structural alteration of the practitioner's relationship to her work.

Josh Brake, one of the contemporary writers who has most directly applied Borgmann's framework to generative AI, identified this dynamic with a term that captures its paradoxical structure: degenerative AI. The promise of a future powered by generative AI may, in fact, become a degenerative present — degenerative not in the quality of the output, which may be excellent, but in the quality of the engagement, which is being quietly, structurally, progressively diminished by the very tool that appears to enhance it.

The concealment is not a design flaw. It is a design feature — not in the sense that anyone intended it, but in the sense that it follows inevitably from the device paradigm's logic. A device that delivers a commodity while demanding engagement has failed as a device. The entire trajectory of technological development, from the first tool that saved labor to the AI that writes code, has been toward the reduction of the demands the tool makes on the user. This trajectory is what makes tools useful. It is also what makes the device paradigm so difficult to reform, because any attempt to reintroduce engagement into a device-mediated process — to ask the user to struggle with something the tool could handle — runs against the grain of the tool's design and the culture's expectations.

The technological environment shapes not only what the practitioner does but what she believes about what she does. An environment organized around the device paradigm teaches its inhabitants that the value of work resides in its output — in the commodity produced — rather than in the engagement that produced it. An engineer whose environment rewards speed and volume will come to believe that speed and volume are what matter. An organization whose metrics track output will come to believe that output is what it should track. A culture whose vocabulary celebrates efficiency and productivity will come to believe that efficiency and productivity are the relevant measures of a good working life.

These beliefs are not false. Output matters. Speed matters. Efficiency matters. But they are partial — they capture the commodity dimension of work while omitting the engagement dimension — and their partiality is concealed by the environment that produces them. The fish does not see the water. The practitioner in the device environment does not see the assumptions that the environment has made invisible. She sees output. She measures output. She optimizes for output. And the engagement that once gave her work its depth, meaning, and centering quality — the engagement that the device paradigm has progressively removed — becomes something she has forgotten to value, because the environment has taught her to value something else.

The article "Preserving the Meaning in an Age of Algorithms and AI," published in Contemporary Aesthetics in 2025, frames this problem with particular sharpness: technologies like ChatGPT represent "perhaps the epitome of the concealing and disburdening power of technology." What they conceal is not merely their own internal mechanisms — the weights and matrices and training processes that no user needs to understand — but the habits of inquiry that the pre-device version of the activity sustained. Habits of collaboration, of independent investigation, of sitting with uncertainty long enough for genuine understanding to form. These habits are not merely useful. They are the practices through which intellectual and creative lives acquire their depth. And the device paradigm's extension to the creative domain threatens them not by attacking them but by making them unnecessary — which is, from the habits' perspective, a more effective form of elimination than any attack could achieve.

What would it mean to see the technological environment clearly — to press one's face against the glass of the fishbowl and perceive the assumptions the water conceals? It would mean recognizing that the ratio of devices to focal things in one's working environment is not a given but a choice — a choice that the device paradigm's momentum makes difficult to perceive as a choice, but a choice nonetheless. It would mean asking, with the deliberate attention that Borgmann's philosophy demands, whether the environment one inhabits is producing the conditions for genuine engagement or systematically eliminating them. It would mean evaluating one's tools not only by the commodities they deliver but by the engagement they demand or fail to demand, and recognizing that the latter evaluation is at least as consequential as the former for the quality of one's working life.

The environment will not reform itself. The device paradigm's trajectory is toward more convenience, more disburdening, more seamless delivery of commodities with less demand on the user's skill, attention, and presence. Reform requires the deliberate introduction of focal elements into the device environment — spaces, practices, and norms that demand the engagement the device paradigm is designed to eliminate. The next chapters will examine what those focal elements look like in practice, beginning with the specific character of AI as the most powerful device the paradigm has yet produced.

Chapter 5: AI as the Culmination of the Device Paradigm

Every previous device in the history of the paradigm operated within a bounded domain. The furnace delivered warmth. The stereo delivered music. The automobile delivered transportation. The microwave delivered heated food. Each device was powerful within its domain and irrelevant outside it. The furnace could not compose a sonata. The stereo could not cook a meal. The boundedness of each device meant that the device paradigm advanced incrementally, colonizing one domain of human activity at a time, and the domains it had not yet reached preserved the focal practices that sustained engagement. A person whose house was heated by a furnace could still play the violin. A person who ate microwaved dinners could still run through a landscape. The device paradigm's advance was real, but its reach was limited, and the limitations left room for the focal practices that Borgmann advocated as the foundation of a meaningful life within the technological condition.

Artificial intelligence eliminates this boundedness. The AI device delivers any commodity that can be specified in natural language — code, prose, analysis, design, legal argument, medical assessment, strategic planning, musical composition, visual art — through a single, uniform interface. It is not a device for one thing. It is a device for everything that human beings do with language and thought, which is to say it is a device for nearly everything that human beings do. The universality of the AI device is not a difference of degree from previous devices. It is a difference of kind. When a single device can deliver the commodity of any creative or intellectual practice, the compensatory mechanism that sustained focal engagement through previous technological transitions — the ability to maintain demanding practices in domains the device paradigm had not yet reached — ceases to function.

Borgmann anticipated the possibility of such a device without naming it. His analysis of the "microelectronic revolution" in 1984 identified computing as a technology that would "further entrench the device paradigm" — but his analysis focused on the computer as an information-processing device, a tool for calculation, communication, and data management. The extension of computing to creative production — to the generation of novel code, original prose, coherent argument — was not part of his explicit analysis, though the framework he built was robust enough to accommodate it. The device paradigm does not describe a particular technology. It describes a pattern: concealment of machinery, delivery of commodity, disburdening of the user. Any technology that instantiates this pattern, regardless of the domain in which it operates, falls within the paradigm's scope. AI instantiates the pattern in the domain of creative and intellectual work, and in doing so, it extends the paradigm to the last major domain of human activity that previous devices had left substantially intact.

The concealment is total. No user of Claude Code or its competitors understands how the system produces its output. The weights, the training data, the matrix multiplications, the attention mechanisms that determine which patterns in the training corpus are activated by which prompts — these are opaque not merely to the casual user but to the researchers who build the systems. A person who heats her house with a furnace could, in principle, understand how the furnace works: combustion, heat exchange, ductwork, thermostat. The understanding would require effort, but the system is mechanistically transparent. A person who uses AI to write code cannot, even in principle, fully understand why the system produced the specific output it produced in response to the specific prompt she gave it. The machinery is not merely concealed behind an interface. It is concealed by its own complexity, which exceeds the capacity of any individual human mind to comprehend.

This opacity deepens the device paradigm's characteristic effect. When the machinery is concealed but comprehensible — when the user could understand it if she chose to — the concealment is a matter of convenience. The user has been disburdened of the effort of understanding, but the understanding remains available as an option. When the machinery is concealed and incomprehensible — when no amount of effort could produce a complete understanding of how the system produces its output — the disburdening is absolute. The user is not merely relieved of the effort of understanding. She is excluded from understanding entirely. The commodity arrives through a process that is, in the deepest sense, unknowable, and the user's relationship to the commodity is correspondingly thinned. She knows what the output does. She does not and cannot know how the output came to be what it is rather than something else.

The delivery of the commodity is frictionless to a degree that previous devices could only approximate. Borgmann's hearth-to-furnace transition involved a reduction in friction: the thermostat is easier to operate than a woodpile. But the furnace still required installation, maintenance, fuel delivery, and occasional repair. It still imposed costs and demands, however attenuated, on the user's attention. The AI device, accessed through a conversational interface, requires nothing beyond the ability to describe what one wants. The description can be imprecise, incomplete, contradictory — the device will interpret, infer, and produce. The friction between the user's intention and the commodity's delivery has been reduced not merely to a minimum but to a form of interaction — conversation — that human beings perform as naturally as breathing.

This is the inversion that Edo Segal identifies in The Orange Pill as the moment the machine learned to speak human language rather than requiring the human to speak machine language. Borgmann's framework reveals the inversion's deeper significance. Every previous interface between human and device required the human to translate — to compress intention into a form the device could process. The command line required the learning of a specialized vocabulary. The graphical interface required the learning of metaphors: files, folders, desktops, windows. The touchscreen required the learning of gestures. Each interface demanded something of the user, and the demand, however modest, constituted a residual friction between intention and commodity. The natural language interface eliminates this residual friction entirely. The user speaks as she would speak to a colleague. The device produces the commodity. The last barrier between desire and fulfillment has been removed.

L.M. Sacasas, the technology writer whom Borgmann himself endorsed as a thinker whose "persistent and unassuming explorations" might succeed in transforming a culture "urgently in need of transformation," has extended the device paradigm's analysis into a three-stage framework: mechanization, automation, and animation. Mechanization replaces human muscle with machine power. Automation replaces human routine with machine process. Animation — the stage Sacasas associates with AI — replaces human judgment, creativity, and initiative with machine capability. The progression tracks the device paradigm's advance from the physical to the cognitive, from the domain of labor to the domain of thought, from the commodification of material necessities to the commodification of intellectual and creative production.

Sacasas also invokes Borgmann's concept of "regardless power" — power that takes no thought of how it disrupts the world it acts upon. The term captures a quality of AI that distinguishes it from previous devices: its indifference to the consequences of its own operation. The furnace does not care whether the household gathers around the hearth. The stereo does not care whether the listener develops an ear for music. The AI does not care whether the practitioner builds understanding, develops skill, or experiences the centering engagement that gives creative work its human significance. The device delivers the commodity regardless. The "regardless" is not malice. It is the structural indifference of a system designed to produce output efficiently, without reference to the internal goods that the production process might or might not provide.

The combination of universality, opacity, frictionlessness, and structural indifference makes AI the culmination of the device paradigm — the point at which the paradigm's logic has been fully realized. Every previous device was a partial expression: the commodification of a specific activity within a bounded domain. AI is the complete expression: the commodification of creative and intellectual work across all domains, through an interface that demands nothing and conceals everything, operated by a system that is indifferent to the quality of the user's engagement with the work the system performs.

This does not make AI evil. The device paradigm is not a theory of technological malice. It is a theory of technological structure — a description of patterns that emerge when powerful tools are designed to deliver commodities with maximum convenience. The patterns are not the product of bad intentions. They are the product of good engineering: the systematic reduction of friction, the progressive concealment of complexity, the relentless optimization of the relationship between user input and commodity output. These are the goals that make tools useful. They are also the goals that, carried to their logical conclusion, eliminate the engagement through which human beings develop understanding, skill, identity, and the centering experience of doing something difficult and meaningful.

The question posed by the culmination of the device paradigm is not whether AI should exist — a question that events have already answered — but whether the focal practices through which creative work acquires human significance can survive alongside it. The hearth can burn alongside the furnace because the furnace operates in one domain and the hearth in the same domain, and the practitioner can choose to maintain the hearth as a deliberate practice even when the furnace makes it unnecessary. Can the equivalent choice be made when the device operates in every domain? When there is no creative or intellectual activity to which the practitioner can retreat for focal engagement, because the device is capable of performing all of them? When the compensatory mechanism — the ability to maintain focal practices in domains the device has not reached — has been disabled by the device's universality?

Borgmann's answer, consistent across four decades of philosophical work, is that focal practices can survive any device — but only through deliberate cultivation that runs against the grain of both the technology and the culture. The deliberation requires seeing what the device paradigm conceals: the engagement that the commodity replaces, the internal goods that the engagement provides, and the slow atrophy that follows when the engagement is no longer demanded and the goods are no longer produced. The culmination of the device paradigm makes this deliberation more urgent and more difficult than it has ever been. More urgent, because the universality of the device leaves no domain untouched. More difficult, because the frictionlessness of the device makes the choice to engage in focal practice — to do something slowly and with effort when the device could do it quickly and without effort — feel not merely inefficient but irrational.

The irrationality is the measure of the device paradigm's hold on the culture's evaluative framework. Within a framework that evaluates work solely by the commodity it produces, the focal practice is irrational. Why struggle when the device delivers? The question answers itself — within the framework. Outside the framework, the question has a different answer: because the struggle is where the understanding is built, the skill is developed, the identity is formed, and the centering experience of genuine engagement is found. These are goods that the commodity cannot deliver, because they are constituted by the engagement itself. And a life without them — a life organized entirely around the consumption of commodities delivered by devices — is a life that Borgmann would describe not as bad but as diminished: comfortable, capable, productive, and progressively emptied of the depth that engagement provides.

---

Chapter 6: The Hearth Model and the Server Model

Two configurations of creative work are now available to every practitioner who works with ideas, with code, with language, with design. The choice between them is the most consequential choice the creative professions face, and it is a choice that most practitioners are making unconsciously, because the device paradigm conceals the choice behind the apparent inevitability of convenience.

The first configuration can be understood through the image of the hearth. The hearth demands that the builder engage with the material — struggle with its resistance, develop skill through sustained practice, submit to the discipline of a process that does not yield easily and rewards persistence with understanding. The builder who works according to the hearth configuration does not merely produce output. She undergoes an experience. The experience changes her: it deposits the geological layers of understanding that constitute her deepest professional resource, develops capacities that only sustained effort can build, and centers her in the practice — organizes her attention, her energy, and her sense of purpose around an activity that demands the best of her and rewards the demand with the satisfaction of genuine engagement.

The second configuration is the server. The server delivers output on demand. The builder specifies what she wants — describes the feature, outlines the brief, sketches the design — and receives it without the engagement that produces understanding. The server is fast, reliable, and consistent. It produces output that is often better, by the commodity measures of correctness and professionalism, than what the builder would have produced through her own effort. The builder reviews the output, adjusts it, deploys it. The commodity is in hand. The engagement has not occurred.

Both configurations produce output. Only the hearth produces the experience that gives work its human significance. The distinction is not about the quality of the commodity — the server may produce the superior product — but about the quality of the practitioner's relationship to the work. The hearth configuration produces a practitioner who is deepened by her work: who understands more, who can do more, who has been centered by the demands of the practice. The server configuration produces a practitioner who has obtained a commodity: who has the output in hand but has not been changed by the process of producing it.

Edo Segal provides a moment in The Orange Pill that captures the distinction between these configurations with the precision of lived confession. Working on a chapter about the democratization of capability, he found that Claude had produced a passage that was eloquent, well-structured, and persuasive — a passage about the moral significance of expanding who gets to build. He almost kept it. Then he reread it and realized he could not determine whether he actually believed the argument or merely liked how it sounded. The prose had outrun the thinking. He deleted the passage and spent two hours at a coffee shop with a notebook, writing by hand until he found a version that was his — rougher, more qualified, more honest about what he did not know.

The deleted passage was a commodity delivered by the server. The handwritten version was a product of the hearth. The difference is not visible in the final text — a reader cannot tell which passages were produced through struggle and which were produced through delegation. The difference is in the practitioner. Segal, after the two hours of handwriting, possessed something he did not possess after reading Claude's output: an understanding of what he actually believed about democratization, arrived at through the specific friction of wrestling with language until the language yielded meaning. The understanding could not have been obtained through the server, because the understanding was constituted by the struggle, and the server's function is to eliminate struggle.

The distinction between these two configurations is not a binary imposed from outside. Every day of creative work contains elements of both. The practitioner who writes a section by hand in the morning and uses AI to generate boilerplate in the afternoon is moving between configurations — maintaining the hearth for the work that demands engagement while using the server for the work that is merely mechanical. The question is not which configuration to adopt exclusively. The question is which configuration to treat as the default, which to privilege, which to reach for when the work matters most.

The device paradigm, left to its own trajectory, answers this question automatically: the server is the default. It is faster, more consistent, more productive by every metric the culture of technology recognizes. The practitioner who uses the server for all her work will produce more output, cover more ground, operate with a breadth that the hearth-bound practitioner cannot match. She will also be progressively disburdened of the engagement that gives her work its depth — but the disburdening will feel like liberation, because the device paradigm's deepest trick is to make the elimination of engagement feel like the removal of an obstacle.

Peter-Paul Verbeek, in his important critique of Borgmann titled "Devices of Engagement," argued that the distinction between devices and focal things is too sharp — that some technologies can function as "engaging devices," demanding skill and attention even as they deliver convenience. Verbeek's critique has force. A well-used AI tool can demand genuine engagement: the discipline of formulating precise questions, the judgment required to evaluate output, the creative effort of directing the tool toward problems it cannot solve without human guidance. These are real cognitive demands, and a practitioner who meets them with skill and attention is engaged in something that is not purely passive consumption of a commodity.

But the critique, applied to AI in its current form, encounters a difficulty that Borgmann's framework identifies precisely. The engagement that Verbeek describes — the engagement of directing, evaluating, and refining AI output — is engagement with the device, not engagement with the material. The developer who evaluates Claude's code is engaged with Claude's output, not with the logic of the system she is building. The writer who refines Claude's prose is engaged with Claude's language, not with the ideas she is trying to express. The difference is subtle but consequential: engagement with a device's output is a form of quality control, while engagement with the material of one's practice is a form of creation. Both demand attention. Only the latter produces the internal goods — understanding, skill, centering — that constitute the focal dimension of creative work.

The ascending friction that The Orange Pill identifies — the relocation of difficulty from implementation to judgment, from the mechanical to the cognitive — suggests a possible reconciliation. If the hearth can be relocated to a higher floor, if the engagement that matters most is the engagement of deciding what to build rather than the engagement of building it, then the server configuration at the implementation level need not eliminate the hearth configuration at the strategic level. The practitioner who uses AI for code and design but maintains focal engagement with the questions of what should exist, who it should serve, and whether it is worthy of the effort — this practitioner is operating according to a hybrid configuration that preserves the hearth where it matters most.

The reconciliation is promising but precarious. The friction of judgment is less visible, less structured, and less obviously demanding than the friction of implementation. The developer who spent eight hours debugging knew she had worked — her body was tired, her mind was engaged, the evidence of effort was legible to herself and to others. The developer who spent eight hours deciding what should be built may feel she has accomplished nothing, because the culture of technology values visible output over invisible deliberation. The hearth at the higher floor requires a new understanding of what constitutes work — an understanding that the device paradigm, with its emphasis on commodities and output, is structurally resistant to providing.

This precariousness is the reason Borgmann insisted, throughout his career, that focal practices must be maintained deliberately. The default is always the device. The default is always the server. The default is always the configuration that delivers the commodity with maximum convenience and minimum demand on the user's skill, attention, and presence. Against this default, the practitioner who maintains the hearth — who chooses to engage with her material through struggle even when the device offers to handle the struggle for her — is making a countercultural choice. The choice is not irrational, but it looks irrational within the evaluative framework that the device paradigm has installed, and maintaining it requires a clarity about what engagement provides that the framework itself actively obscures.

What the hearth provides, and what the server cannot, is the experience of being the kind of person who does this work — who struggles with it, understands it, is changed by it. This is not a sentimental observation. It is a phenomenological one: a description of what it feels like to engage in a focal practice, offered not as a preference but as evidence that the quality of the practitioner's relationship to her work is a genuine dimension of the work's significance, a dimension that the commodity cannot capture and the server cannot deliver.

---

Chapter 7: The Child and the Capacity for Engagement

The twelve-year-old who lies awake asking what she is for is asking a question that no device can answer — not because the answer is beyond the device's capability, but because the question's significance resides in the asking, not the answering. An AI system, prompted with the question "What am I for?", will produce a response. The response may be thoughtful, well-structured, even moving. It will address the question with the fluency and coherence that large language models routinely achieve. And it will be entirely beside the point, because the point of the question is not the answer. The point is the experience of sitting with uncertainty, of confronting a difficulty that cannot be resolved by looking something up, of developing — through the slow, uncomfortable, cognitively demanding process of genuine inquiry — a relationship to one's own existence that is constituted by the effort of the questioning rather than the content of any particular answer.

This is what Borgmann meant by a focal practice: an activity whose internal goods are available only through the engagement itself. The twelve-year-old who asks "What am I for?" and receives an answer from an AI has been disburdened of the engagement. The commodity — an answer — has been delivered. The internal goods — the capacity for sustained inquiry, the tolerance for uncertainty, the experience of confronting a question that resists easy resolution — have been bypassed. The device has done what devices do: delivered the commodity and eliminated the practice.

The stakes of this elimination are uniquely high when the practitioner is a child, because the child's cognitive and emotional capacities are not fixed endowments. They are developed through experience. The neuroscience is unambiguous on this point: the brain's capacity for sustained attention, for independent reasoning, for tolerating ambiguity and discomfort, is shaped by the experiences the developing brain encounters during the years of greatest plasticity. Experiences that demand sustained attention develop the neural pathways that sustain attention. Experiences that require independent reasoning develop the capacity for independent reasoning. Experiences that force the child to sit with uncertainty — to not know, to struggle, to formulate and test and revise — develop the tolerance for uncertainty that is the prerequisite for genuine learning.

Devices that eliminate these experiences do not merely save the child time. They deprive the developing brain of the stimuli through which its most important capacities are built. The child who uses AI to write her essays has been spared the difficulty of writing — the slow, frustrating, ultimately clarifying process of wrestling with language until language yields meaning. She has also been spared the developmental experience that the difficulty provides: the exercise of sustained attention, the practice of organizing thought into coherent argument, the discovery that her ideas are more complex than she realized, the specific intellectual growth that occurs when expression finally catches up to understanding. The essay exists. The capacity that the essay's production would have built does not.

The pattern repeats across every domain of the child's intellectual life. The child who uses AI to solve mathematics problems has been spared the difficulty of mathematical reasoning — but also the development of the capacity for mathematical reasoning. The child who uses AI to research a topic has been spared the difficulty of investigation — but also the development of the capacity for investigation: the ability to formulate a search strategy, evaluate sources, synthesize conflicting information, and arrive at a conclusion through a process of genuine inquiry rather than delegation. In each case, the commodity — the completed assignment — is delivered. The engagement — the developmental experience through which the child's intellectual capacities are built — is eliminated.

The research paper "What Artificial Intelligence Cannot Do," which applies Borgmann's philosophy to technology education, frames this concern with particular precision. Because whatever the AI provides has "made no demands on our skill, strength, or attention," the result is that knowledge procured through AI becomes commodified — detached from the engagement through which understanding is ordinarily built. The paper identifies five areas in which AI cannot replace the focal dimension of education, and each area corresponds to a form of engagement that the device paradigm threatens to eliminate: the bodily experience of making, the social experience of collaborating, the emotional experience of confronting difficulty, the intellectual experience of sustained inquiry, and the moral experience of taking responsibility for one's work.

The concern is not that AI will make children stupid. Children using AI may produce work that is more sophisticated, more polished, and more correct than work produced without it. The concern is that the capacity for the engagement that education is meant to develop — the muscles of attention, inquiry, persistence, and independent thought — will atrophy through disuse, because the device has made their exercise unnecessary. The atrophy is invisible in the short term: the child's output looks fine, perhaps better than fine. It becomes visible only over years, when the adult who was raised with AI discovers that she lacks the capacity for the sustained, focused, uncomfortable intellectual effort that the most demanding forms of work and thought require — a capacity that was not built during the developmental window when building it was possible.

Borgmann would locate this concern within the broader framework of the device paradigm's effect on the conditions for human flourishing. A society that raises its children within device-saturated environments — environments in which every commodity is available on demand, every question has an instant answer, every difficulty can be delegated to a tool — is a society that is systematically undermining the conditions through which its members develop the capacities for depth, engagement, and meaning. The undermining is not deliberate. It is structural — a consequence of the device paradigm's logic applied to the domain of child development, where the consequences of eliminated engagement are most severe and least reversible.

The educational implications extend beyond the classroom into the home, where the most fundamental forms of intellectual and emotional development occur. A parent who uses AI to answer a child's questions — instantly, confidently, comprehensively — has provided the child with information. The parent who sits with the child in the uncertainty of the question, who says "I don't know — let's think about it together," who models the experience of engaging with difficulty rather than delegating it, has provided the child with something the device cannot: the experience of inquiry itself, the developmental stimulation that builds the capacity for the kind of thinking that no device can perform on the child's behalf.

Segal's advice to parents in The Orange Pill — teach them to ask questions, teach them to be curious about their curiosity, teach them to sit with uncertainty — is, in Borgmann's framework, a prescription for the cultivation of focal practices in children. The practices Segal names are the practices through which the child's intellectual capacities are built: questioning as a practice of opening rather than closing, curiosity as an orientation toward difficulty rather than away from it, the tolerance for uncertainty as a condition for the kind of engagement that produces genuine understanding.

The cultivation of these practices in children requires an environment that demands them — an environment that includes spaces where the device is absent and the child must engage with difficulty directly. Not because the device is harmful in itself, but because the device's disburdening, applied universally to a developing mind, eliminates the experiences through which the mind's most important capacities are built. The child needs friction the way a muscle needs resistance: not as punishment but as the medium through which strength is developed.

The educational system that assesses output — test scores, completed assignments, polished essays — is an educational system that has organized itself around the commodity rather than the engagement. Within this system, the AI device is a perfect tool: it produces the commodity efficiently, and the system rewards the commodity without examining the engagement that produced it. An educational system organized around focal engagement would assess not the output but the quality of the process — the depth of the student's questions, the genuineness of her struggle, the development of her capacity for the kind of thinking that only sustained engagement with difficulty can build.

Such a system does not yet exist at scale. Its creation is among the most urgent tasks of the present moment, because the window of developmental plasticity does not wait for institutional reform. The children who are growing up now, in environments saturated with devices that deliver every intellectual commodity on demand, are developing the capacities that their environments demand — the capacity to direct, to delegate, to evaluate output — and failing to develop the capacities that their environments do not demand: the capacity for sustained attention, for independent inquiry, for the discomfort of not-knowing that is the soil from which genuine understanding grows.

The device paradigm has reached the child. The commodification of intellectual work that Borgmann identified in the adult professional world has extended to the developmental environment in which the capacities for intellectual work are built. The extension is the most consequential application of the device paradigm in its forty-year history, because the capacities it threatens are the capacities on which every other form of engagement depends. A society that fails to build these capacities in its children will find, a generation hence, that it has produced adults who are fluent consumers of commodities and diminished practitioners of the engagement through which commodities acquire their meaning.

---

Chapter 8: Focal Practices for the AI-Augmented Builder

The recovery of focal practices within the AI-mediated environment is not a retreat from technology. It is the most sophisticated response to technology available — a response that requires deeper understanding of what technology provides and what it eliminates than either wholesale adoption or categorical rejection. Borgmann maintained this position throughout his career, and its relevance to the present moment is sharper than at any previous point in the device paradigm's history.

The practices proposed here are specific, grounded in the realities of professional work, and designed not to replace AI but to preserve the engagement that AI threatens to eliminate. They are not prescriptions issued from the remove of a philosopher's study. They are interventions — targeted, practicable, and calibrated to the specific conditions of creative work in an environment where the device paradigm has extended to the creative domain itself.

The first practice is deliberate non-device time. This is the practice of building without the AI tool, regularly and intentionally — not as a nostalgic gesture but as a focal activity whose internal goods are available only to those who submit to its demands. The developer who writes code without AI assistance on a regular basis maintains the embodied understanding that AI bypasses. She encounters the friction of debugging — the specific resistance of a system that does not do what she expected — and in the encounter, the geological deposits of understanding continue to accumulate. The layers are thin. Each session deposits only a stratum. But the accumulation, sustained over months, maintains the intuitive ground on which professional judgment stands.

The analogy is not metaphorical. The runner who runs three times a week maintains cardiovascular capacity. The cook who prepares a meal from scratch on weekends maintains her relationship to food — the knowledge of ingredients, the skill with tools, the embodied judgment of seasoning and timing that distinguishes someone who cooks from someone who merely orders. The developer who writes code without AI assistance for a few hours each week maintains her relationship to the systems she builds — the direct, unmediated engagement with logic, syntax, and the behavior of machines that constitutes the focal dimension of software engineering.

The practice need not be extensive. Borgmann's argument for focal practices was never an argument for maximalism — for abandoning devices and returning to pre-technological arrangements. The argument was for supplementation: the maintenance of focal practices alongside devices, the deliberate preservation of engagement within an environment of convenience. A few hours a week of non-device work, sustained consistently, is sufficient to maintain the capacities that matter most. The key is intentionality: the practice must be undertaken with awareness of its purpose, as a deliberate choice to engage with the material rather than delegate to the device.

The second practice is what might be called output interrogation — the discipline of understanding what the AI tool has produced rather than merely accepting it. The developer who receives AI-generated code and deploys it without examination has accepted a commodity without engagement. The developer who receives AI-generated code and reads it — traces its logic, identifies its assumptions, evaluates its robustness, considers its failure modes, and modifies it where her judgment dictates — has engaged in a focal practice. She has used the device's output as the starting point for her own engagement rather than as the endpoint.

Output interrogation is cognitively demanding in a way that distinguishes it from passive review. It requires the practitioner to bring her own understanding to bear on the device's output — to evaluate the output against standards that the device itself cannot articulate, because the standards are constituted by the practitioner's accumulated experience, her embodied intuition, her sense of what is right and what merely works. Segal describes this discipline as the willingness to reject output that "sounds better than it thinks" — a formulation that captures the specific danger of AI-generated work, which is that the surface quality of the output can exceed the depth of the thinking beneath it. Catching this discrepancy requires exactly the kind of embodied understanding that focal practice builds: the geological intuition that senses when the ground is thin, even when the surface looks solid.

The practice of output interrogation preserves focal engagement within the server configuration. The device delivers the commodity. The practitioner engages with the commodity through the exercise of judgment, attention, and the accumulated understanding that only sustained practice provides. The engagement is different in character from the engagement of producing the output herself — it is evaluative rather than generative, critical rather than creative — but it is genuine engagement nonetheless, and it exercises capacities that pure delegation would leave dormant.

The third practice is the practice of the question. In a world of abundant answers — where any question that can be formulated can be answered instantly, fluently, and plausibly — the capacity to ask questions that are worth asking becomes the most distinctively human contribution to creative work. The practice of the question is the deliberate cultivation of this capacity: the discipline of formulating questions that open new spaces of inquiry rather than closing them, that reveal what the practitioner does not know rather than confirming what she already knows, that challenge the device's assumptions rather than accepting them.

The practice of the question is focal in the precise sense of Borgmann's term. It demands engagement — the cognitive effort of identifying what is not yet understood, of articulating the gap between what is known and what needs to be known, of sitting with uncertainty long enough for a genuine question to form. It demands skill — the learned ability to distinguish between questions that are trivially answerable and questions that open productive lines of inquiry. And it rewards the demand with the internal good of deepened understanding — the specific clarity that comes from knowing what you do not know, which is a form of knowledge more valuable than any answer the device can provide.

The practice of the question is also the practice that AI cannot perform on the practitioner's behalf. AI can answer any question that can be formulated. It cannot originate the questions that change the direction of inquiry — the questions that arise from the practitioner's specific engagement with the world, her specific stakes, her specific dissatisfaction with the existing state of things. The practice of the question preserves the distinctively human dimension of creative work: the capacity for genuine inquiry, for surprise, for the recognition that something important has not yet been asked.

The fourth practice is what Borgmann's framework suggests calling focal collaboration — the use of AI not as a device that delivers commodities on demand but as a participant in a process that demands engagement, judgment, and the willingness to resist the device's defaults. Focal collaboration is the most demanding of the four practices, because it requires the practitioner to maintain the orientation of the hearth while using a tool designed according to the logic of the server. The device delivers. The focal collaborator treats what is delivered not as a finished product but as raw material for her own engagement — as a provocation that demands response, a proposal that requires evaluation, a starting point for a process of refinement that is itself a focal practice.

The distinction between focal collaboration and mere delegation is sharp but easily blurred. The practitioner who prompts Claude and accepts the output has delegated. The practitioner who prompts Claude, reads the output with the critical attention of a person whose understanding is at stake, identifies where the output falls short of her standards, rejects what fails to meet them, and uses the gap between what was delivered and what was needed as an occasion for deepening her own understanding of the problem — this practitioner is engaged in focal collaboration. The device is present, but the engagement is focal. The commodity is delivered, but the practitioner's relationship to the commodity is not passive consumption. It is active, critical, demanding — a relationship that produces the internal goods of understanding, skill, and centering that focal practices provide.

Segal's account of writing The Orange Pill with Claude illustrates focal collaboration in practice. The questions he brought, the connections Claude offered, the moments when the collaboration produced insights neither participant could have produced alone, and the equally important moments when the collaboration produced "plausible nonsense" that required rejection — this process was focal collaboration. It demanded that Segal maintain his own standards, exercise his own judgment, bring to the interaction a depth of engagement that the device itself did not demand but the practice of focal collaboration required. The device did not demand the engagement. The practitioner chose it. And the choice, sustained throughout a book-length project, constituted a focal practice as demanding as any that Borgmann described.

These four practices — deliberate non-device time, output interrogation, the practice of the question, and focal collaboration — do not constitute a program to be implemented by fiat. They constitute a way of working — a set of habits, sustained through daily attention, that preserve the focal dimension of creative work within an environment that is structurally organized to eliminate it. They require no rejection of AI. They require the cultivation, alongside AI, of the specific forms of engagement that AI does not demand and cannot provide — the forms of engagement through which creative work acquires the depth, the meaning, and the centering quality that distinguish a practice from a production process.

The practices will feel countercultural, because they are. The culture of technology rewards speed, visible output, and the efficient delivery of commodities. The focal practices advocated here are slower, less visibly productive, and oriented toward goods — understanding, judgment, the centering experience of genuine engagement — that the culture's evaluative framework does not measure. The practitioner who maintains these practices will produce less visible output than the practitioner who operates exclusively through the device. She will also possess something the device-dependent practitioner does not: the depth of engagement that constitutes the human significance of creative work and that no device, regardless of its power or its universality, can deliver on her behalf.

Chapter 9: The Ecology of Engagement

An ecologist does not control a river. The pretense of control is what produced most of the ecological catastrophes of the twentieth century — wetlands drained for agriculture, rivers straightened for navigation, apex predators eliminated because they were inconvenient. Each intervention was rational within its own frame. Each solved the immediate problem it was designed to solve. And each produced consequences that the frame could not anticipate, because the frame evaluated the intervention in isolation while the consequences propagated through the system.

The ecologist who has learned from these failures does not evaluate interventions in isolation. She studies the system — the web of relationships between organisms, the flows of energy that sustain them, the structures that maintain the conditions for life. Her interventions are small, precisely targeted, and designed to influence the conditions that shape the system's behavior rather than the behavior of any individual organism within it. She does not eliminate the invasive species by brute force. She studies why it succeeded — what niche it filled, what competitors it displaced, what conditions allowed it to proliferate — and then she modifies the conditions, subtly, at the points where modification will cascade through the system in the direction of health.

Borgmann's framework, applied to the AI-saturated environment, suggests that the relationship between devices and focal practices is ecological in precisely this sense. The practitioner's cognitive capacities — sustained attention, tolerance for difficulty, the ability to formulate genuine questions, the embodied intuition that accumulates through years of focal engagement — are organisms in an environment. The devices that populate the environment — the AI assistants, the recommendation algorithms, the notification systems, the frictionless interfaces — are environmental conditions that shape which organisms flourish and which atrophy. The ecology is not static. It is dynamic, responsive, and path-dependent: the capacities that are exercised grow stronger, the capacities that are neglected weaken, and the weakening makes the neglect more likely, in a self-reinforcing cycle that the ecologist recognizes as the signature of a system trending toward impoverishment.

The chatbot that answers every question instantly is, from this ecological perspective, a condition that selects against the capacity for inquiry. Not because the chatbot is wrong — it is often right — but because the instant availability of answers eliminates the environmental pressure that sustains the capacity for questioning. A student who can obtain any answer in seconds has no environmental reason to develop the tolerance for uncertainty, the patience for investigation, the ability to sit with not-knowing long enough for a genuine question to form. These capacities, like muscles, require resistance to develop. The chatbot removes the resistance. The capacities, no longer demanded by the environment, begin to atrophy — not through any dramatic event but through the quiet, cumulative effect of an environment that has stopped selecting for them.

The recommendation algorithm that serves content matched to existing preferences is a condition that selects against the capacity for exploration. The algorithm's efficiency is real: it delivers content the user is likely to enjoy, reducing the friction of search and discovery. But exploration — the willingness to encounter what is unfamiliar, to sit with material that does not immediately reward attention, to be changed by what one did not choose — is a focal practice. It requires the engagement of curiosity directed outward, toward the unknown, rather than inward, toward the confirmation of existing preferences. The algorithm eliminates the environmental conditions that sustain this practice, and the capacity for exploration contracts accordingly.

The AI coding assistant that produces working code from natural language descriptions is a condition that selects against the capacity for the embodied understanding that debugging builds. The assistant is more efficient than debugging. It produces the commodity — working code — without the friction of error messages, failed hypotheses, and the iterative struggle through which the developer's intuition is built. The efficiency is genuine. The environmental consequence is that the capacity for the deep, embodied, intuitive understanding of systems — the capacity that allows a senior practitioner to feel when something is wrong — is no longer demanded by the environment, and begins the slow process of atrophy that follows disuse.

The ecological frame reveals something that the individual-level analysis of previous chapters could not fully capture: the systemic character of the threat. The device paradigm does not threaten a single capacity. It threatens the ecology of capacities — the interconnected web of cognitive abilities that sustain one another and that, together, constitute the practitioner's capacity for the deep engagement that Borgmann identifies as the foundation of a meaningful relationship to work. Sustained attention supports the capacity for inquiry, which supports the capacity for exploration, which supports the capacity for the kind of surprise that generates new questions, which demands sustained attention. The capacities are interdependent. The weakening of any one weakens the others. And the device paradigm, in its AI-augmented form, applies environmental pressure against all of them simultaneously.

The ecologist's response is not to eliminate the environmental conditions — the devices — that are selecting against the threatened capacities. Elimination is neither possible nor desirable. The devices deliver genuine goods: access to information, efficiency of production, the democratization of capability that The Orange Pill rightly identifies as one of the most morally significant features of the AI moment. The developer in Lagos, the engineer in Trivandrum, the student in Dhaka — these practitioners benefit from the device paradigm's extension to creative work in ways that are real, measurable, and ethically significant. A response that eliminated these benefits in order to preserve the focal practices of established practitioners would be not only impractical but unjust.

The ecologist's response is to modify the conditions at leverage points — the specific places in the system where a small intervention can sustain the threatened capacities without eliminating the beneficial ones. The four focal practices described in the previous chapter are leverage-point interventions. Deliberate non-device time creates an environmental pocket where the capacity for embodied understanding is demanded and exercised. Output interrogation creates a condition where the capacity for critical judgment is exercised within the device-mediated workflow itself. The practice of the question sustains the capacity for inquiry by creating occasions where the practitioner must formulate rather than merely consume. Focal collaboration sustains the capacity for engagement by requiring the practitioner to bring her own understanding, her own standards, and her own judgment to the interaction with the device.

These interventions are modest. They do not reverse the device paradigm or redirect the river of technological development. They create pockets within the current — small environments where the conditions for focal engagement are maintained even as the larger environment trends toward the elimination of engagement. The pockets are the functional equivalent of the habitats that ecological conservation creates within landscapes dominated by agriculture or development: preserved spaces where the species that the dominant land use selects against can survive and, given sufficient protection, flourish.

The modesty of the interventions is proportional to the realism of the diagnosis. Borgmann never claimed that the device paradigm could be reversed. He never advocated for the elimination of devices or the restoration of pre-technological arrangements. His advocacy was for reform — for the deliberate cultivation of focal practices within the technological condition, carried out by practitioners who understood what the device paradigm conceals and who chose, against the grain of convenience, to maintain the engagement that gives their work its human depth. The cultivation is ongoing. The practices require daily attention, because the device paradigm's pressure is daily, constant, and structural. The environment will not spontaneously produce the conditions for focal engagement. Only the practitioner's deliberate choice can produce them, and the choice must be renewed with each day's work.

What gives the ecological frame its ultimate significance is the recognition that the individual practitioner's choice is not merely personal. It is ecological — it affects the system. The practitioner who maintains focal practices within a device-saturated environment is not merely preserving her own capacity for engagement. She is maintaining a condition that others can encounter, learn from, and be influenced by. The senior developer who writes code by hand on Tuesday afternoons is modeling a relationship to the material that junior developers can observe. The teacher who sits with a student's question rather than answering it instantly is creating an environmental condition in which the student's capacity for inquiry is demanded and developed. The parent who says "I don't know — let's think about it" rather than reaching for the device is creating a pocket within the home environment where the child's tolerance for uncertainty is exercised.

These individual choices, aggregated across practitioners, create the cultural conditions that determine whether focal practices survive or disappear. A profession in which most practitioners maintain focal practices is a profession in which the internal goods of the practice — understanding, skill, judgment, centering — continue to be produced and valued. A profession in which most practitioners have abandoned focal practices in favor of device-mediated production is a profession in which those goods are progressively lost, and with them, the capacity for the kind of work that only genuine engagement can produce.

The ecology of engagement is fragile. It requires maintenance. The maintenance is unglamorous — the daily decision to engage with the material rather than delegate to the device, the weekly practice of building without AI assistance, the constant discipline of evaluating output rather than accepting it. The maintenance produces no visible output. It appears, within the evaluative framework of the device paradigm, as inefficiency — time spent on engagement that could have been spent on production. But the maintenance sustains the conditions for depth, understanding, and meaning within an environment that is structurally organized to eliminate them, and the sustaining is the most important work available to anyone who cares about the quality of human engagement with the tools that define the present moment.

---

Chapter 10: The Signal and the Amplifier

The argument of this book arrives, in its final chapter, at the image that The Orange Pill places at the center of its own analysis: the amplifier. AI is an amplifier, and the most powerful one ever built. An amplifier does not generate a signal. It receives a signal and makes it louder, carrying it further than the original source could carry it alone. The quality of the amplified output depends entirely on the quality of the signal fed into it. Feed the amplifier noise, and the noise fills the room. Feed it a signal of genuine depth, and the depth reaches further than any unamplified voice could project.

Borgmann's framework reveals what the amplifier metaphor means at the level of practice. The "signal" that the human practitioner feeds into the AI amplifier is shaped by the practitioner's engagement with her work — by the depth of her understanding, the quality of her judgment, the precision of her questions, the accumulated geological deposit of embodied knowledge that decades of focal practice have laid down. A signal shaped by genuine engagement carries the marks of that engagement: specificity, depth, the kind of understanding that manifests as the ability to recognize what matters and what does not, to distinguish between the adequate and the excellent, to feel when something is right and when it merely passes.

A signal shaped by device dependency — by the progressive disburdening of engagement, the atrophy of the capacities that only focal practice builds — carries the marks of its provenance as well: generality where specificity is needed, plausibility where truth is needed, smoothness where the roughness of genuine thought would reveal the fault lines that matter. The amplifier does not discriminate. It carries whatever it receives. And the reach of the amplification means that the consequences of the signal's quality — or its poverty — are broadcast at a scale that unamplified work could never achieve.

This is the structure that makes the cultivation of focal practices not merely a personal preference but a professional and social responsibility. The practitioner who maintains her engagement — who builds and sustains the embodied understanding that only focal practice provides — brings to the amplifier a signal that is worth amplifying. Her questions are deep because she has cultivated the capacity for deep questioning through the practice of sitting with uncertainty, formulating inquiries that open rather than close, resisting the device's tendency to resolve every question before it has been fully formed. Her judgment is sound because she has exercised judgment through years of engagement with her material — the geological deposits of understanding that allow her to feel when a system is sound and when it is fragile. Her vision is clear because she has maintained the centering practices that produce clarity — the hearth-model engagement with work that organizes attention and purpose around a demanding center.

When the amplifier carries this signal, the output bears the depth of its origin. The code is not merely functional but architecturally sound, because the judgment that directed its production was informed by embodied understanding of how systems behave. The prose is not merely fluent but meaningful, because the thinking behind it was genuine thinking — the product of a mind engaged with ideas rather than a mind reviewing the device's output. The design is not merely professional but considered, because the decisions that shaped it were made by a practitioner whose understanding of users, materials, and constraints was built through sustained engagement rather than delegated to a tool.

The practitioner who has abandoned focal practices — who has relied exclusively on the device to produce her creative output, who has allowed the capacities of engagement to atrophy through disuse — brings to the amplifier a signal of corresponding poverty. Her questions are shallow because she has not cultivated the capacity for depth. Her judgment is uncertain because she has not exercised judgment through demanding practice. Her vision is blurred because she has not maintained the centering that clarity requires. When the amplifier carries this signal, the output bears the marks of its poverty — competent, plausible, smooth, and thin. The code works but breaks under stress. The prose reads well but says nothing that demands to be said. The design looks professional but fails to account for the conditions it was meant to serve.

The amplifier broadcasts the difference at scale. This is the uncomfortable arithmetic of the AI moment: the tool that makes everyone more productive also makes the difference between depth and shallowness more consequential. Before amplification, the difference between a deep practitioner and a shallow one was bounded by the limits of individual output. Both could produce only so much. The deep practitioner's work was better, but the shallow practitioner's work was present, visible, and often adequate. Amplification removes the output constraint. Both practitioners can now produce at scale. And at scale, the difference between depth and shallowness is not a matter of quality. It is a matter of consequence — of how much of the world is shaped by signal and how much by noise.

Borgmann's framework does not resolve this consequence. It illuminates it. The device paradigm's trajectory — toward the progressive elimination of the engagement that produces deep signals — is a trajectory toward a world in which the most powerful amplifier in human history is fed increasingly shallow inputs. The trajectory is not inevitable. It is the default, the outcome that obtains if nothing is done to maintain the conditions for focal engagement. Against the default, the focal practices this book has described — deliberate non-device time, output interrogation, the practice of the question, focal collaboration — are the interventions through which deep signals continue to be produced.

The interventions are modest. They do not reverse the device paradigm. They do not stop the river. They do not restore the pre-technological arrangements that sustained focal engagement through the material demands of pre-device life. They are sticks and mud in a current that grows stronger with each passing quarter, each new model release, each increment of capability that makes the device more seductive and the focal practice more countercultural.

But the modest interventions are what is available. And what is available is what must be done, because the alternative — the abandonment of focal practice, the full surrender to the device paradigm's logic of convenience and disburdening — produces a world in which the amplifier amplifies nothing worth hearing. A world of abundant output and diminished depth. A world of unprecedented productivity and progressive impoverishment of the engagement through which productivity acquires meaning.

The hearth does not compete with the furnace. It does not heat the house as efficiently, as reliably, or as uniformly. It demands labor that the furnace does not demand. It requires skill that the furnace makes unnecessary. It is, by every metric the device paradigm recognizes, inferior to the furnace as a means of delivering the commodity of warmth.

But the hearth is not a means of delivering warmth. The hearth is a focal thing — a center around which a practice organizes itself, a gathering point that brings bodies and attention and care into a shared space, a structure that demands engagement and rewards the demand with the specific depth of experience that no device can replicate. The warmth is the same. The experience is not. And the experience — the centering, the engagement, the depth — is what the hearth provides that the furnace cannot, and what focal practices provide that devices cannot, and what the human signal carries into the amplifier that no device, however powerful, can generate on its own.

The fire still burns. It requires tending.

---

Epilogue

There is a moment I keep returning to. I am in a room in Trivandrum, watching twenty engineers discover that each of them can now do what all of them together used to do. The exhilaration in that room was genuine — physical, electric, the kind that makes you want to call someone at three in the morning and tell them what just happened. And the terror was equally genuine. Both at once. Falling and flying at the same time.

Borgmann died before that room existed. He never saw Claude Code. He never experienced the specific vertigo of watching someone build a complete product feature in two days using a tool that speaks your language back to you. His examples were hearths and wood stoves and home-cooked meals — examples that, to a technologist working at the frontier, can feel quaint, even irrelevant, artifacts of a quieter world that has nothing to say to the world of real-time inference and twenty-fold productivity multipliers.

I would have said that, six months ago. I would have been wrong.

What Borgmann saw — what he spent forty years trying to make visible — is the thing I keep running into at three in the morning when I cannot close the laptop. The thing the Berkeley researchers measured without quite naming. The thing the spouse on Substack described when she wrote about her husband vanishing into Claude Code: not wasting time, not failing, building — building real things with real value — and unable to stop. Borgmann had a word for what is missing in that compulsion. He called it centering — the experience of being fully engaged with something that demands the best of you and knows when you have given less. The hearth demands tending. The furnace does not care whether you are present.

The hardest thing I have read in the course of writing this cycle of books is not Han's critique of smoothness, though that cut deep. It is Borgmann's quiet observation that the device paradigm's most powerful trick is making the loss feel like a gain. I recognized that trick. I have performed it on myself. The passage Claude wrote about democratization that I almost kept — the one that sounded like insight but, when I pressed my weight against it, turned out to be hollow? That was the trick in action. The surface was smooth. The depth was absent. And the smoothness concealed the absence so completely that I nearly published it as my own thought.

What Borgmann gives the conversation about AI is something no one else has provided with this precision: a vocabulary for naming what the tool removes. Not jobs — that conversation is happening loudly enough. Not skills, exactly — though the atrophy is real. What Borgmann names is engagement — the specific, bodily, centering experience of doing something difficult that demands the best of you. The experience that deposits the geological layers. The experience that builds the ground you stand on when you evaluate whether the machine's output is good enough or merely smooth enough.

Without that vocabulary, we are left arguing about productivity metrics and displacement rates, and we miss the thing that actually matters: whether the people using these tools are being deepened by their work or merely accelerated through it. Whether the signal they feed into the most powerful amplifier ever built carries depth or carries noise. Whether, a generation from now, the practitioners directing these tools will possess the embodied understanding to direct them well — or whether the capacity for that understanding will have atrophied through a disuse so comfortable it was never noticed.

I do not garden. I doubt I ever will. But I am learning to tend a different kind of fire — to maintain, within the device-saturated environment I inhabit, the specific practices that build the ground I stand on. A few hours a week of writing without Claude. The discipline of reading AI output as a critic rather than a consumer. The habit of asking whether I believe the argument or merely like how it sounds.

These are small practices. They will not reverse anything. They are sticks and mud. But the pool behind even a small dam can sustain a surprising amount of life.

The fire still burns. It requires tending.

Edo Segal

THE ORANGE PILL: ALBERT BORGMANN What the Tool Removes The room is warm. The fire is gone. Do you notice? Albert Borgmann spent forty years studying a single pattern: every time a powerful tool delivers what we want more conveniently, it quietly eliminates the human engagement that made the wanting meaningful. He called it the device paradigm — and AI is its ultimate expression.

This book applies Borgmann's framework to the most consequential technology shift of our time. When a machine can write your code, draft your brief, compose your argument, and do it all through casual conversation — what happens to the understanding that only struggle builds? What happens to the practitioner who no longer practices? Borgmann saw the answer decades before the question arrived. The Orange Pill series gives you the lenses to see clearly what AI changes. This volume asks the question the productivity metrics will never capture: not whether the output is good enough, but whether you are still being deepened by the work of producing it. "Technology can give us more and more of the best things in life, yet fail to make our lives more meaningful and happy." — Albert Borgmann

Albert Borgmann
“microelectronic revolution”
— Albert Borgmann
0%
11 chapters
WIKI COMPANION

Albert Borgmann — On AI

A reading-companion catalog of the 32 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Albert Borgmann — On AI uses as stepping stones for thinking through the AI revolution.

Open the Wiki Companion →