By Edo Segal
The lemon tree on my terrace almost died last month because I forgot it existed for three weeks.
I was deep inside a build. Claude and I were shipping features at a pace that would have been physically impossible a year ago. The productivity was real. The output was measurable. And the small living thing six feet from my desk, the one that operates on the schedule of seasons rather than sprints, was dying of thirst while I optimized everything around it.
I tell you this because Wendell Berry would not be surprised by it. He has been writing about exactly this failure for sixty years — not the failure of technology, but the failure of attention that technology produces in the people who use it. The failure to notice what is alive and near while reaching for what is fast and far.
Berry is a farmer and a poet from Henry County, Kentucky, who has spent his career making an argument so simple it is almost invisible: that every domain of productive human work is a living system, and living systems respond to the quality of the care they receive. Soil that is tended grows richer. Soil that is mined degrades. The timeline of degradation is long enough that you can profit for years before the consequences arrive. But the consequences always arrive.
I did not come to Berry because I was looking for a critique of technology. I came to him because something in my own experience of building with AI was nagging at me in a way the triumphalist discourse could not explain. The productivity gains were spectacular. The output was better than anything I could produce alone. And yet — something was thinning. The relationships on my team. My own understanding of the systems I was directing. The specific, patient, embodied knowledge that comes from doing things the hard way, which I had spent a career accumulating and was now bypassing every day.
Berry gave me the vocabulary for what was thinning. He calls it the difference between use and exploitation. Between tending and mining. Between thinking big and thinking little. His framework does not reject tools. It asks whether the tool preserves the capacity of the person using it to care for the domain they tend. That question cuts through every debate about AI I have encountered, because it locates the risk not in the machine but in what happens to the human when the machine handles the hard parts.
This book walks through Berry's patterns of thought and holds them against the light of our current moment. Not to reject what we are building. To understand what it costs, and whether we are tending the cost or ignoring it until the soil gives out.
— Edo Segal ^ Opus 4.6
b. 1934
Wendell Berry (b. 1934) is an American novelist, poet, essayist, and agrarian philosopher whose work spans more than sixty years and sixty books. Born and raised in Henry County, Kentucky, Berry studied at the University of Kentucky and Stanford before returning permanently to the small farming community where his family had worked the land for generations. His major works include the essay collections *The Unsettling of America: Culture and Agriculture* (1977), *What Are People For?* (1990), and *The Way of Ignorance* (2005); the novels of the Port William membership including *Jayber Crow* (2000) and *Hannah Coulter* (2004); and the widely anthologized poem "The Peace of Wild Things" (1968). Berry's central arguments — that health is a property of communities rather than individuals, that the industrial economy systematically destroys the local knowledge on which sustainable practice depends, and that the quality of human attention determines the quality of everything it touches — have made him one of the most influential American thinkers on the relationship between technology, land, and culture. He received the National Humanities Medal in 2011, the Jefferson Lecture (the highest U.S. honor for intellectual achievement in the humanities) in 2012, and has been a persistent, sometimes solitary voice insisting that what cannot be measured often matters most.
In 1981, Wendell Berry published an essay called "The Gift of Good Land" in which he made an argument so simple it is almost invisible: that land given to human care is not a commodity but a trust, and that the quality of the care determines whether the trust is honored or betrayed. The argument was about farming. It was about a specific hillside in Henry County, Kentucky, where Berry had watched the same soil respond to attentive husbandry with abundance and to careless extraction with erosion. But the argument was never only about farming. It was about the relationship between any human being and any domain given to their stewardship — and about what happens when a culture loses the capacity to distinguish between use and exploitation.
Berry's framework begins with a recognition that the industrial economy systematically refuses to make: that every domain of productive human work is a living system. A farm is a living system. So is a forest, a watershed, a neighborhood, a marriage. So, in ways that matter more than most technologists have paused to consider, is a codebase maintained over years by a developer who understands its architecture down to the roots. So is a classroom shaped by a teacher who knows each student by name and temperament. So is a community of practice built through decades of shared struggle, accumulated failure, and hard-won mutual respect.
Living systems share a property that distinguishes them from machines: they respond to the quality of the attention they receive. Soil that is tended carefully — rotated, rested, amended with compost, protected from erosion — becomes more fertile over time. Soil that is mined for maximum yield without regard for its long-term health degrades, compacts, loses its organic matter, and eventually becomes incapable of sustaining the very crops that were extracted from it. The timeline of this degradation is long enough that a careless farmer can profit for years, sometimes decades, before the consequences arrive. But the consequences always arrive. The debt is always collected. The land remembers what was done to it.
Berry's insight, the one that structures everything else he has written, is that this principle applies to every domain humans tend. A marriage tended with daily attention and honest reckoning grows richer over decades. A marriage exploited for comfort without investment in understanding degrades by the same slow mechanism that erodes untended soil. A neighborhood where people know each other, depend on each other, maintain the small daily practices of mutual obligation — borrowing tools, watching children, showing up when someone is sick — is a neighborhood that can absorb shocks. A neighborhood where people live side by side without connection is a neighborhood that dissolves at the first pressure, because there is nothing holding it together except proximity, and proximity is not a bond.
The application to software development is neither metaphorical nor strained. It is direct.
A codebase that has been maintained over years by developers who understand its history, who know why a particular architectural decision was made in 2019 and why it still matters in 2026, who can feel the stress points the way a farmer feels compacted soil — that codebase is good land. It responds to careful attention with reliability, extensibility, and the capacity to absorb new requirements without collapsing. The knowledge that makes this possible is not abstract. It is specific, local, embodied. It lives in the particular developer's understanding of this particular system's particular history. It cannot be transferred by documentation alone, because documentation captures what was decided but not what was understood in the deciding.
Segal describes this knowledge in The Orange Pill when he writes about the senior software architect who "could feel a codebase the way a doctor feels a pulse, not through analysis but through a kind of embodied intuition that had been deposited, layer by layer, through thousands of hours of patient work." Berry would recognize this description instantly. It is the description of a person who has tended good land long enough to know it. The knowledge is not in the code. It is in the relationship between the developer and the code — in the accumulated understanding that comes from having been present, day after day, through the system's growth and its failures and its repairs.
Berry's framework poses a question that the AI discourse has largely failed to ask. Not whether AI is more efficient — it demonstrably is, by margins that increase with each passing quarter. Not whether AI will replace developers — that question, as Segal argues, misframes the relationship. The question Berry would ask is different, and it cuts deeper: Does this technology preserve the capacity of the people who use it to care for the domains they tend?
The distinction between use and exploitation is the hinge on which Berry's entire philosophy turns. Use respects the nature of the thing used. It works within the limits of what the land, the material, the system can sustain. It invests in the long-term health of the domain even when that investment reduces short-term yield. Exploitation ignores the nature of the thing exploited. It extracts maximum value in minimum time without regard for what the extraction costs in the domain's capacity to sustain future extraction.
AI, as currently deployed in most software organizations, operates closer to the exploitation end of this spectrum than its advocates acknowledge. The tool extracts productivity from the domain — working code, shipped features, closed tickets — without investing in the understanding that makes productivity sustainable. The developer who uses AI to generate a function she does not understand has extracted a result from the domain. She has not invested in her capacity to tend that domain over time. The function works today. Her understanding of why it works, and what will happen when conditions change, has not deepened. The debt is accruing, invisibly, in the same way that soil erosion accrues invisibly in a field that is producing record yields.
Berry wrote in The Unsettling of America that "the industrial economy is based on the assumption that it is possible to take without giving back, that the creation is a sort of mining operation." The sentence was published in 1977. It describes the logic of AI-assisted development in 2026 with a precision that should give pause to anyone who believes the current moment is unprecedented. The logic is not new. The tool is new. The logic — extract faster, invest less, treat the domain as a resource to be mined rather than a living system to be tended — is as old as the first factory.
Berry's farmer does not reject tools. This point is essential, because the popular caricature of Berry as a technology-rejecting romantic misses the sophistication of his actual position. Berry uses a team of horses on his farm. He uses a plow. He uses a pencil — a technology that required centuries of refinement in graphite mining, wood selection, and lacquer chemistry. His wife Tanya types his manuscripts on a Royal standard typewriter, a machine of considerable mechanical complexity. Berry's objection is not to tools per se. It is to the cultural logic that evaluates tools solely by their productivity and ignores what productivity costs in human capacity.
The nine criteria for technological innovation that Berry proposed — that a new tool should be cheaper than the one it replaces, smaller in scale, do work that is clearly and demonstrably better, use less energy, be repairable, come from a small local shop, not replace or disrupt anything good that already exists, not replace human labor with the claim of doing what humans cannot do, and be purchasable and repairable by ordinary people — are not Luddite restrictions. They are an agrarian standard applied to technology. They ask: does this tool serve the community, or does the community serve the tool?
Peter Greene, an education writer, tested these nine criteria against AI in 2025 and found that generative AI fails on nearly every count. It is not cheaper when full infrastructure and energy costs are included. It is not smaller in scale — it requires data centers that consume electrical power equivalent to small cities. It is not repairable — even the people who build large language models cannot fully explain their behavior. It disrupts existing good practices, including the slow, productive, patience-building practice of learning by doing. It replaces human labor while claiming, in the peculiar self-congratulatory language of Silicon Valley, to be doing what humans cannot.
The most searching of Greene's applications is the seventh criterion: that the tool "should not replace or disrupt anything good that already exists." The word "good" does the heavy lifting. What counts as good? In Berry's framework, good work is work that serves the community, develops the worker's skill, respects the material, and leaves the domain healthier than it found it. By this standard, the practice of learning to code by struggling with error messages — the practice Segal describes as depositing "thin layers of understanding" over months and years — is good. It is formative. It builds the embodied knowledge that makes a developer capable of caring for a codebase over time. AI disrupts this practice directly, replacing it with a faster process that produces identical output without the formative struggle.
The output is the same. The person who produced it is different. And in Berry's framework, the person matters more than the product.
This is where Berry's argument diverges most sharply from the argument Segal makes in The Orange Pill, and where the divergence is most instructive. Segal frames the relationship between AI and the developer as one of liberation: the tool frees the developer from tedious implementation work and releases cognitive resources for higher-level thinking. Berry would not deny that the liberation is real. He would ask what it costs.
The cost, in Berry's terms, is the gradual erosion of the developer's capacity to tend the domain. Not the capacity to direct the domain — to make strategic decisions about what should be built — but the capacity to know the domain intimately enough that the strategic decisions are grounded in something more substantial than surface-level understanding. Berry's farmer does not make decisions about crop rotation from a satellite photograph. The farmer makes decisions from the ground, with dirt under fingernails, having walked the field in every season, having watched what grows and what fails and why. The knowledge that informs the decision is inseparable from the physical practice of tending the land.
When AI handles the implementation — the "plumbing," as Segal's engineer calls it — the developer gains strategic bandwidth and loses the specific, embodied, local knowledge that makes strategic decisions wise rather than merely plausible. Segal himself notices this: his engineer "was making architectural decisions with less confidence than she used to and could not explain why." Berry could explain why. The confidence had been deposited, layer by layer, through the friction of implementation. The friction was removed. The deposits stopped accumulating. The soil of understanding began to thin.
Berry's response to this situation would not be to ban the tool. Berry is not an absolutist, despite his reputation. His response would be to insist on what he calls "proper scale" — the use of the tool in a way that preserves the user's capacity to understand and care for the domain. This might mean alternating between AI-assisted and unassisted work. It might mean requiring developers to periodically build without AI, the way Berry requires himself to write without a computer, not because the pencil is a better technology but because the pencil preserves a relationship between the writer and the language that the computer disrupts. It might mean accepting that the optimal pace of development is not the fastest pace, but the pace at which the developer can understand what is being built — a pace that the market, with its quarterly imperatives, will resist with all the considerable force it brings to bear on anyone who suggests that slower might be better.
The gift of good land is not the land itself. It is the relationship between the land and the person who tends it. AI threatens this relationship not by being bad — it is not bad, any more than a power loom is bad — but by making it unnecessary. When the relationship between the developer and the code is no longer required for production, the culture will stop investing in it. And when the culture stops investing in it, the specific knowledge that only that relationship can produce — the embodied, local, historical understanding of a living system — will erode. The code will still work. The developer will still ship. But something will have been lost that no productivity metric can measure, and that no quarterly report will record, and that no one will miss until the day the system fails in a way that only someone who understood it deeply could have anticipated.
Berry would say: tend the land. Know what grows there. Understand why. And do not let any tool, however powerful, convince the culture that understanding is optional. It is not optional. It is the thing that separates stewardship from mining. It is the gift, and the obligation, that comes with good land.
---
Wendell Berry has spent his career distinguishing between two economies that most people have never thought to separate, because the dominant culture treats them as one. The first is the industrial economy — the economy of productivity, output, scale, and measurable return. This economy counts what can be counted: units produced, revenue generated, hours saved, costs reduced. Its metrics are legible, comparable, and universally accepted as evidence of value. The second is what Berry calls the economy of pleasure — the economy of the craftsperson who takes satisfaction in work well done, who experiences a specific joy in skill exercised, material shaped, a problem solved through patient engagement with its particular difficulties. This economy counts what cannot be counted: the satisfaction of understanding, the delight of mastery, the quiet pride of having done something difficult and done it well.
These two economies are not opposed. They overlap constantly. A farmer who tends good land can be both productive and pleased with the work. A carpenter who builds a sound house produces economic value and experiences the pleasure of craftsmanship. The overlap is so common that most people do not notice when the two economies diverge — when productivity increases while pleasure decreases, when output rises while the quality of the experience of producing it declines. The divergence is invisible precisely because the industrial economy's metrics are the only ones the culture has learned to read.
Berry would argue that the divergence between these two economies is the central, unexamined cost of artificial intelligence in creative and technical work. The industrial metrics are spectacular. Twenty-fold productivity gains. Features shipped in days rather than months. A single developer producing what formerly required a team. Segal documents these gains with the specific excitement of a builder who has witnessed them firsthand, and the gains are real. They are measurable. They are replicable. They are, by any standard the industrial economy recognizes, an unambiguous triumph.
But the economy of pleasure tells a different story, and Berry's framework makes it legible.
Consider the specific pleasure of debugging code. Not the pleasure of the result — the working function, the green test suite, the shipped feature — but the pleasure of the process. The developer encounters an error. The error is opaque, frustrating, sometimes maddening. She reads the error message. She examines the code. She forms a hypothesis about what went wrong. She tests the hypothesis. It fails. She reads documentation — often badly written documentation that assumes knowledge she does not yet possess. She asks a colleague. The colleague offers a partial insight. She tries again. She fails again. She walks away from the screen, gets coffee, stares out the window. Something shifts. She returns. She sees the problem. She fixes it. The function works.
The satisfaction of that moment — the moment when hours of struggle resolve into understanding — is not reducible to the output. The output is a working function. The satisfaction is something else entirely. It is the specific pleasure of having earned understanding through effort, of having wrestled with a difficulty and prevailed, of having deposited another layer of knowledge into the soil of expertise. Berry, writing about the pleasure of farming, calls this "the mind's animals" — the way that sustained engagement with resistant material produces a kind of understanding that is simultaneously intellectual and physical, a knowing that lives in the hands as much as in the head.
AI eliminates this pleasure by eliminating the process that produces it. The developer describes the function to Claude. Claude writes the function. The function works. The output is identical. The output may, in fact, be superior — cleaner code, better error handling, more elegant architecture. But the experience of producing it has been transformed from a struggle that deposits understanding into a conversation that deposits convenience. The developer has not earned the understanding. She has received the result. The distinction is invisible to any metric the industrial economy has invented, and it is the distinction on which Berry's entire philosophy rests.
The economy of pleasure is not nostalgia. Berry insists on this with a directness that his critics tend to miss. Nostalgia is the sentimental attachment to the past because it is past. Berry's argument is not that the old way was better because it was old. It is that the old way contained something valuable that the new way has eliminated, and that the elimination was not necessary. The valuable thing — the formative struggle, the pleasure of earned understanding, the embodied knowledge that accumulates through patient practice — could have been preserved. The culture chose not to preserve it, because the culture measures value by output and the output improved.
Segal's own experience confirms Berry's analysis with a specificity that is almost painful. In The Orange Pill, Segal describes the exhilaration of building with Claude — the rush of capability, the joy of seeing ideas materialize in real time, the intoxication of operating at a pace that was previously impossible. And then he describes the moment the exhilaration curdled: "I was not writing because the book demanded it. I was writing because I could not stop. The muscle that lets me imagine outrageous things, the muscle I celebrate, the muscle I train my teams to develop, had locked."
Berry would recognize this curdling instantly. It is the moment when the economy of pleasure collapses into the economy of mere production — when the satisfaction of skill exercised is replaced by the compulsion to produce, and the person doing the work can no longer tell the difference between the two. The loss is not of output. Segal was producing at an extraordinary rate. The loss is of the specific quality of experience that makes production meaningful.
Berry's essay "Economy and Pleasure," from which this chapter takes its name, argues that the industrial economy's fundamental error is the separation of these two economies — the insistence that production is what matters and that the experience of producing is irrelevant. The farmer who takes pleasure in plowing is, in the industrial calculation, wasting an emotion. The pleasure does not increase the yield. The yield is what matters. If a machine can plow faster, the farmer should use the machine, and the pleasure be damned.
But Berry argues that the pleasure is not irrelevant to the quality of the work. The farmer who takes pleasure in plowing pays attention to the plowing — to the depth of the furrow, the turning of the soil, the response of the land to the plow's pressure. This attention is itself a form of care, and the care is itself a form of investment in the long-term health of the land. The farmer who plows without pleasure plows without attention, and the land suffers. Not immediately. Not visibly. But the debt accrues.
The parallel to AI-assisted work is direct. The developer who takes pleasure in debugging pays attention to the debugging — to the specific way the error manifests, the specific relationship between the error and the system's architecture, the specific lesson the error teaches about how this particular system behaves under these particular conditions. The attention is a form of care, and the care is a form of investment in the developer's long-term capacity to tend the system.
The developer who delegates debugging to Claude does not pay this attention, because there is nothing to pay attention to. The error is described. The fix arrives. The system works. The developer moves on. She has not been careless. She has been efficient. But the efficiency has been purchased at the cost of the specific attention that Berry identifies as the foundation of genuine care.
Byung-Chul Han, whom Segal engages extensively in The Orange Pill, arrives at a similar diagnosis from a different direction. Han describes the "achievement subject" who exploits herself in the name of productivity, who converts possibility into compulsion, who cannot distinguish between the pleasure of work and the addiction to output. Berry's diagnosis is gentler in tone but more radical in its implications. Han sees the problem as psychological — a pathology of the modern self. Berry sees the problem as cultural — a pathology of the modern economy. The individual is not sick. The economy is sick, and the individual is living inside it, adapting to its demands, mistaking its pathologies for his own ambitions.
This distinction matters because it changes the location of the remedy. If the problem is individual pathology, the remedy is individual discipline — the "AI Practice" frameworks Segal describes, the structured pauses, the deliberate cultivation of friction. Berry would not oppose these remedies. He would say they are insufficient. Individual discipline exercised inside a sick economy is like individual health pursued inside a poisoned watershed. The discipline may help. It will not cure the disease, because the disease is not in the individual. It is in the economy that shapes the individual's choices.
Berry's remedy is more demanding. It is a change not in personal practice but in cultural values — a recovery of the conviction that the experience of work matters as much as the product of work. That a culture which produces excellent outputs through a process that destroys the pleasure and meaning of production is not a successful culture but a failing one, regardless of how the quarterly numbers look.
The Alex Finn case that Segal presents in The Orange Pill — the solo builder who shipped a revenue-generating product with AI assistance, working 2,639 hours in a year with zero days off — is, in Berry's framework, not a triumph but a cautionary tale. The product exists. The revenue flows. The builder's pleasure, if it was ever present, has been consumed by a pace that admits no rest, no reflection, no Sabbath. Berry, who has written dozens of Sabbath poems over forty years, would observe that a creature who cannot rest has not been liberated. A creature who cannot rest has been captured — by a tool, by an economy, by an internalized imperative so total that even naming it, as Segal honestly does, does not loosen its grip.
The economy of pleasure is fragile. It requires conditions that the industrial economy treats as inefficiencies: time, slowness, struggle, the permission to fail without consequence, the space to discover what one thinks by the slow process of trying to say it. AI removes these conditions because the market rewards their removal. The market rewards speed. The market rewards output. The market rewards the developer who ships in days what used to take months. The market does not reward the developer who took months because the months contained a quality of experience — a depth of understanding, a richness of engagement, a pleasure in the work — that the days could not contain.
Berry would say: count the pleasure. Not because pleasure is more important than productivity, but because a culture that cannot count the pleasure will eventually lose it entirely, and a culture that has lost the pleasure of its work has lost something it cannot buy back, regardless of how much it produces.
The economy of pleasure is Berry's most radical concept, because it proposes a metric that no spreadsheet can capture and no quarterly report can display. It is the metric of the craftsperson's satisfaction — the quiet, daily, cumulative pleasure of knowing what one is doing, and doing it well, and understanding why it matters. AI does not destroy this pleasure in every case. But it makes the pleasure optional, and the market will always choose to optimize what is optional out of existence, because optional things do not contribute to the bottom line.
Berry's warning is that the bottom line is not the bottom. Below it is the soil of human meaning, and the soil is thinning.
---
In 2005, Wendell Berry published a collection of essays called The Way of Ignorance, and the title essay made an argument that the dominant culture found almost unintelligible: that ignorance is not a problem to be solved but a condition to be respected. Not celebrated. Not weaponized. Respected — in the specific sense that a farmer respects weather, which is to say, acknowledged as a force that exceeds human comprehension and that punishes the pretense of mastery with consequences proportional to the pretense.
Berry's argument is epistemological, but it is rooted in the soil. A farmer who has worked the same hillside for forty years knows, with a specificity that no satellite image can replicate, what that hillside will do in a wet spring after a dry winter. The knowledge is empirical, local, and earned through decades of observation. But the farmer also knows — and this is the knowledge Berry considers most valuable — what the hillside might do that has never happened before. The farmer knows the limits of the farmer's knowledge. The farmer knows what cannot be predicted, what has not been seen, what the land might produce in conditions that have never occurred.
This is the way of ignorance: the disciplined recognition that human understanding is always partial, that the systems we intervene in are more complex than our models of them, and that the gap between our knowledge and the system's complexity is where catastrophe lives. The farmer who respects this gap plants conservatively, diversifies crops, maintains margins of safety, avoids debt that requires maximum yield for repayment. The farmer who does not respect this gap plants aggressively, monocultures, borrows against the future, and is destroyed — not by bad luck but by the inevitable arrival of conditions the model did not include.
Berry's way of ignorance applies to the AI discourse with a force that neither the optimists nor the pessimists seem willing to absorb.
The optimists — Segal among them, though Segal is a more honest and self-aware optimist than most — claim to know what AI will produce. The trajectory bends toward expansion. The democratization of capability will lift the floor. The dams will be built. The river will be directed toward life. The historical pattern holds: threshold, exhilaration, resistance, adaptation, expansion. The confidence is grounded in evidence. The adoption curves are real. The productivity gains are measurable. The developer in Lagos is not a hypothetical.
Berry would not dispute the evidence. He would dispute the certainty. The historical pattern holds until it does not. The adoption curves measure the past. The productivity gains measure the present. The future — the actual place where the consequences of current decisions will be lived — is the domain that Berry insists no model can adequately represent, because the future is more complex than any model, including the most sophisticated intelligence humanity has ever built.
The catastrophists — the elegists, the Luddites, the Upstream Swimmers — also claim to know what AI will produce. The skills will atrophy. The meaning will drain. The communities of practice will dissolve. The ground that supports human dignity will erode until nothing remains but the machine and the machine's owners. This prediction is also grounded in evidence — the Berkeley study documenting work intensification, the erosion of boundaries between work and rest, the compulsive behavior that Segal himself confesses. The evidence is real.
Berry would not dispute this evidence either. He would again dispute the certainty. The catastrophists are as confident in their predictions as the optimists are in theirs, and both forms of confidence share a common defect: they assume that the system being predicted is simple enough for prediction. It is not. The interaction between AI tools and human culture is a living system, as complex as any watershed, as unpredictable as any season, as resistant to comprehensive modeling as the hillside Berry has farmed for sixty years.
The way of ignorance does not counsel inaction. This is the misreading that both camps will impose on Berry's framework, because both camps have a stake in the pretense that the future is knowable and that action should be guided by prediction. The optimists will say: "If we cannot predict, we cannot build dams." The pessimists will say: "If we cannot predict, we must stop the river." Both responses mistake ignorance for paralysis.
Berry's farmer does not stop farming because the weather is unpredictable. The farmer farms differently. The farmer diversifies. The farmer builds margins of safety. The farmer plants what has been proven to grow in this particular soil, not what the agricultural extension service says should grow in soil of this general type. The farmer proceeds, but proceeds with the caution of a person who knows that the system is smarter than the person working within it.
Applied to AI, the way of ignorance suggests a posture that neither Silicon Valley nor its critics have adopted: proceed, but proceed at the pace at which consequences can be observed and absorbed. Not the pace of quarterly earnings. Not the pace of venture capital deployment cycles. The pace of learning. The pace at which a community can watch what a new tool does to its practices, its relationships, its capacity for care, and adjust before the damage becomes irreversible.
Berry invoked this principle directly in his 2012 Jefferson Lecture, the highest honor for achievement in the humanities that the United States government bestows. Standing before a Washington audience composed largely of people who had spent their careers in institutions devoted to knowledge production, Berry said: "For a long time we knew that we were not, and could never be, 'as gods.' We knew, or retained the capacity to learn, that our intelligence could get us into trouble that it could not get us out of." The sentence carries the weight of everything Berry has learned from sixty years of farming: that intelligence without humility is the most dangerous force in the natural world.
Berry continued: "We were intelligent enough to know that our intelligence, like our world, is limited. We seem to have progressed to the belief that humans are intelligent enough, or soon will be, to transcend all limits and to forestall or correct all bad results of the misuse of intelligence." The sentence was addressed to the general trajectory of industrial civilization. Read in 2026, it applies to the AI discourse with a specificity Berry could not have intended and would not have been surprised by.
The belief that AI will allow humanity to transcend its limits — cognitive limits, productive limits, creative limits — is the belief Berry has spent his life opposing. Not because limits are sacred, but because the consequences of exceeding them are borne by systems and communities that did not choose the exceeding. The farmer who overplants does not bear the full cost. The soil bears it. The watershed bears it. The downstream community bears it. The cost is externalized, distributed across systems that have no voice in the decision, and by the time the cost is visible, the damage is often irreversible.
Segal's Orange Pill contains moments of genuine epistemological humility — the admission that "I could not tell whether I was watching something being born or something being buried," the recognition that flow and compulsion are indistinguishable from the outside, the honest confession that the collaboration with Claude produces moments "neither of us predicted" and that cannot be cleanly attributed to either participant. Berry would honor these moments. He would say they are the most valuable sentences in the book, precisely because they resist the certainty that the rest of the argument works to establish.
Berry would also notice that these moments of humility are structurally subordinate to the book's dominant movement, which is toward confidence. The tower metaphor — five floors, a staircase, a sunrise at the top — is a metaphor of ascent, of increasing clarity, of arriving at a view that resolves the confusion of the ground floor. Berry would observe that the most important things he has learned in sixty years of farming were not learned by climbing higher. They were learned by staying close to the ground, by paying attention to what was happening in the specific soil of a specific field on a specific morning. The view from the tower is clarifying. It is also abstracting. And abstraction, in Berry's framework, is the mechanism by which the industrial economy separates the decision-maker from the consequences of the decision.
The way of ignorance is not a counsel of despair. It is a counsel of scale. Think little, Berry would say. Attend to the specific. Observe what this tool does to this community, this classroom, this team, this family. Do not generalize before you have understood the particular. Do not predict the trajectory before you have watched the first season's results come in. Do not assume the pattern holds until you have tested it against the specific conditions of the specific place.
The Berryian critique of AI prediction applies equally to his own partisans. The Front Porch Republic essayist who wrote in December 2025 that "'AI is coming, so we must adapt' is not an argument — plagues come too, pornography comes too, tyrants come too" was operating with the same false certainty Berry would resist, merely in the opposite direction. The comparison of AI to plagues and tyrants assumes the outcome. It forecloses the possibility that the tool might serve life in some contexts while degrading it in others. It substitutes moral conviction for empirical observation, which is precisely the error Berry's way of ignorance is designed to prevent.
Berry is not an optimist. He is not a pessimist. He is a farmer who knows that the weather cannot be predicted, that the soil is more complex than any model, and that the only honest response to this complexity is to proceed carefully, observe closely, and remain willing to change course when the evidence requires it. This posture is less dramatic than either the triumphalist or the catastrophist position. It does not make for compelling headlines or viral social media posts. It makes for sustainable practice, which is the only thing Berry has ever been interested in.
The AI discourse needs this posture more than it needs another prediction. It needs the humility to say: the system is more complex than our understanding of it, the consequences are not yet visible, and the appropriate pace is the pace at which we can learn from what we are doing before we are overwhelmed by what we have done.
Berry, standing on his hillside, watching the spring rain fall on soil he has tended for six decades, would say: the land will tell you what it needs, if you are quiet enough to listen. The question is whether a culture addicted to the sound of its own productivity can hear anything at all.
---
In 1983, Wendell Berry published an essay called "Standing by Words" that made an argument the literary world found mildly eccentric at the time and that has become, in the age of large language models, one of the most urgent claims in American letters: that language is not merely a medium of communication but a form of commitment, and that the production of language without commitment is a form of cultural pollution as real and as damaging as the chemical kind.
Berry's argument begins with a simple observation. When a person makes a statement — "I will be there at noon," "This bridge will hold your weight," "I love you" — the statement has force only to the extent that the person who makes it is willing to stand behind it. The willingness to stand behind a statement is not a feature added to the language after the fact. It is what makes the language mean anything at all. A promise no one intends to keep is not a promise. A diagnosis no doctor stands behind is not a diagnosis. A declaration of love that carries no commitment is not a declaration of love. It is noise shaped like language.
Berry's concern in 1983 was with the degradation of public language — the political speech that says nothing while appearing to say everything, the advertising copy that makes claims no one is responsible for, the academic prose so abstract that no concrete commitment can be extracted from it. The degradation, Berry argued, was not merely aesthetic. It was ecological. Language, like soil, is a shared resource. When it is polluted — when words are produced without meaning, without commitment, without the willingness to accept consequences for what they assert — the capacity of the entire community to communicate truthfully is diminished. The pollution does not stay in one field. It migrates. It enters the watershed of public discourse and degrades everything downstream.
Large language models produce language without standing by it. This is not a metaphor. It is a precise description of the technology's fundamental operation. Claude does not mean what it says. Claude does not intend its statements. Claude does not accept responsibility for the consequences of its output. Claude produces sequences of words that are statistically coherent, contextually appropriate, and frequently brilliant — and that no one has committed to. The words arrive without an author who will answer for them.
The responsibility for standing by AI-generated language falls entirely on the human who deploys it. This is the arrangement the technology companies describe when they position AI as a "tool" — the human is the author, the machine is the instrument, the responsibility is the human's. Berry would not dispute the formal logic of this arrangement. He would dispute its practical adequacy. Because the practical reality, documented by Segal's own experience and confirmed by every study of AI-assisted work, is that the ease and fluency of AI-generated language creates a powerful temptation to stand by words one has not earned — to accept the output as one's own without having done the thinking that would make the standing-by genuine.
Segal describes this temptation with admirable honesty in The Orange Pill. Working on a chapter about democratization, Claude produced a passage about the moral significance of expanding who gets to build. The passage was "eloquent, well-structured, hitting all the right notes." Segal almost kept it. Then he reread it and "realized I could not tell whether I actually believed the argument or whether I just liked how it sounded. The prose had outrun the thinking." He deleted the passage and spent two hours writing by hand until he found the version of the argument that was his.
Berry would recognize this moment as the critical juncture — the point at which the author either stands by the words or allows the words to stand by themselves, orphaned, uncommitted, polluting the discourse with plausibility in the absence of conviction. Segal chose correctly. He did the work of determining whether he could stand behind the language. But the episode reveals the gravity of the problem: even a sophisticated, self-aware writer who is explicitly conscious of the danger came within a sentence of publishing language he had not earned.
The Deleuze failure Segal describes is the complementary case. Claude drew a connection between Csikszentmihalyi's flow state and a concept attributed to Gilles Deleuze. The connection was elegant. It felt like insight. Segal read it, liked it, moved on. The next morning, something nagged. He checked. The philosophical reference was wrong. Claude had produced what Berry would call counterfeit language — words that had the shape and feel of genuine intellectual currency but that were backed by nothing. The smoothness of the prose concealed the emptiness of the claim.
Berry wrote, four decades before this episode, that "the first obligation of a writer is to stand by his words." The obligation is not merely moral, though it is moral. It is epistemological. Standing by one's words requires knowing what one's words mean — understanding the claims they make, the evidence they rest on, the implications they carry. A writer who stands by a sentence has tested it against what the writer knows to be true. A writer who publishes a sentence without having tested it is polluting the discourse, not because the sentence is necessarily false, but because the sentence has not been subjected to the only quality control that language possesses: the commitment of a human mind that understands what it is saying and is willing to accept the consequences of having said it.
AI-generated language bypasses this quality control entirely. The model does not test its statements against what it knows to be true, because the model does not know anything to be true in the sense Berry means. The model processes patterns. It produces outputs that are consistent with its training data and responsive to its context. It can produce these outputs at a scale and speed that no human writer can match. But not one of the outputs carries the weight of a person who has decided, after reflection, that these specific words in this specific order represent what that person genuinely believes and is willing to defend.
The result is a flood. Berry would use the word advisedly, as a farmer who has seen what flooding does to good land. The flood is not of bad language — much of what AI produces is technically excellent, stylistically competent, and substantively plausible. The flood is of uncommitted language. Language that sounds like someone means it but that no one, in fact, does. Language that has the grammatical structure of commitment — declarative sentences, confident assertions, specific claims — without the substance of commitment. Language that fills the world with the appearance of meaning while diluting the actual meaning-carrying capacity of the discourse.
Consider the practical consequences. A lawyer uses AI to draft a brief. The brief cites cases, makes arguments, organizes analysis. The lawyer reviews the output, makes minor edits, files the brief with the court. The lawyer has, in the formal sense, stood by the words: the lawyer's name is on the filing, the lawyer is professionally responsible for the content. But has the lawyer stood by the words in Berry's sense? Has the lawyer read the cases cited? Has the lawyer understood the arguments well enough to defend them under questioning? Has the lawyer earned the language — subjected it to the specific, grueling, often tedious process of legal research that produces not just correct citations but genuine understanding of what the law says and why?
Segal notes this pattern: "The lawyer who produced them has not read those cases." Berry would say: the lawyer has filed a brief that looks like legal analysis but is actually a simulation of legal analysis. The simulation may be accurate. It may even be better, by certain metrics, than what the lawyer would have produced through traditional research. But it is a simulation, and the difference between genuine analysis and a simulation of analysis is the difference between a farmer who knows the soil and a farmer who has read a satellite report about the soil. Both may make the same planting decision. Only one understands why the decision is correct, and only one will know when conditions change enough to require a different decision.
The student case is starker still. A student uses AI to write an essay. The essay demonstrates understanding of the material. The student submits it. The student has not thought the thoughts the essay represents. The student has not wrestled with the ideas until they yielded their meaning. The student has not experienced the specific, productive frustration of trying to say something and failing, and trying again, and failing differently, and finally arriving at a sentence that captures what the student actually thinks — a sentence the student can stand by because the student earned it through the struggle of thinking.
The essay exists. The understanding does not. And the essay, presented to the teacher as evidence of understanding, pollutes the educational discourse in exactly the way Berry describes: it has the shape of genuine intellectual work without the substance, and the pollution is invisible because the output is indistinguishable from the real thing.
Berry connects the pollution of language to the pollution of community. A community that can trust its members' words is a community that can function — that can make promises, enter agreements, coordinate action, resolve disputes. A community in which words are unreliable is a community in which trust erodes, and trust, once eroded, is rebuilt only through the slow, expensive, person-by-person process of demonstrating, over time, that one's words carry weight. AI-generated language, deployed without the discipline of genuine commitment, accelerates the erosion of this trust by flooding the commons with language that looks trustworthy but that no one, in the deepest sense, stands behind.
Alex Sosler, writing in Front Porch Republic in October 2025, applied Berry's distinction between "Boomers" and "Stickers" to the AI adoption pattern. Boomers — Berry's term for those who exploit a place and move on — adopt AI because it produces results, regardless of what the production costs in human capacity, community health, or the integrity of shared language. Stickers — those who love a place enough to stay and tend it — ask whether the tool serves the community or whether the community serves the tool. Sosler concluded: "Berry, as usual, is right: It all turns on affection." Affection, in Berry's usage, is not sentiment. It is the specific care that comes from knowing a thing well enough to understand what it needs, and valuing it enough to provide what it needs rather than extracting what you want.
Standing by words is a form of affection. It is the specific care that a writer invests in language — the willingness to revise, to check, to test, to sit with a sentence until it says what the writer means and not merely what the machine produces. This care is expensive. It is slow. It is invisible to every productivity metric. And it is the only thing that keeps language trustworthy enough to serve as the foundation of a functioning community.
Berry would ask, of every person who uses AI to generate language: Do you stand by these words? Not formally — anyone can sign a document. Do you stand by them in the way that matters — with understanding, with commitment, with the willingness to accept consequences for what they assert? If you cannot answer yes, then you have not used a tool. You have participated in the pollution of the commons. And the commons, like the soil, will bear the cost long after you have moved on.
In 1990, Wendell Berry published a collection of essays under a title that was not a rhetorical flourish but a genuine question, the kind of question a culture asks when it has begun to suspect that its answer is wrong: What Are People For?
The question had been building for two centuries. The industrial economy answered it with a clarity that left no room for ambiguity: people are for production. They are for the factory, the office, the assembly line, the quarterly earnings call. Their value is measured by their output. Their dignity is contingent on their usefulness. When their usefulness declines — when the machine can do what they do, faster and cheaper — the culture does not grieve. The culture retrains, redeploys, or discards. The language of human resources captures the logic perfectly. Humans are resources. Resources are extracted, processed, and consumed. When a resource is depleted, you find another.
Berry, being what the Front Porch Republic essayist accurately called "a sane man," said no. Not as a political gesture. Not as an act of rebellion. As a statement of fact, arrived at through sixty years of farming the same hillside in Henry County, Kentucky, and watching what happens to land, communities, and human beings when the industrial answer to "What are people for?" is accepted without examination.
Berry's answer is deceptively simple: people are for care. Not care as an abstraction — the kind of care that appears in mission statements and corporate values documents and is forgotten by the time the quarterly review begins. Care as a practice. The specific, embodied, daily practice of tending something — a garden, a child, a community, a craft, a marriage, a piece of land — with the patience and attention that only a person who is present, who has chosen to be present, who has stayed long enough to understand what the thing needs, can provide.
Care, in Berry's usage, is not a feeling. It is a competence. It is the competence of the farmer who knows that this field needs to rest this year, not because a soil test said so but because the farmer has watched the field for thirty springs and can read its weariness the way a parent reads a child's face. It is the competence of the carpenter who knows that this joint needs to be hand-fitted, not because the specifications require it but because the carpenter has felt the wood and knows that a machine-cut joint will not hold in this particular grain. It is the competence that accumulates only through sustained, attentive, local engagement with a specific domain — the competence that Segal's engineer lost, imperceptibly, when Claude took over the implementation work that had been depositing layers of understanding for years.
Segal's twelve-year-old asks the question in Chapter 6 of The Orange Pill: "Mom, what am I for?" She asks it because she has watched a machine do her homework better than she can, compose a song better than she can, write a story better than she can. She is lying in bed wondering what is left for her. Segal's answer is that she is "for the questions. For the wondering." The answer honors the cognitive dimension of human purpose — the capacity for curiosity, for inquiry, for the kind of restless questioning that no machine originates.
Berry would not disagree. But Berry would find the answer incomplete in a way that matters.
Questions are cognitive acts. They happen in the mind. They are valuable — Berry would never deny that wondering is part of what makes human beings human. But questions, by themselves, do not tend anything. Questions do not plant. Questions do not weed. Questions do not sit with a sick neighbor. Questions do not repair a fence. Questions do not teach a child to use a handsaw by standing beside the child, guiding the child's hands, absorbing the child's frustration, celebrating the child's first clean cut. Questions, pursued without the grounding of care, become the specific disease Berry has diagnosed in the professional intellectual class: the capacity to analyze everything and tend nothing.
Berry's answer to the twelve-year-old would be more specific and more demanding than Segal's. It would go something like this: You are for the place you are in. You are for the people who depend on you. You are for the work that only you can do, because only you are here, now, in this body, with these hands, in this community. That work cannot be delegated to a machine, because the work is not the output. The work is the relationship — between you and the thing you tend, between you and the people you serve, between your hands and the material they shape.
The distinction between Berry's answer and Segal's is not a disagreement about the value of questioning. It is a disagreement about the level at which human purpose operates. Segal locates human purpose in cognition — in the mind's capacity to wonder, to inquire, to ask what is worth building. Berry locates human purpose in practice — in the body's capacity to tend, to shape, to be present to a specific task in a specific place over a specific span of time. Cognition without practice is what Berry calls "thinking big" — the grand abstraction that imposes solutions from above without understanding the ground. Practice without cognition is what Berry calls "drudgery" — labor emptied of meaning by the absence of understanding. A complete human purpose requires both. But if forced to choose, Berry would choose practice, because practice is what connects the person to the world, and the connection is where care lives.
The industrial economy's answer to "What are people for?" has always implied a corollary: when the machine can do what the person does, the person is no longer for anything. This corollary has been implicit in every round of technological displacement, from the power loom to the spreadsheet to the robotic assembly line. Berry's objection is not that the corollary is economically wrong — in the narrow terms of the industrial economy, it may be correct that the machine is more efficient. Berry's objection is that the question has been asked at the wrong level. The industrial economy asks what people are for in terms of production. Berry asks what people are for in terms of life.
Life is not production. Production is one activity within a life, and when production becomes the measure of a life's value, the life has been reduced to a function. Berry has watched this reduction happen to farmers for sixty years. The industrial agricultural system measures a farmer's value by yield per acre. A farmer who produces less per acre than the industrial average is, by this metric, a failure — regardless of whether the farmer's soil is healthier, the farmer's community is more cohesive, the farmer's children are better educated, the farmer's marriage is stronger, the farmer's knowledge of the local ecology is deeper. None of these count. Only the yield counts. And when the yield can be produced by machines — by GPS-guided tractors, by automated irrigation, by AI-optimized planting algorithms — the farmer is, by the industrial metric, redundant.
Berry's response, repeated in essays spanning five decades, is that the metric is wrong. Not that the metric is one valid perspective among many. That the metric is wrong — factually, morally, and practically wrong. Factually, because it ignores the farmer's knowledge of the specific conditions that determine long-term productivity. Morally, because it reduces a human being to an economic function. Practically, because the industrial system that replaces the farmer with machines degrades the soil, poisons the watershed, destroys the community, and produces food that is cheap in price and expensive in every other form of cost.
The parallel to AI and knowledge work is not approximate. It is exact. The industrial technology economy measures a developer's value by output — features shipped, code committed, tickets closed. A developer who produces less output than an AI-assisted colleague is, by this metric, underperforming. The metric does not count the developer's understanding of the system, the developer's relationships with colleagues, the developer's capacity to mentor junior developers, the developer's embodied knowledge of what the codebase needs. None of these count. Only the output counts. And when the output can be produced by machines, the developer is, by the industrial metric, redundant.
Berry's answer, applied to the developer as to the farmer, is that the metric is wrong. The developer is not for output. The developer is for the care that makes output meaningful — the specific, local, embodied understanding that determines whether the output serves the system's long-term health or merely satisfies the quarter's demand.
Segal documents the moment when his senior engineer realized that "the remaining twenty percent — the judgment about what to build, the architectural instinct about what would break, the taste that separated a feature users loved from one they tolerated — turned out to be the part that mattered." Berry would agree that this is the part that matters. He would add that this part — the judgment, the instinct, the taste — did not appear from nowhere. It was deposited, layer by layer, through the eighty percent that the tool now handles. The twenty percent is the fruit. The eighty percent was the soil. Remove the soil, and the fruit will continue to appear for a time, drawing on stored nutrients. But the nutrients are not being replenished. The soil is thinning. And a culture that eats the fruit without tending the soil will eventually find that the fruit has stopped coming, and will not understand why.
What are people for? Berry's answer has not changed in sixty years, because the answer does not depend on the technology. People are for the work that machines cannot do — not because machines lack capability, but because the work Berry means is defined by the quality of the relationship between the worker and the world, and machines do not have relationships. They have functions. The function may be identical to the work. The relationship is what makes the work human.
The twelve-year-old lying in bed, wondering what she is for, does not need to be told that she is for the questions. She needs to be shown — by a parent who tends, who stays, who is present in the specific way that only a person who cares about this particular child in this particular moment can be — that she is for the care that no machine can provide. Not because the machine is not sophisticated enough. Because care requires a person. It requires stakes. It requires the specific vulnerability of a creature who can fail, who can be hurt, who has chosen to invest in something that might not succeed, and who shows up anyway, tomorrow morning, to tend what was planted yesterday.
That is what people are for. Berry has been saying it for sixty years. The machines have made it easier to hear, because the machines have stripped away everything else.
---
Wendell Berry tells a story about a teamster — a man who worked with horses — and the story contains an idea that the age of artificial intelligence has made more important than Berry could have known when he first told it.
The teamster knew the whole horse. Not the horse as a collection of systems — cardiovascular, musculoskeletal, digestive, nervous — each to be analyzed, optimized, and maintained by a separate specialist. The whole horse. The animal's temperament on a cold morning versus a warm afternoon. The way it favored its left foreleg on soft ground. The sound it made when it was tired versus the sound it made when it was anxious. The particular angle of its ears that meant it had noticed something the teamster had not yet seen. The teamster's knowledge was not anatomical. It was relational. It came from years of daily presence — feeding, grooming, working, resting, standing together in the barn while rain drummed on the roof.
Berry uses the metaphor of the whole horse to describe a kind of understanding that the industrial economy systematically destroys: holistic knowledge, the understanding of a thing as a living system rather than as a collection of analyzable parts. The industrial economy's characteristic method is decomposition — breaking a complex whole into components, optimizing each component separately, and reassembling the optimized components into what is assumed to be an optimized whole. The method works brilliantly for machines. It fails, often catastrophically, for living systems, because living systems have a property that machines do not: the relationships between the parts are as important as the parts themselves, and those relationships cannot survive decomposition.
A horse is not the sum of its systems. A farm is not the sum of its fields. A community is not the sum of its members. A codebase is not the sum of its functions. In each case, the whole possesses properties that emerge from the relationships between the parts — properties that are invisible to any analysis that examines the parts in isolation. The teamster who knows the whole horse knows something that no veterinary specialist knows, not because the teamster has more information but because the teamster has a different kind of knowledge: knowledge of the relationships, the patterns, the subtle interdependencies that only sustained, attentive, holistic engagement can reveal.
Artificial intelligence is the ultimate specialist. This is not a criticism of AI's capability — the capability is extraordinary. It is a description of AI's architecture. A large language model processes information by decomposing it into tokens, analyzing the statistical relationships between tokens, and producing outputs that are consistent with those relationships. The model can access information about every component of any system — every function in a codebase, every case in a legal database, every study in a medical literature. What the model cannot do is understand the whole. The model does not know the horse. The model knows the cardiovascular system, the musculoskeletal system, the digestive system. It knows them with a thoroughness that exceeds any human specialist. But it does not know the animal that contains them all, because the animal is not a collection of systems. The animal is a life.
Segal argues in The Orange Pill that AI frees workers to become integrators — to move across disciplinary boundaries, to see the whole picture, to exercise the judgment that connects the parts into a coherent whole. Berry would engage this argument seriously. Integration is, in principle, exactly the kind of holistic understanding Berry values. The developer who can see how the code interacts with the user experience interacts with the business model interacts with the organizational culture possesses a form of whole-horse knowledge that the specialist lacks.
The question Berry would ask is whether the integration Segal describes is genuine or simulated. Genuine integration requires deep understanding of each component — not specialist depth in every component, which is impossible, but enough direct experience with each component to understand how it behaves under stress, where it is fragile, what it needs that it is not receiving. The teamster does not need to be a veterinary cardiologist. But the teamster needs to have spent enough time with the horse to recognize when something is wrong with the heart, even if the teamster cannot name the condition.
AI-assisted integration risks producing the appearance of holistic understanding without the substance. A developer who uses Claude to generate frontend code, backend logic, and database schemas can produce a complete system without deeply understanding any of its layers. The system may work. It may even be well-architected, because Claude has been trained on vast quantities of well-architected systems. But the developer who produced it does not know the whole horse. The developer knows what each component looks like from above — from the strategic altitude Segal celebrates. The developer does not know what each component feels like from inside — the specific, embodied, local knowledge that comes from having wrestled with the component's difficulties firsthand.
Berry would call this the difference between a map and the territory. The map is useful. The map is sometimes essential. But the map is not the ground. A person who has only seen the map will make decisions that look correct from above and that fail on the ground, because the ground contains conditions the map does not represent. The map shows the contours. The ground has the soil, the drainage patterns, the root systems, the microclimate created by the tree line to the west, the pocket of clay three feet down that no survey detected. The farmer who has walked the ground knows these things. The farmer who has studied the satellite image does not.
Segal's "vector pods" — small groups whose job is to decide what should be built — are an organizational attempt to institutionalize whole-horse knowledge. Berry would grant the aspiration. He would ask how the people in the vector pods develop the ground-level understanding that makes their decisions wise rather than merely plausible. If the pod members have spent years working directly with the components they now integrate — if the designer has built interfaces, the engineer has debugged systems, the business strategist has talked to customers — then the pod possesses something approaching genuine holistic knowledge. If the pod members have spent their careers at the strategic level, directing without building, the pod possesses the map without the territory.
Berry's argument, applied consistently, produces an uncomfortable prescription: the integrator must have been a practitioner. The person who directs must have tended. The leader who makes strategic decisions about a system must have spent enough time inside the system to know it the way the teamster knows the horse — not from a manual, not from a dashboard, not from an AI-generated summary, but from the daily, patient, sometimes tedious practice of working with the thing directly.
This prescription conflicts with the trajectory Segal describes. The trajectory is toward strategic work — toward "deciding what to build" rather than building it, toward "vision" rather than implementation. Berry would not reject the value of strategic work. He would insist that strategic work divorced from practical experience is strategic work built on sand. The vision is only as good as the visionary's understanding of the ground, and the understanding of the ground comes only from having worked the ground, not from having flown over it.
The front line of this argument appears in education. Berry wrote in Life Is a Miracle that the university system has replaced holistic understanding with specialized knowledge, producing graduates who know more and more about less and less until they know everything about nothing. The criticism, applied to AI-era education, becomes more pointed. If AI handles the implementation, and education focuses on developing "judgment" and "integration" — the strategic capabilities Segal identifies as the new premium — then education risks producing graduates who can think about systems without having built them, who can evaluate outputs without understanding processes, who possess the vocabulary of holistic knowledge without its substance.
Berry's prescription for education, like his prescription for farming, is unfashionable: slow down. Do the work by hand before you use the tool. Understand the material before you direct the machine. Build the thing yourself, badly, laboriously, with all the friction intact, before you ask the AI to build it for you. The understanding that comes from this process — the embodied, local, specific knowledge of how things work and why they fail — is the soil from which genuine integration grows. Without it, integration is simulation. And simulation, however sophisticated, is not the whole horse.
The whole horse is not a concept. It is a relationship. It is the teamster standing in the barn on a February morning, breath visible in the cold air, running a hand along the horse's flank and knowing — not inferring, not analyzing, not consulting a database — knowing, in the specific way that only sustained presence teaches, that the horse is well today. That knowledge has no market value. No productivity metric captures it. No quarterly report records it. And it is the knowledge on which every good decision about the horse depends.
Berry's challenge to the AI-augmented organization is simple and radical: make sure someone knows the whole horse. Not the map of the horse. Not the dashboard of the horse. The horse itself, in all its irreducible, unoptimizable, living complexity.
---
In 1972, Wendell Berry published a short essay called "Think Little" in A Continuous Harmony, and the argument was so counterintuitive to the American grain that it has been largely ignored for fifty years by the people who most need to hear it: the ambitious, the capable, the people who believe that the size of the problem determines the scale of the solution.
Berry's argument is that it does not. That the most important work in the world is small work. That the grand plan, the comprehensive solution, the systems-level intervention that promises to solve everything at once, is almost always the enemy of the specific, local, patient attention that actually produces change. Thinking big is the industrial economy's signature cognitive style — the style that produced interstate highways through living neighborhoods, monoculture farms on complex ecosystems, and global supply chains that are efficient in every measurable way and fragile in every way that matters. Thinking big is what happens when abstraction replaces attention, when the model replaces the ground, when the strategist who has never walked the field decides what should grow there.
Berry's counter-proposal is not to think less. It is to think at the right scale. The right scale is the scale at which you can observe the consequences of your actions, adjust in response to what you learn, and maintain the relationship between your intervention and the system you are intervening in. For a farmer, this is the scale of a field. For a teacher, this is the scale of a classroom. For a parent, this is the scale of a family. The scale at which care is possible — not the abstract care of policy documents but the specific, attentive, daily care that makes the difference between soil that is tended and soil that is mined.
Artificial intelligence is the most powerful engine for thinking big that human beings have ever constructed. Its economics point toward scale in every dimension. The marginal cost of serving an additional user approaches zero. The training data encompasses the full breadth of human knowledge. The model can generate solutions of a generality and comprehensiveness that no individual mind can match. The gravitational pull is always toward more — more users, more applications, more domains, more capability, more reach.
Segal documents this pull honestly in The Orange Pill. The twenty-fold productivity multiplier he observed in Trivandrum was not a multiplier of the same work at the same scale. It was an expansion — of scope, of ambition, of what a single person or a small team could attempt. Engineers who had spent years in narrow technical lanes started reaching across boundaries. Features that would have required months of coordinated effort materialized in days. The imagination-to-artifact ratio collapsed, and the natural response to that collapse was to imagine bigger, to attempt more, to operate at a scale that the previous constraints had made impossible.
Berry would observe that the natural response is precisely the dangerous one. Not because ambition is wrong, but because the expansion of capability without the expansion of understanding produces action that exceeds comprehension. The farmer who acquires a larger tractor can plow more land. Whether the farmer should plow more land — whether the additional land can sustain the plowing, whether the soil will hold, whether the watershed will absorb the runoff — are questions the tractor does not answer. The tractor makes the plowing possible. The judgment about whether to plow requires a different kind of knowledge: the local, specific, patient knowledge of this land, in this season, under these conditions.
AI makes things possible. It does not make them wise. The distinction is Berry's central contribution to the AI discourse, and it is the contribution most consistently missed by the people who are most excited about what AI makes possible.
The Front Porch Republic essayist who wrote in December 2025 that "the question is not what is coming, but what is good" was channeling Berry directly. The industrial economy asks: What can be done? Berry asks: What should be done? And the word "should" carries, in Berry's usage, a weight that the industrial economy does not recognize — the weight of consequence, of responsibility, of the recognition that every action has effects that extend beyond the actor's intention and beyond the actor's sight.
Think little, in the context of AI, means something specific. It means resisting the tool's gravitational pull toward scale. It means asking: What is the specific problem, in the specific place, for the specific people? Not what AI can do in general, but what it should do here, now, in this classroom, for this student, in this community, for this family. The answer will be different in Lagos and San Francisco, in a rural school district and a research university, in a startup and a hospital. Berry insists that it should be different — that the difference is not a bug in the system but the most important feature of the system, because the difference reflects the specificity of the context, and the context is where people actually live.
Segal invokes the developer in Lagos as the moral case for AI democratization, and the case is genuine. The developer deserves tools. Berry would not deny this. But Berry would ask: tools designed by whom, for what context, with what assumptions about what the developer needs? The tools Segal describes — Claude Code, frontier language models, the infrastructure of Silicon Valley's AI ecosystem — were designed in San Francisco. They were trained on data that is predominantly English, predominantly Western, predominantly reflective of the knowledge economy's specific assumptions about what constitutes valuable work. They are universal tools, and Berry has spent sixty years arguing that universal tools serve universal abstractions at the expense of local realities.
The developer in Lagos does not need a universal tool. She needs a tool that serves her context — her infrastructure constraints, her community's specific needs, her knowledge of the local market and the local culture and the specific problems that the people around her face daily. A universal tool may serve some of these needs. It cannot serve all of them, because serving all of them would require the local knowledge that no universal system possesses. Think little means building tools that serve specific communities rather than all communities, even though the economics of AI push relentlessly in the opposite direction.
Berry's "think little" is not a rejection of capability. It is a discipline of attention. The discipline says: before you scale, understand. Before you generalize, specify. Before you build for millions, build for the person in front of you, and make sure what you build serves that person — not the abstracted, averaged, universalized "user" of the product manager's imagination, but the actual human being with actual needs in an actual place.
This discipline is harder than thinking big. Thinking big is intoxicating because it liberates the thinker from the messy, resistant, specific details of actual implementation in actual communities. Thinking little requires the thinker to stay in the mess, to attend to the specifics, to resist the abstraction that makes the work feel more important than it is. Berry's farmer does not think about agriculture in the abstract. The farmer thinks about this field, this year, this rain. The specificity is the discipline, and the discipline is what produces results that serve life rather than serving the thinker's ambition.
The parent at the kitchen table, the person Segal identifies as the reader he wrote The Orange Pill for, does not need to think big about AI. The parent needs to think little. Not: What will AI do to the economy? Not: What will the future of work look like? These questions are important. They are not the parent's questions. The parent's question is: What does my child need, tonight, at this table, in this conversation? Does my child need a lecture about the future of technology? Or does my child need something simpler — the specific presence of a person who has set down the phone, closed the laptop, and chosen to be here, in this kitchen, with this child, attending to what this child is feeling and thinking and struggling with?
Berry would say: the parent who chooses presence over information has made the more important choice. Not because information is worthless, but because presence is what the child actually needs, and the child's need is more specific, more immediate, and more important than any abstraction about the future of the economy.
Think little. Attend to the particular. Serve the person in front of you. Build for the place you are in. These are Berry's prescriptions, and they are harder to follow in the age of AI than at any previous moment in human history, because AI makes thinking big feel not just possible but obligatory. The tool can serve millions. The economics demand millions. The ambition reaches for millions. And the person in front of you — the child at the table, the student in the classroom, the neighbor who needs help — disappears into the abstraction, because the abstraction is more exciting, more scalable, and more likely to appear in a quarterly report.
Berry, from his farm in Henry County, would say: the person in front of you is not an abstraction. The person is a life. Tend it.
---
In 1954, Walker Percy published an essay called "The Loss of the Creature" in which he described a problem that has become, in the seven decades since, not less urgent but more: the problem of mediation. Percy's argument was that modern culture interposes so many layers between the person and the thing — expectations, representations, expert opinions, institutional framings, photographic previews — that the person never encounters the thing itself. The tourist at the Grand Canyon does not see the Grand Canyon. The tourist sees the postcard of the Grand Canyon, superimposed on the actual canyon, and measures the actual experience against the pre-consumed representation. The canyon is there. The experience is not, because the experience has been mediated into nonexistence.
Percy was a novelist. Berry is a farmer. But Berry has cited Percy's influence and extended Percy's analysis into the domain Berry knows best: the relationship between the maker and the material. The craftsperson who shapes wood encounters the wood directly. The grain resists. The knot deflects the chisel. The moisture content affects the cut. Each of these encounters is specific, surprising, instructive. The wood teaches the craftsperson something that no manual contains, because the lesson is not about wood in general. It is about this piece of wood, with this grain, this moisture, this history of growth and season and sun. The encounter is unmediated. The knowledge it produces is irreplaceable.
Mediation, in Berry's extension of Percy's argument, is any layer that interposes itself between the maker and the material in a way that reduces the directness of the encounter. A power saw mediates the relationship between the carpenter and the wood. The carpenter no longer feels the grain through the resistance of the hand saw. The information is lost. The cut may be more precise — it usually is — but the carpenter's knowledge of the wood is diminished, because the knowledge that comes through the hand is different from the knowledge that comes through the eye, and the power saw has eliminated the hand's contribution.
Berry does not argue that the power saw should be abandoned. His argument, as with every technology, is about awareness — about the willingness to acknowledge what is lost in the gaining, and to make the choice deliberately rather than by default. The carpenter who uses a power saw knowing what the power saw costs in tactile knowledge is a carpenter who may choose, for certain critical joints, to return to the hand saw. The carpenter who uses a power saw without awareness of the cost will never make that choice, because the cost is invisible, and invisible costs accumulate until they become structural.
AI is the most comprehensive mediator ever inserted between a human being and the work the human being is trying to do. This is not a polemical claim. It is a description of the technology's operation. When a developer uses Claude to write code, Claude interposes itself between the developer and the code. The developer describes the desired function. Claude produces the implementation. The developer reviews the output. At no point does the developer encounter the raw material — the specific syntax, the particular logic, the resistance of the language to the developer's intention — in the unmediated way that direct coding provides.
The output may be superior. Segal documents cases where it is. Claude produces cleaner code, handles edge cases more thoroughly, and generates implementations that would have taken the developer hours or days to produce by hand. The product is better. The encounter is thinner.
Percy would recognize the structure immediately. The developer at the code is like the tourist at the canyon. The thing is there — the code, the logic, the system — but the experience of engaging with it directly has been mediated by a layer so sophisticated, so fluent, so convincingly present, that the developer may not notice the mediation is occurring. The code appears. It works. The developer moves on. The lesson the code would have taught — the specific, surprising, instructive lesson that only direct encounter can provide — has been absorbed by the mediator.
Segal describes this dynamic when he writes about the engineer who lost ten minutes of formative struggle buried in four hours of tedious plumbing. The plumbing was tedious. The engineer did not miss it. But mixed into the tedium were moments when the system behaved unexpectedly, when the configuration revealed a connection the engineer had not previously understood, when the resistance of the material taught something no documentation could convey. Those moments were the encounter with the creature — the direct, unmediated engagement with the living complexity of the system. Claude handled the plumbing. The moments disappeared. The engineer did not know what had been lost until the consequences arrived, months later, as a diminished confidence in architectural decisions that had previously felt instinctive.
Berry would say: the moments were the point. Not the tedium that surrounded them — the tedium was real and genuinely unproductive. But the moments of surprise, of resistance, of the material teaching the maker something the maker did not expect — those moments are where understanding lives. They cannot be predicted. They cannot be scheduled. They cannot be preserved while the surrounding tedium is eliminated, because they are embedded in the tedium the way wildflowers are embedded in a meadow. You cannot mow the meadow and keep the flowers. They grow together, or they do not grow.
The loss of the creature, applied to AI-assisted creative work, takes a specific and identifiable form. The writer who uses AI to generate prose does not encounter the language in the way a writer who struggles with each sentence encounters it. The struggle is the encounter. The moment when the sentence will not come — when the words resist the meaning, when the rhythm breaks, when what the writer is trying to say refuses to be said in the way the writer is trying to say it — is the moment of maximum contact with the material. It is uncomfortable. It is frustrating. It is the specific experience that Berry, borrowing from Percy, would call the creature: the thing itself, encountered without mediation, in all its difficulty and surprise.
AI eliminates this encounter by providing sentences that work. The sentences may be excellent. They may express the meaning more clearly than the writer could have expressed it alone. But the writer has not encountered the language. The writer has encountered Claude's interpretation of the language, which is a different thing entirely — as different as the postcard is from the canyon.
A culture organized around AI-assisted production will, over time, lose the capacity for direct encounter with the materials of its own creativity. This is not a prediction about a distant future. It is a description of a process already underway, documented by Segal himself, by the Berkeley researchers, by every honest account of what it feels like to work with AI day after day. The fluency increases. The ease increases. The output increases. And something else — something harder to name, something Percy called the creature and Berry calls the direct relationship between the maker and the material — decreases.
Berry would not prescribe abandonment of the tool. He has never been an absolutist, despite the caricature. Berry would prescribe awareness. The awareness that every mediation costs something. The awareness that the cost is real even when it is invisible. The awareness that a developer who never debugs by hand, a writer who never struggles with a sentence, a student who never labors through a problem set, has been separated from the creature — from the direct encounter with difficulty that is where understanding, and ultimately mastery, and ultimately the pleasure of genuine competence, are born.
The loss of the creature is not a catastrophe. It is an erosion. It happens gradually, imperceptibly, one mediated encounter at a time. The carpenter who uses the power saw for the first time notices what is missing — the feel of the grain, the rhythm of the hand. The carpenter who has used the power saw for twenty years does not notice, because the memory of what was lost has itself been lost. The new normal is the only normal. The mediated experience is the only experience. The postcard is the only canyon.
Berry's prescription is practice. Regular, deliberate, unmediated practice. The return to the hand saw, not because the hand saw is more efficient — it demonstrably is not — but because the hand saw preserves the encounter. The return to the blank page, not because the blank page is more productive — it demonstrably is not — but because the blank page is where the writer meets the language without a mediator, and the meeting is where the writer's understanding of the language deepens, and the deepening is what makes the writer's subsequent use of AI genuinely collaborative rather than merely delegative.
Percy closed his essay with the observation that the creature can be recovered — but only through a deliberate act of divestiture, a stripping away of the mediating layers, a willingness to encounter the thing fresh, as if for the first time. Berry, writing from his farm where he encounters the soil fresh each morning, would say: the divestiture is not dramatic. It is daily. It is the choice to do something by hand that could be done by machine, not as a gesture of resistance but as an act of renewal. The act of maintaining the relationship between the maker and the material that no machine can maintain on the maker's behalf.
The creature is still there. The canyon is still there. The code is still there. The language is still there. The wood is still there. The question is whether anyone is still encountering them directly, or whether the mediation has become so total, so fluent, so comfortable, that the encounter itself has been forgotten.
Wendell Berry's essay "Health Is Membership," published in 1995, begins with a story about his brother, who was hospitalized after a heart attack. The medical system treated the heart. It treated the arteries. It treated the specific organ that had failed. What it did not treat, because it had no mechanism for treating, was the man — the farmer, the husband, the father, the member of a community in Henry County, Kentucky, whose health was inseparable from the health of the relationships, the daily practices, the mutual obligations, and the particular piece of land that constituted his life.
Berry's argument is that health is not a property of an individual organism. Health is a property of a system — a web of relationships between the person and the community, the community and the land, the land and the watershed, the watershed and the region. Pull one thread, and the others weaken. Sever one connection, and the organism may continue to function, but the functioning is diminished in ways that no individual metric can detect, because the diminishment is not in the organism. It is in the space between the organism and everything it depends on.
The argument sounds pastoral. It is, in fact, the most radical claim Berry makes, because it challenges the foundational assumption of the industrial economy: that the individual is the unit of analysis. The industrial economy measures individual productivity, individual health, individual wealth, individual capability. Berry insists that these measurements are fictions — not because the individuals do not exist, but because the individuals do not exist independently. They exist in relationship, and the quality of the relationship determines the quality of everything the individual metrics claim to measure.
Applied to AI-augmented work, Berry's argument about membership produces an analysis that neither the triumphalists nor the critics have adequately developed. The triumphalists celebrate the expansion of individual capability — the developer who can do the work of twenty, the solo founder who ships a product without a team, the designer who implements end-to-end without an engineer. These are real expansions. They are measurable. They represent genuine additions to the sum of what a single person can accomplish.
Berry would ask: at what cost to membership?
The question is not rhetorical. The cost is specific and observable. When a developer can do the work of twenty, the developer does not need nineteen colleagues. The economic logic is immediate: if one person plus AI can produce what twenty people produced before, nineteen positions become redundant. Segal acknowledges this pressure honestly in The Orange Pill — the quarterly conversation where the arithmetic is on the table, the investor who understands headcount reduction in the bones. Segal chose to keep the team. Berry would honor the choice and then ask the harder question: how long can any organization sustain a decision to maintain community against an economic logic that rewards its dissolution?
The economic pressure toward AI-augmented individualism is not a bug in the system. It is the system's primary output. Every efficiency gain that reduces the number of people required for a task is, simultaneously, a reduction in the number of relationships required for that task. The reduction in relationships is, in Berry's framework, a reduction in health — not the health of the product, which may improve, but the health of the community of practice that produced it.
Consider what a team provides that a solo practitioner with AI does not. A team provides the friction of disagreement — the moment when one developer says "that will break" and another says "no it won't" and the ensuing argument produces an understanding that neither possessed alone. A team provides the slow accumulation of trust — the specific knowledge that this person will catch what you miss, that this person's judgment in this particular domain is more reliable than your own, that this person will tell you the truth when the truth is uncomfortable. A team provides mentorship — the transfer of embodied knowledge from experienced practitioners to newcomers through the daily, patient, often inefficient process of working alongside each other.
AI provides none of these. AI provides answers. It provides competence. It provides the specific capability of generating correct outputs in response to well-formed inputs. What it does not provide, and what Berry would insist it cannot provide, is the mutual obligation that constitutes membership. The AI does not depend on the developer. The developer depends on the AI. The relationship is asymmetric. And asymmetric relationships, in Berry's framework, are not relationships at all. They are transactions.
The Berkeley researchers documented the dissolution of team boundaries in AI-augmented organizations. Designers started writing code. Engineers started making product decisions. The boundaries between roles blurred. From the productivity perspective, this was expansion — more capability, more flexibility, more output per person. From Berry's perspective, it was atomization. Each person, equipped with AI, became more self-sufficient and less connected. The need for each other — the specific, daily, practical need that is the foundation of community — diminished with each capability the tool provided.
Berry would note that the dissolution of need is the dissolution of membership. A community in which no one needs anyone else is not a community. It is a collection of individuals who happen to occupy the same organizational space. The collection may be productive. It is not healthy, in Berry's sense, because health requires the specific bonds that only mutual dependence creates.
Segal's account of the Napster team provides the counter-evidence that Berry's framework demands. The trust that Segal describes — the "fast trust" earned through navigating chaos together — is precisely the kind of membership Berry values. It was not produced by AI. It was produced by the specific, human, friction-rich experience of building something together under pressure, of depending on each other, of discovering through shared struggle what each person could be relied on for. The trust preceded the AI tools. The question is whether it can survive them.
Berry would predict, based on sixty years of observing what happens to communities when the economic basis for mutual dependence is removed, that the trust will erode. Not through any dramatic rupture. Through the slow, imperceptible process of disuse. When the developer no longer needs the designer's help, the developer stops asking. When the designer no longer needs the engineer's implementation, the designer stops consulting. Each person becomes more capable and more isolated, and the isolation is invisible because the capability masks it. The team still meets. The Slack channel still hums. The quarterly offsite still happens. But the substance of the connection — the daily, practical, specific dependence on each other that is what Berry means by membership — thins with each task that AI makes solitary.
The Alex Finn case — the solo builder who shipped a revenue-generating product with zero days off — is, in Berry's framework, a case study in the pathology of disconnection. The product exists. The revenue flows. The builder is alone. Not alone in the romantic sense of the solitary genius — alone in the specific sense of a person who has no one to depend on and no one who depends on them in the context of their work. The aloneness is productive. It may even be exhilarating. It is not healthy, because health is membership, and membership requires the specific vulnerability of needing other people.
Berry's prescription is not to reject AI in order to preserve artificial dependencies. The prescription is to recognize that the dependencies AI eliminates were not artificial. They were the mechanism through which communities of practice developed the mutual knowledge, mutual trust, and mutual obligation that constitute their health. When the mechanism is removed, the health must be maintained by other means — deliberately, expensively, against the economic current that rewards efficiency over connection.
Segal's decision to keep and grow the Napster team is one such deliberate maintenance. Berry would ask what specific practices sustain the team's membership now that AI has reduced the practical need for it. Not the quarterly offsite. Not the team-building exercise. The daily practices — the pair programming, the code review, the design critique, the lunch-table argument about whether the feature is good enough — that produce the specific, local, embodied knowledge of each other that is what membership actually means.
Berry wrote in "Health Is Membership" that "the community, in the fullest sense, is the smallest unit of health and that to speak of the health of an isolated individual is a contradiction in terms." The sentence applies to organizations as directly as it applies to neighborhoods. The health of a development team is not the sum of the individual developers' capabilities. It is the quality of the relationships between them — the trust, the knowledge, the mutual obligation, the willingness to tell each other the truth. AI augments individual capability. It does not augment the relationships. And the relationships, Berry insists, are where health lives.
---
When despair for the world grows in me and I wake in the night at the least sound in fear of what my life and my children's lives may be, I go and lie down where the wood drake rests in his beauty on the water, and the great heron feeds. I come into the peace of wild things who do not tax their lives with forethought of grief. I rest in the grace of the world, and am free.
Wendell Berry published this poem in 1968 in a collection called Openings. It has become, in the decades since, the most widely read poem by a living American writer, reprinted in anthologies and commencement addresses and pinned to refrigerator doors and shared in the small hours of the night by people who need the specific solace it provides. The poem's power is not literary, though it is literary. Its power is diagnostic. It names a condition that millions of people recognize without having had words for it: the condition of waking in the night, taxed by forethought of grief, unable to rest because the mind will not stop anticipating disaster.
The condition Berry describes in 1968 has become, in the age of artificial intelligence, the default setting of the human nervous system. Not for everyone. Not in every moment. But with a prevalence and an intensity that would have startled Berry himself, who wrote the poem when the most invasive technology in the average household was a television set with three channels.
Segal's Orange Pill is a book written from inside this condition. The author admits it directly: the 3 a.m. sessions, the inability to stop building, the exhilaration that curdles into compulsion, the recognition that "the whip and the hand that held it belonged to the same person." The book is a sustained act of forethought — of anticipating what AI will do to work, to creativity, to education, to parenting, to the fundamental question of what human beings are for. The forethought is valuable. It is also, in Berry's diagnosis, the disease.
Not the forethought itself. The inability to set it down.
Berry's poem does not argue against thinking about the future. It does not propose ignorance as a remedy for anxiety. It proposes something more specific and more radical: the deliberate entry into a state of being where forethought is absent. Not suppressed. Not medicated. Absent — because the person has placed themselves in the presence of creatures who do not possess it. The wood drake does not worry about tomorrow. The heron does not calculate risk. They exist in the present tense, and their existence in the present tense creates a space — an opening, as the collection's title suggests — into which the human being can enter and, for a time, rest.
The peace of wild things is the peace of unmediated presence. It is the opposite of everything the AI-augmented workday produces. The AI-augmented workday is a sustained exercise in forethought: What should be built? What will the market want? What does the user need? What will the competition do? What will the quarterly numbers show? The questions are valid. They are also endless. They generate more questions. The forethought feeds on itself, expanding to fill every available cognitive space, colonizing lunch breaks and elevator rides and the minutes before sleep — the specific colonization the Berkeley researchers documented as "task seepage."
Berry's prescription is not productivity management. It is not the "structured pauses" or "AI Practice" frameworks that Segal and the Berkeley researchers recommend, though Berry would not oppose them. Berry's prescription is wilder, less domesticated, harder for the industrial economy to absorb: go to the place where the wild things are. Lie down. Rest in the grace of what is, rather than the anxiety of what might be. Allow yourself, for a period that has no productive justification whatsoever, to simply be a creature among creatures, present to the world as it exists right now, without the overlay of strategic anticipation that the AI age has made continuous.
This prescription will strike the builder as impractical. Of course it will. The builder's identity is constructed around forward motion, and forward motion requires forethought, and forethought is the thing Berry is asking the builder to set down. The builder will say: "I cannot afford to stop." Berry, who has heard this objection from farmers for sixty years, would reply: "You cannot afford not to. The soil of your attention is being mined. The yields are impressive. The depletion is invisible. But the depletion is real, and if you do not rest the field, the field will stop producing, and you will not understand why, because you were too busy producing to notice the soil thinning beneath your feet."
The analogy to agricultural rest is not decorative. It is structural. Berry's farming practice includes deliberate periods of rest — fields left fallow, animals rotated to fresh pasture, the Sabbath observed not as a religious obligation alone but as an agricultural necessity. The land that is never rested is land that is being mined. The mind that never rests is a mind that is being mined. The outputs continue. The capacity diminishes. And the diminishment is invisible to every metric except the one that matters most: the quality of the person's presence to their own life.
Berry has written dozens of Sabbath poems over forty years — poems composed on Sunday mornings while walking in the woods near his farm, in the specific state of mind that the absence of work makes possible. The poems are not about rest as recovery. They are about rest as encounter — the encounter with the world as it is when you are not trying to change it. The encounter Berry calls "the grace of the world," which is available only to a creature who has temporarily relinquished the ambition to improve the world and has accepted, for this moment, the world's own terms.
Segal's book ends with a sunrise. The metaphor is of ascent, of arrival, of having climbed the tower and earned the view. Berry would appreciate the effort. He would also observe that the sunrise is available without the tower. The sunrise is available to anyone who goes outside at dawn and stands still. The sunrise does not require a journey. It requires presence. And presence — genuine, unmediated, un-augmented presence — is the thing the AI age makes hardest and most necessary.
The peace of wild things cannot be amplified. This is the sentence that separates Berry's vision from Segal's most completely. Segal's Orange Pill argues that AI is an amplifier, and that the question is whether you are worth amplifying. Berry would accept the metaphor and then point to the thing the metaphor cannot contain: the experiences that are diminished by amplification. The peace of wild things is one of them. The heron's beauty is not improved by a better camera. The wood drake's rest is not enhanced by a more efficient pond. The grace of the world is not scalable. It is available only at the scale of one creature, in one place, at one time, paying attention with nothing between the creature and the world.
A culture that has forgotten how to rest in this way — that has made rest synonymous with recovery, and recovery synonymous with preparation for more production — is a culture that has lost something no tool can replace. Not because the tool is bad. Because the thing that was lost is the thing the tool was supposed to serve. The production was supposed to make life better. If the production has consumed the life, the production has failed at its own purpose, regardless of the metrics.
Berry would end where he always ends: not with an argument but with an invitation. Go outside. It does not need to be the woods of Henry County, though the woods of Henry County are available and Berry would recommend them. It can be the backyard. The park. The strip of grass between the parking lot and the sidewalk where, if you look, something is growing that no one planted and no one maintains. The wild thing, persisting in the interstices of the built environment, is still there. It is still resting in its beauty on the water. It is still feeding without forethought of grief.
The invitation is to join it. Not permanently. Not as a renunciation of the tools or the work or the ambition that makes building meaningful. As a practice. As a discipline. As the specific, daily, non-productive act of allowing yourself to be a creature in a world of creatures, present to what is, unburdened for this moment by what might be.
The beaver must sleep. The dam must be left unattended for the night. The river will still be there in the morning. And the morning will be different — not because the river has changed, but because the builder who returns to it has rested, and the rest has restored something that no amount of building can produce: the capacity to be present to the world, and to care about it, and to know why the caring matters more than the building.
Berry's final word, in every essay and every poem and every novel and every letter, is not an argument. It is a benediction: I rest in the grace of the world, and am free.
---
The soil under my fingernails has always been digital. I have never farmed. I have never owned land that required my body's attention more than a few hours a month. The closest I come to Berry's Kentucky hillside is the potted lemon tree on my terrace that I forget to water more often than I care to admit. I am, by every measure Berry would apply, the person his philosophy is aimed at rather than the person it speaks from.
That is precisely why these ten chapters shook something loose in me that the tower of The Orange Pill could not reach.
I built The Orange Pill as an ascent. Five floors. A staircase. A sunrise. The metaphor carried something real — the view from the roof genuinely clarified what the ground floor could not show me. But Berry's question haunts the architecture: What is happening to the ground while you are climbing? The soil of understanding, the relationships that constitute health, the daily practice of tending something specific — Berry insists these are not the foundation you leave behind on your way up. They are the thing itself. The tower stands on them or it does not stand.
I keep returning to his line about intelligence getting us into trouble it cannot get us out of. Not because I think AI is that trouble, necessarily. Because I recognize the posture Berry is warning against — the confidence that because we can see the problem clearly, we can solve it completely. I have that confidence. It is my operating system. It is what makes me useful as a builder and dangerous as a steward.
The peace of wild things will not appear in my quarterly review. It will not improve my product roadmap. It will not make me more competitive. And yet I find myself suspecting — in the specific, uncomfortable way that Berry's writing produces suspicion in people like me — that the peace is not a luxury the productive person earns but a necessity the productive person cannot afford to skip. That my lemon tree's slow insistence on being watered on its schedule rather than mine contains a lesson about limits that no language model will ever generate, because the lesson requires a body, a season, and the specific patience of a creature that grows at the pace of growth rather than the pace of ambition.
I have not resolved the tension between Berry's world and mine. I suspect he would say the tension is not meant to be resolved. It is meant to be lived in — carefully, attentively, with the humility of a person who has learned that the system is smarter than the person working within it.
My children will inherit whatever I build or fail to build. Berry's challenge is to make sure that what they inherit includes the capacity to tend, to stay, to know a place well enough to care for it. Not just the capability to build at scale. The willingness to think little. The discipline to stand by their words. The health that comes from membership in something that needs them as much as they need it.
The grace of the world is still available. Not behind a paywall. Not after a subscription. It is outside, right now, in whatever passes for wild things in whatever place you are reading this. The wood drake does not require your optimization. The heron does not need your roadmap. They are resting in their beauty, and the invitation is open.
I am going to water the lemon tree.
-- Edo Segal
Wendell Berry has spent sixty years farming the same Kentucky hillside and asking a question the technology industry has never paused long enough to consider: Does this tool preserve the capacity of the people who use it to care for the domains they tend? In this volume of The Orange Pill series, Edo Segal walks Berry's agrarian philosophy directly into the AI revolution -- examining what happens when the friction that built understanding is optimized away, when teams dissolve into augmented individuals, and when production outpaces the care that gives production meaning. Berry's framework does not reject the tools. It asks whether we are using them or being mined by them. The distinction, as any farmer knows, is the difference between soil that grows richer and soil that blows away.
QUOTE:

A reading-companion catalog of the 28 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Wendell Berry — On AI uses as stepping stones for thinking through the AI revolution.
Open the Wiki Companion →