By Edo Segal
The thing I could not explain was why the notebook felt different.
Not better. Not worse. Different in a way that mattered and that I had no language for. I describe in *The Orange Pill* a moment where the prose Claude and I produced together sounded better than it thought — where I closed the laptop, walked to a coffee shop, picked up a pen, and wrote by hand until the friction of the slower medium forced the thinking to catch up with the words. The version that emerged was rougher. More qualified. More honest about what I did not know. And unmistakably mine.
I knew something had happened in the switch from screen to paper. I could feel it. But I could not name it. The builder's fishbowl does not contain a vocabulary for the relationship between a hand and a surface, for what is gained when a medium resists you and what is lost when it does not.
Tim Ingold gave me that vocabulary.
Ingold is a social anthropologist who has spent four decades watching people make things — potters, weavers, boat builders, hunters, basket-makers — and asking a question so basic it sounds naive: What actually happens between the maker and the material? His answer dismantles a twenty-four-hundred-year-old assumption that sits beneath every product roadmap and every prompt I have ever typed. The assumption that creation starts with a form in your mind and ends when you impose it on passive matter. Blueprint, then building. Prompt, then output.
Ingold says that sequence is exactly backward. The potter does not force clay into a predetermined shape. She enters a dialogue with the clay — follows its moisture, its plasticity, its tendency to sag — and the form emerges from the conversation. The knowledge she develops lives not in her mind but in the relationship between her hands and the material. Sever the relationship, and the knowledge does not migrate to her brain. It ceases to exist.
Read that again. The knowledge ceases to exist.
This is the lens Ingold offers the AI moment. Not a rejection of the tools — I am building with them every day and have no intention of stopping. But a precise diagnosis of what changes in the maker when the material drops out of the equation. When the dialogue becomes a monologue. When correspondence becomes instruction.
The chapters that follow will challenge assumptions I hold dear. They challenged mine. That is why they belong in this series.
— Edo Segal ^ Opus 4.6
1948-present
Tim Ingold (1948–present) is a British social anthropologist widely regarded as one of the most original thinkers in contemporary anthropology. Born in England, he studied at Cambridge and spent the majority of his academic career at the University of Aberdeen, where he is now Professor Emeritus. His fieldwork among the Skolt Sámi of northeastern Finland shaped his early career, but his influence extends far beyond any single ethnographic tradition. Ingold's major works include *The Perception of the Environment* (2000), *Lines: A Brief History* (2007), *Being Alive* (2011), *Making: Anthropology, Archaeology, Art and Architecture* (2013), and *Correspondences* (2021). His key concepts — correspondence, enskilment, the meshwork, wayfaring, and the critique of hylomorphism — have reshaped how scholars across disciplines understand the relationships between skill, making, material engagement, and knowledge. He is a Fellow of the British Academy and the Royal Society of Edinburgh, and his work has become newly urgent in debates about what artificial intelligence means for human craft, embodied knowledge, and the nature of intelligence itself.
Every act of creation in the history of Western civilization carries a hidden assumption, so deeply embedded in the way people think about making that most makers have never noticed it operating. The assumption is this: first you conceive the form, then you impose it on material. First the blueprint, then the building. First the idea, then the artifact. Aristotle gave this assumption its name twenty-four hundred years ago — hylomorphism, from hyle (matter) and morphe (form) — and it has structured every subsequent theory of creation from Renaissance architecture to industrial engineering to modern software development. The assumption feels so natural, so obviously correct, that questioning it sounds like questioning gravity.
Tim Ingold has spent four decades questioning it.
The argument is deceptively simple. Ingold, a social anthropologist who trained at Cambridge and spent most of his career at the University of Aberdeen, observed that the hylomorphic model gets making exactly backward. The potter does not begin with a completed form in her mind and then force clay to conform to it. She begins with clay — wet, responsive, temperamental clay that has its own moisture content, its own plasticity, its own tendency to sag or crack or hold — and she enters into a dialogue with that clay, following its properties, responding to its resistance, discovering through the pressure of her hands against the wheel what the clay will and will not permit. The form that emerges is not the realization of a prior design. It is the outcome of a process of mutual responsiveness between maker and material that could not have been fully predicted before the making began.
This is not a minor philosophical quibble. It is a wholesale reframing of what creation is. In the hylomorphic model, the maker is an agent who imposes will on passive matter. In Ingold's model, the maker is a participant in a process of correspondence — his technical term for the mutual attentiveness between a living being and the world it inhabits. The form does not precede the making. The form is a property of the making itself.
Now consider the prompt. A user sits before a screen, describes what she wants in natural language — a piece of software, a business plan, a chapter of a book — and the machine produces it. The imagination-to-artifact ratio, the distance between the form in the mind and its material realization, collapses to the duration of a conversation. The Orange Pill celebrates this collapse as the most significant expansion of human creative capability since the invention of writing. The barrier between what a person can imagine and what a person can build has been reduced to nearly nothing. Anyone with an idea and the ability to describe it can now produce a working artifact in hours.
From Ingold's perspective, this celebration rests on the very illusion his career has been devoted to dismantling. The prompt-execute cycle is hylomorphism in its purest historical expression — purer than any previous technology has managed to achieve. The architect still had to negotiate with the properties of stone. The industrial designer still had to account for the tolerances of injection-molded plastic. The programmer still had to wrestle with the syntax and logic of a programming language that imposed its own constraints on the form of the solution. In every previous mode of making, the material pushed back. The resistance of the medium forced the maker into a dialogue, however minimal, with something other than her own intention.
The prompt-execute model eliminates this resistance almost entirely. The user describes the form. The machine produces the artifact. The material — the code, the text, the design — offers no resistance that the user must negotiate. The hylomorphic dream, twenty-four centuries old, has finally been realized: form imposed on matter without friction, without negotiation, without the inconvenient demands of a medium that has its own tendencies and its own logic.
And that, in Ingold's framework, is precisely the problem. Not because the products are inferior — they may be excellent. Not because the process is dishonest — the user genuinely conceived the intention. But because the model of creation that the prompt-execute cycle perfects is the model that most profoundly misunderstands what making actually is and what it actually does to the maker.
In his 2013 book Making: Anthropology, Archaeology, Art and Architecture, Ingold drew a distinction between two kinds of knowledge that bears directly on this question. The first kind is propositional knowledge — knowledge that can be stated, transmitted, stored, and retrieved. The capital of France is Paris. The boiling point of water at sea level is one hundred degrees Celsius. This is the kind of knowledge that AI handles with extraordinary competence: vast, searchable, instantly retrievable, articulable in natural language with a fluency that rivals any human expert.
The second kind is what Ingold, drawing on the phenomenological tradition, calls knowledge from the inside — the kind of understanding that develops only through sustained practical engagement with a task, a material, a domain. The potter's knowledge of clay is not primarily propositional. She could not write a manual that would transmit what she knows to a novice, because what she knows is not a set of facts about clay. It is a set of capacities — developed through thousands of hours of hands-on engagement — for sensing the clay's readiness, responding to its behavior, adjusting her pressure, her speed, her angle in real time according to feedback that arrives through her fingertips, not through her intellect. This knowledge does not exist in her mind. It exists in the relationship between her hands and the clay. Remove the clay, and the knowledge has no location. It belongs to the correspondence, not to the correspondent.
When The Orange Pill describes the senior engineer who "could feel a codebase the way a doctor feels a pulse," it is describing exactly this kind of knowledge — embodied, relational, developed through the sustained correspondence between a practitioner and her medium. And when it describes the same engineer's realization that "something beautiful was being lost," it is describing, whether it knows it or not, the dissolution of the correspondence in which that knowledge lived.
Ingold's foundational 1993 essay "Technology, Language, Intelligence" proposed a thought experiment that now reads as prophetic. He asked what would happen if, instead of organizing inquiry around the triad of technology, language, and intelligence — the three concepts that structure virtually all thinking about AI — scholars organized it around craftsmanship, song, and imagination. The resulting account, he predicted, would be "very different." Technology treats tools as means to predetermined ends. Craftsmanship treats them as extensions of the practitioner's bodily engagement with the world. Language, in the information-processing model, is a code for transmitting data between minds. Song is a form of participation in the sounding world, a joining of voice to the acoustic environment. Intelligence, in the computational model, is the capacity to process information. Imagination is the capacity to perceive what is not yet present — to sense the latent possibilities in a situation and respond to them creatively.
The triad of technology, language, and intelligence produces AI. The triad of craftsmanship, song, and imagination does not. And the question Ingold's reframing raises is whether the first triad, the one that produced AI, is the right frame for understanding human creation in the first place — or whether it has always been a reduction, a narrowing of what making is to only those aspects that can be mechanized.
The prompt-execute model works because it operates entirely within the first triad. The user's intention is treated as information. The machine processes that information. The output is a technological artifact. At no point in this process does craftsmanship — the embodied, sensory, relational engagement of the maker with her medium — enter the picture. At no point does song — the participatory joining of the maker's voice to something larger than herself — play a role. Imagination is present, but only in its reduced form: the capacity to conceive a desired outcome, not the capacity to sense latent possibilities in an unfolding situation and respond to them as they emerge.
This is not a failing of the technology. It is a perfect expression of the model of creation that produced it. AI could not have been built by people who understood creation as correspondence, because correspondence cannot be automated. Correspondence requires two parties, each attending to the other, each responding in ways that are conditioned by but not determined by the other's behavior. Automation requires one party — the user — who determines the outcome, and another — the machine — that executes. The asymmetry is structural. It is what makes the system work. And it is what makes the system, from Ingold's perspective, a perfection of the hylomorphic illusion rather than a transcendence of it.
There is a counterargument, and it deserves to be stated clearly before it is examined. The Orange Pill describes moments when the collaboration between author and AI escapes the prompt-execute model — moments of genuine surprise, when the machine offers a connection the author had not anticipated, when the output redirects the inquiry in ways neither party predicted. These moments are real. They are, in Ingold's vocabulary, moments of correspondence: the human attending to what the machine offers, the machine responding to what the human provides, and something emerging from the exchange that neither could have produced alone.
But these moments occur within a medium — natural language — that is already an abstraction from the material world. The correspondence between author and AI is a correspondence between two processors of symbolic representation. The resistance the author encounters is not the resistance of clay or wood or fiber. It is the resistance of language: the difficulty of finding the right words, the friction of articulating a half-formed thought. This is real friction. It produces real knowledge. But it is friction of a specific and limited kind — cognitive, linguistic, representational — and it exercises a specific and limited set of human capacities.
The hands are idle. The body is still. The senses, apart from vision and perhaps hearing, are uninvolved. The spatial, tactile, proprioceptive dimensions of human intelligence — the dimensions that Ingold has shown are essential to the most sophisticated forms of human knowing — are entirely absent from the correspondence.
What the hylomorphic machine has accomplished, then, is not the liberation of making from its material constraints. It is the completion of a centuries-long process of abstracting creation away from the body, away from the hand, away from the sensory engagement with a resistant material world that, Ingold argues, is where the deepest forms of human knowledge are cultivated. The imagination-to-artifact ratio has collapsed. But the ratio was never the right measure of creative depth. It was always a measure of something else entirely: the efficiency of the hylomorphic process. The speed with which form can be imposed on matter.
The question Ingold's framework demands is not how fast the ratio can be closed. It is what happens to the maker — to her knowledge, her capacities, her way of being in the world — when the ratio closes so completely that the material drops out of the equation altogether. When the maker no longer follows materials. When she commands results. When the dialogue becomes a monologue and the correspondence becomes an instruction.
The answer, Ingold's entire body of work suggests, is that something essential about making is lost — not the product, which may be indistinguishable from what a skilled practitioner would have produced, but the process, which is where the maker's knowledge, identity, and relationship to the world are continuously formed and reformed.
The hylomorphic machine does not make bad things. It makes the wrong kind of maker.
In a workshop in the Scottish Highlands, a boat builder named James runs his hand along the grain of a plank of larch. He has been building boats for thirty-seven years. If asked to explain what he is doing, he would say he is checking the wood. But what he is actually doing is far more complex than checking implies. His fingertips are reading the grain — sensing where it runs straight and where it curves, where it is tight and where it opens, where the wood will bend cleanly under steam and where it will splinter. This reading is not a metaphor. The information arriving through his fingertips is as precise, as content-rich, and as essential to the quality of the finished boat as any measurement taken with calipers or any specification retrieved from a database.
He cannot articulate most of what he knows. If pressed, he will say things like "this plank wants to go here" or "the grain is telling me to take it this way" — statements that sound mystical to anyone who has not spent decades in correspondence with wood, and that are, in fact, rigorously empirical descriptions of a knowledge that lives in the relationship between his hands and the material.
Ingold's term for this relationship is correspondence, and it is the central concept of his anthropology of making. Correspondence is not communication. Communication implies the transmission of a message from sender to receiver — information packaged, sent, decoded. Correspondence implies something different: an ongoing, mutual responsiveness between two parties, each attending to what the other offers, each adjusting in real time to what the other does. The potter corresponds with the clay. The woodworker corresponds with the grain. The blacksmith corresponds with the heated metal. In each case, the knowledge that emerges is not the property of the maker alone. It is a property of the correspondence — of the relationship between maker and material, practitioner and medium. Sever the relationship, and the knowledge does not migrate to the practitioner's mind. It ceases to exist.
This framing transforms the question of what AI does to human skill. The conventional framing, which The Orange Pill shares with most technology commentary, treats skill as a possession of the individual — something acquired through training, stored in the practitioner's brain and body, and deployed when needed. The senior engineer's ability to "feel a codebase the way a doctor feels a pulse" is, in this conventional view, a personal attribute: years of experience have deposited a layer of intuition that the engineer carries with her regardless of context. AI threatens this skill by automating the tasks through which it was developed. The threat is real but conceptualized as a threat to the individual: the engineer may lose her edge, may fail to develop the intuition, may find her expertise devalued.
Ingold reframes the threat. Skill is not a possession that can be lost from the individual. It is a quality of a relationship that can be dissolved. When the medium changes — when the engineer's correspondence is no longer with a codebase she has built and maintained and debugged over years, but with an AI that produces code she has not written, has not struggled with, has not intimately come to know — the old skill does not atrophy inside her. It disappears from the world, because the relationship in which it existed no longer exists.
The distinction is not academic. It changes what counts as an adequate response to the transformation. If skill is a personal possession, the remedy for its loss is retraining: teach the engineer new skills to replace the old ones. If skill is a property of a relationship, the remedy is more radical: cultivate new correspondences that are as rich, as demanding, and as formative as the ones being dissolved. The question is not "What new skills should this engineer learn?" but "What new relationships — between the engineer and her medium, between the practitioner and the stuff of her practice — need to be established, and are they adequate to the demands of the work?"
This is why the concept of "ascending friction," as articulated in The Orange Pill, is correct in its observation but insufficient in its analysis. The book argues persuasively that when AI removes the friction of implementation — syntax, debugging, the mechanical labor of converting design into code — the friction does not disappear. It relocates upward, to the level of judgment, vision, architectural decision-making. The surgeon who loses the tactile friction of open surgery gains the cognitive friction of interpreting a two-dimensional image of a three-dimensional space. The friction ascends. The work becomes harder at a higher level.
Ingold would not dispute the observation. Friction does relocate. But he would challenge the metaphor of ascent, because the metaphor carries an assumption that is never examined: that the higher floor is higher. That cognitive friction is superior to material friction. That the judgment required to direct an AI system is a more advanced form of the same capacity that was exercised in the hands-on making the AI replaced.
This assumption is the hylomorphic hierarchy reasserting itself. Mind over matter. Design over execution. The conceiver over the maker. The entire framework of ascending friction rests on the premise that the cognitive work of deciding what to build is more valuable — higher — than the material work of building it. And this premise is precisely what Ingold's anthropology calls into question.
Consider the boat builder again. The judgment he exercises when his fingers read the grain of a plank is not a lower form of the judgment a naval architect exercises when she reviews a computer-generated hull design. It is a different form of judgment entirely — one that engages bodily capacities (tactile sensitivity, spatial reasoning grounded in physical manipulation, the proprioceptive sense of how materials respond to force) that the architect's judgment does not require and cannot develop. These capacities are not primitive precursors to abstract thought. They are sophisticated forms of intelligence in their own right — forms that, as Ingold has argued throughout his career, Western intellectual tradition has systematically devalued because they do not conform to the model of intelligence as information processing.
In a 2019 interview, Ingold stated this position with characteristic bluntness: "The whole AI business, it seems to me, is built upon a faulty notion of intelligence — one that views it in purely cognitive, information-processing terms. For me there can be no intelligence which is not grounded in the perception and action of living beings, moving around in and perceiving their environments as they go." The claim is sweeping. It asserts that the very concept of intelligence on which AI is founded — intelligence as the processing of information, separable from the body that processes it — is wrong. Not incomplete. Not partial. Wrong.
The evidence for this claim comes not from philosophy but from Ingold's decades of ethnographic observation of skilled practitioners. Hunters who read animal tracks are not decoding information inscribed on the ground. They are perceiving — directly, bodily, through eyes trained by years of attending to the movements of animals — the traces of a life unfolding in a landscape they themselves inhabit. Their knowledge is not stored in their heads and retrieved when needed. It is constituted in the act of perception itself, in the active, skilled, attentive engagement of a perceiver with a perceived world.
The basket-weaver who produces an intricate pattern does not consult a stored design and then execute it. She follows the fibers — their tension, their flexibility, their tendency to curve — and the pattern emerges from the interaction of her hands with the material and the developing form of the basket. If she is interrupted and asked to describe the pattern she is making, she may not be able to, because the pattern is not a representation in her mind. It is a rhythm in her hands. It lives in the correspondence, not in the correspondent.
This understanding illuminates a quiet loss that The Orange Pill documents without fully grasping its nature. The engineer who spent eight years on backend systems and "never written a line of frontend code" used Claude to build a complete user-facing feature in two days. The book presents this as liberation: the engineer was freed from the constraints of her specialization, able to reach into domains that had been walled off by the translation cost of learning new tools. And it was liberation, of a kind.
But the walls that had separated backend from frontend were not merely barriers of translation cost. They were also the boundaries of correspondences — the specific, hard-won relationships between a practitioner and her medium that constituted her skill. The backend engineer's knowledge of server architecture was not a transferable cognitive asset that happened to be locked behind a technical paywall. It was a quality of her ongoing correspondence with the systems she had built, maintained, debugged, and come to know with the intimacy of long acquaintance. When she moved to frontend development via AI, she did not carry this knowledge with her. She left it behind — or rather, it ceased to exist, because the correspondence in which it lived was no longer being practiced.
The new capability was real. The engineer could now produce interfaces she could not have produced before. But the mode of engagement was fundamentally different. She was no longer following materials — attending to the behavior of systems through direct, sustained, hands-on engagement. She was directing outputs — describing what she wanted and receiving it from a machine that did the making for her. The products were comparable. The maker was transformed.
Ingold's concept of enskilment clarifies the stakes. Enskilment is his term for the process by which skill develops — not through the transmission of knowledge from teacher to student, but through the cultivation of the student's own perceptual and practical capacities within an environment structured by the teacher's guidance. "'Understanding in practice' is a process of enskilment," Ingold writes, "in which learning is inseparable from doing, and in which both are embedded in the context of a practical engagement in the world."
The critical phrase is "learning is inseparable from doing." Enskilment cannot be shortcut. You cannot develop the boat builder's sensitivity to grain by reading about grain, by watching videos of grain, or by having an AI describe the properties of grain in exquisite detail. You develop it by handling wood. By making mistakes with wood. By ruining planks and wasting hours and slowly, through the friction of repeated engagement with a resistant material, cultivating the perceptual capacities that allow you to read the grain by touch.
AI offers knowledge without enskilment. It delivers the output of skilled practice without requiring the practitioner to undergo the process through which skill is developed. And because the output is what the market values — the working code, the legal brief, the engineering solution — the process appears dispensable. Who needs enskilment when you have the result?
The answer, from the perspective of correspondence, is: anyone who needs to understand what they have made. The result without the process is a product without a maker. It exists in the world, it functions, it may serve its purpose admirably. But no one has been formed by its making. No one has developed the perceptual and practical capacities that the making would have cultivated. No correspondence has been established between a human being and the stuff of her practice. The world has one more artifact and one fewer skilled practitioner, and the question that correspondence raises is whether this trade is sustainable — whether a civilization that produces artifacts without enskilment can maintain the depth of understanding required to know which artifacts are worth producing and which are not.
Bob Dylan did not draw a straight line from silence to "Like a Rolling Stone." He drew something more like what a river draws through a floodplain — a meandering, branching, self-correcting path that doubled back on itself, pooled in eddies, broke through unexpected channels, and arrived at a destination that could not have been identified from the headwaters. Twenty pages of what he called "vomit." Days of cutting. A band session in which Al Kooper was not even supposed to be there. The song emerged not from a plan but from a process — a process that included accident, exhaustion, overflow, collaboration, and the specific resistance of musical materials (chord progressions, vocal rhythms, the grain of a particular recording studio's acoustics) that shaped the final form as much as any intention Dylan carried into the work.
The Orange Pill uses Dylan to dismantle the myth of the solitary genius — the Romantic idea that great work flows from a single extraordinary mind, alone with its vision. The dismantling is effective. Dylan was never alone. He was accompanied by every musician he had ever heard, every poet he had ever read, every argument he had ever lost. The room was always crowded with influences. Creativity is relational, not individual. The genius is not the spring but the rapids — a stretch of turbulent water where multiple tributaries converge.
Ingold would agree with the demolition. He would then note that the reconstruction — the model of creativity that replaces the myth — still carries a hidden assumption about what kind of movement creative work is. And the assumption matters, because it determines what AI is understood to do to the creative process.
The hidden assumption is linearity of a particular kind. Even in the Rapids model — creativity as the convergence of influences rather than the eruption of individual genius — the movement of creation is understood as a movement from something to something. From influences to output. From training data to synthesis. From the vast implicit inputs to the specific, unprecedented product. The line runs from origin to destination, and the creative event is the moment of arrival.
Ingold has spent decades theorizing lines, and he distinguishes between two fundamentally different kinds. The first he calls the line of transport. A line of transport connects two predetermined points — origin and destination — and the movement along it is a movement of delivery. The postal service operates along lines of transport. So does GPS navigation. So does the prompt-execute cycle. The user originates an intention (the starting point), the machine delivers a result (the endpoint), and the line between them is the process of production — a process whose value lies entirely in the arrival, not in the journey.
The second kind is the line of wayfaring. A line of wayfaring does not connect two points. It grows through a landscape. The wayfarer does not know her destination in advance. She discovers it through the journey — by attending to the terrain, reading the slope, responding to what she encounters. Her movement is not transport from A to B. It is a continuous, improvisatory engagement with an unfolding world. The path she makes is not a connection between predetermined endpoints. It is a trace of her movement through the landscape, a record of her ongoing responsiveness to what the landscape offered.
The distinction maps directly onto the creative process, and it reveals something that the Rapids model, for all its sophistication, misses. Dylan's twenty pages of "vomit" were not a line of transport from exhaustion to song. They were a line of wayfaring — a movement through an emotional and linguistic landscape in which the destination was unknown and the path was discovered through the walking. The pages doubled back. They repeated. They followed threads that led nowhere and abandoned them. They were the movement of a wayfarer who did not know where he was going and who found his destination only by attending to the terrain he was traversing.
The cutting — the days of condensation that transformed twenty pages into six minutes — was a different kind of movement. It was closer to transport: Dylan knew, by then, approximately where the song wanted to go, and the work of cutting was the work of clearing the path between the material he had and the form he could sense. But even the cutting was not pure transport, because the form of the song continued to change as he cut, and the cutting revealed possibilities that the full twenty pages had concealed.
What matters here is not the biographical detail but the typology. Two kinds of movement produce two kinds of knowledge, two kinds of relationship to the work, two kinds of maker.
Transport produces arrival. The knowledge it generates is knowledge of the destination — the product, the result, the finished thing. The maker who arrives by transport knows what she has produced. She may evaluate it, compare it to her original intention, assess its quality against external standards. But her relationship to the work is the relationship of a commissioner to a commission: she specified, the work was delivered, she inspects.
Wayfaring produces what Ingold calls "knowledge from the inside" — the intimate familiarity with a domain that only comes from having moved through it attentively, responding to its features, getting lost in its complexities, finding one's way not by consulting a map but by cultivating a feel for the terrain. The maker who arrives by wayfaring knows not just what she has produced but how the terrain of production behaves — where the footholds are, where the ground gives way, where the unexpected openings appear. She carries the knowledge of the journey in her body and her habits, and that knowledge is available to her the next time she enters the terrain, as an expanded capacity for perception and response.
AI collaboration, in its dominant mode, converts wayfaring into transport. The user describes a destination. The machine calculates a route. The product arrives. The user inspects. At no point in this process does the user move through the problem space the way a wayfarer moves through a landscape — attending to its features, getting lost, finding unexpected paths, developing the felt knowledge that only the journey deposits.
This is what The Orange Pill's engineer lost without knowing it. The four hours of daily "plumbing" — dependency management, configuration files, the mechanical connective tissue between components — contained, buried within their tedium, approximately ten minutes of wayfaring: moments when something unexpected happened in the configuration, something that forced the engineer to attend to a connection between systems she had not previously understood. Those ten minutes were the terrain. They were the landscape through which she moved, and the architectural intuition she developed — the ability to sense that something was wrong before she could articulate what — was the knowledge of a wayfarer, accumulated step by step through years of attentive movement through the problem space.
When Claude took over the plumbing, it converted those four hours from wayfaring to transport. The dependencies were resolved. The configurations were generated. The engineer arrived at the working system without having traveled through the territory that would have developed her felt understanding of how the system's pieces interrelated. The arrival was faster. The terrain knowledge was not deposited.
The loss compounds over time in ways that are invisible at the level of individual tasks but devastating at the level of professional development. Each day of wayfaring through a complex system deposits a thin layer of understanding — not propositional understanding, not the kind that can be stated and stored, but the kind that shows up as better questions, faster pattern recognition, a more acute sensitivity to the places where systems are likely to fail. These layers accumulate over years into something that senior practitioners call intuition and that Ingold would call the knowledge of a skilled wayfarer — someone who has moved through this particular terrain so many times that she can read its features the way a hunter reads tracks.
The prompt-execute cycle does not merely fail to deposit these layers. It renders the terrain invisible. The engineer who uses Claude to resolve a dependency does not see the dependency landscape. She does not know what the alternative resolutions were, does not know which approach Claude chose and why, does not know what tradeoffs were made in the resolution. She sees only the result: the working system. The terrain between the problem and the solution has been traversed, but not by her. She was transported across it.
There is a deeper consequence still, one that connects Ingold's line theory to the nature of creative discovery. Wayfaring produces not only terrain knowledge but surprise — the encounter with something unexpected that could not have been anticipated from the starting point. The wayfarer who sets out without a fixed destination is open to what the landscape offers in a way that the transported traveler, who knows where she is going and wants only to arrive efficiently, is not. The unexpected detour — the configuration that behaves strangely, the dependency that reveals a hidden connection between components, the bug that forces a rethinking of the entire architecture — is not a failure of efficiency. It is the mechanism of discovery.
Dylan's twenty pages of "vomit" were full of detours. Most of them led nowhere. A few led to lines that became the backbone of the song. He could not have known in advance which detours would prove productive, because the productivity of a detour is visible only in retrospect, only from the vantage point of the destination that the detour helped reveal. The wayfarer discovers the destination by getting lost. The transported traveler cannot get lost, because the route has been calculated. And because she cannot get lost, she cannot make the discoveries that only lostness produces.
The AI-assisted creative process is extraordinarily efficient at arriving at destinations. It is structurally impoverished at getting lost. The user who prompts Claude has, in the act of prompting, already determined the approximate destination. The machine delivers her there along the most efficient route. What she does not experience is the wandering — the twenty pages of overflow, the wrong turns, the dead ends that open unexpectedly onto views the straight path would never have afforded.
The most creative moments in The Orange Pill's own AI collaboration — the laparoscopic surgery insight, the punctuated equilibrium connection — occurred when the process temporarily escaped the transport model and became something closer to wayfaring: the author described an impasse rather than a destination, and the machine responded with a direction rather than an answer, and the subsequent movement was genuinely exploratory, neither party knowing where it would lead. These moments are real, and they demonstrate that correspondence within AI collaboration is possible. But they are exceptional rather than structural. The medium gravitates toward transport. The user gravitates toward prompting for destinations. The moments of wayfaring are precious and fragile and require a deliberate resistance to the medium's dominant mode.
The line of making is a line of growth, not a line of connection. It does not link a predetermined start to a predetermined finish. It grows through the landscape of the work, branching where the material offers an opening, pausing where it resists, doubling back where the path proves impassable. And the maker who follows this line — who attends to the terrain rather than commanding the route — arrives somewhere she could not have predicted, carrying knowledge she could not have acquired any other way.
The question for the age of AI is whether the makers who remain will still know how to get lost.
Fourteen billion years of cosmic history, and all the intelligence that emerged from it, can be described in two fundamentally different ways. The first treats the universe as a vast network: discrete entities — atoms, molecules, cells, organisms, minds — connected by relationships that transmit information between them. Intelligence, in this model, is a property of the connections. It lives in the spaces between nodes. The more connections, the denser the network, the richer the intelligence. This is the model that The Orange Pill adopts when it describes intelligence as a river flowing through increasingly complex channels, from hydrogen atoms to neurons to cultures to machines. It is the model of computational neuroscience, of information theory, of the internet itself. It is the dominant model of the twenty-first century.
The second way of describing the same history does not start with entities. It starts with movements. Not atoms that then connect, but flows that sometimes converge. Not nodes that then form networks, but lines that interweave into what Ingold calls a meshwork — a tangle of interwoven threads of movement and growth, where the threads are the primary reality and the intersections are merely the places where threads happen to cross.
The difference sounds abstract. It is not. It determines what AI is understood to be, what intelligence is understood to be, and what the relationship between human and machine is understood to look like.
In the network model, intelligence lives in nodes and travels along connections. A neuron is a node. A synapse is a connection. A person is a node. A conversation is a connection. An AI system is a new node in the network of intelligence, and its arrival increases the network's density, its computational power, its capacity for the kind of pattern-matching that produces insight. The Orange Pill's claim that "intelligence is not a thing we possess but a thing we swim in" is stated in network terms: the river is the network, the swimmers are the nodes, and AI is a new, powerful swimmer that increases the river's current.
Ingold's meshwork inverts the priority. In a meshwork, there are no nodes. There are only lines — lines of movement, growth, becoming. What appear to be nodes are actually knots: temporary tangles where multiple lines of movement cross, converge, and continue on their way. A person is not a node in a social network. A person is a knot — a convergence of lines of ancestry, experience, relationship, aspiration, habit, memory, and practice that are always in motion, always extending outward, always entangled with the lines of other people and other beings and other forces.
The distinction matters because it changes what AI is. In the network model, AI is a new node — a new entity with its own processing power, connected to existing nodes through interfaces and protocols. The node has properties: speed, accuracy, scale. The network gains a powerful new member. The question is how the existing nodes relate to the new one: as partners, competitors, collaborators, threats.
In the meshwork model, AI is not a node. It is a new thread — a new line of movement that enters the tangle, interweaves with existing threads, alters the pattern of the whole weave. The thread has no independent existence apart from the meshwork. Its properties are not intrinsic but relational: they emerge from how the thread interacts with the other threads it encounters. And the question is not how the existing nodes relate to the new node, but how the new thread changes the pattern of the entire weave — what it tightens, what it loosens, what new tangles it creates, what existing tangles it resolves.
This reframing produces a different analysis of every phenomenon The Orange Pill describes. Consider the "orange pill" moment itself — the author's recognition, working late one night with Claude, that the distance between imagination and artifact had collapsed to the duration of a conversation. In the network model, this is a moment of node-level transformation: the individual recognizes that a new node (AI) has been connected to his existing network (his experience, his skills, his creative intentions), and the augmented network is more powerful than the original.
In the meshwork model, the moment is different. It is not the addition of a node to a network. It is the arrival of a new thread into a tangle — a thread that interweaves with the author's ongoing lines of movement (his decades of building, his parental anxieties, his aesthetic commitments, his physical habits of late-night work) and produces a new pattern. The pattern is not the old pattern plus a powerful new element. It is a genuinely new pattern, one in which the author's own lines of movement have been redirected, pulled in directions they were not going before, entangled with a thread whose properties they are still discovering.
The difference is felt, not calculated. The author does not add Claude to his toolkit the way one adds a new application to a device. He enters into a new mode of being — a new weather-world, a new atmosphere of work — that changes not just what he produces but how he moves through his days, what he attends to, what he ignores, where he feels most alive and where he feels most lost. The meshwork model captures this experiential dimension that the network model, with its tidy nodes and edges, systematically flattens.
Now consider Dylan. The Orange Pill describes Dylan as a node at the confluence of multiple cultural tributaries — Woody Guthrie, Robert Johnson, the Beat poets, the British Invasion. The intelligence of "Like a Rolling Stone" lives in the connections between these influences, the way the node synthesized the inputs. It is a network story: influences flow in, synthesis happens at the node, the product flows out.
Ingold's meshwork tells a different story. Dylan is not a node that receives and synthesizes. He is a knot — a tangle of lines that were already moving before they converged in him and that continued moving after the convergence. The line of delta blues did not terminate at Dylan. It passed through him and continued, transformed, into folk rock and country rock and a hundred subsequent musical movements. The line of Beat poetry did not deposit itself in Dylan's brain like a file uploaded to a server. It intertwined with his own line of movement — his restlessness, his exhaustion, his particular way of hearing language — and the intertwining produced something that belonged to neither line alone.
In the meshwork model, "Like a Rolling Stone" is not a product of synthesis. It is a knot — a particularly dense, resonant tangle of lines that happened to converge with exceptional intensity at a particular time and place. The song does not belong to Dylan in the way a product belongs to its manufacturer. It belongs to the meshwork — to the whole tangle of cultural movements, personal histories, technological conditions (the Columbia recording studio, the electric instruments, the particular microphone that captured the particular acoustic quality of that particular day) that converged to produce it.
This matters for understanding AI's role in creation because the meshwork model refuses the premise that creation has an author. In the network model, the author is the node: the entity at which inputs converge and from which outputs emerge. The question of authorship — who wrote this book, who composed this song, who created this product — is a question about which node deserves credit for the synthesis. This is why the question of AI authorship is so vexed in the network model: if the AI is a node that contributed significantly to the synthesis, where does the human's authorship end and the machine's begin?
The meshwork dissolves the question. If creation is not the product of a node but the pattern of a weave — if the song, the book, the product is a knot in a meshwork of moving lines — then the question of authorship is malformed. Nobody authored the knot. The lines produced it by converging. The human's line of movement is one thread. The AI's is another. The cultural context is another. The deadline pressure is another. The physical state of the maker at three in the morning is another. The knot is the product of all these threads, and asking which thread deserves credit is like asking which thread in a tapestry is responsible for the pattern.
This dissolution of authorship is not comfortable, and it is not the argument that The Orange Pill makes. The book maintains, carefully and with evident discomfort, a model of authorship in which the ideas are the human's and the expression is collaborative — a model that still depends on the network concept of the author as a node with identifiable contributions. But the discomfort is revealing. Every passage in which the author wonders whether a particular insight belongs to him or to Claude or to the collaboration is a passage in which the meshwork model is straining against the network model that the prose is trying to maintain.
The most theoretically radical implication of the meshwork model is its treatment of intelligence itself. The Orange Pill describes intelligence as flowing through the universe like a river — from hydrogen to neurons to algorithms — and this image captures something real: the continuity of pattern-formation across scales and substrates. But the river metaphor, for all its power, carries a network-model assumption: that intelligence is a substance that flows through channels, and the question is how wide the channels are and how fast the current runs.
In the meshwork, intelligence is not a substance. It is the quality of the weave — the density, the complexity, the responsiveness of the tangle. Intelligence is not something that nodes possess or that flows between them. It is what happens when lines of movement interweave with sufficient complexity that the meshwork begins to exhibit properties that no individual thread possesses. Consciousness, creativity, understanding — these are properties of the meshwork, not of the nodes.
This has a specific, testable consequence for thinking about AI. If intelligence is a property of the meshwork rather than of nodes, then adding a powerful new thread (AI) does not simply increase the total intelligence in the system, the way adding water to a river increases the total flow. It changes the pattern of the weave. And the pattern may become denser in some areas and thinner in others. The total "amount" of intelligence is not the relevant measure. The relevant measure is the quality of the weave — whether the new thread enriches the tangle or crowds out the older threads that gave it its specific character.
Ingold would observe that the AI thread is unusually thick and fast-moving. It interweaves with enormous breadth — touching nearly every domain of human practice simultaneously — and with enormous speed, reshaping the pattern faster than the other threads can adjust. When a new thread enters a meshwork with this kind of force, the risk is not that the meshwork will lack intelligence. It is that the older threads — the slower, thinner, more fragile lines of embodied skill, local knowledge, tacit understanding, sensory engagement with the material world — will be overwhelmed. Not destroyed. Overwhelmed: pushed to the margins of the weave, still present but no longer structurally significant, still technically existing but no longer contributing to the pattern in ways that anyone notices or values.
The framework knitters of Nottingham were threads in a meshwork. Their lines of movement — the skilled, practiced, daily engagement with thread and frame and the economics of their trade — had been structural threads for generations. The power loom was a new, thick, fast-moving thread that entered the meshwork and overwhelmed the older threads. The knitters did not disappear. Their skills did not vanish from the earth. But the meshwork reorganized around the dominant thread, and the older threads became marginal — still technically present, no longer structurally important.
This is the risk that the meshwork model identifies and the network model obscures. The network model asks whether the new node enhances or threatens the existing nodes. The meshwork model asks whether the new thread enriches or impoverishes the weave. The answers may diverge. A thread that enhances the network's total processing power may simultaneously impoverish the weave by overwhelming the slower, subtler threads that gave it texture and depth. The network gets more powerful. The meshwork gets thinner. And because we live in the meshwork — because the quality of human life is determined not by the total processing power of the network but by the richness and responsiveness of the tangle of relationships, skills, perceptions, and practices in which we dwell — the impoverishment of the weave is a loss that no amount of network enhancement can compensate.
The question the meshwork poses to the age of AI is not The Orange Pill's question — "Are you worth amplifying?" — because the meshwork does not amplify nodes. The meshwork question is: What kind of weave are we making? Which threads are thickening and which are thinning? What is the quality of the tangle — its density, its responsiveness, its capacity to sustain the life that depends on it? And are the threads we are allowing to dominate the weave the threads that produce the richest, most responsive, most life-sustaining pattern?
Or are they merely the fastest?
There is a difference between knowing where you are and knowing how to get there. The difference sounds trivial. It is not. It is the difference between two fundamentally distinct modes of being in the world, and the confusion between them is one of the most consequential errors of the technological age.
A person using GPS knows where she is. The blue dot on the screen tells her. She knows where she is going. The route line tells her. She knows how long it will take. The estimated arrival time tells her. What she does not know — what GPS structurally prevents her from knowing — is the landscape she is moving through. The rise of the hill to the west. The drainage pattern that explains why this road curves here. The smell of the tidal flat that would tell her she is near the coast even if the screen went dark. She is informed, precisely and continuously, about her position. She is ignorant, profoundly and increasingly, about her place.
Ingold built an entire theory of knowledge on this distinction. Wayfaring is his term for the mode of movement in which the traveler navigates by attending to the landscape — reading the slope, the wind, the light, the signs left by previous travelers — and discovers the path through the walking. The wayfarer does not consult a map stored in her head. She does not follow a route calculated in advance. She grows her path through the terrain the way a root grows through soil: by feeling her way, responding to what she encounters, adjusting in real time to the resistances and openings that the landscape offers. Her knowledge is not positional — it is not about where she is on a grid. It is inhabitative. It is about what it is like to be here, moving through this specific terrain, attending to these specific features, with this specific history of previous journeys available as a perceptual resource.
Transport is the other mode. The transported person is delivered from origin to destination without attending to the terrain between them. The knowledge produced by transport is knowledge of endpoints: where you started, where you arrived, how long it took. The knowledge produced by wayfaring is knowledge of the terrain itself: its textures, its rhythms, its dangers, its affordances — the whole felt quality of a landscape that can only be known by someone who has moved through it on foot, over time, with attention.
Every technology that calculates routes converts wayfarers into transported persons. This conversion is not metaphorical. It produces measurable changes in the knowledge that practitioners carry. London taxi drivers who trained under the Knowledge — the legendary requirement to memorize twenty-five thousand streets and thousands of points of interest through years of wayfaring on a moped — developed measurably enlarged hippocampi, the brain region associated with spatial memory and navigation. Drivers who used GPS did not. The wayfaring deposited neural structure. The transport did not. The knowledge was not in the driver's conscious memory. It was in the architecture of her brain, built layer by layer through the friction of moving through a complex landscape with nothing but her own attention to guide her.
The parallel to software development is not approximate. It is precise. The programmer who builds a system by hand — writing the code, running into errors, debugging, rewriting, tracing the logic through layers of abstraction, discovering where the dependencies tangle and where the architecture holds — is a wayfarer in a computational landscape. Her knowledge of the system is not positional. She does not merely know that the system works. She knows how it works, in the felt, inhabitative sense: she has traveled through its internal terrain often enough to sense where it is solid and where it is fragile, where a change will propagate safely and where it will break something three layers down.
This knowledge cannot be transmitted. It cannot be summarized. It cannot be prompted out of an AI system, because it is not propositional knowledge at all. It is the accumulated residue of thousands of hours of wayfaring through a specific computational landscape — a residue that exists not as information stored in the programmer's memory but as perceptual capacities developed through practice. The capacity to sense that something is wrong before you can articulate what. The capacity to read a block of code and feel its fragility, the way a structural engineer walks into a building and feels the load paths without calculating them. These capacities are the programmer's equivalent of the enlarged hippocampus: neural architecture deposited by the friction of moving through complex terrain with nothing but attention to guide you.
Claude Code converts wayfaring into transport with an efficiency that no previous tool has approached. The programmer describes a destination — "build a feature that handles speaker detection" — and the machine calculates a route and delivers her there. The destination is reached. The feature works. But the terrain between the description and the working feature has been traversed by the machine, not by the programmer. She did not encounter the dependency tangles. She did not trace the logic through layers of abstraction. She did not discover where the architecture was fragile. She arrived without having traveled.
The Orange Pill documents this transformation with striking honesty. The engineer who spent four hours a day on "plumbing" — configuration files, dependency management, the mechanical connective tissue of a working system — lost not just the tedium when Claude took over, but the ten minutes of formative experience buried within it: moments when something unexpected happened, something that forced her to attend to a connection she had not previously understood. Those ten minutes were wayfaring. They were the brief stretches of terrain-reading embedded in hours of mechanical transit. And they were the moments that deposited the architectural intuition she later found eroding without understanding why.
The loss is invisible at the level of the individual task. Each task is completed faster, often better, with fewer errors. The transported developer produces working code more efficiently than the wayfaring developer. But efficiency is a measure of transport — of the speed and reliability with which one arrives at a predetermined destination. It is not a measure of the knowledge deposited by the journey. And the knowledge deposited by the journey is the knowledge that determines whether the developer, when faced with a novel problem — one for which no prompt is adequate, one that requires sensing the terrain rather than requesting a route — will have the perceptual resources to navigate.
Ingold observed this dynamic in a context far removed from software. Indigenous hunters in circumpolar regions do not navigate the landscape by consulting cognitive maps — internal representations of the terrain stored in memory and accessed when needed. They navigate by attending to the landscape itself: the angle of the sun, the direction of the wind, the quality of the snow, the behavior of animals, the subtle signs left by previous travelers. Their knowledge is not representational. It is perceptual. It exists not as a map in the head but as a capacity for reading the world in real time, a capacity developed through years of movement through the terrain under the guidance of more experienced wayfarers.
When younger hunters began using GPS, the elders expressed not technological anxiety but epistemological alarm. The concern was not that GPS was inaccurate. It was that GPS-using hunters would arrive at their destinations without developing the perceptual capacities — the capacity to read snow, wind, animal behavior, light — that constitute the knowledge of a wayfarer. They would be positioned correctly and perceptually impoverished. They would know where they were without knowing where they were.
The same epistemological alarm is audible, though rarely named as such, in the anxiety of senior software engineers watching junior colleagues build with AI. The anxiety is not primarily about job displacement, though that is real. It is about a subtler and more consequential loss: the junior developer who builds exclusively with AI will arrive at working systems without having developed the felt understanding of computational terrain that only wayfaring deposits. She will be productive. She will be efficient. She will ship code that works. And she will lack the capacity to sense when something is wrong in a system she did not build, because that capacity is the residue of wayfaring, and she has only ever been transported.
The deepest challenge Ingold's wayfaring concept poses to The Orange Pill is not about the loss of specific skills. It is about the nature of the knowledge that replaces them. The book argues that when implementation friction is removed, the friction ascends to the level of judgment, vision, and architectural decision-making. The senior engineer's value shifts from execution to direction: knowing what to build, not how to build it.
But Ingold's framework asks a question the ascending friction model cannot easily answer: where does the judgment come from? If judgment is the capacity to make wise decisions about what to build and how to direct the building, and if that capacity develops through the sustained experience of building itself — through the wayfaring that deposits layer after layer of terrain knowledge — then the judgment at the top of the tower depends on the climbing that the tower is designed to eliminate. The view from the higher floor is earned by the stairs, and the stairs are the very implementation work that AI removes.
The architect who has never lifted a beam may design structurally sound buildings. But the architect who has lifted beams — who knows in her body what a load feels like, how materials behave under stress, what the difference is between a joint that holds and one that will eventually fail — designs with a different quality of understanding. Her judgment is grounded in the terrain. It is the judgment of a wayfarer, not a navigator.
When Ingold states that "there can be no intelligence which is not grounded in the perception and action of living beings, moving around in and perceiving their environments as they go," he is making a claim about the nature of judgment itself. Judgment that is not grounded in perceptual experience of the domain — judgment that floats above the terrain, directing from a height that has never been earned by climbing — is judgment of a particular and impoverished kind. It may be correct in its conclusions. It may produce working products. But it lacks the ground-level sensitivity to the specific, the local, the textured, the resistant that wayfaring develops and that the most sophisticated forms of professional practice require.
The wayfarer knows something the navigator does not: what the terrain feels like from the inside. She knows which routes are dangerous not because she has been told but because she has felt the ground give way under her feet. She knows where to look for water not because a database told her but because years of attending to the landscape have taught her to read the signs — the vegetation patterns, the slope of the land, the quality of the soil — that indicate water's presence. This knowledge cannot be outsourced because it is not information. It is a way of being in the terrain, cultivated through the practice of moving through it.
The age of AI is producing a generation of navigators: people who arrive at correct destinations with extraordinary efficiency, who produce working systems and competent analyses and serviceable designs, and who have never been lost. Never encountered the unexpected. Never had to read the terrain with nothing but their own perceptual capacities to guide them. Never developed the felt understanding that only comes from having been, many times, in territory where no calculated route could help.
The question is whether a civilization of navigators can produce the kind of knowledge that only wayfarers carry — the ground-level, perceptual, embodied understanding of how things work and why they break and what to do when the map runs out. Every previous civilization has depended on this knowledge, even as it has systematically undervalued the people who carry it. The hunters, the craftspeople, the farmers, the mechanics, the nurses, the teachers — the people whose expertise lives not in their credentials but in their hands and eyes and cultivated attention — have always been the substrate on which the more visible achievements of culture rested.
AI does not threaten to eliminate this substrate. It threatens something more insidious: to make it appear unnecessary, so that the slow, expensive, friction-rich process of developing wayfaring knowledge is abandoned not because it has been replaced but because the products it used to generate can now be obtained by faster means. The products will be obtained. The knowledge will not be deposited. And the civilization will discover, at some future point of crisis — when the GPS fails, when the model hallucinates, when the system encounters a situation that no training data anticipated — that it has optimized away the very capacity it needs most: the ability to find the way when there is no route to follow.
Martin Heidegger, in a 1951 lecture delivered to an audience of architects and engineers in postwar Germany, made an argument that has been reverberating through philosophy, architecture, and anthropology ever since. The argument is deceptively simple: we do not build in order to dwell. We dwell, and out of dwelling, we build.
The distinction sounds like wordplay. It is not. It is a reversal of the entire modern understanding of the relationship between human beings and the structures they create. The modern understanding says: first you build the house, then you live in it. First the structure, then the inhabitation. Building is the means; dwelling is the end. Heidegger argued that this sequence is backward. Dwelling — the ongoing, attentive, caring engagement of a mortal being with the world — is the fundamental condition. Building is one of the ways dwelling expresses itself. The farmer who cultivates a field is not building in order to dwell. She is dwelling — attending to the earth, caring for the crops, maintaining her relationship with the land — and the cultivation is an expression of that dwelling. The vintner who tends the vine, the shepherd who tends the flock, the builder who tends the house — all are dwelling first, building second.
Ingold absorbed this insight and radicalized it. For Heidegger, dwelling was a philosophical category — a mode of being that could be described in the language of phenomenology. For Ingold, it became an ethnographic observable — something that could be seen, documented, and analyzed in the practices of actual human beings engaging with actual materials in actual places. The potter dwells in the making of pots. The boat builder dwells in the workshop. The weaver dwells in the rhythm of the shuttle. In each case, dwelling means something specific: it means being present to the work not as a means to an end but as a practice in itself, a way of engaging with the world that is valued not for what it produces but for what it is.
The Orange Pill is a book about building. The word saturates the text. Build products. Build companies. Build dams. Build structures that redirect the flow of intelligence toward life. The author identifies as a builder: "I am a builder who sits with the things he makes." The imperative at the book's conclusion is: "It's time to get back to building." The entire moral architecture of the argument — the beaver as the hero, the Believer and the Swimmer as cautionary figures, the ascending friction that rewards those who build at the right layer — rests on the premise that building is the highest human response to the challenge of AI.
Ingold's framework, channeled through Heidegger, asks a question that the building-centric discourse does not pose to itself: What is the quality of the builder's relationship to the building?
A builder who dwells in her work — who cares for the code she writes, tends the system she maintains, preserves the quality of the product through ongoing, attentive engagement — is in one kind of relationship to her creation. She knows it intimately. She has invested not merely labor but attention, the sustained, caring, particular kind of attention that Heidegger called Sorge and that Ingold describes as the fundamental posture of the maker who follows materials. She has been present to the work in the way that a gardener is present to a garden: not as an engineer managing outputs but as a dweller tending something that is alive, that changes, that responds to care and suffers from neglect.
A builder who directs AI toward predetermined outputs is in a different kind of relationship. She may care about the product. She may be deeply invested in its success. She may bring extraordinary judgment to the question of what should be built and for whom. But the mode of engagement is different. She is not tending. She is commissioning. She is not dwelling in the process of making. She is managing the process of production.
The distinction is not a hierarchy. Directing is not worse than tending. But it is different, and the difference has consequences for the kind of knowledge the builder develops, the kind of relationship she establishes with her work, and the kind of satisfaction the work provides.
Consider the beaver — the central metaphor of The Orange Pill's practical philosophy. The beaver builds dams. But the dam is not the point. The point is what the dam creates: a pool, a habitat, a wetland that supports hundreds of species. And the beaver's relationship to the dam is not the relationship of a manufacturer to a product. It is the relationship of a dweller to a dwelling place. The beaver does not build the dam and walk away. She returns every day. She repairs what the current has loosened. She adds new sticks where the structure has weakened. She packs fresh mud into the gaps. The dam is not a project with a completion date. It is a practice — an ongoing expression of the beaver's dwelling in a particular place, her caring attention to the structure that makes that place habitable.
This is dwelling. And it is precisely the quality of engagement that AI's acceleration of production threatens to erode.
The threat is not that builders will stop caring about their products. Many will care deeply. The threat is subtler: that the mode of caring will shift from dwelling to managing. From the intimate, ongoing, hands-on tending of a specific practice to the efficient, arms-length direction of a production process. The shift happens not because builders choose it but because the tool makes it structurally inevitable. When the production cycle shrinks from months to hours, there is no time to dwell. The product arrives before the builder has established the kind of relationship with it that dwelling requires — the slow, repetitive, friction-rich familiarity that comes from having worked with the same material, the same system, the same codebase over months and years.
The Berkeley study of AI in the workplace documented this shift without naming it. Workers using AI expanded their scope, took on more tasks, moved faster. But the researchers also found that work "seeped into pauses" — that the brief gaps of unstructured time that had previously provided cognitive rest were colonized by AI-assisted productivity. The seepage is the opposite of dwelling. Dwelling requires unstructured time — time in which the builder is present to the work without an immediate productive goal, time in which the relationship between builder and artifact deepens through the kind of unhurried, attentive engagement that cannot be optimized or accelerated.
In the idiom of ecology, the pauses were habitat. They were the unmowed margins of the workday, the patches of unproductive ground that supported the micro-organisms of creative thought: the idle association, the half-formed question, the background processing that the brain performs when the conscious mind is at rest. When the pauses were colonized — when every gap was filled with "just one more prompt" — the habitat was destroyed. Not by malice. By opportunity. The AI provided something productive to do in every available moment, and the workers, operating under the internalized imperative to achieve, filled the gaps.
Heidegger's insight was that dwelling cannot be achieved by building more efficiently. You do not dwell by producing more products or shipping more features or completing more tasks. You dwell by being present — by attending to the world with the kind of caring, unhurried, sustained attention that allows a relationship to develop between you and the thing you are tending. And this kind of attention is precisely what the acceleration of production makes harder, not because the tools are hostile to attention but because the tools make inattention so productive that attention feels like a luxury.
The Orange Pill describes a moment that illustrates the tension with painful clarity. The author, writing the book on a transatlantic flight, catches himself: "I was not writing because the book demanded it. I was writing because I could not stop." The muscle that lets him imagine outrageous things had locked. The exhilaration had drained away. What remained was "the grinding compulsion of a person who has confused productivity with aliveness."
This is the opposite of dwelling. It is the opposite of the beaver tending her dam. It is production without presence — the manufacture of output by a person who is no longer in a caring relationship with the work but is instead being driven by it, whipped by the internalized imperative that converts every available moment into labor. The confession is honest, and its honesty reveals the structural tension at the heart of the book's argument: the same technology that enables the builder to build extraordinary things also enables the production-without-presence that dissolves the conditions for dwelling.
Ingold's ethnographic work on craftspeople reveals what dwelling looks like in practice and why it is worth fighting for. The boat builder in his workshop is not optimizing his productivity. He is present to the wood. He is in correspondence with the grain, the moisture, the specific qualities of this particular plank on this particular day. He will spend time that a production-focused manager would call wasted — time running his hands along the surface, time holding the plank at different angles to the light, time simply being with the material before deciding what to do with it. This time is not wasted. It is the practice of dwelling — the unhurried, attentive engagement through which the builder's relationship with the material deepens and the knowledge that only correspondence produces is deposited.
When the boat builder's grandson uses AI-assisted design software to produce boat plans, he may produce excellent plans. The software can calculate hull shapes, optimize for stability and speed, account for the properties of different materials with a precision that no human eye or hand can match. But the grandson will not know what the wood feels like. He will not have developed the sensitivity to grain that his grandfather cultivated through decades of hands-on engagement. He will not dwell in the workshop the way his grandfather did — not because he lacks the desire but because the production process no longer requires him to spend unhurried time in the presence of the material.
The loss is real but almost impossible to quantify, because dwelling does not produce measurable outputs. It produces the conditions in which outputs of a certain quality become possible — the deep familiarity, the intuitive sensitivity, the caring attention that distinguish work made by a dweller from work made by a producer. The distinction is visible in the finished artifact, but only to those who have enough experience of the domain to see it. A bread made by a baker who dwells in her practice — who has spent years attending to the behavior of dough, the temperature of the oven, the quality of the flour — is different from a bread made by following a recipe with precision. The recipe may be identical. The bread is not. And the difference is the difference between production and dwelling.
The imperative at the close of The Orange Pill — "It's time to get back to building" — deserves a Heideggerian amendment. Building is necessary. The dams must be constructed. The structures that redirect the flow of intelligence toward life must be raised and maintained. But the question that dwelling raises is prior: What is your relationship to the building? Are you dwelling in it — present, caring, attentive, tending it as a practice rather than completing it as a project? Or are you producing through it — directing outputs, managing processes, shipping results without the slow, friction-rich, intimate engagement that transforms a builder into a dweller?
The beaver does not merely build. She dwells. And the difference between a dam that holds and a dam that washes away is the difference between a structure built by a dweller and a structure assembled by a producer. The dweller returns. The dweller repairs. The dweller tends. The producer ships and moves on.
In the age of AI, the temptation to produce without dwelling is almost irresistible. The tools are so fast, the outputs so competent, the next opportunity so immediately available, that the slow work of dwelling — of being present to the work long enough for the work to teach you something — feels like an indulgence. It is not. It is the foundation on which every durable thing is built.
A thread is not a line. A thread is a material — flexible, tensile, possessing properties of its own — that becomes a line only when drawn through the specific gestures of a maker's hands. The distinction matters because it locates the origin of the line not in abstract geometry but in the practical, bodily engagement of a human being with a physical medium. The line of the thread is not imposed on the material. It emerges from the interaction of the maker's movements with the thread's properties — its elasticity, its roughness, its tendency to twist.
Weaving takes this interaction and compounds it. The weaver does not work with a single thread but with many, interlacing them according to patterns that emerge from the interaction of warp and weft, tension and slack, the properties of the fiber and the rhythm of the shuttle. The pattern is not a design imposed from outside. It is a property of the weaving itself — of the specific way these particular threads, handled by these particular hands, in this particular rhythm, interlock and resist and accommodate one another. Change the fiber, and the pattern changes. Change the rhythm, and the pattern changes. Change the tension on the warp, and the pattern changes. The weaver is not executing a plan. She is following the fabric.
Ingold has used weaving as a model for all skilled practice because weaving reveals, with unusual clarity, the essential character of what he calls textility — the quality of skilled making that arises from the interlacing of movements, materials, and attention into a coherent whole. Textility is not texture, though the words share a root. Texture is a surface property — the feel of a finished fabric. Textility is a process property — the quality of the making that produces the texture. A piece of handwoven cloth has a textility that machine-woven cloth lacks, not because the surface is different (it may be indistinguishable) but because the process that produced it was different: more responsive, more improvisatory, more dependent on the moment-to-moment judgments of a maker who was in correspondence with the material.
The concept illuminates a loss that The Orange Pill describes with precision and mourns without naming. The book's chapter on the Luddites is the most textility-relevant passage in the entire work. The framework knitters of Nottinghamshire and the hand-loom weavers of Yorkshire had spent years, sometimes decades, developing a particular form of expertise. Their skill was not merely technical. It was an entire way of being with material — an embodied relationship to thread and loom that constituted their identity as much as their livelihood.
Ingold's framework allows a more precise identification of what was destroyed. What the power loom eliminated was not a job or even a skill in the conventional sense. It was a textility — the specific quality of making that arose from the interlacing of the weaver's movements with the thread's properties, the loom's mechanics, and the emerging fabric's demands. The weaver was not executing a stored design. She was following the cloth — responding to the tension of the warp, adjusting the weight of the shuttle, sensing through her hands the developing quality of the weave and making continuous micro-adjustments that kept the whole process in balance.
These micro-adjustments were not visible in the finished product. A bolt of cloth woven by a master and a bolt woven by an advanced apprentice might be indistinguishable to a customer. But the weaver knew the difference, and the difference was the quality of the correspondence — the depth and responsiveness of the mutual engagement between maker and material. The master's cloth was made with a textility that the apprentice's lacked: a richer, more nuanced dialogue between hands and thread, a greater sensitivity to the developing demands of the fabric, a more refined capacity for the continuous micro-adjustments that constitute skilled practice.
The power loom did not produce cloth without textility. It produced cloth with a different textility — one in which the machine's rhythms replaced the weaver's rhythms, the machine's adjustments replaced the weaver's adjustments, and the correspondence between maker and material was mediated by a mechanism that absorbed most of the variation that the weaver's hands would have responded to. The cloth was consistent. The cloth was efficient. The cloth was, by many measures, superior. But the textility was impoverished — not in the surface quality of the product but in the depth of the engagement between the making and the maker.
This is not nostalgia. Ingold is clear that he is not arguing for a return to the hand loom any more than he is arguing for a return to stone tools. The argument is analytic, not prescriptive. When a mode of making that possesses a particular textility is replaced by a mode that possesses a different and thinner textility, something real disappears from the world — something that existed in the correspondence between maker and material and that no amount of product-quality improvement can replace. The loss is not in the product. It is in the practice. And the practice was the site of a particular kind of human knowledge, engagement, and satisfaction.
The AI moment is doing to knowledge work what the power loom did to textile work. This claim requires careful calibration, because the analogy can be pushed too far. Knowledge work is not manual labor. Code is not cloth. The programmer's correspondence with a codebase is not identical to the weaver's correspondence with a loom. The media are different, the capacities engaged are different, the products are different.
But the structural transformation is the same. In both cases, a mode of making that possessed a particular textility — a particular quality of interlacing between the maker's attention, the medium's demands, and the emerging form's feedback — is being replaced by a mode of production that generates comparable or superior products with a thinner textility. The programmer who builds by hand, writing code, encountering errors, debugging, tracing logic, struggling with syntax, is engaged in a practice that has its own textility: the interlacing of her cognitive movements with the code's behavior, her attention with the system's feedback, her evolving understanding with the artifact's emerging form. The programmer who directs Claude toward a working system may arrive at a comparable product, but the textility of the practice — the quality of the interlacing, the depth of the correspondence, the richness of the engagement between maker and medium — is diminished.
Ingold would note that textility is fractal: it operates at every scale of the practice, from the micro-level of individual keystrokes to the macro-level of architectural decisions. The programmer who types code is in tactile correspondence with the keyboard — a minimal correspondence, but a real one, a physical engagement with a physical interface that anchors the cognitive work in a bodily practice. The programmer who reads error messages is in visual correspondence with the system's feedback — a more substantive correspondence in which the maker attends to what the material is doing and adjusts her approach accordingly. The programmer who debugs is in the deepest correspondence of all — a sustained, frustrating, illuminating dialogue with a system that is not doing what she expected, a dialogue in which her understanding of the system is tested, refined, and deepened by the friction of the encounter.
Each of these levels of correspondence contributes to the overall textility of the practice. Remove one, and the textility thins. Remove several, and it thins dramatically. Remove all of them — replace the typing, the error-reading, the debugging with a single act of natural-language description followed by machine production — and what remains is not textility at all. It is direction. The maker has become a director, and the making has become a production. The product may be excellent. The practice has lost its weave.
There is an additional dimension to textility that bears directly on the question of professional identity — the question that haunts the senior engineers and experienced practitioners who feel, as The Orange Pill documents, that "something beautiful was being lost." The beauty they sense is not aesthetic in the conventional sense. It is the beauty of textility — the specific satisfaction that comes from a practice in which your movements, your attention, and the material's responses are interlaced into a coherent whole. The satisfaction of the weaver is not in the finished cloth. It is in the weaving — in the rhythmic, responsive, continuously adjusted engagement with the medium that constitutes the practice itself.
When The Orange Pill's senior architect describes himself as "a master calligrapher watching the printing press arrive," he is describing the loss of textility. The calligrapher's art is not merely in the finished letter. It is in the stroke — the pressure, the angle, the speed, the relationship between the brush and the paper and the ink's viscosity and the maker's breath. The printing press produces better letters, more consistently, more efficiently. But it does not produce the stroke. It does not produce the textility. And for the calligrapher, the stroke was the practice.
The knowledge workers whose practice is being restructured by AI are not calligraphers. Their textility is cognitive rather than manual, symbolic rather than material. But the structural loss is analogous. The interlacing of their movements with their medium — their attention with the code's feedback, their judgment with the system's behavior, their evolving understanding with the artifact's emerging demands — was the substance of their practice. The product was the output. The textility was the practice. And the practice was where the meaning lived.
When the textility of a practice is thinned, the practitioner does not merely lose a mode of working. She loses a mode of being. The weaver who no longer weaves is not merely unemployed. She is displaced from the practice in which her identity, her knowledge, and her relationship to the world were constituted. The programmer who no longer debugs is not merely more efficient. She has been displaced from the correspondence in which her deepest professional knowledge was developed and maintained.
The question for the age of AI is not whether the products will be better. They probably will be. The question is whether the practices that produce them will retain enough textility — enough interlacing of human attention with material response, enough correspondence between maker and medium, enough of the rhythmic, responsive, continuously adjusted engagement that constitutes skilled work — to sustain the knowledge, the identity, and the satisfaction that have always depended on the richness of the weave.
The most revealing metaphor in The Orange Pill is not the river. It is the fishbowl.
A fishbowl, in Segal's usage, is the set of assumptions so familiar that the inhabitant has stopped noticing them — "the water you breathe, the glass that shapes what you see." Everyone swims in one. The scientist's fishbowl is shaped by empiricism. The filmmaker's by narrative. The builder's by the question "Can this be made?" Each fishbowl reveals part of the world and hides the rest. The effort that defines the best thinking, the book argues, is the effort to press your face against the glass and see the world beyond the water's refractions.
The metaphor is powerful. It captures something real about the way professional and intellectual formations constrain perception — the way a person who has spent decades inside a particular way of seeing the world comes to mistake that way of seeing for the world itself. The fishbowl is invisible because it is total. The water is unnoticed because it is everywhere.
But viewed through Ingold's anthropology of dwelling, the fishbowl reveals something that the metaphor, as deployed, does not intend. A fishbowl is not just a limitation. It is a dwelling place. And dwelling places, as Heidegger and Ingold understand them, are not merely constraints on perception. They are the conditions of perception — the structured environments within which seeing, knowing, and making become possible.
A dwelling place is not a prison. It is a home. And the difference between a prison and a home is not in the walls — both have walls — but in the inhabitant's relationship to those walls. The prisoner is confined by walls she did not choose and cannot modify. The dweller inhabits walls she has built, maintained, and shaped to suit her practice — walls that protect as much as they constrain, that create an interior in which certain kinds of work become possible precisely because the exterior has been held at bay.
The fishbowl metaphor, as commonly deployed, treats the glass as a limitation to be transcended. The best thinking presses against the glass. The orange pill cracks the fishbowl. The aspiration is to see beyond the water, to escape the distortion, to apprehend the world as it really is rather than as the fishbowl presents it. The gesture is heroic and outward-directed: break the glass, see the truth, build on the new ground.
Ingold's dwelling perspective suggests a different reading. The fishbowl is not merely a limitation. It is the accumulated result of years of dwelling — years of engaged practice within a particular domain, during which the dweller has constructed, layer by layer, the perceptual and conceptual structures that allow her to see what she sees. The scientist's fishbowl is shaped by empiricism not because empiricism is a bias but because empiricism is a hard-won perceptual achievement — a way of attending to the world that took centuries to develop and that produces a specific, valuable, irreplaceable form of knowledge. The filmmaker's fishbowl is shaped by narrative not because narrative distorts reality but because narrative is a mode of attending to the temporal structure of experience that reveals patterns invisible to other modes of attention.
The fishbowl is not the thing that prevents you from seeing. It is the thing that allows you to see — at the cost of determining what you see and what you miss. Every fishbowl is simultaneously a revelation and a concealment. The same structures that enable perception also constrain it. This is not a defect to be corrected. It is the fundamental condition of all knowing.
The orange pill moment — the recognition that the fishbowl is a fishbowl — is, in Ingold's terms, a moment of defamiliarization: the sudden, disorienting experience of seeing one's dwelling place as a construction rather than a given. The water you have been breathing is revealed as a particular kind of water. The glass that has shaped your vision is revealed as a particular kind of glass. The taken-for-granted quality of your world — the quality that dwelling requires and produces — is momentarily suspended, and you see, with a clarity that is both exhilarating and vertiginous, that you have been living inside a structure all along.
This is a valuable experience. Ingold does not deny this. Defamiliarization — the capacity to see the familiar as strange — is one of anthropology's foundational methods and one of philosophy's oldest aspirations. The capacity to step outside one's dwelling place, even briefly, and see it as a construction rather than a natural environment is a capacity that enriches thought, expands sympathy, and prevents the dogmatism that comes from mistaking one's own fishbowl for the ocean.
But Ingold adds a crucial qualification that the glass-breaking metaphor obscures. You cannot dwell outside a fishbowl. There is no view from nowhere. The person who breaks the glass does not ascend to an unmediated perception of reality. She finds herself in a new fishbowl — a new set of assumptions, a new perceptual structure, a new dwelling place that will, in time, become as invisible and as total as the one she left. The orange pill does not liberate you from the fishbowl. It relocates you to a different fishbowl, one whose assumptions are fresh enough to be visible and will become invisible with familiarity.
The critical question, then, is not how to escape the fishbowl — an impossibility — but how to construct a fishbowl worth dwelling in. What kind of perceptual structure do you want to inhabit? What do you want your dwelling place to reveal, and what are you willing to accept that it conceals? What are the walls made of, and do they protect the right things?
This reframing changes the meaning of the AI transformation. The book presents the orange pill as a crack in the glass — a moment when assumptions that had governed careers for decades were revealed as contingent and breakable. The vertigo is the vertigo of defamiliarization, and the imperative is to build new structures on the newly revealed ground.
Ingold's framework suggests that the building of new structures is the construction of a new fishbowl — a new dwelling place with its own assumptions, its own constraints, its own revelations and concealments. The question is whether the new fishbowl is constructed with the care and attention that dwelling requires, or whether it is assembled hastily, under the pressure of the moment, without the slow, deliberate, friction-rich process of dwelling that produces a habitable structure.
Consider the three friends on the Princeton campus. Each inhabits a fishbowl shaped by decades of dwelling in a particular practice. The neuroscientist's fishbowl is shaped by years of attending to brain scans, reading papers, conducting experiments — a dwelling place constructed through the sustained practice of scientific inquiry. The filmmaker's is shaped by years of seeing the world in sequences, of attending to cuts and transitions, of the practice of constructing narrative meaning. The builder's is shaped by decades of making things and watching what happens after — the practice of bringing artifacts into the world and living with their consequences.
When their fishbowls collide — when the three friends argue on the path — the collision is productive precisely because each fishbowl is the product of genuine dwelling. The neuroscientist's objections carry weight because they come from decades of practice within a specific discipline. The filmmaker's reframings illuminate because they come from a cultivated capacity for seeing connections between images that others miss. The builder's intuitions are worth attending to because they are grounded in the specific, practical, hard-won knowledge of what it takes to make something that works in the world.
If any of the three had constructed his fishbowl hastily — if the neuroscientist had spent a semester in a lab, or the filmmaker had watched a few films, or the builder had shipped one product — the collision would not produce insight. It would produce noise. The value of the collision depends on the depth of the dwelling that produced each fishbowl. The fishbowl has to be dense enough, structured enough, the product of enough sustained engagement with a specific practice, to have something real to offer when it meets another.
This is why Ingold's dwelling perspective produces a different anxiety about AI than the defamiliarization perspective the book primarily adopts. The defamiliarization anxiety is: Will we be able to see past our old assumptions quickly enough to navigate the new world? Can we crack the glass before the ground shifts under us?
The dwelling anxiety is deeper: Will the new fishbowls we construct be worth inhabiting? Will they be built through the slow, attentive, friction-rich process of genuine practice — the kind of dwelling that produces perceptual structures dense enough to reveal the world in all its complexity? Or will they be assembled hastily from the abundant, smooth, friction-free materials that AI provides — structures that look like fishbowls from the outside but that lack the density, the texture, the layered quality that only sustained dwelling produces?
A fishbowl built by a dweller is a different thing from a fishbowl built by a producer. The dweller's fishbowl is the accumulated residue of years of attention — attention to the material, to the practice, to the specific demands of a particular domain. Its glass is thick with the deposits of experience. It refracts the light in specific ways because it has been shaped by specific encounters with the world. It reveals what the dweller's practice has taught her to see, and it conceals what her practice has not reached. But what it reveals, it reveals with depth and precision, because the revelations come from genuine engagement with the world rather than from abstract representation.
A fishbowl built hastily — assembled from prompts and outputs, constructed in hours rather than years, lacking the layered deposits of sustained practice — may look identical from the outside. The glass may be clear. The water may be comfortable. The inhabitant may see a great deal. But what she sees will lack the depth that only dwelling produces — the felt, intimate, ground-level knowledge of a domain that comes from having moved through it as a wayfarer rather than having been transported across it.
The age of AI is a moment of collective fishbowl reconstruction. Old dwelling places are cracking. New ones must be built. The question Ingold's framework raises is not whether to build them — that is inevitable — but whether to build them with the care, the patience, the sustained attention that dwelling requires, or to assemble them with the speed and efficiency that the tools make possible and the moment seems to demand.
The quality of the fishbowl determines the quality of the seeing. And the quality of the seeing determines whether the builder, when she looks out at the world through her new glass, perceives it with the depth and richness that the moment demands — or perceives only the smooth, frictionless, rapidly produced image of a world she has never truly dwelt in.
The hand that holds a stylus makes contact with a surface. The surface resists. The stylus moves across it, leaving a mark whose character — its depth, its width, its curvature — is determined not by the hand alone but by the interaction of three things: the pressure the hand exerts, the angle at which the stylus meets the surface, and the properties of the surface itself. Smooth paper produces a different mark than rough paper. Wet clay produces a different mark than dry clay. The hand that has spent years making marks on a particular surface knows things about that surface that no description could convey — knows them in the fingertips, in the wrist, in the calibrated pressure of a gesture so habitual it has become invisible to the person performing it.
This is material engagement. It is the mode of being in the world in which the maker is in direct, sensory, bodily contact with the stuff of her practice. It is, in Ingold's framework, the ground condition of all making — the irreducible minimum of correspondence between a living being and the physical world from which all more complex forms of making grow.
In March 2025, Ingold delivered a lecture at Penn State titled "Digitization and Fingerwork." The lecture examined a specific migration: the movement of skilled work from the hand to the fingertips. For millennia, skilled manual operations — knotting, weaving, breadmaking, milking, embroidery, handwriting — depended on the full hand, the whole architecture of grip and grasp and pressure and release that makes the human hand the most sophisticated manipulative instrument in the biological world. The hand engaged with materials. The hand followed materials. The hand knew materials in a way that the conscious mind could not articulate, because the knowledge was distributed across the sensorimotor system rather than concentrated in the representational mind.
Then the hand migrated to the fingertip. The keyboard, the touchscreen, the trackpad — each reduced the full repertoire of manual skill to a narrow subset: tapping, swiping, clicking. The fingertip makes contact with a surface, but the surface is inert glass. The glass does not resist. It does not have grain or moisture or tension. It does not push back in ways that require the maker to adjust. The fingertip mediates the transmission of information in a virtual world, but it has, as Ingold stated, "no purchase in the real world of forces and materials."
This observation is not technophobia. It is a precise description of a change in the human relationship to the material world — a change whose consequences are still being calculated. When the full hand engaged with materials, the body's sensorimotor system was exercised across its entire range. Proprioception — the sense of where your body is in space and how much force your muscles are exerting — was constantly active. Tactile discrimination — the ability to distinguish between surfaces, textures, resistances — was continuously refined through practice. The body was a participant in the making, not merely a vehicle for the mind's instructions.
When the hand contracts to the fingertip on glass, these capacities atrophy. Not metaphorically. Measurably. Studies of manual dexterity among populations that have shifted from manual to digital work show declining grip strength, reduced tactile sensitivity, and diminished proprioceptive accuracy. The body that spends its days tapping glass is a body whose sensorimotor system is being narrowed to a single, repetitive, low-variability movement. The hand that once knew the grain of wood, the plasticity of clay, the tension of thread, now knows only the smooth, uniform, unresisting surface of a screen.
AI completes this migration. The prompt-execute cycle does not even require the fingertip to make contact with a material medium. It requires only that the user compose language — natural language, the most abstract of human symbolic systems — and transmit it to a machine that produces the artifact. The body is almost entirely absent from the process. The hands rest on a keyboard, but the keyboard is not a material medium in any meaningful sense; it is a transmission device, a way of converting thought into text for the machine's consumption. The maker who prompts Claude is not in contact with the material of her practice. She is in contact with language about the material of her practice, which is a fundamentally different relationship.
Consider the distance that has opened between the maker and the made. The boat builder's hand on the plank of larch: zero distance. The weaver's hand on the shuttle: zero distance. The programmer's fingers on the keyboard, typing code that she can see on the screen and run and test and debug: a small distance, mediated by the keyboard and the screen, but still a distance within which real-time feedback flows from the artifact to the maker. The coder sees the error message. She sees the unexpected output. She traces the logic and finds the bug. The feedback is not tactile, but it is immediate, specific, and rich enough to support a genuine correspondence between maker and artifact.
The AI user's language directed at a machine that produces the artifact: the distance is now categorical. The feedback is not the artifact's feedback. It is the machine's output. The user does not see the error messages, the dependency tangles, the unexpected behaviors that occur between the prompt and the product. She sees only the product. The entire landscape between her intention and its realization has been traversed by the machine, and the feedback that would have arrived at each step of a hands-on process — the friction that would have forced adjustment, learning, the deepening of understanding — has been absorbed by the machine and is invisible to the user.
Ingold is not arguing for a return to handcraft. He is making a more pointed and more relevant claim: that abstraction without material engagement produces a particular kind of knowledge, and that this kind of knowledge has specific, identifiable limitations that matter for the quality of what is made and for the capacity of the maker to know what she has made.
Knowledge produced through material engagement is grounded. It is anchored in specific, bodily encounters with specific materials in specific places and times. The boat builder's knowledge of larch is not abstract knowledge about wood in general. It is particular knowledge about this wood — its response to this humidity, its behavior under this kind of stress, its particular grain pattern and what it implies about the tree's growth history. This particularity is what makes the knowledge useful, because the problems that arise in practice are always particular: this plank, this joint, this specific conjunction of forces and materials that will never recur in exactly this form.
Knowledge produced through abstraction — through the manipulation of symbols, descriptions, representations — is connectable. It can be linked to other knowledge, combined into frameworks, applied across contexts. This is its power. The naval architect who works with mathematical models of hull dynamics can apply those models to any hull, in any water, under any conditions. The abstraction is what makes the knowledge portable.
But connectability without grounding produces a specific failure mode: the production of artifacts that are formally correct and materially wrong. The hull that is mathematically optimal and practically unseaworthy because the model did not account for a specific property of the specific water in the specific harbor where the boat will operate. The software that passes every test and fails in production because the test environment did not replicate the specific, particular, messy conditions of the real-world deployment. The legal brief that cites the right cases and makes the right arguments and misses the specific, particular, human dimension of the dispute that a lawyer who had sat with the client for hours would have caught.
These failures are not random. They are systematic. They are the predictable consequence of making from abstraction without the grounding that material engagement provides. And they are becoming more common as AI makes abstraction more productive and material engagement less necessary.
The Orange Pill describes this dynamic indirectly when it notes that "the prose had outrun the thinking" — that Claude produced a passage about the moral significance of democratization that was "eloquent, well-structured, hitting all the right notes" but that the author could not tell whether he actually believed. The passage was formally correct. It was materially empty. The words were right and the thought was absent. The author caught this instance because he had enough experience to recognize the gap between articulate prose and genuine conviction. But the gap is structural, not incidental. It is what happens when the medium of making is pure language, when the resistance of the material — in this case, the resistance of genuine thought, which is slow and messy and does not arrive in well-structured paragraphs — has been smoothed away.
This smoothing reaches beyond any single domain. It is a civilizational shift in the relationship between human beings and the material world — a shift that has been underway for centuries but that AI accelerates to a pace at which its consequences become visible in years rather than generations. Each layer of abstraction that separates the maker from the material reduces the maker's sensory engagement with the world by one degree. The programmer is more abstract than the carpenter. The AI-assisted programmer is more abstract than the programmer. The AI-directed manager is more abstract than the AI-assisted programmer. Each step up the abstraction ladder produces knowledge that is more connectable and less grounded, more portable and less particular, more formally correct and more vulnerable to the specific, material, ground-level failures that only grounded knowledge can anticipate.
Ingold does not prescribe a remedy. He is a diagnostician, not a pharmacist. But his diagnosis implies a direction: that the makers who thrive in the age of AI will be the ones who maintain some form of material engagement alongside their abstract practice. Not as a hobby. Not as a weekend retreat from the screen. As a structural component of their professional practice — a deliberate, sustained, friction-rich engagement with the stuff of their domain that keeps their abstract knowledge grounded in the particular, the sensory, the resistant.
What this looks like will differ by domain. For the software architect, it might mean periodically writing code by hand — not for efficiency but for the grounding that the struggle provides. For the designer, it might mean building physical prototypes — not because they are cheaper than digital ones (they are not) but because the hands-on engagement with material reveals constraints and possibilities that the screen conceals. For the writer, it might mean what The Orange Pill's author did when the prose outran the thinking: close the laptop, pick up a pen, and write by hand until the friction of the slower medium forces the thought to catch up with the words.
These practices will look inefficient. They will look, in the language of the smooth aesthetic, like resistance to progress. They are resistance — not to progress but to the specific kind of progress that abstracts the maker away from the material world entirely. They are the maintenance of a correspondence — between hand and world, between body and material, between maker and the resistant, particular, irreducibly physical stuff of practice — that no amount of abstract knowledge can replace and that the quality of the made thing ultimately depends upon.
The fingertips on the glass transmit information. The hands in the clay generate knowledge. The difference between transmission and generation is the difference between a civilization that produces artifacts and one that understands them.
The kite flies because of the string.
This image, which Ingold has used to illuminate the relationship between freedom and constraint in skilled practice, is the point of departure for the synthesis his framework demands. A kite without a string does not soar. It tumbles. The wind takes it in every direction, and because every direction is available, no direction is achieved. The string is the constraint that converts the wind's undifferentiated force into directed flight. Remove the string, and you do not get more freedom. You get less — the specific, purposeful freedom of flight replaced by the purposeless, chaotic freedom of tumbling.
The metaphor challenges the most fundamental assumption of the triumphalist narrative about AI: that the removal of friction is the removal of limitation, and that the removal of limitation is the expansion of creative capability. The assumption is intuitive. It is also wrong, in the precise way that the assumption "a kite without a string will fly higher" is wrong. The string is not holding the kite down. The string is what enables the kite to fly.
Ingold's entire body of work can be read as an extended meditation on what the string is made of and why it matters. The string is correspondence — the mutual responsiveness between maker and material. The string is enskilment — the slow, friction-rich cultivation of perceptual capacities through practice. The string is textility — the interlacing of movements, materials, and attention that constitutes the weave of skilled work. The string is dwelling — the sustained, caring, unhurried engagement with a specific practice that allows the practitioner to develop the kind of knowledge that cannot be transmitted, only grown.
Remove any of these strings, and the kite does not soar higher. It tumbles. The maker who is freed from material engagement does not create more freely. She creates without the constraint that gave her creation its direction and its depth. The developer freed from debugging does not write better code. She directs code production without the intimate familiarity with failure that gave her judgment its grounding. The writer freed from the struggle with language does not produce richer prose. She produces smoother prose — prose that is formally correct and materially weightless, that sounds like insight without bearing the marks of genuine struggle.
The Orange Pill recognizes this. Its concept of the dam — the structure that redirects the river's force toward life — is a string concept. The dams are constraints that enable rather than limit, structures that convert the undifferentiated force of AI capability into directed creative work. The beaver metaphor is, at its heart, a metaphor about the creative necessity of constraint: the pool behind the dam is valuable precisely because it is bounded, because the water's force has been redirected rather than allowed to flow unimpeded.
Ingold would endorse the dam but press further on its composition. A dam made of what? The dams that The Orange Pill advocates — structured pauses, sequenced work, protected mentoring time, the organizational and cultural structures that prevent AI from colonizing every available moment — are institutional dams. They are built from policies and norms, from the deliberate decisions of leaders and organizations to create spaces where the pressure of production is held at bay. These dams are necessary. They are also insufficient.
Ingold's framework suggests that the most important dams are not institutional but personal — not the structures that organizations build around their workers but the practices that individual makers maintain within their own working lives. The boat builder who runs his hand along the grain before deciding where to cut is practicing a personal dam: a deliberate, habitual engagement with material that slows the production process and enriches the maker's knowledge. The writer who puts down the laptop and picks up a pen is practicing a personal dam: a voluntary reintroduction of friction that forces the thinking to catch up with the production. The developer who periodically writes code by hand, tracing the logic through the system rather than prompting a solution, is practicing a personal dam: a maintenance of the wayfaring mode that keeps her terrain knowledge alive.
These personal dams are harder to build and harder to maintain than institutional ones, because they require the individual to resist the internal pressure of the achievement imperative — the voice that says every moment spent in friction is a moment wasted, every detour through the terrain is a detour away from the destination. The institutional dam can be mandated. The personal dam must be chosen, daily, against the grain of a culture and a toolset that rewards speed above all else.
The maker worthy of this moment is not, then, a figure defined by a single posture. She is not the Upstream Swimmer who refuses the tools, nor the Believer who accelerates without constraint, nor even the Beaver who builds institutional dams in the current. She is something more specific and more difficult: a maker who attends.
Attending is Ingold's most fundamental concept, though it does not carry the technical vocabulary of his more cited terms. To attend is to be present to the world in a way that is simultaneously receptive and active — to perceive what is happening without imposing a framework on it, while responding to what is perceived with the full repertoire of one's developed capacities. The hunter attends to the landscape. The potter attends to the clay. The weaver attends to the developing fabric. In each case, attending means something quite specific: not controlling but following, not commanding but responding, not producing but growing.
Attending is harder than directing. This is the counterintuitive truth that Ingold's framework reveals. Directing requires clarity of purpose, decisiveness, the ability to specify an outcome and drive toward it. These are valuable capacities. They are also, in the age of AI, the capacities that the tools most naturally support and amplify. Prompt with a clear intention, and the machine delivers. The clearer the intention, the better the delivery. Directing is what the tool was built for.
Attending requires the relinquishment of predetermined outcomes. It requires the willingness to be surprised — to enter the process without knowing what will emerge and to trust that the emergence, guided by the maker's cultivated capacities, will produce something that a predetermined outcome could not have anticipated. It requires patience, because emergence is slower than production. It requires tolerance for ambiguity, because the attending maker does not always know where she is going or whether what is emerging is any good. It requires, above all, the specific kind of presence that dwelling cultivates: the unhurried, caring, sustained engagement with a practice that allows the practitioner to perceive what the practice offers rather than demanding that the practice deliver what the practitioner wants.
The most creative moments in The Orange Pill's own AI collaboration — the moments the author returns to with evident wonder — are moments of attending. The impasse that preceded the laparoscopic surgery insight was not solved by directing. It was solved by attending: the author described the impasse to Claude not as a prompt for a solution but as an expression of genuine uncertainty, and what emerged from the exchange was something neither party had anticipated. The tears at the beauty of an excavated idea were not the tears of a director pleased with a product. They were the tears of a maker who had attended to a process and been rewarded with an emergence she could not have commanded.
These moments demonstrate that attending within AI collaboration is possible. They also demonstrate that it is exceptional — that the medium's gravitational pull is toward directing rather than attending, toward production rather than growth, toward the prompt-execute cycle that is hylomorphism's purest expression rather than the open-ended, responsive, emergence-welcoming mode that Ingold's making demands.
The discipline of the age, then, is the discipline of continuing to attend in a medium that rewards directing. Of following materials — even when the materials are ideas and the medium is language — rather than commanding results. Of dwelling in the process rather than managing the production. Of maintaining the string that gives the kite its flight, even when the culture insists that strings are limitations and the removal of limitations is freedom.
Ingold, in a 2019 interview, declared: "I suppose you are referring here to the popular debates and hyperbolic speculation currently surrounding the idea of artificial intelligence. This is a topic I prefer to avoid." The avoidance is itself a form of attending — attending to the practices (gardening, handwriting, analog music) that his framework identifies as sites of genuine knowledge, rather than attending to the spectacle of a technology whose fundamental assumptions he rejects.
But the avoidance is also a limitation. Ingold's framework is most useful precisely where he is least willing to apply it — to the specific, lived, ground-level experience of makers who are working with AI right now, who are trying to maintain the quality of their correspondence in a medium that structurally thins it, who are trying to attend when the tools are designed for directing.
The framework does not provide a manual. It provides something more valuable: a vocabulary for what is at stake, and a standard against which the quality of making can be measured. The standard is not productivity. It is not output. It is not the speed with which the imagination-to-artifact ratio closes. The standard is the quality of the correspondence — the depth of the maker's engagement with her medium, the richness of the mutual responsiveness between the one who makes and the thing being made, the degree to which the process of making is a process of growth rather than a process of production.
By that standard, some AI-assisted work will measure well. The collaboration that produces genuine surprise, that excavates connections the maker could not have found alone, that enters the process without a predetermined outcome and emerges with something richer than either party could have predicted — this is correspondence. This is attending. This is making.
And by that standard, much AI-assisted work will measure poorly. The production of artifacts without engagement. The direction of outputs without dwelling. The accumulation of products by a maker who has never been in contact with the material of her practice, who has never followed anything, who has never been lost.
The distinction cannot be made from outside. A product made by a dweller and a product made by a director may be indistinguishable. Only the maker knows which mode she was in. Only the maker knows whether she attended or merely directed. And only the maker bears the consequences: the deepened capacity that attending cultivates, or the thinned capacity that directing, over time, produces.
Skill, Ingold writes, is "destined to carry on for as long as life does, along a line of resistance, forever undoing the closures and finalities that mechanisation" imposes. The line of resistance is not a refusal of the tools. It is the maintenance of attending — the ongoing, daily, never-completed practice of following materials, dwelling in the process, holding the string that makes the flight possible — in a world that measures only the height of the kite and never asks what keeps it in the air.
The grain of a thing. That phrase kept surfacing while I worked through Ingold's ideas, and I could not shake it.
Not the grain of wood — though that was where Ingold's framework started for me, with the boat builder running his hand along a plank of larch, reading with his fingertips what no description could convey. The phrase that haunted me was more general than that. The grain of a problem. The grain of an idea. The grain of a practice. The texture that reveals itself only to someone who has been in sustained, patient, friction-rich contact with the thing itself.
When I built Napster Station in thirty days with my team and with Claude, I felt — and I described in The Orange Pill — the exhilaration of the imagination-to-artifact ratio collapsing to the width of a conversation. I still feel that exhilaration. I do not retract it. The capability is real, and what it unlocks for people who previously had no path from idea to artifact is genuinely important.
But Ingold forced me to sit with a different question. Not whether the artifact was good — it was — but what happened to me in the making of it. Whether I was following the material or commanding the output. Whether I was wayfaring through the problem or being transported to the solution. Whether I was dwelling in the process or merely producing through it.
The honest answer is: both. Sometimes in the same hour. And Ingold's framework gave me the vocabulary to tell the difference — to recognize the moments when the collaboration was genuine correspondence, two participants attending to what emerged between them, and the moments when it was pure hylomorphism, me describing a form and the machine imposing it on compliant matter.
The moments of correspondence were the ones that produced the work I am proudest of. The moments of hylomorphism produced the work that shipped fastest.
There is a passage in The Orange Pill where I describe sitting at a coffee shop with a notebook, writing by hand, because the prose that Claude and I had produced together sounded better than it thought. That moment — the pen, the paper, the slowness, the friction of a medium that does not autocomplete — was what Ingold would call a return to material engagement. I did not have his vocabulary for it then. I have it now. What I was doing was reintroducing the string. The constraint that gives the kite its flight. The resistance that forces the thought to earn its expression rather than receiving it as a gift from a machine that does not care whether the thought is genuine.
Ingold declared, with characteristic bluntness, that the whole AI enterprise is built on a faulty notion of intelligence. I do not fully agree. But I am less confident in my disagreement than I was before I encountered his work. The question he poses — whether intelligence can exist apart from the perception and action of a living body moving through a material world — is not a question I can answer from inside the fishbowl of my own practice. It is a question that requires pressing my face against the glass and admitting I cannot see what is on the other side.
What I can see is this. The engineers in Trivandrum who made the leap from one discipline to another with Claude's help were genuinely expanded. Their capability is real. But Ingold's framework asks me to look at what was not expanded — the bodily, sensory, material engagement with the stuff of their practice that was, if anything, further contracted. They directed more. They attended less. They arrived at more destinations. They wayfared through less terrain.
Whether this trade is sustainable — whether a generation of navigators can produce the knowledge that only wayfarers carry — is the question I will carry forward from this book. I do not know the answer. Ingold, from his garden in Berlin, is certain he does. I am less certain, and my uncertainty is, I think, the more honest position for someone who is building inside the transformation rather than observing it from the outside.
But I am grateful for the vocabulary. For the distinction between production and growth. For the concept of correspondence, which names the thing I felt when the collaboration was at its best and the thing I missed when it was merely efficient. For the kite and the string — the image I now carry as a standard against which to measure my own practice.
Am I attending? Or am I merely directing?
The answer changes hour by hour. The question does not change at all.
What if the friction AI removes was never holding you back -- but holding you up?
When AI collapses the distance between imagination and artifact to the length of a conversation, we celebrate the removal of friction as pure liberation. Tim Ingold, a social anthropologist who has spent four decades studying what actually happens between makers and their materials, sees something the celebration misses: the friction was never just an obstacle. It was the relationship. The potter's knowledge lives in her hands pressing against clay. The programmer's intuition lives in the hours of debugging that deposited it. Remove the resistance, and you do not free the maker. You dissolve the correspondence in which her deepest knowledge was formed.
This book applies Ingold's framework -- correspondence, enskilment, wayfaring, the meshwork, the critique of hylomorphism -- to the AI revolution with surgical precision. It asks not whether AI produces better artifacts, but what happens to the maker when the material drops out of the making. The answer reframes everything builders, leaders, and parents think they know about skill, knowledge, and what it means to create.
-- Tim Ingold

A reading-companion catalog of the 40 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Ingold Tim — On AI uses as stepping stones for thinking through the AI revolution.
Open the Wiki Companion →