By Edo Segal
The first thing I noticed was that I had stopped noticing.
Not in the dramatic way — not a collapse of attention or a moment of crisis. Something quieter. I was building with Claude Code at a pace that would have seemed impossible six months earlier, and the results were extraordinary, and I was proud of them, and somewhere in the middle of all that velocity I realized I could no longer feel the thing I was building.
I don't mean emotionally. I mean perceptually. The texture of the work had changed. When I used to write code, the errors had grain — each one told me something specific about the shape of the system I was constructing. The resistance of the syntax, the stubbornness of a dependency conflict, the particular way a null pointer exception announced itself — these were surfaces I had learned to read the way a sailor reads water. Not through analysis. Through years of contact.
Now the surfaces were smooth. The code arrived correct. The errors were resolved before I encountered them. The system worked. And I could not tell you, with the same bodily confidence I once had, what the system was made of.
That gap between producing something and perceiving something is what led me to J.J. Gibson. Gibson was a psychologist who spent his career on a single radical claim: that perception is not something the brain constructs from incomplete data. It is something the organism detects directly from the structure of its environment. The world is not ambiguous. The world is rich with information. The question is whether you are attending to it.
Gibson's concept of affordances — the idea that what an environment offers for action depends on the relationship between the environment and the specific organism encountering it — is the sharpest diagnostic tool I have found for understanding what the AI transition is actually doing to the people inside it. Not what it produces. What it develops. Or fails to develop.
The AI discourse argues endlessly about output: faster, cheaper, more, less. Gibson's lens ignores output entirely and asks a different question. What does this environment offer for perception? What textures does it present for the organism to learn from? What happens when the surfaces smooth and the friction that educated your attention disappears — not because someone took it away, but because a better tool made it unnecessary?
This book applies that lens with rigor and discomfort. It will not tell you whether AI is good or bad. It will show you what the new environment actually offers, what it has quietly removed, and what that means for the perceptual development of every builder, student, and thinker growing up inside it.
The information is in the environment. Gibson proved that. The question is whether we are still learning to see.
— Edo Segal ^ Opus 4.6
James Jerome Gibson (1904–1979) was an American psychologist whose work fundamentally transformed the science of perception. Born in McConnelsville, Ohio, he studied at Princeton University and Northwestern University before joining the faculty at Smith College and later Cornell University, where he spent the bulk of his career. During World War II, Gibson directed a research unit for the U.S. Army Air Forces studying pilot selection and training, an experience that profoundly shaped his thinking about how organisms perceive and navigate real environments rather than laboratory abstractions. His major works — *The Perception of the Visual World* (1950), *The Senses Considered as Perceptual Systems* (1966), and *The Ecological Approach to Visual Perception* (1979) — progressively dismantled the dominant view that perception is an internal construction from impoverished sensory data, replacing it with an ecological theory in which perception is the direct detection of information already structured in the environment. His concept of "affordances" — what an environment offers an organism for action, relative to that organism's capacities — became one of the most influential ideas in twentieth-century psychology, migrating into design theory, robotics, human-computer interaction, and artificial intelligence research. His wife and collaborator Eleanor J. Gibson made foundational contributions to the study of perceptual learning and development. Gibson's insistence that the organism-environment system, not the organism alone, is the proper unit of perceptual analysis remains a cornerstone of ecological psychology and has gained renewed urgency in the age of AI-augmented environments.
An affordance is what the environment offers the animal for action. Not what the animal thinks the environment offers. Not what a designer intended the environment to offer. What the environment actually offers, given the specific constitution of the specific organism encountering it. A horizontal, rigid, extended surface at approximately knee height affords sitting for an adult human being. It does not afford sitting for an infant, whose legs are not long enough, or for a horse, whose joints do not bend that way. The surface does not change. The affordance changes, because the affordance is not a property of the surface. It is a property of the relationship between the surface and the organism.
This distinction, which Gibson introduced in his 1979 masterwork The Ecological Approach to Visual Perception and which has since migrated into robotics, design theory, and artificial intelligence research far beyond anything he could have anticipated, is not a refinement of existing perceptual theory. It is a replacement. The entire tradition of perceptual psychology, from Helmholtz through the Gestaltists through the information-processing cognitivists of the mid-twentieth century, assumed that perception begins with impoverished sensory data — patches of light on the retina, pressure waves on the eardrum — and that the organism's task is to construct a meaningful representation of the world from these raw inputs. Gibson rejected every element of this account. The data are not impoverished. The organism does not construct. Representation is not the mechanism.
The implications for understanding the AI transition are immediate and severe, and they have not been adequately drawn.
Consider the professional environment of a software builder in 2024, before the threshold event described in The Orange Pill. That environment had an affordance structure as real and as specifiable as the affordance structure of a forest floor. The codebase afforded modification: it offered the builder the possibility of changing behavior through the manipulation of symbols according to syntactic rules the builder had learned. The documentation afforded study: it offered the possibility of acquiring knowledge about the system's architecture through sustained, friction-rich reading. The error message afforded diagnosis: it offered the possibility of narrowing a problem space through systematic hypothesis and test. The version control system afforded collaboration: it offered the possibility of coordinating changes across multiple builders without destructive interference. The IDE afforded rapid feedback: it offered the possibility of seeing the immediate consequences of a change before committing it.
Each of these affordances was real. Each was relational — it depended on the specific capabilities the builder brought to the environment. A codebase written in Rust affords modification for a Rust programmer and affords almost nothing for a designer who has never seen a lifetime expression. The affordance does not live in the code. It does not live in the programmer. It lives in the relationship between them.
And each of these affordances shaped behavior. Gibson was emphatic on this point: animals do not first perceive the world and then decide what to do in it. Perception and action are coupled. The organism perceives affordances — possibilities for action — and the perception of those possibilities is itself the initiation of action. The experienced builder does not see a function and then decide whether to optimize it. She perceives the function as affording optimization, and that perception is the beginning of the optimization. Perception is not the prelude to action. It is the first phase of action.
This coupling between perception and action meant that the builder's daily engagement with the pre-AI environment was not merely productive — not merely a sequence of tasks completed and features shipped. It was perceptually formative. Every encounter with an error message tuned the builder's sensitivity to the patterns of failure that error messages specify. Every session of reading documentation tuned her sensitivity to the architectural invariants that documentation describes. Every debugging session tuned her sensitivity to the causal structure of the system she was maintaining. Over months and years, these engagements did not simply produce knowledge. They produced perceptual expertise — the ability to see, directly and immediately, affordances that a less experienced builder could not detect.
Gibson's collaborator and wife Eleanor Gibson spent decades studying this process, which they called perceptual learning. Perceptual learning is not the acquisition of new information to be stored and retrieved. It is the progressive differentiation of the perceptual system — the organism learns to make finer and finer distinctions in what was previously an undifferentiated field. The wine taster who can distinguish Burgundy from Bordeaux has not memorized a rule. She has differentiated her perceptual system through hundreds of hours of active engagement with the stimulus. The experienced builder who can feel that a codebase is unstable before she can articulate why has undergone the same process. Her perceptual system has been differentiated through thousands of hours of engagement with the affordance structure of codebases, and that differentiation allows her to detect invariants — patterns of instability, of architectural fragility, of impending failure — that a novice cannot see, not because the novice lacks intelligence but because the novice's perceptual system has not yet been tuned to detect them.
This is what Segal describes, in different vocabulary, when he writes of understanding as a geological process: each hour of debugging deposits a thin layer, and expertise is thousands of such layers compressed into perceptual sensitivity. The metaphor is apt. But Gibson's framework makes the mechanism precise. The layers are not layers of knowledge deposited in memory. They are layers of perceptual differentiation deposited in the organism's attunement to the affordance structure of its environment. The experienced builder does not know more. She sees more. And she sees more because the environment, through years of friction-rich engagement, has educated her attention.
Now consider what happened in the winter of 2025. The arrival of AI coding tools did not merely add a new instrument to the builder's workbench. It transformed the affordance structure of the entire environment. Surfaces changed. Textures changed. What the environment offered for action changed at every level simultaneously. The codebase still existed, but it no longer afforded modification in the same way — the builder could now describe the desired modification in natural language and receive the implementation. The error message still appeared, but it no longer afforded the same diagnostic engagement — the AI could diagnose the error faster and more comprehensively than the builder's systematic hypothesis-testing. The documentation still existed, but it no longer afforded the same friction-rich study — the builder could ask the AI to summarize, explain, and apply the relevant portions in seconds.
Each of these changes is, taken individually, an improvement in efficiency. Taken together, they constitute an ecological transformation — a restructuring of the organism-environment system as fundamental as a change in the physical landscape that an animal inhabits.
When a forest is cleared and replaced by farmland, the affordance structure changes completely. The trees that afforded climbing, nesting, and shelter are gone. The undergrowth that afforded concealment is gone. The canopy that afforded shade and moisture retention is gone. In their place are open fields that afford running, plowing, and long-distance vision. An entirely different set of organisms will thrive. Not because the organisms have changed, but because the environment has changed, and affordances are relational, and a change in the environment is therefore a change in what any given organism can do.
The analogy is not loose. Gibson would insist it is structurally precise. The builder's professional environment underwent an ecological transformation. The affordances that had shaped the builder's behavior and perception for decades — the implementation affordances, the debugging affordances, the friction-rich learning affordances — were not eliminated in the way that a tool is taken away. They were restructured in the way that a landscape is restructured. The environment now offered fundamentally different possibilities for action. And because affordances shape perception, and perception shapes what the organism becomes, the restructuring of the affordance landscape was simultaneously a restructuring of the conditions under which builders develop the perceptual expertise that distinguishes the competent from the extraordinary.
Stanford University's Gibson Environment — a virtual training platform for AI agents named explicitly after Gibson — embodies this insight from the machine's side. The researchers who built it understood that an AI agent's capacity for intelligent behavior cannot be understood independently of the environment in which it develops. Sergey Levine of UC Berkeley, drawing directly on Gibson's ecological philosophy, argued that "the capacity for reinforcement learning algorithms to lead to intelligent behavior cannot be understood independently of the environment in which they are situated." The same is true for humans. The builder's intelligence is not a private possession. It is a property of the builder-environment system, and when the environment changes, the intelligence changes — not metaphorically, but actually, because the affordances that shaped the intelligence are no longer the same affordances.
The question this book poses is the ecological question, not the technological one. The technological question asks what AI can do. That question has been answered, at length and with considerable documentation, by Segal and by the thousands of builders who crossed the threshold in 2025. The ecological question asks something different and more consequential: When the affordance structure of an environment changes this rapidly and this completely, what happens to the organisms whose perception was shaped by the old structure? What new perceptual sensitivities does the new structure demand? And — the question Gibson's framework renders most urgent — does the new environment provide the conditions under which those sensitivities can develop?
A builder whose perceptual system was tuned by twenty years of friction-rich engagement with implementation affordances now inhabits an environment where those affordances have been restructured beyond recognition. She must perceive new affordances: affordances for direction, for evaluation, for judgment at a level of abstraction that the old environment rarely demanded. But the new affordances require a different kind of perceptual sensitivity, and that sensitivity can only develop through active engagement with the new environment's structure. The organism must explore. It must encounter resistance. It must discover, through trial and adjustment, what the new environment actually offers.
The question is whether the new environment provides enough resistance, enough texture, enough perceptual information to support the development of the sensitivities it demands. A smooth environment, in Gibson's framework, is an informationally impoverished environment. And if the AI-augmented building environment is smoother than its predecessor — if it offers less friction, less resistance, less of the textured engagement that drives perceptual learning — then it may demand new perceptual sensitivities while simultaneously failing to provide the conditions under which those sensitivities develop.
That paradox is the subject of this book. It cannot be resolved by celebrating what the new environment offers or mourning what it has taken away. It can only be resolved by looking, with the full sensitivity of an ecological framework designed to see what environments actually offer, at the affordance structure of the builder's new world.
---
Gibson argued, against the entirety of the perceptual psychology establishment of his era, that perception is direct. The organism does not receive impoverished sensory data and then compute a representation of the world through unconscious inference. The organism detects information that is already present in the structured energy arrays — light, sound, chemical gradients — that fill the environment. The information is there. It has always been there. The organism's task is not construction but detection.
This was not a minor theoretical adjustment. It was the demolition of a paradigm that had dominated Western thinking about perception since Helmholtz formalized it in the 1860s. The standard account went like this: The retinal image is flat, ambiguous, and insufficient. It does not contain enough information to specify the three-dimensional world. Therefore, the brain must supplement the impoverished input with stored knowledge, assumptions, and computational inference. Perception is indirect — mediated by the brain's constructive processes.
Gibson's counter-argument was systematic and devastating. The retinal image is not the stimulus for vision. The ambient optic array — the structured light that converges on any point in the environment from every direction — is the stimulus for vision. And the ambient optic array is not impoverished. It is extraordinarily rich. It contains texture gradients that specify surface distance and orientation. It contains occlusion patterns that specify which surfaces are in front of which other surfaces. It contains the optic flow generated by the observer's own movement, which specifies the layout of the environment with precision that no static image could match. The information is in the light. The organism picks it up.
The implications for understanding the history of computing interfaces — and for understanding why the natural language interface represents a qualitatively different kind of change than any prior interface transition — are profound and have not been drawn with adequate precision.
Every computing interface prior to the large language model imposed what Gibson would recognize as an obstruction between the organism and the information. An obstruction, in ecological terms, is anything that prevents the organism from picking up information that is available in the ambient array. A wall obstructs vision. A mask obstructs olfaction. A glove obstructs haptic perception. The obstruction does not destroy the information. The information is still there, in the environment, structured and available. The obstruction prevents the organism from detecting it.
The command-line interface was an obstruction of a specific and diagnostic kind. The builder had a problem. The problem was perceptually available — she could see it, describe it, understand it in the natural language of her thought. But the machine could not receive the problem in that language. The builder had to translate the problem into a formal syntax the machine could parse. This translation was not a trivial encoding. It was a transformation of the problem's structure. Aspects of the problem that were self-evident in natural language — context, ambiguity, intent, the relative importance of competing requirements — had to be either formalized into the machine's representational scheme or discarded entirely. The command-line interface was an obstruction that permitted only certain kinds of information to pass and that distorted what it did permit.
The graphical user interface reduced the obstruction. It mapped the machine's operations onto visual metaphors — files, folders, windows, buttons — that were closer to the perceptual categories humans naturally employ. The translation cost decreased. But the translation did not disappear. The builder still had to think in the machine's terms, to learn the machine's metaphors, to map her intentions onto the specific set of actions the interface made available. The desktop metaphor, however intuitive, was still a metaphor — an indirect representation of the machine's actual operations, not a direct specification of the builder's actual intentions.
The touchscreen reduced the obstruction further. It eliminated the intermediary of the mouse, allowing the builder to manipulate screen objects with the same gestures she would use to manipulate physical objects. The directness increased. But the builder was still manipulating the machine's representations, not specifying her own intentions. She could drag and pinch and swipe, but she could not say what she meant in the language in which she meant it.
Each interface transition was a reduction in the distance between the organism and the information. In Gibson's terms, each transition removed a layer of obstruction between the builder and the problem. But none of them achieved directness in the Gibsonian sense, because all of them required the builder to translate her perception of the problem into a form the machine could accept. The translation was the obstruction. And as long as the translation existed, perception was indirect — mediated by the encoding scheme of the interface.
The large language model removed the translation.
Not reduced it. Not simplified it. Removed it. For the first time in the history of computing, the builder could describe the problem in the same language in which she perceived it. Natural language — the medium in which human beings most naturally detect and communicate affordances, the medium in which perception and action are most tightly coupled — became the interface. The obstruction between the organism and the information vanished.
This is the event Segal describes in The Orange Pill when he writes of the machine learning to "meet you on yours," of never having to leave his own way of thinking, of the cognitive overhead of translation being abolished. The vocabulary is different. The observation is the same. What disappeared was the layer of mediation that had separated the builder from the problem for the entire history of computing.
Gibson would have recognized the significance of this immediately, and he would have recognized its ambiguity with equal speed.
The restoration of directness is, from an ecological perspective, an unqualified improvement in the perceptual relationship between the organism and its task. When the builder can describe the problem in the language of direct perception — when she can say "I need a function that detects when the user's face is oriented toward the camera and they appear to be speaking" rather than translating that intention into OpenCV calls, audio threshold calculations, and callback architectures — the coupling between perception and action tightens. The builder perceives the problem. She describes what she perceives. The machine responds. The cycle of perception-action-perception accelerates to something approaching the speed of thought.
But Gibson was never naive about directness, and his framework contains a complication that the celebratory accounts of the natural language interface consistently miss. Directness, in ecological perception, is not merely a matter of removing obstructions. It is a matter of what the organism does with the information it picks up directly. And what the organism does depends on its history of active engagement with the environment — on the perceptual learning that has tuned its sensitivity to the affordances the environment offers.
The experienced builder who describes a problem to Claude in natural language is operating at a high level of perceptual sophistication. She perceives the problem directly because her perceptual system has been differentiated through years of engagement with similar problems. She knows what to ask for because she has spent years discovering, through friction-rich trial and error, what works and what fails. Her natural language description is precise not because natural language is inherently precise but because her perceptual learning has given her the categories — the differentiated sensitivities — needed to describe the problem at the right level of specificity.
The novice who describes the same problem to Claude may also use natural language. The obstruction has been removed for both of them equally. But the novice's perceptual system has not been differentiated. She has not spent years discovering, through the specific resistance of implementation, what works and what breaks and why. Her natural language description will be less precise, not because she lacks verbal ability but because she lacks the perceptual sensitivity that would allow her to see the relevant distinctions in the problem space.
The removal of the obstruction is real. The gain in directness is real. But directness without perceptual differentiation produces a specific kind of failure that Gibson's framework predicts with uncomfortable precision: the organism picks up the information that is available but lacks the sensitivity to detect the information that matters. The builder sees the problem. She does not see the problem's structure — the invariants that specify which aspects are load-bearing, which constraints are binding, which architectural decisions will determine whether the solution scales or collapses. That structural perception was built through years of engagement with the very implementation friction that the natural language interface has removed.
DeepMind's researchers discovered a version of this paradox when they applied Gibson's affordance concept to reinforcement learning. In conventional reinforcement learning, an agent begins with the assumption that all actions are possible in all states and learns through trial and error which actions are actually available. The affordance-based approach, inspired directly by Gibson, teaches the agent to perceive which actions the environment actually affords in each state, dramatically reducing the space of failed trials. But the researchers found that the affordance-perception itself requires learning — the agent must explore the environment actively to discover its affordances, and this exploration requires engagement with the environment's resistance. The shortcut works only after the organism has done the preliminary work of learning what the environment offers.
The natural language interface restores directness. It removes an obstruction that had persisted for fifty years of computing history. But it does not restore — and cannot restore — the perceptual differentiation that was built through decades of engagement with the obstruction itself. The obstruction was not merely in the way. It was, simultaneously and paradoxically, a training ground. The translation that the interface demanded was also the exercise through which the builder's perceptual system learned to make the distinctions that expert perception requires.
This is the paradox that defines the AI moment from an ecological perspective. The obstruction has been removed, and the directness is real, and the gain is genuine. And the training that the obstruction provided has been removed along with it, and the loss is real, and the consequences for the development of perceptual expertise are not yet understood.
---
The ecologist reconstructing a vanished habitat does not begin with what the organisms did. She begins with what the environment offered. The organisms' behavior follows from the affordance structure of the place they inhabited, not the other way around. Understand what the environment afforded, and the behavior becomes intelligible. Misunderstand the environment, and the behavior looks arbitrary — a collection of habits with no ecological logic.
The pre-AI building environment vanished between December 2025 and the spring of 2026. Not gradually, in the way a forest thins over decades. Abruptly, in the way a flood restructures a riverbed. The builders who inhabited that environment for years or decades are still here, carrying perceptual systems tuned to affordances that no longer exist in the same form. Understanding what they lost — and what they carried out of the flood — requires a forensic reconstruction of the affordance structure of the environment they inhabited.
Begin with the codebase. In the pre-AI environment, the codebase was not merely a storage medium for instructions. It was a perceptual landscape. The experienced builder did not read code the way a layperson reads text, sequentially, word by word, constructing meaning through comprehension. She perceived code the way a pilot perceives terrain: structurally, holistically, in terms of the action possibilities it offered and the dangers it contained. A function that was too long afforded refactoring. A dependency chain that was too deep afforded simplification. A naming convention that was inconsistent afforded confusion — the absence of a positive affordance, the presence of a trap.
This perceptual relationship to the codebase was built through thousands of hours of direct engagement. The builder modified code. She saw the consequences of her modifications. She encountered errors that her modifications produced — errors that specified, in the precise language of the compiler or the runtime, exactly where her understanding of the system diverged from the system's actual structure. Each error was a piece of perceptual information. Each correction was a recalibration of her perceptual system. Over time, the recalibrations accumulated into something that felt like intuition but was, in Gibson's terms, highly differentiated perception — the ability to detect invariants in the codebase's structure that were invisible to the untrained eye.
The debugging session was perhaps the richest affordance in the pre-AI builder's environment, and it merits particular attention because it is the affordance most completely transformed by AI.
A debugging session in the pre-AI environment proceeded through a sequence of actions, each shaped by the specific affordances the environment offered at that moment. The error message afforded hypothesis formation: it specified, in compressed and often cryptic form, the nature of the failure. The stack trace afforded localization: it specified the sequence of function calls that produced the failure, allowing the builder to narrow the search space. The breakpoint afforded inspection: it allowed the builder to freeze the system's execution at a specific moment and examine the state of every variable, every data structure, every intermediate result.
None of these affordances operated in isolation. The debugging session was an affordance cascade — each action revealed new affordances that shaped the next action. Setting a breakpoint revealed the unexpected value of a variable, which afforded a new hypothesis, which afforded a new breakpoint at an earlier point in the execution, which revealed a different unexpected value, and so on until the cascade converged on the root cause.
This cascade was, in Gibson's terms, an extended episode of active perceptual exploration. The builder was not computing. She was exploring — moving through the system's state space the way a hiker moves through unfamiliar terrain, detecting affordances as she went, adjusting her trajectory based on what each new vantage point revealed. The engagement was embodied in the specific sense that her actions — setting breakpoints, inspecting variables, stepping through execution — were not separable from her perception. She perceived by acting. She acted based on what she perceived. The tight coupling between perception and action that Gibson identified as the fundamental character of ecological engagement was fully present in the debugging session.
And the debugging session was developmental. Each cascade through the system's state space deposited perceptual sensitivity. The builder who had debugged a race condition understood race conditions not as an abstract concept but as a perceptual category — she could detect the affordance for race-condition failure in code she had never seen before, because her perceptual system had been tuned by the specific pattern of the previous debugging cascade. The sensitivity was not stored as a rule. It was stored as a differentiation — a finer grain in the perceptual system's ability to discriminate between code that was safe and code that was dangerous.
Documentation afforded a different kind of perceptual engagement: slower, more linguistic, more dependent on the builder's ability to translate abstract descriptions into concrete affordance perceptions. Reading a framework's documentation was not like reading a novel. It was like reading a topographic map — a symbolic representation of a terrain the builder had not yet traversed, which became meaningful only as the builder began to move through the terrain itself and discovered the correspondence between the map's symbols and the landscape's actual structure. The documentation said "this method returns a promise." The builder did not understand what that meant until she used the method, received the promise, handled it incorrectly, received an error, handled it correctly, and felt, through the resistance of the experience, what a promise afforded and what it did not.
Stack Overflow afforded social perceptual learning — the encounter with other builders' solutions, which functioned as demonstrations of affordance perception from different vantage points. A question on Stack Overflow was, in ecological terms, a report of a failed action: "I tried this, and the environment did not respond as I expected." The answers were reports of successful affordance perception: "The environment actually affords this, not what you attempted." The tone was often dismissive. The social friction was real. But the perceptual information was genuine — each answer specified an affordance of the system that the questioner had not detected.
Code review afforded another critical perceptual development: the experience of having one's affordance perception evaluated by a more differentiated perceiver. The senior developer who reviewed a junior developer's code was not primarily checking for correctness. She was checking for affordance sensitivity — could the junior perceive the architectural affordances and constraints that the code needed to respect? The comments on a code review were, in ecological terms, attentional guidance: "Look here. See this. You missed this affordance. You misperceived that constraint." Over months and years, this guided attention tuned the junior developer's perceptual system toward the invariants that the senior developer had learned to detect.
Each of these affordances — the codebase, the debugger, the documentation, the social forum, the code review — was simultaneously instrumental and developmental. It served an immediate productive purpose, and it served a long-term perceptual purpose. The instrumental purpose could be replaced. A faster tool that produced the same output would serve the same instrumental purpose. The developmental purpose could not be replaced by a faster tool, because the development depended on the specific engagement, the specific resistance, the specific friction of the interaction between the builder and the affordance structure of the environment.
Gibson's framework insists that these two purposes are not separable in practice, even though they are distinguishable in theory. The builder did not debug for the purpose of developing perceptual expertise. She debugged because the system was broken and the error afforded diagnosis. The perceptual development was a consequence of the engagement, not its purpose. And this is precisely why the development cannot be achieved by simply assigning exercises in debugging to builders who no longer need to debug. The perceptual learning depends on the authenticity of the engagement — on the fact that the error is real, the system is genuinely broken, and the builder genuinely needs the diagnosis to proceed. Simulated resistance, like a treadmill in a gym, does not carry the same perceptual information as real resistance, like a hill on a trail, because the simulated resistance does not specify real affordances. It specifies the affordances of the simulation.
What the pre-AI environment afforded, then, was not merely the ability to build software. It afforded the development of a specific kind of perceptual expertise — the ability to see, directly and immediately, what a system offered, what it constrained, where it was fragile, and where it was robust. This expertise was the builder's most valuable possession, and it was not a possession at all, in the conventional sense. It was a relationship — a coupling between the builder's differentiated perceptual system and the environment's affordance structure. Remove the environment, and the coupling dissolves. The expertise does not vanish — the perceptual differentiations remain — but it becomes untethered, like a pilot's terrain-reading skill in a landscape that no longer has terrain.
The pre-AI environment was not perfect. It was not designed to educate builders' attention. The friction was often merely tedious rather than instructive. The documentation was often inadequate. The social forums were often hostile. The hours lost to dependency management and boilerplate were genuinely wasted time that served no developmental purpose. Gibson's ecological framework does not require the conclusion that the old environment was optimal. It requires only the recognition that the old environment's affordance structure — including its friction, its resistance, its infuriating inefficiencies — produced a specific kind of perceptual development that cannot be assumed to occur in a radically different environment.
The habitat has been restructured. The organisms must adapt. But adaptation requires understanding what was lost in the restructuring, not for the purpose of mourning it, but for the purpose of knowing what must now be built deliberately, since it will no longer emerge as a side effect of the work itself.
---
The ecologist who has reconstructed the vanished habitat turns now to the new one. The flood has passed. The riverbed has been restructured. New surfaces have appeared. New textures. New possibilities for action where none existed before, and absent action possibilities where familiar ones had been. The task is not evaluation — not yet. The task is description. What does the new environment actually offer?
Gibson's method was always descriptive before it was normative. He spent years cataloging the affordances of terrestrial environments — surfaces, substances, edges, enclosures, objects, other animals — before drawing any conclusions about which affordances served the organism well and which posed danger. The catalog came first. The ecological analysis followed from the catalog.
The AI-augmented building environment affords actions that the pre-AI environment did not. These affordances are not incremental improvements on previous affordances. They are categorically new — action possibilities that did not exist in any form before the AI tools arrived. The builder encountering them for the first time is not perceiving a faster version of what was available yesterday. She is perceiving a different landscape entirely.
The first and most fundamental new affordance is what might be called the specification affordance. The AI-augmented environment affords the direct specification of outcomes in natural language. The builder can describe what she wants the system to do — not in the formal syntax of a programming language, not in the structured format of an API call, but in the same language she would use to explain the desired behavior to a knowledgeable colleague. The environment takes this specification and produces an implementation.
This affordance has no precedent in the builder's previous environment. Prior interfaces afforded implementation — the builder could write the code that produced the behavior. The AI-augmented environment affords specification — the builder can describe the behavior, and the environment produces the code. The difference is not one of degree. It is the difference between an environment that affords carving stone and an environment that affords describing a sculpture and receiving the carved stone. The action the environment supports is fundamentally different in kind.
The specification affordance transforms the builder's perceptual orientation to the problem. In the pre-AI environment, the builder perceived problems in terms of implementation affordances: this problem affords a recursive solution, this data structure affords efficient lookup, this pattern affords reuse across modules. In the AI-augmented environment, the builder perceives problems in terms of outcome affordances: this user need affords a conversational interface, this business requirement affords a real-time dashboard, this system constraint affords a microservice architecture. The grain of perception shifts upward from the mechanics of how to the strategy of what.
Segal documents this shift when he describes his senior engineer discovering that the remaining twenty percent of his work — the judgment about what to build, the architectural instinct about what would break, the taste that separated features users loved from features they tolerated — turned out to be everything. In Gibson's terms, the engineer discovered that the affordances he had been perceiving all along — the implementation affordances — were not the most important affordances in his environment. They had merely been the most visible, because the implementation consumed so much of the available bandwidth that the higher-order affordances were obscured. The AI stripped away the obscuring layer and revealed what had been underneath: affordances for direction, evaluation, and judgment that had always been there but that the builder had never had the perceptual bandwidth to fully exploit.
The second new affordance is the iteration affordance. The AI-augmented environment affords rapid, conversational iteration between intention and artifact. The builder describes an intention. The environment produces an artifact. The builder evaluates the artifact against her intention, identifies discrepancies, describes the needed adjustments, and receives a revised artifact. The cycle time is measured in seconds or minutes, not hours or days.
This creates an affordance loop — a self-reinforcing cycle of perception, action, evaluation, and adjustment — that has no equivalent in the pre-AI environment. In the old environment, iteration was costly. Each cycle through intention-implementation-evaluation-adjustment consumed hours of implementation labor, and the labor imposed a natural limit on the number of cycles the builder could complete. Three or four iterations on a feature might consume a full day. The cost of iteration created a selection pressure toward getting the implementation right the first time, which in turn created a pressure toward extensive upfront planning, detailed specifications, and conservative architectural choices.
The iteration affordance removes this selection pressure. When iteration is cheap, the builder can afford to be wrong — to specify an intention loosely, see what the environment produces, learn from the discrepancy between what she wanted and what she got, and adjust. This is, in ecological terms, an affordance for exploratory behavior, the kind of active, probing engagement with the environment that Gibson identified as the primary mechanism of perceptual learning. The organism that explores discovers affordances it could not have predicted. The builder who iterates rapidly discovers possibilities in the problem space that she could not have perceived through upfront analysis alone.
Segal describes an instance of this when he recounts the laparoscopic surgery insight — the moment when his question about friction collided with Claude's associative capacity and produced a connection neither could have made independently. That moment was enabled by the iteration affordance. In the old environment, the cycle time between question and candidate answer was measured in days of reading and reflection. In the AI-augmented environment, it was measured in seconds. The affordance for exploratory thinking was not merely improved. It was transformed in kind, from a slow, solitary excavation to a rapid, collaborative probing.
The third new affordance is the domain-crossing affordance. The AI-augmented environment affords competent engagement with domains the builder has not trained in. The backend engineer can specify frontend behavior. The designer can specify algorithmic logic. The product manager can specify system architecture. Each of these cross-domain engagements was impossible in the pre-AI environment, because each domain had its own implementation language, and fluency in that language was the prerequisite for any productive engagement with the domain's affordances.
Gibson's concept of affordances makes the mechanism precise. In the pre-AI environment, the backend engineer did not perceive affordances in frontend code, because her perceptual system had not been differentiated to detect them. Frontend code was, for her, what an unfamiliar animal species is for an organism that has never encountered it — a part of the environment that affords nothing, because the organism-environment relationship is not established. The AI does not establish the relationship. It bypasses the need for it. The builder does not perceive frontend affordances directly. She specifies outcomes in natural language, and the AI translates those specifications into frontend implementations that she does not need to perceive at the implementation level.
This produces a genuine expansion of what the builder can do. The engineer in Trivandrum whom Segal describes — the woman who had spent eight years on backend systems and never written a line of frontend code — built a complete user-facing feature in two days. In Gibson's terms, the AI-augmented environment afforded frontend work for an organism whose perceptual system had been differentiated exclusively for backend affordances. The new environment created an affordance where none had existed.
But Gibson's framework, applied with rigor, identifies a specific limitation of the domain-crossing affordance that the celebratory accounts tend to elide. The builder who crosses into a new domain through AI mediation does not develop the perceptual sensitivity specific to that domain. She produces competent work without undergoing the perceptual differentiation that direct engagement with the domain would provide. The frontend feature works. The builder did not learn to perceive frontend affordances. She learned to specify outcomes effectively enough that the AI could handle the implementation. These are different competencies, and they produce different kinds of expertise.
The experienced frontend developer perceives affordances in user interface design that the backend engineer using AI cannot see — not because the backend engineer is less intelligent, but because perceptual differentiation requires direct engagement, and the AI-mediated builder is not directly engaged with the frontend affordance structure. She is engaged with the specification affordance, which is a different thing. She perceives possibilities for what the interface should do. She does not perceive possibilities for how the interface should feel, because that perception requires the kind of embodied engagement with layout, spacing, timing, and responsive behavior that years of direct frontend work provide.
The fourth new affordance — and the one that raises the most challenging ecological questions — is what might be called the collaborative perception affordance. The conversational interface affords a form of perception that neither the human nor the machine could achieve independently. The builder describes a half-formed intention. The machine interprets that intention through the lens of its vast training data, finds connections the builder did not see, and presents them. The builder perceives these connections, evaluates them against her own understanding, refines her intention, and describes the refinement. The machine interprets the refined intention and presents further connections. The cycle continues until something emerges that neither the builder nor the machine would have produced alone.
This is not, in Gibson's strict terms, direct perception. The builder is not picking up information from the ambient array through her own exploratory activity. She is picking up information from the machine's output, which is itself a processed and structured artifact. But it is also not the indirect, representationalist perception that Gibson spent his career opposing. The builder is not constructing an internal model from impoverished data. She is detecting patterns in a rich, responsive, dynamically structured information source that adjusts to her exploratory probes in real time. The collaborative perception affordance occupies a strange territory in Gibson's framework — neither the directness of unmediated perceptual pickup nor the indirectness of internal model construction, but something new: perception mediated by an intelligent agent that actively structures the information environment in response to the perceiver's actions.
Whether this constitutes genuine perception in Gibson's sense is a question that his framework cannot settle, because the framework was constructed before such mediating agents existed. What the framework can settle is the ecological question: What does this affordance do to the builder-environment coupling? Does it enrich the coupling, providing the builder with perceptual information she could not otherwise access? Or does it impoverish the coupling, interposing a layer of pre-processed interpretation between the builder and the raw affordance structure of the problem?
The evidence from the first months of the AI transition suggests it does both, simultaneously, in proportions that depend on the builder's existing level of perceptual differentiation. The builder with twenty years of domain expertise uses the collaborative perception affordance to extend her perception into territory she could not reach alone. The builder with twenty months of experience uses the same affordance as a substitute for perceptual development she has not yet undergone. Both are engaging with the same environment. The environment affords different things for each of them, because affordances are relational, and the organism's history determines what the environment offers.
The new affordance landscape is richer in some dimensions and poorer in others. It affords specification, rapid iteration, domain crossing, and collaborative perception. It no longer affords, in the same form or to the same degree, the friction-rich implementation engagement that built the perceptual expertise of the previous generation of builders. Whether the richness of the new affordances compensates for the impoverishment of the old ones is not a question that can be answered in the abstract. It depends on what the organism needs to become — on what kind of perceptual development the future demands. That question requires examining what has disappeared from the builder's environment with the same precision that this chapter has applied to what has appeared. The disappearance is the subject of the next chapter, and it is there that Gibson's framework delivers its most uncomfortable diagnosis.
Every ecologist understands extinction. Not the dramatic extinction of a species vanishing from the planet, though that is the version the public knows. The quieter extinction — the local extirpation, the disappearance of a species from a particular habitat while it persists elsewhere — is the one that reshapes ecosystems. When wolves disappeared from Yellowstone, the elk stopped moving. When the elk stopped moving, they overgrazed the riverbanks. When the riverbanks eroded, the streams widened and shallowed. When the streams shallowed, the beavers could not build dams. When the beavers could not build dams, the wetlands dried. The entire cascade followed from a single disappearance — not the elimination of an organism but the elimination of an affordance. The wolves afforded predation risk, and predation risk afforded movement, and movement afforded distributed grazing, and distributed grazing afforded riparian health. Remove the wolf, and the affordance cascade collapses backward through the system.
The disappearance of implementation affordances from the builder's environment is an extirpation of this kind. The affordances have not been destroyed — one can still write code by hand, debug line by line, wrestle with dependency conflicts. But they have been marginalized in the ecological sense. They are no longer the dominant affordance structure. The environment no longer channels the builder toward them as the primary mode of engagement. A builder who insists on debugging manually in an environment where the AI can diagnose the error in seconds is not exercising a skill. She is resisting an environment — and organisms that resist their environment's affordance structure expend energy without ecological return.
The specific implementation affordances that have been extirpated can be cataloged with some precision. Each entry in the catalog represents not merely a lost activity but a lost source of perceptual information — a channel through which the environment once communicated its structure to the builder's perceptual system.
The syntactic affordance: the affordance of writing code in a formal language whose rules enforce precision. Formal syntax afforded a specific perceptual discipline. The compiler's refusal to accept ambiguous instruction forced the builder to resolve ambiguities in her own thinking before the code would run. The semicolon that had to be placed. The bracket that had to close. The type that had to match. Each syntactic constraint was a surface with texture — a point where the environment resisted the builder's action and, through the resistance, specified something about the structure of the system the builder was constructing. The builder who had written ten thousand lines of Python had internalized Python's affordance structure — not as memorized rules but as perceptual sensitivities. She could see a mismatched type the way a carpenter can see a joint that is not square: immediately, without analysis, because her perceptual system had been tuned by thousands of encounters with the specific textures of the language.
When the AI writes the code, the syntactic affordance vanishes from the builder's experience. The code is produced. The syntax is correct. The builder did not engage with the constraints that correctness requires. The perceptual tuning that those constraints provided does not occur. The builder may review the AI's output and confirm that the syntax is correct, but reviewing correct syntax and producing correct syntax are perceptually different activities. The first is pattern-matching against a template. The second is active exploration of a constraint space. Gibson's framework is explicit about this distinction: perception develops through action, not through passive observation. The organism that watches another organism navigate a terrain does not develop the same perceptual sensitivity as the organism that navigates the terrain itself. Observation informs. Engagement differentiates.
The diagnostic affordance: the affordance of tracing a failure through a system's causal structure. The previous chapter described the debugging session as an affordance cascade — a sequence of actions, each revealing new affordances that shaped the next action. The cascade was a form of active perceptual exploration of the system's state space, and the exploration deposited perceptual sensitivity: the ability to detect the causal patterns that produce specific categories of failure. When the AI handles diagnosis, the cascade does not occur. The error is identified. The fix is proposed. The causal structure that produced the error — the specific sequence of states, the specific interaction between components, the specific boundary condition that was violated — is available in the AI's explanation but is not discovered through the builder's own exploratory action. The information is delivered rather than detected. And delivered information, in Gibson's framework, does not produce the same perceptual differentiation as detected information, because the detection process is what tunes the perceptual system.
The dependency affordance: the affordance of managing the relationships between components, libraries, and systems. Dependency management was perhaps the least loved aspect of the pre-AI builder's work. It was tedious, error-prone, and frequently consumed hours that the builder would have preferred to spend on substantive design. But it afforded something that its tediousness obscured: an encounter with the system's relational structure. The builder who resolved a dependency conflict discovered, through the resistance of the resolution process, how the components of her system related to each other — which depended on which, where the coupling was tight and where it was loose, where a change in one component would cascade through others. This relational knowledge was perceptual, not propositional. It was the feel of the system's architecture, built through the specific friction of manipulating its interconnections.
The documentation affordance: the affordance of studying a system's intended behavior through written descriptions. Documentation afforded slow, friction-rich learning — the kind of engagement where understanding developed through the gap between what the documentation described and what the builder discovered when she actually used the system. That gap was informative. It specified the difference between the system's design and its reality, and navigating that gap built the perceptual sensitivity to distinguish between how systems are supposed to work and how they actually work. When the AI summarizes documentation and applies it directly, the gap disappears — and with it, the perceptual information the gap contained.
Each of these disappearances is, taken individually, a gain in efficiency and a loss in perceptual development. The gain is measurable. The loss is not, because the loss is in the currency of perceptual differentiation, which is not a quantity that shows up in productivity metrics or sprint velocity or lines of code shipped. The loss shows up later, as a subtle degradation in the quality of judgment that the builder brings to higher-order decisions — decisions that depend on the kind of deep, intuition-like sensitivity that only years of friction-rich engagement can produce.
There is a temptation to argue that the lost implementation affordances were not all developmental — that much of the friction was merely tedious, that hours spent on dependency management and boilerplate configuration served no perceptual purpose. The argument has merit. Not all friction is informative. A rock in a hiking path affords a stumble; a different rock on the same path affords a handhold. The ecologist does not claim that all features of an environment are equally valuable. She claims that the affordance structure of an environment is a system, and that removing elements of that system produces consequences that are not predictable from the properties of the removed elements alone.
The Yellowstone wolves were not, taken individually, the most important species in the ecosystem. They were a regulatory affordance — their presence constrained elk behavior in ways that maintained a cascade of other affordances throughout the system. The implementation affordances of the pre-AI building environment may have served a similar regulatory function. The tedium of dependency management constrained the builder's pace in ways that created time for reflection. The friction of syntactic correctness constrained the builder's expression in ways that forced precision of thought. The frustration of debugging constrained the builder's confidence in ways that maintained epistemic humility. Remove the constraints, and the behavior they regulated changes in ways the builder did not anticipate and may not notice until the downstream consequences have become entrenched.
Gibson's student and intellectual successor Edward Reed observed that the ecological approach to perception has an inherent conservatism — not in the political sense, but in the biological sense. Ecological systems that have been stable for long periods contain affordance structures that have been tested by the organisms inhabiting them over many generations. Rapid restructuring of those affordance structures eliminates tested relationships and replaces them with untested ones. The new relationships may prove more productive than the old. They may also produce cascading failures that the old relationships prevented. The point is not that change is bad. The point is that rapid ecological restructuring carries informational risk — the risk that the system's new affordance structure will lack regulatory relationships that the old structure provided invisibly.
The AI-augmented building environment is a rapidly restructured ecology. The implementation affordances that regulated the builder's behavior — that constrained pace, forced precision, maintained humility, and deposited perceptual differentiation as a side effect of productive work — have been extirpated from the dominant affordance structure. What has taken their place is the subject of the next chapter. What the extirpation means for the builders whose perceptual systems were shaped by the old structure — and for the builders who will develop their perceptual systems entirely within the new one — is the question that Gibson's framework renders most urgent and most difficult to answer.
The wolves are gone from the valley. The elk have stopped moving. The question is what happens to the riverbanks.
---
The ecologist who catalogs what has disappeared must also catalog what has appeared. Ecological restructuring is not merely subtraction. New surfaces emerge. New textures appear. New action possibilities present themselves to organisms capable of detecting them. The flood that restructured the riverbed also deposited new sediment, created new pools, exposed new mineral formations. The habitat is different. It is not empty.
The AI-augmented building environment contains affordances that have no precedent in the pre-AI ecology. These affordances cluster around a specific cognitive activity that Gibson's framework illuminates with particular clarity: the activity of direction — the perception and specification of where to go rather than how to get there.
Gibson distinguished, in his analysis of locomotion, between two kinds of perceptual information that guide an organism's movement through the environment. The first is the information that specifies the terrain — the texture of the ground, the slope of the surface, the firmness of the substrate. This information guides the mechanics of movement: where to place the foot, how much force to apply, when to shift weight. The second is the information that specifies the destination — the path through the terrain, the gap in the barrier, the clearing beyond the thicket. This information guides the strategy of movement: which direction to go, which obstacles to avoid, which affordances to exploit.
In the pre-AI building environment, the builder's perceptual bandwidth was dominated by terrain information. The syntax, the frameworks, the dependencies, the debugging — all of this was terrain. The builder spent the majority of her perceptual effort on the mechanics of each step: how to write this function, how to resolve this conflict, how to handle this edge case. The destination information — what the system should do, who it should serve, what architectural strategy would best support its evolution — was available but perceptually recessive. The terrain occupied the foreground. The destination occupied the background. The builder could see the path, but only intermittently, in the gaps between the steps.
The AI-augmented environment reverses the figure-ground relationship. When the AI handles the terrain — the syntax, the implementation, the mechanical details of each step — the builder's perceptual bandwidth is freed for destination information. The path through the problem space, which was previously glimpsed in fragments between implementation tasks, becomes the primary object of perception. The builder sees the whole route. She perceives the strategic affordances of the problem — which approaches lead to dead ends, which open onto broader possibilities, which serve the user's actual need rather than the developer's convenience — with a clarity that was impossible when the terrain consumed her attention.
This is not a metaphor forced onto Gibson's framework. It is a direct application of his analysis of perceptual attention. Gibson argued that the organism's perceptual system is always directed toward some aspects of the ambient array and away from others, and that this directionality is shaped by the organism's current activity. The organism that is navigating rough terrain attends to the ground. The organism that is scanning for a destination attends to the horizon. Both kinds of information are available in the ambient array simultaneously. The organism's activity determines which information is picked up.
In the pre-AI environment, the builder's activity was predominantly implementation. Implementation directed her perceptual attention toward the terrain. She attended to syntax, to types, to control flow, to the specific textures of the code she was producing. Destination information — the strategic, architectural, user-facing aspects of the problem — was available but unattended, because her activity did not direct her perceptual system toward it. She perceived the destination when she deliberately paused to think strategically, but these pauses were interruptions of her primary activity, not continuations of it.
In the AI-augmented environment, the builder's activity is predominantly direction. Direction directs her perceptual attention toward the destination. She attends to what the system should do, who it should serve, how it should evolve. The terrain information is available — the AI's output contains it — but the builder does not need to attend to it with the same granularity. Her activity has shifted, and her perception has shifted with it.
Gibson would call this a reorientation of the perceptual system. The organism has not become more intelligent or less intelligent. It has become differently oriented — attending to different information in the same ambient array, perceiving different affordances in the same environment. The reorientation is genuine. The question is whether it produces more capable organisms or merely differently capable ones.
The directional affordances that have emerged in the AI-augmented environment can be specified with some precision. The evaluation affordance: the environment affords the rapid evaluation of multiple candidate solutions. The builder describes an intention, receives an implementation, evaluates it against her criteria, and either accepts, rejects, or refines. This cycle can be repeated many times in the span that a single implementation would have consumed in the pre-AI environment. The evaluation affordance transforms the builder from a producer of single solutions into a curator of multiple candidates — and curation requires a different perceptual sensitivity than production. The curator must perceive the quality differential between candidates, which requires criteria that are not purely technical. The candidates may all be technically correct. The differentiation is in fitness for purpose, in elegance, in the subtle alignment between the solution and the user's actual need. This is a perceptual skill, but it is a different perceptual skill from the one that implementation developed.
The compositional affordance: the environment affords the combination of components at a level of abstraction that was previously inaccessible. The builder can specify, in natural language, the desired behavior of a complete system — frontend, backend, data layer, API, authentication — and receive a working composition of components that implements that behavior. The compositional affordance transforms the builder's perceptual engagement from the mechanics of individual components to the architecture of the whole system. She perceives the system as a composition of capabilities rather than a construction of parts. This is a genuine perceptual gain — the ability to see the forest rather than being lost among the trees.
The questioning affordance: the environment affords the asking of questions that the builder could not previously ask productively. In the pre-AI environment, the builder could ask, "How do I implement a real-time notification system?" and the answer was: spend three weeks learning WebSocket protocols, message queuing, and pub/sub architectures. The cost of the answer foreclosed the question for anyone who did not have three weeks to invest. In the AI-augmented environment, the same builder can ask the question and receive a working implementation in hours, which means the question becomes productive for a much larger population of builders. The affordance for productive questioning has expanded dramatically, and with it, the space of problems that a given builder can meaningfully explore.
Each of these directional affordances represents a genuine enrichment of the builder's environment. The builder who can evaluate multiple candidates rapidly is better positioned to find the best solution than the builder who commits to a single implementation. The builder who can compose at the system level is better positioned to see architectural possibilities than the builder who constructs component by component. The builder who can ask questions across domains is better positioned to find cross-cutting solutions than the builder who is confined to a single specialty.
But Gibson's framework identifies a specific fragility in the directional affordances that the celebratory accounts do not address. Directional affordances depend on the organism's capacity to perceive them — and that capacity is not automatic. The evaluation affordance is valuable only to the builder who possesses the criteria to evaluate. The compositional affordance is valuable only to the builder who perceives the architectural implications of different compositions. The questioning affordance is valuable only to the builder who knows which questions are worth asking.
These capacities — evaluative criteria, architectural perception, question judgment — were developed in the pre-AI environment through the very implementation affordances that have been extirpated. The builder who can evaluate a candidate solution's fitness for purpose developed that evaluative sensitivity through years of producing solutions that were fit and unfit and learning, through the resistance of deployment and user response, what fitness felt like. The builder who perceives architectural implications developed that perception through years of building and maintaining architectures that succeeded and failed. The builder who knows which questions are worth asking developed that judgment through years of asking questions that turned out to be productive and unproductive, and learning, through the friction of each outcome, how to distinguish them.
The directional affordances of the AI-augmented environment are real and valuable. They are also, in a specific ecological sense, dependent on a perceptual foundation that was built in the environment they replaced. The experienced builder who enters the AI-augmented environment brings the perceptual differentiations of the old environment with her — the evaluative sensitivity, the architectural perception, the question judgment — and these differentiations make the new affordances maximally useful. The novice who enters the AI-augmented environment without those differentiations encounters the same affordances, but cannot exploit them with the same depth, because the perception required to exploit them has not been developed.
This is the ecological asymmetry at the heart of the AI transition. The new affordances are most valuable to the organisms whose perceptual systems were shaped by the old ones. And the old affordances, which shaped those perceptual systems, are no longer available in the same form to the new generation of organisms entering the environment.
The destination is visible. The terrain has been cleared. The question is whether the next generation of builders, who never navigated the terrain, will possess the perceptual sensitivity to choose the right destination — or whether they will move with speed and confidence toward destinations they lack the differentiation to evaluate.
---
Gibson spent decades studying surfaces. Not as a metaphor. As the fundamental unit of visual perception.
The central claim of The Ecological Approach to Visual Perception is that the ambient optic array — the structured light that converges on any point in the environment from every direction — is structured primarily by surfaces. Surfaces reflect light. They texture the light with information about their composition, their orientation, their distance from the observer, their material properties. A surface covered with fine grain — sand, for instance — produces a texture gradient in the ambient array that specifies the surface's recession into distance with mathematical precision. The grains appear larger in the near field and smaller in the far field, and the rate of change specifies the surface's angle relative to the observer's line of sight. This information is in the light. The organism does not compute it. The organism detects it.
The richness of the information available for perceptual pickup depends directly on the richness of the surface texture. A highly textured surface — tree bark, woven fabric, rough stone — produces a dense information structure in the ambient optic array. Distance, orientation, composition, curvature, the boundary between one surface and another: all specified by the texture gradients, the contrast patterns, the ways that texture changes as the observer moves. A smooth surface — polished glass, still water, sanded steel — produces an impoverished information structure. The absence of texture means the absence of gradients, the absence of specified distance and orientation and composition. The smooth surface reflects the light without structuring it. It gives the eye almost nothing to detect.
Gibson was not evaluating surfaces morally. He was not arguing that texture is good and smoothness is bad. He was making a perceptual claim with the precision of a physicist: texture carries information, smoothness does not. An environment rich in textured surfaces is an environment rich in perceptual information. An environment dominated by smooth surfaces is an environment in which the perceptual system has less to work with — less to detect, less to differentiate, less to learn from.
When the philosopher Byung-Chul Han diagnoses the contemporary world's obsession with smoothness — the iPhone's featureless glass, the airbrushed face, the frictionless checkout, the seamless onboarding — he is making a cultural and aesthetic argument. When Gibson's framework is applied to the same observation, the argument becomes perceptual and ecological. Smoothness is not merely an aesthetic preference. It is a restructuring of the information environment. A culture that systematically smooths its surfaces systematically reduces the perceptual information available to the organisms that inhabit it.
The AI-augmented building environment is, in specific and measurable ways, smoother than the environment it replaced.
Consider the surface of a codebase. In the pre-AI environment, the codebase was a textured surface of extraordinary richness. Its naming conventions specified the builders' conceptual models. Its comment patterns specified the history of modification. Its indentation and whitespace specified the hierarchical structure of the logic. Its inconsistencies — the function that was named differently from its neighbors, the module that used a different error-handling pattern — specified the codebase's developmental history, the places where different builders with different habits had worked, the seams where refactoring had been attempted and abandoned. Each of these textures carried perceptual information. The experienced builder reading a codebase was not merely comprehending its logic. She was perceiving its history, its architecture, its fragilities, its personalities — all specified by the textures of the code.
AI-generated code is smoother. It follows consistent naming conventions. It uses uniform error-handling patterns. It maintains regular indentation and formatting. Its comments, when present, are descriptive but generic. It lacks the irregularities, the personal signatures, the accumulated historical texture of code written by multiple humans over time. The code is correct. It is often elegant. It is also informationally impoverished compared to the textured code it replaces.
The builder who reads AI-generated code encounters fewer texture gradients. Fewer irregularities to notice. Fewer seams to inspect. Fewer places where the surface's resistance to the eye produces the specific perceptual engagement that builds architectural intuition. The code works. The code does not tell the builder, through its textures, what it has been through. And what the code has been through — the decisions, the compromises, the failures, the refactorings — was precisely the information that the experienced builder detected in the texture of human-written code and that informed her judgment about the system's reliability, its maintainability, its readiness for modification.
The error surface has been smoothed as well. In the pre-AI environment, errors were textured with specificity. The segmentation fault specified a memory access violation with a distinctive pattern. The null pointer exception specified a reference failure at a particular point in the execution. The stack overflow specified a recursion depth that exceeded the system's capacity. Each error had a texture — a characteristic appearance, a typical context, a usual set of causes — and the experienced builder had learned to read these textures the way a tracker reads animal signs. The shape of the error specified the kind of failure, and the kind of failure narrowed the diagnostic search space.
When the AI handles errors, the builder encounters the resolution rather than the error. The texture of the error — its specific character, its diagnostic information, its causal signature — is processed by the AI and delivered as a fix. The fix works. The builder did not engage with the error's texture. The perceptual differentiation that comes from hundreds of encounters with that specific category of error does not occur. The smooth surface of the resolution tells the builder that the problem is solved. It does not tell the builder what the problem was, in the embodied, textured, perceptually differentiated sense that direct engagement would have provided.
The design surface has been smoothed. AI-generated interfaces follow established patterns with high consistency. They are professional, polished, and undifferentiated. They lack the specific rough edges — the slightly unconventional layout, the unexpected interaction pattern, the design decision that makes the user pause and think — that textured designs possess. Textured designs carry information about the designer's specific vision, about the tradeoffs she made, about the places where the standard pattern was insufficient and she improvised. Smooth designs carry less of this information. They are adequate. They are unremarkable. They specify competence without specifying personality.
Gibson would not lament these smoothings in the way that Han laments them. Gibson's framework is descriptive, not prescriptive. But the framework's description has prescriptive implications. If perception depends on texture, and if the builder's environment is becoming systematically smoother, then the builder's perceptual system is receiving less information from the environment. Not less data — the AI provides enormous quantities of data. Less perceptual information in Gibson's specific sense: less structure in the ambient array that the perceptual system can detect and differentiate.
The distinction between data and perceptual information is critical here, and it is the distinction that most discussions of AI in the workplace miss entirely. Data is quantitative. It can be transmitted, stored, processed, and retrieved. Perceptual information, in Gibson's sense, is structural. It is the pattern in the ambient array that specifies the affordances of the environment. Data can be delivered by a machine. Perceptual information must be detected by an organism, because the detection process is what tunes the organism's perceptual system to the specific structure being detected.
The AI provides more data. It provides less perceptual information, in the Gibsonian sense, because the surfaces it creates and the interactions it mediates are smoother than the surfaces and interactions they replace. The builder is better informed and less perceptually developed. She knows more and sees less. This is not a paradox. It is the predictable consequence of an environment that delivers information rather than affording its detection.
The implications extend beyond individual builders to the culture of building itself. A codebase that consists entirely of AI-generated code is a smooth surface at the organizational level. It lacks the textures — the irregularities, the personal signatures, the historical accumulations — that specify the organization's developmental history. A new builder joining such a codebase encounters a surface that is efficient, correct, and perceptually uninformative. She cannot read the codebase's history in its textures, because the textures have been smoothed away. She cannot perceive the decisions, the tradeoffs, the moments of architectural courage or cowardice that shaped the system, because those moments are not specified in the smooth surface of AI-generated code. The codebase is legible but not perceptually rich. It can be read but not felt.
Gibson's framework predicts that builders who develop exclusively in smooth environments will develop perceptual systems tuned to smooth surfaces — systems that are sensitive to the affordances that smooth environments offer (specification, iteration, composition) and insensitive to the affordances that textured environments offered (diagnosis, architectural intuition, the detection of historical and structural subtleties). These builders will not be less competent. They will be differently competent — tuned to a different affordance structure, sensitive to different information, capable of different perceptions.
Whether this difference constitutes a loss depends on what the future demands. If the future demands only the perceptual sensitivities that smooth environments develop — the ability to specify, to iterate, to compose — then the smoothing is costless. If the future demands, even occasionally, the perceptual sensitivities that only textured environments develop — the ability to diagnose novel failures, to detect architectural fragility, to feel the structural integrity of a system through engagement with its surfaces — then the smoothing carries a cost that will not be visible until the moment the missing sensitivity is needed and is not there.
The ecologist does not speculate about which scenario will obtain. She catalogs what the environment offers, what the organisms can detect, and where the gaps between offering and detection are likely to produce failure. The gap between the smooth AI-augmented environment and the textured perceptual sensitivities that some futures will demand is the most consequential gap in the ecology of the contemporary builder's world.
---
The ambient optic array is the totality of structured light that converges on any point in the environment. Gibson insisted on this formulation — the totality, not a sample — because the central error of the traditional theory was to begin with the retinal image, which is a sample, and to treat perception as the problem of reconstructing the world from an impoverished fragment. The world is not impoverished. The retinal image is impoverished only because it is a sample extracted from a rich array. Restore the organism to the full array — let it move its eyes, turn its head, walk through the environment — and the impoverishment disappears. The information is there. All of it. The organism's task is not construction but exploration: moving through the array, sampling different aspects of its structure, picking up the invariants that persist across transformations.
The richness of the ambient array is not uniform. Some environments present richer arrays than others. A dense forest presents an array of extraordinary structural complexity: overlapping surfaces at multiple distances, texture gradients that specify the curvature and distance of every trunk and branch, occlusion patterns that specify depth relationships between hundreds of objects, color gradients that specify the direction and quality of the light source. The organism in such an environment is immersed in information. Its perceptual system has more than it can process at any given moment, and the organism's exploratory behavior — where it looks, how it moves, which aspects of the array it samples — determines which of the available information is actually picked up.
A bare room with white walls presents an impoverished array. The surfaces are undifferentiated. The texture gradients are minimal. The occlusion patterns are simple. The organism in such an environment has less to detect, which means its perceptual system has less to work with, which means the exploratory behavior that drives perceptual learning has less to engage.
The information environment of the AI-augmented builder presents a paradox that Gibson's framework is uniquely equipped to describe. The environment is simultaneously richer and poorer than the one it replaced — richer in delivered information, poorer in information available for perceptual pickup. Understanding this paradox requires distinguishing between two kinds of informational richness that are routinely conflated.
The first kind of richness is content richness — the quantity and diversity of information that the environment makes available. By this measure, the AI-augmented environment is unprecedentedly rich. The builder who works with Claude has access to information drawn from the entire corpus of human technical knowledge. She can ask about any framework, any language, any architectural pattern, any algorithm, any design principle, and receive a response that synthesizes relevant knowledge with a breadth that no individual human expert could match. The content richness of the AI-augmented environment exceeds the content richness of the pre-AI environment by orders of magnitude.
The second kind of richness is structural richness — the degree to which the information is structured in a way that affords perceptual pickup through active exploration. By this measure, the AI-augmented environment is, in specific and consequential ways, poorer than the one it replaced.
Structural richness, in Gibson's framework, is a property of the ambient array, not of the information source. A library contains vast information, but the ambient optic array of a library is structured by the spines of books on shelves — a relatively impoverished perceptual environment. The same information, encountered through active engagement with the systems it describes — through building, debugging, deploying, maintaining — is structured by the rich, textured surfaces of those systems. The builder who encounters information about WebSocket protocols by reading documentation is in a content-rich, structurally poor information environment. The builder who encounters the same information by implementing a WebSocket connection, encountering the specific errors that the protocol's constraints produce, and resolving those errors through systematic exploration is in a content-identical, structurally rich environment.
The AI-augmented environment collapses this distinction. The builder asks about WebSocket protocols. The AI provides an explanation and a working implementation. The content is delivered. The structural engagement — the active exploration of the protocol's constraints, the encounter with its specific resistances, the perceptual pickup of its affordances through trial and error — does not occur. The builder receives the content without undergoing the perceptual process through which the content would become structurally integrated into her understanding.
Gibson's research on exploratory behavior clarifies why this matters. He argued that the organism does not passively receive information from the ambient array. It actively explores — moving through the environment, manipulating objects, generating new samples of the array through its own actions. Each exploratory action produces a new sample, and the invariants that persist across samples are the information that specifies the environment's affordances. The organism learns to perceive by acting, and the quality of its perception depends on the quality of its exploration.
In the pre-AI building environment, the builder's exploratory behavior was rich and varied. She wrote code and observed the compiler's response. She modified a function and observed the system's behavior change. She deployed to a test environment and observed how the system performed under conditions she had not anticipated. Each of these actions produced a new sample of the system's information structure, and the invariants that persisted across samples — the patterns of behavior that remained stable despite changes in input, the architectural constraints that manifested regardless of the specific feature being built — were the perceptual information that built her expertise.
In the AI-augmented environment, the builder's exploratory behavior has changed. She describes an intention and observes the AI's response. She evaluates the response and describes a refinement. She observes the refined response. The cycle is rapid and productive, but the character of the exploration has shifted. The builder is exploring the AI's response space rather than the system's affordance space. She is learning what the AI produces in response to different specifications, which is not the same as learning what the system affords in response to different actions.
This is a subtle but consequential distinction. The AI's response space is a processed, smoothed, pre-structured representation of the system's affordance space. The invariants that persist across the AI's responses are the patterns that the AI's training has identified and reproduced — which are, by construction, the most common, most well-documented, most conventionally understood patterns. The less common patterns — the unusual failure modes, the unconventional architectural possibilities, the edge cases that only direct engagement with the system's raw affordance structure would reveal — are underrepresented in the AI's response space because they are underrepresented in the training data.
The builder who explores the AI's response space develops a perceptual sensitivity to that response space — she learns to predict what the AI will produce, to specify her intentions in ways that elicit the best responses, to evaluate the AI's output efficiently. This is a genuine perceptual skill, and it is valuable. But it is a perceptual sensitivity to a mediated, smoothed, pre-processed representation of the system, not a perceptual sensitivity to the system itself. The distinction matters when the builder encounters a situation that the AI's response space does not adequately represent — a novel failure mode, an unconventional requirement, a system behavior that falls outside the patterns the AI has been trained to reproduce.
Gibson's framework for understanding the information in event structures becomes particularly relevant here. Gibson argued that events — changes in the environment over time — carry information that static arrays do not. The event of a surface deforming under pressure specifies the surface's elasticity. The event of an object falling specifies its weight relative to the medium it falls through. The event of a system failing under load specifies the system's capacity and the nature of its limitations. Events are informationally rich because they reveal the dynamic properties of the environment — the properties that static observation cannot detect.
In the pre-AI building environment, the builder regularly encountered events: the system crashed, the deployment failed, the user reported unexpected behavior, the database hit its capacity limit. Each event was informationally rich. It specified, through its particular character, something about the system's dynamic properties that no static description could convey. The builder who had experienced a cascade failure — who had watched a single point of failure propagate through a system, taking down service after service in a sequence that revealed the system's hidden dependencies — had picked up perceptual information about distributed systems that no documentation could provide. The event itself was the information source.
In the AI-augmented environment, the builder encounters fewer events of this kind, because the AI prevents many of the failures that would produce them. The code that Claude generates is less likely to contain the errors that produce cascade failures, memory leaks, race conditions, and the other dynamic pathologies that the pre-AI builder encountered regularly. This is an improvement in system quality. It is also a reduction in the informational richness of the builder's experience, because the events that the AI prevents were sources of perceptual information about the system's dynamic properties.
The builder in the AI-augmented environment inhabits an ambient array that is content-rich and structurally impoverished. She has access to more information than any builder in history. She has less opportunity to pick it up through the active, embodied, exploratory engagement that Gibson identified as the mechanism of perceptual learning. The information is delivered to her. It is not detected by her. And the distinction between delivered information and detected information is, in Gibson's framework, the distinction between knowledge and perception — between knowing what the system does and seeing what the system affords.
The framework does not conclude from this analysis that the AI-augmented environment is worse than the one it replaced. It concludes that the two environments develop different perceptual capacities. The pre-AI environment developed the capacity to perceive through struggle — to detect the affordance structure of complex systems through direct, friction-rich, event-laden engagement. The AI-augmented environment develops the capacity to perceive through evaluation — to detect the quality differential between candidate solutions, to perceive the fitness of an implementation for a given purpose, to read the AI's output with the critical sensitivity that comes from knowing what the system needs.
Both capacities are real. Both are valuable. The ecological question is whether an organism can develop the second without first having developed the first — whether evaluation can be learned in the absence of the direct engagement that teaches the evaluator what to look for. Gibson's research on perceptual learning, and Eleanor Gibson's extensive work on the education of attention, suggest that perceptual differentiation is cumulative and directional: coarse distinctions are learned first, and fine distinctions are learned on the foundation of the coarse ones. The builder who has not learned to detect architectural affordances through direct implementation engagement may lack the coarse perceptual categories on which the fine evaluative sensitivity depends.
The ambient array is rich. It has always been rich. The organism's relationship to that richness — whether it explores and detects or receives and reviews — determines what the richness produces. The AI-augmented environment offers the richest information content in the history of building. It offers a structurally smoother, less explorable perceptual environment than the one it replaced. The builder who navigates this paradox — who uses the AI's content richness to extend her perception while maintaining the active, exploratory stance that structural richness demands — is the builder for whom the new ecology is genuinely richer than the old. The builder who surrenders exploration for reception is the builder for whom the richest information environment in history produces the most impoverished perceptual development.
The paradox does not resolve. It describes the condition. What the builder does with the condition — how she structures her engagement, what dams she builds against the current of smooth reception, what textures she deliberately introduces into her practice — is the subject of the remaining chapters.
The organism that needs food and the organism that is learning to find food are engaged in what appear to be the same activity. Both move through the environment. Both orient toward the same class of objects. Both execute sequences of action that result in the acquisition of sustenance. An observer watching from outside — measuring the distance traveled, the calories consumed, the time elapsed between the onset of hunger and the satisfaction of it — would record identical data for both organisms, provided the learner eventually succeeded.
The data would miss everything that mattered.
The organism that already knows how to find food is exploiting affordances it has previously learned to detect. Its perceptual system is differentiated to the point where the relevant environmental features — the color of ripe fruit, the texture of soil that conceals tubers, the pattern of broken branches that indicates the passage of prey — are detected rapidly and reliably. Its movements are efficient. It goes where the food is. The activity is productive: it generates output (consumed food) at a rate that sustains the organism.
The organism that is learning to find food is doing something categorically different, even though the external behavior looks similar. It is exploring. Its perceptual system is not yet differentiated to detect the relevant features. It does not know which colors indicate ripeness, which soil textures conceal food, which patterns of disturbance are diagnostic. Its movements are inefficient — not because it is incompetent, but because inefficiency is the mechanism of perceptual learning. The organism that goes directly to the food does not learn anything about the environment that it did not already know. The organism that wanders, that encounters dead ends, that digs in soil that conceals nothing, that follows trails that lead nowhere — that organism is sampling the environment's affordance structure with the breadth necessary to build the perceptual differentiations that will eventually make its behavior efficient.
Learning requires waste. This is not a moral claim. It is an ecological one. The organism that produces efficiently has already learned. The organism that is learning cannot yet produce efficiently, and the inefficiency is not a failure of the learning process. It is the learning process. The wasted effort, the wrong turns, the failed attempts — these are the exploratory actions that generate the perceptual samples from which invariants are detected and affordance sensitivity is built.
Eleanor Gibson spent decades studying this process in human infants and children. Her research on perceptual learning — published in Principles of Perceptual Learning and Development (1969) and in earlier papers co-authored with her husband — established that perceptual development proceeds through progressive differentiation. The child does not begin with a blank slate and accumulate representations. The child begins with an undifferentiated perceptual field and learns to make finer and finer distinctions within it. The infant who cannot distinguish between the phonemes of her native language becomes, through months of active listening, a child who detects those distinctions effortlessly. The learning is not the addition of new information. It is the education of attention — the tuning of the perceptual system to detect structure that was always present in the ambient array but that the undifferentiated system could not pick up.
The education of attention requires engagement with the structure to be detected. The infant learns to distinguish phonemes by hearing speech — not by hearing a description of how phonemes differ, but by encountering the phonemes themselves, repeatedly, in varied contexts, until the perceptual system has extracted the invariant features that distinguish one from another. The learning is driven by the encounter, not by the explanation. And the encounters must be active — the infant does not learn passively from ambient noise but actively from communicative exchanges in which the phonemes are embedded in the rich, textured, meaningful context of human interaction.
The parallel to the AI-augmented building environment is immediate and uncomfortable.
The pre-AI building environment afforded both learning and producing, simultaneously and inextricably. The builder who debugged a race condition was producing — she was fixing a bug — and she was learning — her perceptual system was being educated to detect the invariant features of race conditions. The two activities were coupled. The learning was a side effect of the production, not a separate activity that the builder had to carve time for. The environment afforded both because the productive activity required the kind of active, exploratory, friction-rich engagement that drives perceptual differentiation. The builder could not fix the bug without exploring the system's state space, and the exploration that fixed the bug also educated her perception.
The AI-augmented environment has decoupled learning from producing. The builder can produce without the exploratory engagement that drives perceptual learning, because the AI handles the exploration. The bug is diagnosed. The fix is proposed. The production continues. The perceptual education that the diagnosis would have provided does not occur, because the builder did not perform the diagnosis. She reviewed the AI's output. And reviewing, as established in previous chapters, does not produce the same perceptual differentiation as the active exploration that reviewing replaces.
The decoupling is not absolute. The builder who works with AI still learns — she learns to specify intentions more precisely, to evaluate AI output more critically, to direct the conversation toward more productive outcomes. These are real perceptual skills, and they are developed through the active engagement of working with the tool. But they are skills of direction and evaluation, not skills of implementation and diagnosis. They are developed in the smooth response space of the AI interface, not in the textured affordance space of the system itself.
The ecological question is whether the directional and evaluative skills that the AI-augmented environment develops are sufficient to sustain the builder's overall perceptual competence — or whether they leave gaps that only the implementation and diagnostic skills of the old environment could fill.
Gibson's framework suggests that perceptual development is hierarchical. Coarse distinctions are learned first, and fine distinctions are learned on the foundation of the coarse ones. The child who has not learned to distinguish consonants from vowels cannot learn to distinguish individual consonants. The coarse distinction is the prerequisite for the fine one. The builder who has not learned to detect, through direct engagement with failing systems, the coarse categories of system failure — race conditions, memory leaks, cascade failures, deadlocks — may lack the perceptual foundation on which finer evaluative distinctions depend. The evaluation affordance of the AI-augmented environment requires the builder to perceive whether a candidate solution is fit for purpose, but fitness for purpose is a fine distinction that presupposes coarser distinctions about what kinds of things can go wrong and why. Those coarser distinctions were built through direct engagement with failure — through the debugging sessions, the deployment disasters, the late-night incident responses that the pre-AI environment afforded and the AI-augmented environment increasingly prevents.
The Gibsons' foundational 1955 paper, "Perceptual Learning: Differentiation or Enrichment?", poses the question precisely. Is perceptual learning a process of adding new features to the percept (enrichment), or is it a process of detecting features that were always present but initially went unnoticed (differentiation)? The Gibsons argued for differentiation. The features are in the environment. The organism learns to notice them. But noticing requires engagement — active, embodied, effortful engagement with the environment's structure. The organism that is shielded from the environment's complexity does not fail to learn because it lacks data. It fails to learn because it lacks the encounters that would educate its attention.
The AI-augmented environment does not deprive the builder of data. It deprives the builder of encounters — the specific, friction-rich, failure-laden encounters that drive the progressive differentiation of the perceptual system. The data is abundant. The encounters are scarce. And the perceptual learning that Eleanor Gibson documented throughout her career depends on encounters, not data.
The practical consequences are already visible, less than two years after the threshold event of December 2025. Builders who entered the profession after the AI transition — who have worked exclusively in the AI-augmented environment — display a specific perceptual profile. They are skilled at specification and evaluation. They are capable of producing working systems with impressive speed. They are less capable of diagnosing novel failures, less sensitive to architectural fragility, less attuned to the subtle signs that a system is approaching a boundary condition. These are not deficits of intelligence or effort. They are deficits of perceptual differentiation — the predictable consequence of developing in an environment that affords production without affording the exploratory encounters that build diagnostic sensitivity.
Gibson's framework does not prescribe a solution in the conventional sense. It describes the ecology and asks what the ecology produces. The ecology of the AI-augmented building environment produces organisms that are optimized for production and underequipped for the kinds of perceptual challenges that arise when production is not enough — when the system fails in ways the AI cannot diagnose, when the requirements are genuinely novel, when the architectural decision has no precedent in the training data.
These moments are rare. They are also, by definition, the moments when the builder's perceptual competence matters most — when the quality of the outcome depends not on the speed of production but on the depth of perception. They are the moments when the organism's history of exploratory engagement with the environment, or the absence of that history, becomes visible.
The builder trained exclusively in the AI-augmented environment encounters these moments without the perceptual resources that direct engagement would have provided. She is not incompetent. She is undifferentiated — her perceptual system has not been tuned to detect the invariants that specify the nature of the problem. She sees that something is wrong. She does not see what is wrong, because the what-is-wrong has a texture that her smooth environment never presented for detection.
The tension between learning and producing is not new. Every educational institution navigates it. The apprenticeship model solved it by embedding learning inside production — the apprentice learned by doing the work, and the friction of the work was the mechanism of the learning. The AI-augmented environment dissolves the apprenticeship model by removing the friction that made it work. The apprentice can now produce without doing the work, in the old sense, because the AI does the work. The production looks identical. The learning has not occurred.
Whether this matters depends on whether the perceptual development that friction provided will be needed. Gibson's framework cannot predict the future. It can describe the ecology. And the ecology it describes is one in which the most powerful tools in the history of building are developing builders whose perceptual systems are adapted to those tools — and maladapted to the conditions that arise when the tools are insufficient. The smooth environment produces smooth perception. The question is what happens when the world turns rough.
---
An ecological analysis does not end with prescription. It ends with description — the most accurate description the framework can produce of what the environment actually offers, perceived with the full sensitivity of a perceptual system trained to detect affordances rather than preferences.
Gibson's career was devoted to the proposition that the world is richer than the theories we use to describe it, and that the organism's task is not to construct a theory of the world but to perceive the world directly, in all its textured, structured, informationally dense reality. The theorist who reduces the world to a model loses the world. The perceiver who attends to the world as it actually presents itself gains something that no model can replicate: direct contact with the affordance structure of the real.
The AI-augmented building environment of 2026 is the most complex affordance landscape that human builders have ever inhabited. It is richer than the pre-AI environment in the affordances it offers for direction, evaluation, composition, and cross-domain exploration. It is poorer in the affordances it offers for the friction-rich, event-laden, texturally dense engagement that builds perceptual expertise through struggle. It is simultaneously the most empowering and the most ecologically precarious environment that builders have encountered, and the difference between the empowerment and the precariousness is not a property of the environment alone. It is a property of the relationship between the environment and the organism that inhabits it.
This is Gibson's fundamental insight, and it has never been more relevant than it is in this moment: what the environment offers depends on what the organism brings. The affordance is relational. The same environment that affords extraordinary productive capability to the builder with twenty years of friction-developed perceptual sensitivity may afford a comfortable, productive, and perceptually impoverishing experience to the builder who has never developed that sensitivity. The tool does not determine the outcome. The organism-environment coupling determines the outcome.
A description of the new environment, perceived ecologically, would proceed through the following observations.
First: the environment affords specification with unprecedented directness. The obstruction between the builder's intention and its realization has been removed. This is a genuine ecological gain, comparable to the removal of a physical barrier that had prevented an organism from accessing a rich foraging ground. The ground was always there. The barrier prevented access. The removal of the barrier is an unqualified improvement in the organism's relationship to that resource. The builder who can specify outcomes in natural language has direct perceptual access to a class of problems that was previously accessible only through the mediation of formal languages, and directness, in Gibson's framework, is the foundation of accurate perception.
Second: the environment affords exploratory iteration at a pace that transforms the character of exploration. When each exploratory probe generates a response in seconds rather than hours, the builder can sample the problem space with a density that was previously impossible. This density of sampling increases the likelihood of detecting non-obvious affordances — possibilities that would have been invisible at the coarser sampling rate of pre-AI iteration. The experienced builder using rapid iteration is, in ecological terms, an organism with enhanced exploratory capacity, and enhanced exploration produces enhanced perception. The connections that emerge from this rapid sampling — the insights that neither the builder nor the machine would have produced in isolation — are genuine perceptual discoveries, detected at the intersection of the builder's directed attention and the environment's structured response.
Third: the environment affords cross-domain engagement that was previously foreclosed by the specialization barrier. The builder whose perceptual system was differentiated for one domain can now act productively in adjacent domains, not because her perception has extended but because the AI mediates the gap between her perceptual categories and the affordances of the unfamiliar domain. This mediation is not direct perception in Gibson's strict sense — the builder does not perceive the affordances of the new domain directly but specifies outcomes that the AI translates into domain-appropriate implementations. Nevertheless, the cross-domain engagement produces a kind of compositional perception — the ability to see how different domains relate to each other, how changes in one propagate to others, how the whole system coheres — that specialization previously precluded. Compositional perception is a genuine perceptual development, even if it is not grounded in the domain-specific affordance sensitivity that direct engagement would provide.
Fourth: the environment has become smoother. Surfaces that were textured — codebases with their historical irregularities, error messages with their diagnostic signatures, debugging sessions with their cascading revelations — have been replaced by surfaces that are more uniform, more consistent, and less informationally dense in the Gibsonian sense. The smoothness is not total. The builder who directs AI toward complex system design still encounters resistance — the resistance of contradictory requirements, of architectural tradeoffs, of the irreducible difficulty of deciding what a system should be rather than how it should work. This resistance is real, and it produces its own form of perceptual learning. But it is resistance at a higher level of abstraction than the implementation resistance it replaced, and it carries different perceptual information. The builder learns to perceive strategic affordances. She does not learn to perceive the structural affordances that were specified by the textures of the pre-AI environment.
Fifth: the environment has shifted the locus of perceptual learning from detection to evaluation. In the pre-AI environment, the builder learned by detecting — by exploring the system's affordance structure directly and picking up invariants through active engagement. In the AI-augmented environment, the builder learns primarily by evaluating — by examining the AI's output and assessing its quality, fitness, and structural soundness. Evaluation is a form of perceptual activity. It requires attention, discrimination, and the kind of critical sensitivity that develops through practice. But evaluation presupposes the categories that detection provides. The evaluator must already know what good looks like — must already possess the perceptual differentiations that allow her to distinguish the fit from the unfit, the robust from the fragile, the elegant from the merely functional. These differentiations were built through the detection-based learning of the pre-AI environment. Whether they can be built through evaluation alone, without the foundation of detection, is the open ecological question that the first generation of AI-native builders will answer — not through argument but through the perceptual competence they demonstrate or fail to demonstrate when the environment demands more than the smooth surface offers.
Sixth: the environment affords a specific form of collaborative perception that has no precedent in the history of human tool use. The conversational interface between builder and AI creates an affordance loop — a self-reinforcing cycle of specification, interpretation, evaluation, and refinement — that neither the builder nor the machine could sustain alone. The loop produces perceptual discoveries. Connections emerge that the builder's attention, directed by her specific biography and expertise, could not have found in the AI's response space without the AI, and that the AI's pattern-matching could not have surfaced without the builder's directed attention. This collaborative perception occupies an uncharted territory in Gibson's framework, because Gibson's framework was built for the coupling between a single organism and its environment. The coupling between a human organism, an intelligent mediating agent, and the environment that the agent represents is something new — something that the framework can describe in pieces but cannot fully integrate without extension.
The honest description, then, is this: the AI-augmented building environment is an affordance landscape of extraordinary richness in some dimensions and significant impoverishment in others. It affords specification, rapid exploration, cross-domain composition, and collaborative perception at levels that no previous building environment could match. It does not afford, to the same degree or in the same form, the friction-rich, texturally dense, event-laden engagement that built the perceptual expertise of the previous generation of builders.
The organism's task is to perceive this landscape accurately — not to celebrate the richness or mourn the impoverishment but to detect, with the full sensitivity of a perceptual system evolved to extract information from structured environments, what the new landscape actually offers and what it actually withholds.
Gibson would not prescribe an action. He would describe the environment and trust the organism to perceive. But the framework he built — the insistence that perception is direct, that affordances are real, that the organism-environment system is the fundamental unit of analysis, that the information is in the environment if the organism knows how to look — contains, implicitly, a prescription for how to inhabit the new ecology. Explore actively. Do not substitute reception for detection. Seek out the textures that the smooth surface conceals. When the environment does not offer resistance, introduce it — not as an aesthetic preference for difficulty but as a perceptual strategy for maintaining the differentiation that only resistance can provide. Build environments that afford both production and learning, because the decoupling of the two produces organisms optimized for one and impoverished in the other.
The ecologist studies what the environment offers. She studies what the organism detects. And she studies the gap between the two — the affordances that are present but unperceived, the perceptual sensitivities that are developed but unemployed, the couplings that are possible but unrealized. It is in these gaps that the ecology's future is written — not in the affordances already exploited but in the ones that remain undetected, waiting for an organism with the sensitivity to perceive them.
The AI-augmented building environment is young. Its affordance structure is still being explored. The organisms that inhabit it are still learning what it offers — still discovering, through the slow education of attention that Gibson described, which invariants persist across transformations, which surfaces carry information, which textures specify what matters. The ecology is in its earliest phase, and the perceptual learning has barely begun.
What comes next depends on what the organisms learn to see. Not what they learn to do — the tools handle that with increasing facility. What they learn to see: the affordances they detect, the textures they attend to, the invariants they extract from the structured ambient array of an environment that is richer and smoother and more dangerous and more generous than any environment the species has previously built for itself.
Gibson spent his career insisting that the information is in the environment. It is there now — in the AI-augmented environment, in its new affordances and its lost textures and its unprecedented couplings. The information specifies what the environment offers. The organism's task is to perceive it. Not to compute it, not to construct a model of it, not to theorize about it — but to perceive it, directly, through the active, exploratory, sensitive engagement that is the organism's fundamental relationship to the world it inhabits.
The surfaces have changed. The textures have shifted. The affordances are new. The perceptual system must follow — must differentiate, must tune, must educate its attention to detect what the new environment actually offers, rather than what the old environment trained it to expect.
That education is the work of the coming years. It cannot be hurried. It cannot be bypassed. It can only be supported — by environments that afford both production and learning, by cultures that value perceptual development as highly as productive output, by individuals who understand that the most valuable thing they possess is not what they can do but what they can see.
The world is richer than any theory we use to describe it. Gibson insisted on this throughout his life. The AI-augmented world is richer than the theories we are currently using to celebrate it or to mourn it. The organism that perceives it accurately — that detects its affordances without flinching from its impoverishments, that exploits its richness without surrendering to its smoothness — is the organism that will thrive in the ecology that is being born.
The information is in the environment. It always has been. The question, now as always, is whether the organism is attending.
---
The muscle I trust least is the one that looks and decides, before I have touched anything, that I already know what I am seeing.
I have trained that muscle my entire career. Decades of building taught me pattern recognition so reflexive it felt like instinct — I could glance at a product and know where the seams were, where the weight would shift, where the user would hesitate. I mistook that speed for seeing. Gibson's framework, applied across these ten chapters, showed me the difference.
What I had developed was perceptual differentiation — a genuinely valuable sensitivity to the textures of environments I had inhabited for years. What I had not noticed was that differentiation can fossilize. The organism tuned to detect affordances in textured terrain may walk across a smooth surface and perceive nothing, not because nothing is there, but because its perceptual system is scanning for features the new environment does not present. It reads the absence of familiar texture as emptiness rather than recognizing the presence of different structure.
That is what happened to me in December 2025. The smooth surface of the AI-augmented environment arrived, and my first reaction was a hybrid of exhilaration and unease that I described in The Orange Pill without understanding its perceptual source. Gibson gave me the vocabulary. The exhilaration was the perception of genuine new affordances — specification, rapid iteration, cross-domain reach, capabilities I had wanted for decades. The unease was the absence of textures my perceptual system had been trained to require. Error messages with diagnostic grain. Code with the irregular signatures of different human minds. The resistance that told me, through the effort of overcoming it, what the system was made of. The surfaces had smoothed, and some part of me could not stop looking for the roughness that used to tell me where I stood.
Gibson's affordance concept is the sharpest lens I have encountered for seeing what the AI moment actually is, as opposed to what the various camps insist it must be. It is not a story about human obsolescence or human liberation. It is an ecological restructuring — a transformation of what the builder's environment offers for action and, consequently, a transformation of how the builder's perceptual system develops. Both the gain and the loss are real, and they are relational. They depend on what you bring. The same smooth surface that offers the experienced builder a canvas for compositional thinking may offer the novice a frictionless slide into competent output without the perceptual foundation that gives competence its depth.
The question I carry out of this book is Gibson's question, applied to my children and to the generation of builders growing up entirely inside the new environment: Will they develop the perceptual differentiation that the old environment provided as a side effect of its difficulty? Not identical differentiation — the old textures are gone and will not return. But differentiation of some kind, sensitivity to the structure of the systems they direct, the capacity to see not just what the machine produces but what the machine conceals in its smoothness.
That depends on what we build for them. Not which tools we give them — they will have the tools regardless. What environments we design for the education of their attention. Whether we build spaces that afford exploration alongside production. Whether we preserve enough texture, enough resistance, enough of the event-laden encounter with real systems, to support the perceptual learning that evaluation alone cannot provide.
The information is in the environment. Gibson spent his life making that one claim, and the claim is more consequential now than the day he wrote it. The AI-augmented environment contains information of extraordinary richness. Whether the next generation of builders learns to perceive it — to detect its affordances directly, to feel its structures through active engagement, to see what it offers and what it withholds — depends on whether we treat the environment as a landscape to be explored or a service to be consumed.
The surfaces have changed. The organism must learn to see again.
** Every AI book measures what the tools produce. This one measures what the environment develops. Drawing on J.J. Gibson's ecological psychology -- the radical theory that perception is not constructed inside the brain but detected directly from the structure of the world -- this book examines the affordance landscape of the AI-augmented builder with forensic precision. What the new environment offers is extraordinary: specification in natural language, rapid iteration, cross-domain reach. What it has quietly removed is the textured, friction-rich, failure-laden engagement through which perceptual expertise was built for decades. Gibson's framework reveals that both the gain and the loss are real, both are relational, and neither can be understood without understanding the organism-environment coupling that determines what any tool actually does to the mind that wields it.

A reading-companion catalog of the 27 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that J.J. Gibson — On AI uses as stepping stones for thinking through the AI revolution.
Open the Wiki Companion →