By Edo Segal
The thing that finally unsettled me was not a sentence. It was a doorknob.
I was reading Gibson between sessions with Claude, and he made an observation so plain it almost slipped past: a doorknob affords turning. Not because you decide it does. Not because your brain computes its function from visual data. Because the relationship between your hand and that brass sphere specifies the action before thought enters the picture. The affordance is there whether you notice it or not.
I set the book down and looked at my screen. The blinking cursor in the Claude interface. The text field that affords typing. The conversational structure that affords continued prompting. The immediate response that affords another iteration, and another, and another, with no signal anywhere in the environment that says stop.
Every tool I have ever built is a doorknob. Every interface is a field of offerings, structured into the design, shaping what people do before they have consciously chosen to do it. I knew this in the way builders know things — practically, intuitively, without the vocabulary to make it rigorous. Gibson gave me the vocabulary, and the vocabulary changed what I could see.
In *The Orange Pill* I wrote about productive addiction, about the inability to close the laptop, about the Rorschach test of whether a builder working at three in the morning is in flow or in the grip of compulsion. I framed that tension as internal — a question of will, of self-knowledge, of the fine line between exhilaration and exhaustion. Gibson forced me to reframe it as external. Not what is wrong with the person. What does the environment offer the person? What actions does the interface make easiest to perceive? What actions has the design rendered invisible?
The shift sounds academic. It is not. It is the difference between telling a builder to have more discipline and redesigning the environment so that pausing becomes as perceivable as prompting. Between blaming the organism and studying the ecology.
Gibson was a psychologist who spent decades arguing that we had perception backwards — that meaning is not something the mind projects onto a meaningless world but something structured into the environment, available for direct pickup by any creature tuned to detect it. He never saw a large language model. He never encountered a social media feed. But the framework he built describes what those technologies do to the minds that inhabit them with a precision that nothing in contemporary AI discourse has matched.
This book applies his lens. It will change where you look when you ask why AI-augmented work feels the way it does. And once you see the affordance landscape for what it is, you cannot go back to blaming the swimmer for the shape of the river.
— Edo Segal ^ Opus 4.6
1904-1979
James Jerome Gibson (1904–1979) was an American psychologist who founded the ecological approach to visual perception, one of the most radical reconceptions of how organisms relate to their environments in the history of psychology. Born in McConnelsville, Ohio, he studied at Princeton University and Northwestern University before joining the faculty at Smith College, where he spent most of his career alongside his wife and frequent collaborator, Eleanor J. Gibson, a pioneering researcher in perceptual learning and development. During World War II, Gibson conducted research for the U.S. Army Air Forces on pilot training and aircraft recognition, work that profoundly shaped his rejection of the classical stimulus-response model of perception. His major works include *The Perception of the Visual World* (1950), *The Senses Considered as Perceptual Systems* (1966), and *The Ecological Approach to Visual Perception* (1979), published the year of his death. Gibson introduced the concept of "affordances" — the possibilities for action that an environment offers an organism — which has since become one of the most widely adopted ideas in design theory, human-computer interaction, robotics, and cognitive science. His insistence that perception is an active, exploratory relationship between organism and environment, rather than a computational process occurring inside the head, continues to challenge and reshape fields far beyond the psychology department where it originated.
A cliff edge does not care whether you see it. It affords falling-off regardless of your attention, your mood, your beliefs about gravity. The affordance is there before you arrive and remains after you leave. It is not a property of the cliff alone — a creature incapable of locomotion cannot fall — nor a property of the creature alone. It is a fact about the relationship between a particular kind of body and a particular kind of surface. The cliff affords falling-off for a walking animal. The same cliff affords nesting for a bird that can fly. The affordance is real, it is relational, and it does not require anyone's permission to exist.
James J. Gibson introduced the concept of affordances in The Ecological Approach to Visual Perception, published in 1979, the year of his death. He called it "a radical hypothesis, because it implies that the 'values' and 'meanings' of things in the environment can be directly perceived." The radicalism was not in the observation that environments offer possibilities. Anyone who has watched a child discover that a chair affords climbing-on understands this intuitively. The radicalism was in what the concept displaced. For three centuries, Western philosophy and psychology had operated on the assumption that the physical world is meaningless — that meaning is something the mind adds to raw sensory data through interpretation, memory, and inference. The world provides stimuli. The mind provides significance.
Gibson said this was backwards. The significance is already there, structured into the environment, available for pickup by any organism with the perceptual apparatus to detect it. "An affordance is not bestowed upon an object by a need of an observer and his act of perceiving it," he wrote. "The object offers what it does because it is what it is."
This distinction — between meaning imposed by the mind and meaning structured into the environment — is not merely philosophical. It determines how one understands every technology ever built, including the ones reshaping human cognition in 2025 and 2026.
---
Consider the smartphone. A slab of glass, aluminum, and silicon. Described in physical terms, it is inert — a rectangle that fits in a pocket, weighs a few ounces, emits light at certain wavelengths. The standard account of how this object shapes behavior runs through the mind: the user decides to check notifications, the user chooses to scroll, the user opts to respond. The agency belongs to the person. The device is passive, waiting for instructions.
Gibson's framework inverts this entirely. The smartphone is not a passive object awaiting the user's sovereign decision. It is an affordance structure — an environment that offers specific possibilities for action and forecloses others. The smooth glass surface affords touching and swiping. The notification badge affords checking. The infinite scroll affords continued engagement without a natural stopping point. The haptic vibration affords the perception that something has happened, something requires attention, even when the organism was not attending.
These affordances do not compel behavior in the way a physical force compels motion. A person can choose not to check the notification. But the affordance is there — structured into the device, perceivable by the organism, specifying checking as an available, easy, low-cost action. The affordance is real whether or not the user acts on it, just as the cliff edge affords falling-off whether or not anyone approaches it. The difference is that the smartphone's affordances are designed — engineered with extraordinary precision to make certain actions maximally perceivable, maximally available, maximally easy. No one designs a cliff edge. Someone designed the notification badge.
This is the first principle of an ecological analysis of technology: every technology is an affordance structure, and the affordance structure shapes behavior not by compelling it but by specifying what actions are available, easy, and perceivable. The organism responds to what the environment offers. Change the offering, and the response changes — not because the organism has changed, but because the field of perceivable possibilities has shifted.
---
The concept of affordances has traveled far since Gibson introduced it. A Google Scholar search for the term yields fewer than six hundred results in the decade following the publication of The Ecological Approach to Visual Perception. By 2018, the number exceeded twenty-three thousand. The term now appears in fields ranging from robotics and artificial intelligence to architecture, music theory, and organizational design. Don Norman, the design theorist, adopted it in The Design of Everyday Things and gave it a second life — though Norman's usage departed from Gibson's in ways that matter.
Norman used "affordance" to mean the perceived possibilities for action — what the user thinks the object offers, whether or not the object actually offers it. A flat panel that looks like a button affords pressing, in Norman's sense, even if pressing it does nothing. Gibson's affordance is not about perception of possibility. It is about actual possibility. The flat surface at knee height affords sitting-on for a human adult whether or not the human perceives it as a seat. The affordance is a fact about the organism-environment system, not a fact about the organism's interpretation.
This distinction matters enormously for the analysis of AI tools, because Norman's version locates the problem in perception — in whether the user correctly reads what the technology offers — while Gibson's version locates the problem in the environment itself. If AI tools produce compulsive engagement, Norman's framework asks whether users are misperceiving the tool's affordances. Gibson's framework asks what the tool actually offers, and whether its affordance structure makes compulsive engagement the most available, most perceivable, most easily enacted action in the organism's environment.
The Gibsonian question is not "Do users understand what the tool does?" The Gibsonian question is "What does the tool make possible, and which possibilities are most easily perceived?"
---
Segal describes in The Orange Pill a phenomenon he calls "productive addiction" — the inability of builders working with Claude Code to stop, even when the exhilaration of creative flow has drained away and what remains is grinding compulsion. A spouse writes publicly about a husband who has vanished into the tool. A developer posts about working harder than ever and having more fun than ever, and the observation functions as a Rorschach test: Is this flow or is this pathology?
Gibson's framework cuts through the ambiguity by shifting the question from the internal state of the organism to the structure of the environment. What does the AI tool afford?
The conversational interface affords continued prompting. Each response is a surface that specifies further engagement as available and easy — the response itself contains affordances for refinement, elaboration, redirection. There is no natural endpoint, no equivalent of the last page of a book or the final scene of a film. The environment is structured for perpetual continuation.
The immediate response affords another iteration. In a conventional workflow, the time between action and feedback is measured in hours or days — the engineer writes code, submits it, waits for review. That delay is an affordance for disengagement. It specifies pausing as an available action. The AI tool eliminates this affordance entirely. Feedback arrives in seconds, and seconds are not long enough for the perceptual system to detect disengagement as a possibility.
The polished output affords the perception of competence. Whether or not the builder has understood the output, the environment specifies that progress has been made. The smooth, well-structured response is a surface that affords moving forward — not because the builder has chosen to skip understanding, but because moving forward is the most perceivable action in the environment and pausing to scrutinize requires the detection of an affordance that the environment does not readily offer.
None of this is a failure of individual discipline. It is the predictable response of organisms to an affordance structure that specifies continued engagement as the primary, most easily perceived, most readily available action. The builder who cannot stop prompting at three in the morning is not weak-willed. She is responding to what the environment offers, the way a walker follows the path of least resistance through a landscape — not because she cannot walk uphill, but because the downhill path is the one the terrain specifies most clearly.
---
Gibson was insistent, throughout his career, that affordances are value-laden. This was part of what made the concept radical. The dominant scientific paradigm held that facts and values belong to separate categories — that the physical world can be described in value-neutral terms, and that values are projections of the mind onto a meaningless substrate. Gibson refused this separation. A cliff edge affords falling-off, and falling-off is dangerous. The affordance carries the value with it. The danger is not an interpretation added by the mind to a neutral physical description. The danger is specified by the relationship between the organism's body and the surface's layout. The cliff edge is genuinely dangerous for a walking animal, and perceiving the affordance is perceiving the danger.
Applied to the technological environment, this principle means that the affordances of AI tools are not value-neutral features to which users may attach whatever significance they choose. The affordance structure carries values embedded within it. A tool that affords rapid production without struggle carries the value of speed. A tool that affords continued engagement without natural stopping points carries the value of persistence. A tool that affords breadth of output without depth of understanding carries the value of coverage over mastery.
These are not values the user chooses. They are values the environment specifies, structured into the affordance landscape as surely as the danger of a cliff edge is structured into its geometry. The user may resist these values — may choose to pause, to scrutinize, to disengage — but the resistance is against the grain of the environment, the way walking uphill is against the grain of a slope. The environment does not prohibit the alternative action. It simply makes it harder to perceive and harder to enact than the action it most readily affords.
Segal writes about the need for "cognitive dams" — structures that redirect the flow of intelligence toward life. Gibson's framework specifies what those dams must be: restructurings of the affordance landscape that make depth, pause, scrutiny, and disengagement as perceivable and as easy to enact as speed, continuation, and production. Not exhortations to discipline. Not lectures about mindfulness. Actual changes to the environment — the technological equivalent of putting a fence at the cliff edge, not because the walker cannot be trusted to notice the drop, but because the terrain affords falling-off, and a well-designed environment should also afford not falling.
---
The affordance concept dissolves the false binary between technological determinism and human freedom that plagues most discussions of AI. Determinism says the technology controls the user. Freedom says the user controls the technology. Gibson says neither. The environment offers, and the organism perceives and acts. The offering is real — structured, specific, measurable. The perception is active — skilled, exploratory, improvable. The action is the organism's, but it is shaped by what the environment makes available.
This is not a compromise between determinism and freedom. It is a different framework entirely, one that locates the relevant dynamics not inside the organism's head (where the freedom theorists look) and not inside the technology (where the determinists look) but in the relationship between organism and environment. The affordance is the unit of analysis. It belongs to neither party alone.
The consequence for the AI moment is this: the question is not whether AI tools are good or bad for human cognition. It is not whether users should resist or embrace the tools. The question is what the tools afford — what possibilities for action they structure into the environment, which of those possibilities are most easily perceived and most easily enacted, and whether the affordance landscape, taken as a whole, supports the range of human perceptual and cognitive capabilities or systematically narrows it.
That question cannot be answered in the abstract. It requires the careful, empirical, ecological study of specific environments and specific organisms — the study of what happens when this particular person, with this particular history of perceptual development, encounters this particular affordance structure, in this particular context. Gibson spent his career insisting that perception cannot be studied in the laboratory alone, divorced from the richly structured environments in which it actually occurs. The same insistence applies now. The affordance structure of AI cannot be understood from a philosopher's armchair or a policy paper. It must be studied where it operates — in the offices, the classrooms, the kitchens at midnight where a parent cannot stop building and a child is watching.
The cliff edge affords falling-off. It does not care about your philosophy of free will. The affordance is real. The question is what other affordances the environment also provides — handholds, paths, fences — and whether the organism can perceive them clearly enough to act.
Everything that follows in this book is an attempt to make those other affordances visible.
For three hundred years, the study of perception began with the same assumption: the eye receives an image, and the brain interprets it. Light enters through the pupil, falls on the retina, and produces a flat, inverted, ambiguous pattern of stimulation. The brain then performs an extraordinary computational feat — correcting the inversion, resolving the ambiguity, adding depth, distance, meaning, and context from stored knowledge. Perception, on this account, is a construction. The raw material is impoverished. The finished product is rich. And the gap between them is bridged by the mind.
Gibson called this the "snapshot theory" of vision and spent the second half of his career demolishing it.
The demolition began with a simple observation that carried devastating consequences for the entire edifice. The eye does not receive a snapshot. The eye is not a camera. The retinal image, that flat pattern of stimulation that centuries of vision scientists had treated as the starting point of perception, is a theoretical abstraction — an artifact of the laboratory, not a fact about how organisms actually see. A living animal does not hold still and stare at a frozen scene. It moves. It turns its head. It walks, runs, flies, swims. It samples the environment from a continuously shifting point of observation, and the information it picks up is not a series of static images but a flowing, structured, temporally extended pattern — what Gibson called the optic array.
The optic array is the total field of ambient light available at any point of observation. It is not light in the physicist's sense — photons traveling in straight lines from sources to surfaces to eyes. It is structured light, light that has been reflected by surfaces, textured by materials, occluded by objects, and arranged by the geometry of the environment into a pattern that specifies the layout of the world. The texture gradient of a receding surface specifies distance. The flow pattern produced by the observer's own movement specifies the direction and speed of locomotion. The progressive occlusion of one surface by another specifies which surface is in front.
None of this information is ambiguous. None of it requires interpretation. None of it needs to be enriched by memory or computed by inference. The information is there, in the structure of the light, available for direct pickup by any organism that has evolved the perceptual apparatus to detect it.
This was the core of Gibson's revolution: the claim that the environment provides sufficient information for perception, and that the organism's job is not to construct a world from impoverished data but to detect the structure that is already present.
---
The implications were sweeping, and they were resisted with proportional intensity. If the environment provides sufficient information, then the entire computational apparatus that cognitive science had erected between stimulus and percept — the feature detectors, the mental representations, the inference engines, the internal models — becomes unnecessary. Not wrong in its details, perhaps, but wrong in its premise. The mind does not need to build a world if the world is already specified in the information available to the perceiver.
Gibson's critics pointed out, correctly, that this account seemed to leave no role for learning, for memory, for the obvious fact that experienced perceivers see things that novices miss. Gibson's response was among his most important contributions, and it bears directly on the question of expertise in the age of AI.
Perception improves not through better computation but through better attention. The novice and the expert are exposed to the same optic array. The same information is available to both. What differs is what they have learned to detect. The expert has developed, through years of engagement, the attentional skills to pick up invariants — stable patterns of structure amid change — that the novice cannot yet perceive. The invariants were always there. The expert's perceptual system has been educated, through practice, to resonate with them.
Gibson's collaborator and wife, Eleanor Gibson, spent decades studying exactly this process. In her research on perceptual learning and development, she demonstrated that infants develop the ability to perceive affordances through active exploration — reaching, grasping, crawling, falling, and trying again. The learning is not the acquisition of rules stored in memory. It is the refinement of the perceptual system itself, the tuning of attention to invariants that were previously undetected.
This account of perceptual expertise dissolves the false dichotomy between innate ability and learned skill. Perception is neither hardwired nor computed. It is educated — attuned, through engagement, to the structure of the environment.
---
Segal describes a senior software architect who "could feel a codebase the way a doctor feels a pulse — not through analysis but through a kind of embodied intuition that had been deposited, layer by layer, through thousands of hours of patient work." Gibson's framework provides the precise theoretical account of what this intuition is and how it was built.
The architect's perception of the codebase is ecological perception. The codebase is an environment — a structured field of relationships, dependencies, patterns, and anomalies. It contains invariants: stable structural properties that persist across changes in implementation detail. The relationship between a module's interface and its internal complexity is an invariant. The pattern of dependencies that makes a system fragile is an invariant. The characteristic signature of technical debt accumulating in a particular architectural pattern is an invariant.
These invariants are available in the codebase itself, structured into its organization in the same way that distance is structured into a texture gradient. The senior architect perceives them directly, not through conscious analysis but through an attentional system that has been educated, over thousands of hours of engagement, to resonate with the specific invariants that specify structural integrity, fragility, and potential for failure.
The junior developer, exposed to the same codebase, does not detect these invariants. Not because the junior developer lacks computational power or analytical intelligence. Because the junior developer's perceptual system has not yet been educated to the relevant structures. The invariants are there. The attention is not yet attuned.
This is the foundation of expertise in Gibson's framework, and it applies across every domain of skilled practice. The experienced radiologist who detects a tumor in a scan that the resident misses. The master vintner who perceives the chemical composition of a wine through its bouquet. The chess grandmaster who perceives the strategic structure of a position at a glance. In every case, the expertise is perceptual — not the application of stored rules to new data, but the direct pickup of invariant structure through an attentional system refined by years of active engagement.
The question this analysis forces upon the AI moment is uncomfortable and precise: What happens to the development of perceptual expertise when the environment in which that expertise was built is replaced by a different environment with a different affordance structure?
---
The senior architect developed her perception through friction. Through the specific resistance of code that did not work as expected. Through the hours of debugging that forced her attention to the boundary between what the system was supposed to do and what it actually did. Through the repeated failure that educated her perceptual system to detect the invariants that specify structural fragility.
Each debugging session was an act of perceptual exploration — a movement through the codebase's structure, sampling it from different points of observation, detecting patterns of failure that pointed toward underlying invariants. The process was slow, effortful, and frequently frustrating. It was also the mechanism through which her expertise was built. Not acquired, in the sense of information transferred from an external source. Built, in the sense of perceptual capability developed through active engagement with a resistant environment.
When Claude Code enters this picture, the environment changes. The affordance structure shifts. Debugging — the specific friction-rich, exploratory engagement through which the architect's perceptual system was educated — is no longer the primary mode of interaction. The AI resolves the error. The code works. The builder moves on.
The resolution is genuine. The code is correct. The output is functional. What has been eliminated is not the product but the process — and in Gibson's framework, the process is where perceptual education happens. The invariants that the architect would have detected through the act of debugging are never encountered, because the debugging never occurs. The error that would have forced attention to the boundary between intended and actual behavior is resolved before the boundary becomes perceivable.
Gibson's framework does not evaluate this as good or bad in the abstract. It evaluates it ecologically: the organism's perceptual development depends on the affordances available in its environment. Change the affordances, and the perceptual development changes. An environment that affords rapid resolution of errors does not afford the exploratory engagement through which error-detection skills are built. The capability that emerges from the new environment will be different — not necessarily lesser, but different in kind, shaped by different affordances, attuned to different invariants.
---
The ecological approach to perception was Gibson's answer to a question that had plagued philosophy since Descartes: How does the organism know what the world is like? The Cartesian answer was that the organism constructs a model of the world inside its head, using sensory data as raw material and cognitive processes as the construction machinery. Gibson's answer was that the organism does not need a model. The world specifies itself, through the structured information available in the ambient array, and the organism picks up that specification directly through skilled, active exploration.
This dispute is not merely academic. It determines how one thinks about the relationship between minds and tools. If perception is construction — if the mind builds its world from data — then a tool that provides better data or faster processing is straightforwardly beneficial. The AI that resolves the error more quickly simply provides the construction machinery with a faster pathway to the same product. Nothing is lost, because what matters is the internal model, and the model is complete regardless of the path by which it was reached.
But if perception is pickup — if understanding is built through the active exploration of structured environments — then the pathway matters as much as the product. The senior architect's understanding of the codebase was not an internal model constructed from data. It was a perceptual attunement developed through engagement. Remove the engagement, and the attunement does not develop. Provide the correct code without the debugging, and the organism arrives at a functional system without having developed the perceptual skills to understand why it functions, or to detect when it will break.
Gibson's approach is sometimes characterized as anti-intellectual or anti-cognitive. This is a misreading. Gibson did not deny that organisms think, remember, or reason. He denied that thinking, remembering, and reasoning are the basis of perception. Perception is prior. It provides the material that thought works on. And if the perceptual system has not been educated — if the invariants have not been detected, if the attentional skills have not been developed through engagement — then the thinking that operates on that impoverished perceptual foundation will be correspondingly impoverished.
This is the specific mechanism through which Gibson's framework predicts that frictionless AI environments will produce a particular kind of shallowness. Not intellectual shallowness in the sense of ignorance. Perceptual shallowness — the failure to develop the attentional skills that allow the direct detection of structural invariants that specify quality, fragility, coherence, and depth. The builder who has never debugged does not merely lack the experience of debugging. She lacks the perceptual education that debugging provides — the attuning of her attentional system to the invariants that specify how systems fail.
---
In the 1950s, Gibson was asked by the United States Air Force to study the problems of pilot perception during landing. Pilots were crashing, and the prevailing theory held that landing required complex calculations of distance, speed, and angle that the pilot's visual system performed imperfectly under stress. Gibson looked at the problem ecologically and discovered something that overturned the computational account entirely.
Pilots do not calculate their approach. They perceive it. The optic flow pattern — the expanding pattern of visual texture produced by forward motion — specifies the point of contact directly. When the center of expansion is on the runway, the aircraft is pointed at the runway. No computation required. The information is in the structure of the light.
But this direct perception was available only to pilots who had developed the attentional skills to detect it. Novice pilots, overwhelmed by the complexity of the visual field, could not pick up the relevant invariant. They reverted to calculation — checking instruments, estimating distances, attempting to compute what the experienced pilot perceived directly. The computational approach was slower, more error-prone, and more vulnerable to stress. The perceptual approach was faster, more reliable, and more robust — but only for the organism that had developed the skill to use it.
The analogy to the current moment is precise. The builder working with AI is like the novice pilot who receives the correct landing vector from an instrument rather than perceiving it directly from the optic flow. The landing succeeds. The code works. The system functions. But the perceptual skill that would allow the builder to detect when the vector is wrong — when the instrument is malfunctioning, when the AI has produced something subtly flawed — has not been developed, because the active exploration through which that skill is educated was bypassed.
Gibson was not a Luddite. He did not argue that pilots should reject instruments. He argued that instruments work best when they augment perceptual skills rather than replace them — when the pilot perceives the optic flow and checks the instrument, rather than relying on the instrument alone. The instrument that replaces perception makes the pilot dependent. The instrument that informs perception makes the pilot more capable.
The same principle applies to AI. The tool that replaces the builder's perceptual engagement with the problem makes the builder dependent. The tool that informs the builder's perception — that provides another perspective on a problem the builder is actively exploring — makes the builder more capable. The distinction is in whether the affordance structure of the tool supports continued perceptual exploration or eliminates the need for it.
Gibson died before he could apply his framework to digital technologies. But the framework was built to describe the fundamental relationship between organisms and their environments, and that relationship does not change when the environment is made of code rather than asphalt. The organism explores. The environment offers. The affordance shapes the action. The action educates the perceptual system. Interrupt the cycle, and the education stops. No matter how good the product.
The history of human-computer interaction is, from a Gibsonian standpoint, a history of redesigned affordance landscapes. Each major interface transition altered not merely what users could do with computers but what the computational environment offered for doing — which actions were perceivable, which were easy, which were hidden, and which were eliminated. The conventional narrative frames this history as a story of progress: each generation of interface made computers more accessible, more intuitive, more powerful. Gibson's framework reframes it as a story of shifting affordance structures, in which every gain in accessibility simultaneously restructured the field of perceivable action in ways that favored certain cognitive patterns and foreclosed others.
The command line, the first widely used human-computer interface, afforded precision. The user typed instructions in a formal syntax, and the machine executed them or returned an error. The affordance structure was sparse and demanding. The command line afforded only those actions that could be specified in the machine's language. It did not afford exploration in the sense that Gibson meant — the casual, playful movement through an environment that allows the organism to discover what the environment offers. To use the command line, the user had to already know what was possible. The environment did not reveal its affordances to the uninitiated.
But the command line afforded something else: a particular kind of understanding. Because every action had to be specified explicitly, the user developed an intimate knowledge of what the machine was doing and why. The feedback was immediate and precise — the error message, however cryptic, specified exactly which instruction had failed and, implicitly, what the constraints of the system were. Each error was an affordance for learning, because it forced the user's attention to the boundary between what the system could do and what it could not. Over time, the accumulation of these boundary encounters produced a perceptual attunement to the machine's logic — an ecological expertise as genuine as the pilot's attunement to optic flow.
The graphical user interface changed the affordance landscape fundamentally. Suddenly, the computational environment afforded exploration. Windows could be opened. Icons could be clicked. Menus could be browsed. The user could discover what the system offered without knowing in advance what to ask for. The GUI made the computer's affordances visible — quite literally, by representing them as visual objects on a screen that specified their function through their appearance. A button afforded pressing. A slider afforded dragging. A folder afforded opening.
This was a genuine expansion of who could use computers. The GUI lowered the perceptual threshold for detecting the machine's affordances. Actions that had been invisible to anyone without formal training became perceivable to anyone who could see a button and understand that it invited pressing. The democratization was real.
But the affordance restructuring came with a cost that was invisible at the time and has become visible only in retrospect. The GUI afforded exploration at the expense of the specific, friction-rich engagement that had built deep understanding. The user who explored the menu and found the function did not develop the same relationship with the system as the user who had to specify the function in a formal language. The GUI user perceived the affordance and acted on it. The command-line user had to construct the action from first principles, and the construction process educated her perceptual system in ways that the menu-browsing process did not.
---
The touchscreen extended the same trajectory. The iPhone, released in 2007, created an affordance landscape of unprecedented immediacy. The glass surface afforded touching — a direct, physical, pre-linguistic engagement that the command line and even the GUI could not match. The pinch-to-zoom gesture afforded scaling. The swipe afforded scrolling. The tap afforded selection. Each affordance was coupled to a physical movement so intuitive that a toddler could detect it without instruction.
The touchscreen achieved something remarkable from a Gibsonian perspective: it aligned the affordances of the digital environment with the affordances of the physical environment. In the physical world, a graspable object affords grasping. On the touchscreen, a movable object affords moving — and the movement is the same physical gesture. The gap between the digital affordance and the bodily action was reduced to nothing. For the first time, the computational environment felt like a physical environment, one that offered its possibilities through the same perceptual channels the organism had evolved to use.
The consequence of this alignment was adoption of a speed and scale without precedent. The iPhone reached market saturation not because it was marketed brilliantly — though it was — but because its affordance structure was immediately perceivable to virtually every human organism on the planet. The environment offered, and the organisms perceived, and the barrier between technological possibility and human action collapsed to the width of a fingertip.
And yet. The affordance structure of the touchscreen, like every affordance structure before it, favored certain cognitive patterns at the expense of others. The touchscreen afforded immediacy — the instant coupling of intention and action. It did not afford the deliberation that the command line, through its very friction, had supported. The user who had to type a command before executing it had a moment — a brief, forced pause between intention and action — in which reflection could occur. The touchscreen eliminated that pause. Intention and action became simultaneous, and the perceptual space in which deliberation lived was compressed to nothing.
---
Each interface transition followed the same ecological logic. The new affordance structure expanded what was perceivable and reduced what was demanded. More organisms could detect the environment's offerings. Fewer organisms needed to struggle with the environment's resistance. The expanding circle of access was a genuine good. The shrinking domain of productive friction was a genuine cost. And at no point in the history of interface design was the cost weighed seriously against the gain, because the cost was invisible — perceptual, attentional, developmental — while the gain was immediately measurable in adoption rates and revenue.
Gibson's framework makes the cost visible by insisting that affordances are not merely opportunities for action. They are the environmental conditions under which perceptual skills develop. An environment that affords struggle is an environment in which the organism's perceptual system is being educated. An environment that affords frictionless action is an environment in which certain perceptual educations cannot occur — not because the organism is incapable, but because the affordances required for that education are no longer present.
The history of interface design, read ecologically, is a history of environments progressively optimized for ease of action and progressively depleted of affordances for the perceptual development that difficulty provides.
---
And then, in 2025, the interface became language.
Segal describes this transition as the moment "the machine learned to meet you on yours" — the inversion of fifty years of interface design in which the human had always been the one adapting. For the first time, the computational environment could be addressed in natural language, the same language the organism uses to think, to argue, to dream. No formal syntax. No visual metaphors. No gestural conventions. Just language — the medium in which human cognition most naturally operates.
Gibson's framework identifies this as the most radical affordance restructuring in the history of computing, and it does so for a specific reason. Every previous interface transition altered which actions were perceivable. The natural language interface alters the relationship between perception and action itself.
The command line afforded precise specification. The GUI afforded visual exploration. The touchscreen afforded physical gesture. In each case, the organism perceived what the environment offered and selected from among the available actions. The affordance structure was a landscape of options, and the organism navigated it.
The natural language interface does not offer a landscape of options. It offers something closer to a conversational partner — an environment that responds to intention itself, not to any specific action. The user does not select from a menu of affordances. The user describes what she wants, and the environment restructures itself to provide it.
This is a different kind of affordance entirely. Gibson distinguished between affordances that are fixed properties of the environment — the cliff edge always affords falling-off — and affordances that emerge from the interaction between organism and environment in real time. The conversational interface is an environment whose affordances are emergent. They are not there before the conversation begins. They come into existence through the interaction, shaped by what the user says, what the system responds, what the user says next.
The consequence is that the natural language interface affords, for the first time, the externalization of the user's own cognitive process. Not the execution of a predetermined action. The exploration of a thought. The user can say, "I am not sure what I am looking for" — and the environment will respond to the uncertainty itself, offering structure where there was none, suggesting directions the user had not perceived.
---
The ecological significance of this development is difficult to overstate. In Gibson's framework, the affordance structure of the environment shapes the development of perceptual skill. An environment that affords physical exploration develops physical expertise. An environment that affords syntactic precision develops linguistic expertise. An environment that affords the externalization and exploration of uncertainty develops — what?
This is the question that Gibson's framework poses to the natural language interface, and it is a question without a historical precedent. No previous affordance structure has offered the organism the ability to explore its own thinking in real time through an environment that adapts to the exploration. The closest analogue is conversation with another human — a Socratic dialogue, perhaps, in which the interlocutor's responses shape the thinker's next move. But even Socratic dialogue is constrained by the interlocutor's own limitations, biases, knowledge gaps, fatigue. The AI interlocutor has different constraints — it cannot perceive the way Gibson meant perception, it has no body, no stakes, no ecological relationship to the world — but it is not subject to the constraints that limit human conversation.
The affordance is genuinely new. The environment offers something that no previous environment has offered: an externalized cognitive space that adapts in real time to the organism's unfolding thought, providing structure, connections, and articulations that the organism could not have generated alone.
Segal describes this experience vividly. He describes telling Claude about a problem he could not solve — an idea about technology adoption that he had the data for and the intuition about but could not bridge — and Claude returning with a concept from evolutionary biology, punctuated equilibrium, that made the bridge visible. The connection existed in Segal's thinking, latent and unperceived. The AI made it perceivable. Not by constructing it — the connection was real, structured into the relationship between the data and the theory — but by directing Segal's attention to an invariant he had not yet detected.
In Gibsonian terms, the AI served as a perceptual aid — an environmental structure that made an existing invariant more detectable. This is not unlike a magnifying glass that makes visible the texture of a surface too fine for the naked eye. The texture was always there. The organism's perceptual apparatus was not attuned to it. The tool changed the affordance landscape so that the invariant became perceivable.
---
But a magnifying glass does not change the surface it reveals. The natural language AI does something more complex. It generates a response — a text, a code structure, an argument — that becomes part of the environment. The user perceives not just the original problem, now better specified, but also the AI's response, which carries its own affordances. The response affords acceptance. It affords refinement. It affords the perception of progress. And it affords — this is the critical point — the conflation of the AI's articulation with the user's own understanding.
When the magnifying glass reveals a texture, the observer perceives the texture. When the AI articulates a connection, the user may perceive the articulation and mistake it for having perceived the connection. The two are different. The articulation is the AI's output. The perception of the underlying invariant — the genuine understanding of why the connection holds — requires the user's own perceptual engagement with the problem. The AI's response can facilitate that engagement or substitute for it, and the affordance structure does not distinguish between the two.
The natural language interface is the most powerful affordance structure ever engineered. It affords the exploration of thought, the externalization of intention, the discovery of connections, the amplification of judgment. It also affords the replacement of perceptual engagement with the consumption of output, the substitution of articulation for understanding, and the atrophy of the very perceptual skills that make the exploration valuable. The same environment offers both, and the organism must perceive the difference — must detect which affordance leads to development and which leads to dependency.
Gibson would have said: that capacity to distinguish between affordances — to perceive not just what the environment offers but which offerings serve the organism's interests — is itself a perceptual skill. It must be developed through practice. It is not automatic. And the environment in which it must be developed is the very environment whose affordances make its development difficult, because the affordance for accepting the output is always more easily perceived than the affordance for questioning it.
Technology is not a tool that the organism uses. Technology is an environment that the organism inhabits. And the affordance structure of that environment shapes the organism's cognitive development as surely as the physical affordance structure of a forest shapes the motor development of a child learning to climb.
The question is not whether to inhabit the environment. It is what the environment affords — and what it has ceased to afford.
Before artificial intelligence restructured the affordance landscape of knowledge work, social media restructured the affordance landscape of attention itself. The two restructurings are related — AI inherited many of social media's affordance patterns and amplified them — but they are not identical, and the differences matter as much as the continuities. To understand what AI affords, it is necessary to understand the environment it arrived into — the attentional ecology that social media had already degraded.
Consider the social media feed. Not any particular platform, but the generic structure that has dominated digital attention since roughly 2009: a continuous, algorithmically curated stream of content, ordered not by time or topic but by predicted engagement. The feed has no beginning and no end. It is not a document to be read but a flow to be sampled. The user enters the stream at an arbitrary point and exits at no designated moment. The only stopping point is the user's own decision to stop — and the feed is engineered to make that decision as difficult to perceive as possible.
Gibson's framework identifies the feed's affordance structure with clinical precision. The feed affords scrolling. Not merely permits it — the infinite vertical layout, the content that loads seamlessly as the user reaches the bottom, the absence of any visual cue that says "you have reached the end" — these are environmental features that specify scrolling as the primary, most easily perceived, most readily available action. Scrolling is what the feed is for, in the Gibsonian sense that a flat surface at knee height is what sitting is for.
The feed affords reacting. Each piece of content is accompanied by affordances for immediate response — like buttons, comment fields, share mechanisms — that specify rapid, low-cost engagement as available. The affordance for reacting is spatially proximate to the content itself. The user does not need to navigate away, open a new interface, or compose a deliberate response. The reaction is right there, attached to the content like a handle attached to a door. The affordance for deliberation — for pausing, reflecting, formulating a considered response — is not visually represented, not spatially proximate, and not coupled to any physical gesture. It is, in Gibsonian terms, a hidden affordance: real, available in principle, but not specified by any feature of the environmental layout.
The feed affords comparison. The juxtaposition of content from multiple sources — a friend's vacation photo next to a news headline next to a political commentary next to an advertisement — specifies scanning and evaluating as available actions. The user is not immersed in a single environment. She is sampling fragments of many environments in rapid succession, and the affordance structure of the feed specifies this fragmented sampling as the natural mode of engagement.
---
What the feed does not afford is equally important, and Gibson's framework makes the absences visible in a way that conventional analysis does not.
The feed does not afford sustained attention to a single subject. The content changes with every scroll. The algorithm rewards novelty, so the feed is structured to present different content with each interaction. Staying with a single idea, returning to it, sitting with it long enough for genuine understanding to develop — none of these actions are specified by the feed's affordance structure. They are not prohibited. They are simply not offered. The organism that wishes to attend deeply to one idea in the feed environment must override the affordances the environment provides, which is like trying to sit still on a conveyor belt. Possible, but the environment is working against you.
The feed does not afford uncertainty. Every piece of content is accompanied by social signals — likes, shares, comments — that specify its value before the user has engaged with it. The experience of encountering an idea without knowing whether it is good or bad, popular or unpopular, approved or contested — the specific uncertainty that is the precondition for independent judgment — is eliminated by the social metadata that accompanies every post. The user perceives not just the content but its reception, and the reception shapes the perception of the content before genuine engagement can occur.
The feed does not afford boredom. This is perhaps the most consequential absence. Boredom, in ecological terms, is the perceptual state that arises when the environment's affordances have been exhausted — when the organism has explored what is available and finds nothing that engages its attention. Boredom is not pleasant. It is also not empty. Neuroscientific research has consistently found that the default mode network — the neural system that activates during periods of unstimulated rest — is associated with creative ideation, autobiographical reflection, and the consolidation of learning. Boredom is the soil in which certain kinds of cognitive growth occur. The feed eliminates it. There is always more content. There is always another thing to engage with. The affordance for continued engagement never expires, which means the perceptual state of having exhausted the environment's offerings — the state from which boredom and its developmental consequences emerge — never arrives.
---
Gibson was insistent that perception is an activity. The organism does not sit passively and receive stimuli. It moves through the environment, actively sampling the optic array, exploring the layout of surfaces, testing what the environment offers. This active, exploratory engagement is how perceptual skills develop. The child learns to perceive depth by crawling toward edges and discovering what happens. The pilot learns to perceive approach by flying and discovering how the optic flow specifies the landing point.
The social media feed presents a paradox that Gibson's framework illuminates with uncomfortable clarity. The user of the feed is constantly active. She scrolls, taps, swipes, reacts, shares, comments. The behavioral record shows continuous engagement, unbroken interaction, a perpetual stream of actions. By any external measure, the user is active.
But the activity is not exploratory in Gibson's sense. Exploration, for Gibson, means moving through an environment to discover its structure — to detect invariants that are not immediately apparent, to sample the optic array from multiple viewpoints, to test the affordances the environment provides. The feed user is not exploring. She is being fed. The algorithm determines what appears next. The user's actions — scrolling, reacting — are responses to what the environment presents, not explorations of what the environment contains. The direction of control runs from the environment to the organism, not from the organism to the environment.
Gibson would recognize this as the structure of a particular kind of perceptual impoverishment. The organism is active but not exploratory. It is responding but not searching. Its perceptual system is engaged but not being educated, because the engagement is not producing contact with invariant structure. The content changes constantly, but the affordance structure never changes. Scroll, react, scroll, react. The actions are varied in their objects but uniform in their kind. The organism is running on a treadmill — in constant motion, going nowhere, developing no new perceptual attunement because the environment presents the same affordances with every step.
---
The conventional critique of social media focuses on content: misinformation, polarization, the amplification of extreme views. These are real problems. But Gibson's framework identifies a deeper problem, one that operates at the level of perception rather than information. The content problem can, in principle, be addressed by better content moderation, better algorithmic design, better media literacy. The affordance problem cannot be addressed by changing the content, because the problem is in the structure, not the substance.
A feed full of accurate, well-sourced, intellectually nutritious content would still afford scanning and reacting. Would still afford comparison and fragmentation. Would still fail to afford sustained attention, productive uncertainty, or boredom. The affordance structure of the feed is independent of what fills it. Changing the content in the feed is like rearranging the furniture in a building whose floor plan forces everyone to walk in circles. The furniture may be beautiful. The walking still goes nowhere.
This is why efforts to improve social media through content-level interventions — fact-checking labels, algorithmic adjustments to reduce extreme content, media literacy campaigns — have produced results that are, at best, modest. The interventions target the wrong level. The degradation of attention is not caused by bad content. It is caused by an affordance structure that specifies shallow, fragmented, reactive engagement as the primary mode of interaction, regardless of what the content happens to be.
Gibson would argue that the only effective intervention is at the level of affordance design. Not what the feed shows, but what the feed offers for doing. An environment that afforded pausing — that built spatial or temporal gaps into the flow, creating moments in which the organism's attention could settle — would produce different attentional behavior than one structured for continuous flow. An environment that afforded uncertainty — that withheld social metadata, forcing the user to form independent judgments — would develop different perceptual skills than one that pre-specifies the value of every piece of content. An environment that afforded depth — that made sustained engagement with a single idea as perceivable and as easy as scrolling to the next item — would produce different cognitive patterns than one structured for sampling.
None of these alternatives have been commercially viable, because the economic model of social media rewards engagement metrics that are maximized by the existing affordance structure. The feed that affords compulsive scrolling generates more attention than the feed that affords reflective pausing, and attention is the commodity being sold. The affordance structure is not an accident. It is a business model.
---
The significance of this analysis for the AI moment is that AI tools arrived in an attentional ecology that social media had already restructured. The organisms encountering Claude Code in 2025 were not pristine perceivers engaging with a new technology in a neutral cognitive environment. They were organisms whose attentional patterns had already been shaped by a decade of affordance structures designed to maximize engagement at the expense of depth, reflection, and sustained focus. They were, in Gibson's terms, organisms whose perceptual systems had been educated by social media to detect affordances for speed and reaction, and whose ability to detect affordances for slowness and reflection had atrophied from disuse.
Segal observes that AI adoption was adopted at unprecedented speed — ChatGPT reached fifty million users in two months, a trajectory that makes the telephone, the radio, and the television look like they were standing still. Gibson's framework suggests that this speed of adoption measured not just the quality of the tool but the condition of the organism encountering it. A population whose attentional ecology had been shaped by social media — whose perceptual systems were already attuned to interfaces that afforded rapid engagement and immediate gratification — was pre-adapted to an AI tool that offered those same affordances in a more powerful form.
The feed taught organisms to expect instant response. The AI provided it. The feed taught organisms to engage in short, reactive bursts. The AI accommodated it. The feed trained organisms to perceive continued engagement as the default action and disengagement as an effortful override. The AI's affordance structure replicated this pattern with extraordinary fidelity.
The AI did not arrive in a vacuum. It arrived in an ecosystem that social media had already terraformed — an attentional ecology in which the affordances for depth had been systematically reduced and the affordances for speed had been systematically amplified. The AI inherited this ecology and extended it into the domain of production. Social media had restructured how organisms consume. AI restructured how organisms create. The affordance patterns are continuous across both environments, and the organisms moving between them carried their socially-media-educated perceptual habits into the AI-mediated workspace without interruption.
---
Gibson wrote, near the end of The Ecological Approach to Visual Perception, that "the affordances of the environment are what it offers the animal, what it provides or furnishes, either for good or ill." The qualification is essential. Affordances are not benefits. They are possibilities — and some possibilities are destructive. The cliff edge affords falling-off. The social media feed affords the systematic erosion of the attentional skills that sustained engagement, deep reflection, and independent judgment require.
The organisms entering the AI era arrived with their perceptual systems already reshaped by a decade of environments designed to afford shallow engagement. Understanding what AI affords requires understanding what those organisms bring to the encounter — what perceptual skills they have developed, what skills have atrophied, and what the interaction between their attentional history and AI's affordance structure is likely to produce.
That is the work of the next chapter: the specific affordance structure of AI tools, analyzed not in isolation but in the ecological context of the organisms that use them and the attentional histories they carry.
The social media feed restructured how organisms attend. The AI tool restructures how organisms produce. The two affordance structures share a deep grammar — immediacy, frictionlessness, the elimination of natural stopping points — but they differ in a way that makes the AI tool's affordance structure both more powerful and more difficult to analyze. Social media affords consumption. AI affords creation. And the perceptual dynamics of creation are different from the perceptual dynamics of consumption in ways that Gibson's framework makes precise.
A consumer encountering an affordance structure designed for engagement is, at worst, wasting time. The hours lost to the feed are hours that might have been spent otherwise, but the organism's productive capabilities remain intact. The affordances of consumption do not alter what the organism can make. They alter what the organism attends to, which is serious enough — attention is the medium through which perceptual skills develop — but the damage is confined to the attentional system.
A producer encountering an affordance structure designed for frictionless output is in a different situation entirely. The affordances of production shape not just what the organism attends to but what the organism makes, and what the organism makes reshapes the environment that other organisms inhabit. The code that the AI-augmented builder ships becomes part of the technological ecology. The essay that the AI-augmented student submits becomes part of the educational ecology. The decision that the AI-augmented executive makes becomes part of the organizational ecology. The affordance structure of production has downstream consequences that the affordance structure of consumption does not, because production changes the world while consumption merely changes the consumer.
This asymmetry is why the affordance analysis of AI tools requires a specificity that the analysis of social media does not. It is not enough to say that AI affords rapid production. The question is what kind of rapid production, directed by what kind of perception, producing what kind of artifacts, in what kind of ecological context. Gibson's framework insists on this specificity, because affordances are always relational — always a function of the particular organism encountering the particular environment in the particular situation.
---
The conversational interface is the foundational affordance of contemporary AI tools, and its ecological significance is different from anything in the prior history of human-computer interaction. Previous interfaces afforded actions — clicking, typing, swiping. The conversational interface affords intentions. The user does not specify what to do. The user describes what she wants, and the environment reconfigures itself to provide it.
Gibson distinguished between what he called "attached" and "detached" objects in the environment — surfaces that are part of the ground layout versus objects that can be manipulated independently. The conversational interface blurs an analogous distinction in the computational environment: the distinction between the tool and the material. In every previous interface paradigm, the tool was one thing — the text editor, the compiler, the design application — and the material was another — the code, the document, the image. The user wielded the tool against the material. The affordance structure of the tool shaped what operations were available, but the material had its own resistance, its own properties, its own invariants that the user had to perceive and accommodate.
The conversational AI collapses this distinction. The tool and the material merge. Claude does not provide a set of operations that the user applies to code. Claude produces the code in response to the user's description. The user does not perceive the code through the tool. The user perceives Claude's interpretation of her intention, rendered as code. The material — the actual computational artifact — arrives already shaped, already structured, already functional. The affordance for engaging with the material's resistance, for discovering its properties through manipulation, for developing the perceptual attunement that comes from working against a medium's grain, has been absorbed into the tool's response.
This merger has consequences that Gibson's framework predicts with uncomfortable precision. When the tool and the material are distinct, the organism develops two kinds of perceptual skill: skill with the tool, and skill with the material. The carpenter who masters the chisel and the carpenter who understands wood are developing different but complementary perceptual attunements. The chisel-skill allows precise action. The wood-skill allows the perception of grain, density, moisture, the structural properties that determine what the wood will and will not tolerate. The best carpenters possess both, and the two skills inform each other — the perception of the wood guides the use of the chisel, and the chisel's response to the wood educates the perception of its properties.
When the AI merges tool and material, the organism develops skill with the conversation but not necessarily skill with the artifact the conversation produces. The builder who uses Claude effectively has learned to describe intentions clearly, to evaluate outputs critically, to iterate through dialogue toward a satisfactory result. These are genuine perceptual skills. But they are skills of the conversational interface — skills of description and evaluation — not skills of the material itself. The builder may produce excellent code without developing the perceptual attunement to code that the manual coder develops through direct engagement with its resistance.
---
The affordance for immediate feedback is the second structural feature that defines the AI tool's ecological impact. In conventional software development, the feedback loop between action and result was measured in minutes, hours, or days. The developer wrote code, compiled it, ran it, encountered an error, read the error message, hypothesized about the cause, modified the code, and repeated the cycle. Each iteration deposited a thin layer of understanding — what Gibson would call a refinement of perceptual attunement to the invariants of the system.
Claude Code compresses the feedback loop to seconds. The developer describes what she wants. The code appears. It runs or it does not. If it does not, the developer describes the failure, and a corrected version appears. The cycle that once took an afternoon takes minutes. The organism perceives the result almost simultaneously with the intention.
Gibson's framework evaluates this compression not by whether it improves productivity — it manifestly does — but by what it does to the organism's perceptual development. The feedback loop in conventional development was not merely a delay. It was a temporal affordance — a gap in which the organism's attention was directed to the boundary between intention and result. In that gap, the developer examined the error, formulated a hypothesis, tested it mentally before testing it computationally. The gap afforded reflection, not as a luxury but as a structural feature of the environment. The organism reflected because the environment offered a temporal space in which reflection was the most available action.
When the feedback loop compresses to seconds, the temporal affordance for reflection disappears. The result arrives before the organism has completed the perceptual exploration of the problem. The developer who would have spent twenty minutes examining an error message — attending to its specifics, tracing its implications through the system, developing hypotheses about its cause — instead sees the corrected code and moves on. The correction is real. The perceptual education that the twenty minutes would have provided is lost.
The compression is not a simple acceleration of the same process. It is a restructuring of the affordance landscape that eliminates a category of perceptual engagement. The developer who receives an immediate correction and the developer who spends twenty minutes debugging are not performing the same activity at different speeds. They are performing different activities, in different affordance landscapes, developing different perceptual skills.
---
The affordance for breadth without depth is the third defining feature. Claude can produce competent output across an extraordinary range of domains. It can write code in dozens of languages, draft legal briefs, compose music, generate marketing copy, analyze data, explain quantum mechanics, and produce architectural diagrams — all within a single conversation. The organism encountering this affordance structure perceives a landscape of nearly unlimited possibility. Any direction is available. Any domain is accessible. The breadth of the offering is genuinely unprecedented.
Gibson's framework identifies what this breadth affords and what it forecloses. Breadth affords exploration — the organism can move across domains, discovering connections, testing ideas in unfamiliar territory, expanding the range of invariants its perceptual system is exposed to. Segal describes exactly this when he writes about engineers who began reaching across disciplinary boundaries, backend developers building interfaces, designers writing features. The affordance for cross-domain engagement is real, and the perceptual enrichment it provides is genuine. An organism that encounters invariants across multiple domains develops a different kind of attentional skill than one confined to a single domain — a skill for detecting structural similarities across contexts, for perceiving the deep patterns that connect apparently disparate problems.
But breadth also affords superficiality — the perception of competence without the development of expertise. The builder who uses Claude to produce code in a language she does not know has acted on the affordance for breadth. She has produced an artifact. She has not developed the perceptual attunement to that language's invariants that would allow her to detect when the artifact is subtly wrong, or fragile, or adequate for the current purpose but architecturally unsound for the next one.
Gibson would frame this as a distinction between two kinds of perceptual engagement. Exploratory engagement — movement across environments, sampling broadly, detecting the gross structure of unfamiliar terrain — develops a wide but shallow attentional skill. Performatory engagement — sustained interaction with a single environment, testing its limits, encountering its resistance, developing the fine-grained attunement that allows the detection of subtle invariants — develops a narrow but deep attentional skill. Both are valuable. Both are forms of perceptual education. The organism that can do both is more capable than the organism confined to either.
The AI affordance structure overwhelmingly favors exploratory engagement. The tool makes it easy to move across domains, easy to produce output in unfamiliar territory, easy to perceive the gross structure of a new problem space. It does not make it easy to develop the performatory skills that come from sustained, friction-rich engagement with a single domain. Those skills require exactly the resistance the tool is designed to remove.
---
The affordance for the perception of competence deserves its own analysis, because it operates at a level that Gibson's framework is uniquely equipped to describe. When Claude produces a response — a block of code, a paragraph of analysis, a structural suggestion — the response arrives polished, well-formed, confident. Its surface properties specify competence. The code compiles. The prose flows. The analysis is structured. The organism perceiving this output detects affordances for acceptance — the smooth, well-formed surface specifies that the output is ready for use, the way a well-paved road specifies that it is ready for driving.
But the surface properties that specify competence are not the same as the invariants that specify correctness. A block of code that compiles and runs is not necessarily correct. A paragraph of analysis that reads fluently is not necessarily true. A structural suggestion that appears elegant is not necessarily sound. The invariants that specify correctness, truth, and soundness are deeper than the surface properties that specify competence — they are structural features that require more sustained perceptual engagement to detect.
Segal describes catching exactly this failure. Claude produced a passage linking Csikszentmihalyi's flow state to a concept attributed to Gilles Deleuze. The passage was elegant, well-structured, rhetorically effective. Its surface specified competence. But the philosophical reference was wrong — not slightly wrong, but wrong in a way that would be obvious to anyone who had actually read Deleuze. The surface affordance for acceptance was strong. The deeper invariant that specified incorrectness was available only to a perceiver with the specific perceptual attunement — developed through years of philosophical reading — required to detect it.
Gibson would say that the AI's output presents a perceptual challenge analogous to visual camouflage. A camouflaged animal's surface properties specify "continuation of background" — the organism perceiving it detects the affordances of empty ground where a predator actually sits. The camouflage works because the surface invariants that specify "nothing here" are more easily detected than the structural invariants that specify "something here." The perceiver must override the salient surface perception to detect the hidden structural reality.
The AI's polished output operates by the same perceptual logic. The surface invariants — grammatical fluency, structural coherence, confident tone — are immediately perceivable and specify acceptance as the available action. The structural invariants — factual accuracy, logical soundness, philosophical precision — require a deeper perceptual engagement that the surface actively discourages, because the surface has already specified the output as adequate. The organism must perceive past the affordance for acceptance to detect the affordance for scrutiny, and the environment is designed to make acceptance easier to perceive than scrutiny.
---
Gibson's ecological framework produces a specific, testable prediction about AI-augmented work. The prediction is not that AI tools will make work worse. It is that AI tools will restructure the affordance landscape of work in ways that favor certain perceptual skills — description, evaluation, cross-domain exploration — at the expense of others — deep material engagement, friction-based attunement, the slow development of invariant detection through sustained struggle. The organisms that thrive in this new affordance landscape will be those that can perceive and act on the affordances for depth even when the environment's most salient offerings are affordances for speed.
The prediction is uncomfortable because it implies that thriving in the AI-mediated environment requires the organism to work against the most easily perceived affordances of that environment — to detect and act on the affordances for pause, scrutiny, and sustained engagement when the affordances for continuation, acceptance, and rapid production are more immediately available. This is perceptually demanding. It is the equivalent of the pilot who checks the optic flow even when the instrument says the approach is correct. The instrument is useful. The pilot who relies on it exclusively is dependent. The pilot who perceives both the instrument and the flow — and knows when to trust which — is genuinely expert.
The AI tool is the most powerful instrument ever placed in the builder's cockpit. What it affords is extraordinary. What it demands — the perceptual skill to use it without becoming dependent on it — is equally extraordinary, and the environment itself does not afford the development of that skill. It must come from elsewhere: from training, from deliberate practice, from environments designed to support the perceptual education that the AI tool's affordance structure, left to its own logic, will systematically eliminate.
That is the problem attentional ecology exists to solve.
Segal introduces the concept of attentional ecology in The Orange Pill as a practice: the deliberate study of how AI-saturated environments affect the minds that inhabit them, and the deliberate design of those environments to support human flourishing rather than human depletion. The concept is powerful, practically urgent, and theoretically underdeveloped. Segal has the right instinct. Gibson provides the theoretical foundation that turns the instinct into a framework.
The standard way of thinking about attention treats it as a resource. The organism possesses a finite quantity of attentional capacity. Tasks consume this capacity. When the capacity is exhausted, the organism can no longer attend effectively. The practical advice that follows from this model is managerial: budget your attention, allocate it wisely, protect it from unnecessary expenditure.
Gibson rejected this framing entirely, and the rejection has consequences that the attention-economy literature has not absorbed.
Attention, in Gibson's framework, is not a resource. It is a relationship — an active, ongoing engagement between the organism and its environment, shaped by the affordances the environment provides and the perceptual skills the organism has developed. The organism does not "spend" attention on the environment the way it spends calories on locomotion. It attends to the environment — which is to say, it actively explores the environment's structure, detecting affordances, picking up invariants, refining its perceptual attunement through the activity of engagement itself.
This distinction is not semantic. It determines the nature of the intervention. If attention is a resource, the solution to attentional depletion is conservation — reducing the demands on a finite supply. Set time limits. Take breaks. Close tabs. Manage the budget. If attention is a relationship, the solution is ecological — restructuring the environment so that the relationship between organism and environment supports the attentional patterns that produce flourishing. The first approach treats the organism as the locus of the problem. The second treats the organism-environment system as the locus of the problem, and the environment as the primary lever for intervention.
---
Ecology, in the biological sense from which Gibson drew his metaphor, is the study of organisms in their environments. Not organisms in isolation. Not environments without inhabitants. The unit of analysis is always the system — the web of relationships between living things and the conditions in which they live. An ecologist studying a wetland does not study the water separately from the organisms that live in it. The water's chemistry shapes what can live there, and what lives there shapes the water's chemistry. The system is irreducible.
Gibson applied this principle to perception. The organism does not perceive in isolation. It perceives in an environment, and the environment's structure determines what is perceivable. Change the environment, and the perception changes — not because the organism has changed, but because the affordances available for pickup have shifted. The ecologist who studies a wetland that has been drained by upstream diversion does not blame the fish for dying. She studies the environmental change that eliminated the conditions the fish required. The fish have not become less capable. Their environment has become less habitable.
Attentional ecology, properly grounded in Gibson's framework, applies this same logic to the environments in which human cognition operates. When a builder working with AI tools cannot stop prompting at three in the morning, the attentional ecologist does not study the builder's willpower, examine her dopamine receptors, or prescribe meditation. The attentional ecologist studies the affordance structure of the environment — What does the tool offer? What actions does it make most perceivable? What temporal gaps does it provide or eliminate? What affordances for disengagement exist? — and identifies the specific environmental features that make compulsive engagement the most easily perceived action.
The intervention targets the environment, not the organism. Not because the organism is blameless — organisms make choices, and those choices matter — but because the organism's choices are shaped by what the environment makes available. A well-designed affordance landscape makes good choices easy to perceive and easy to enact. A poorly designed affordance landscape makes destructive choices the path of least resistance and beneficial choices an effortful override that requires the organism to work against the grain of its environment.
---
The Berkeley study that Segal cites — the eight-month observation of a 200-person technology company adopting AI tools — provides empirical evidence that Gibson's framework predicts. The researchers documented task seepage, the tendency for AI-accelerated work to colonize previously protected spaces: lunch breaks, elevator rides, the minute-long gaps between meetings that had previously served as informal cognitive rest.
Gibson's framework identifies these gaps as temporal affordances — features of the environment's temporal structure that specified disengagement as an available action. Before AI, the elevator ride afforded waiting. Waiting is not nothing. Waiting is a temporal environment whose primary affordance is the absence of productive action, and in that absence, the organism's perceptual system disengages from task-oriented processing and enters a mode of diffuse, undirected attention. The default mode network activates. Autobiographical reflection occurs. The consolidation of recently acquired information proceeds without interference from new input.
The AI tool on the smartphone eliminated the elevator ride's temporal affordance. The same ninety seconds now afford prompting — and prompting is more easily perceived, more immediately available, more readily enacted than waiting. The organism does not choose to prompt instead of wait. The organism perceives the affordance for prompting as the most available action in the environment and acts on it, the way a walker perceives the paved path as the most available route and follows it.
The Berkeley researchers called for "structured pauses" — deliberate interruptions built into the workday to protect cognitive rest. Gibson's framework specifies what these pauses must be: not merely the absence of work, but the presence of affordances for non-work. A pause that consists of sitting at the same desk, with the same screen, with the same AI tool a click away, does not afford disengagement. It affords the perception that work is available, easy, and immediate, and the absence of work is a choice that must be actively maintained against the environmental grain.
A genuine pause requires a different environment — one whose affordances specify non-work as the available action. Standing up. Walking outside. Moving to a space that does not contain the tools. The temporal interruption is necessary but not sufficient. The spatial restructuring of affordances is what gives the interruption its ecological force. The organism must be placed in an environment whose offerings are different from the offerings of the workspace, so that the perceptual system encounters a different set of available actions and attends accordingly.
---
The concept of attentional ecology, grounded in Gibson, produces a framework for analyzing any technological environment in terms of the perceptual patterns it supports or degrades. The framework has four components, each derived directly from Gibson's ecological approach.
The first is the affordance audit — a systematic inventory of what the environment offers for doing. What actions does the environment make perceivable? What actions does it make easy? What actions are available in principle but not specified by any feature of the environmental layout? The affordance audit treats the technological environment the way a field ecologist treats a habitat: as a structured space whose properties determine what kinds of organisms can thrive there.
For an AI-augmented workspace, the audit would identify affordances for continued prompting, immediate iteration, breadth of output, the perception of competence, and the absence of natural stopping points. It would also identify the hidden affordances — available but not salient — for pausing, scrutinizing output, engaging directly with the material the AI has produced, and disengaging from the tool entirely.
The second component is the organism profile — an assessment of the perceptual skills the organism brings to the environment. What invariants has the organism learned to detect? What affordances is the organism already attuned to? What perceptual educations has the organism undergone, and what educations has it missed? The organism profile acknowledges what Gibson insisted upon throughout his career: that perception is skilled, and that different organisms perceive different affordances in the same environment because their perceptual histories are different.
A senior engineer entering the AI-augmented workspace brings decades of perceptual attunement to code's structural invariants. She can perceive fragility, technical debt, architectural unsoundness, because her perceptual system has been educated by years of friction-rich engagement. A junior developer entering the same workspace does not yet possess these attunements. The same environment, with the same affordance structure, offers different possibilities to these two organisms — and the risks are different, too. The senior engineer risks atrophy of skills she already possesses. The junior developer risks never developing them at all.
The third component is the interaction analysis — the study of what actually happens when this particular organism encounters this particular affordance structure. Not what should happen according to the designer's intention, and not what would happen if the organism were a rational agent making optimal choices. What actually happens, observed empirically, in the specific ecological context.
The Berkeley researchers performed exactly this kind of interaction analysis, and their findings confirmed what Gibson's framework predicts: organisms responded to the affordances the environment offered, not to the affordances the designers hoped they would perceive. The AI tools were designed to increase productivity. They succeeded. They were not designed to colonize lunch breaks, erode the capacity for disengagement, or produce compulsive engagement patterns. They did these things anyway, because the affordance structure made these behaviors easier to perceive and easier to enact than the alternatives.
The fourth component is the redesign intervention — the deliberate restructuring of the affordance landscape to support the perceptual patterns that the attentional ecologist has determined are conducive to flourishing. This is not content moderation, not time limits, not lectures about mindfulness. It is the physical and temporal restructuring of the environment itself — changing what the environment offers, so that the organism encounters a different field of perceivable possibilities.
---
Gibson was aware, toward the end of his life, that ecological psychology had implications beyond the laboratory. In the final chapter of The Ecological Approach to Visual Perception, he wrote about what he called "the theory of affordances for behavior" — the idea that the affordances of the environment are not merely perceptual facts but behavioral ones, that they specify not just what can be seen but what can be done, and that the design of environments is, in the deepest sense, the design of behavioral possibilities.
This insight has been taken up most prominently in the field of physical design — architecture, urban planning, product design — where the principle that environments shape behavior through their affordance structures is now a commonplace. Don Norman's The Design of Everyday Things translated Gibson's affordance concept into a design methodology that has influenced a generation of practitioners, though Norman's version, as noted in Chapter 1, diverges from Gibson's in significant ways.
What has not happened — what attentional ecology, properly grounded in Gibson's framework, makes possible — is the application of this principle to the design of cognitive environments. The environments in which human beings think, learn, create, and make decisions are at least as consequential as the environments in which they walk, drive, and manipulate objects. And these cognitive environments are now, for the first time in human history, being designed with the same deliberateness and the same level of engineering sophistication as physical environments.
The AI workspace is a designed cognitive environment. Its affordance structure was engineered by teams of researchers and developers who made specific decisions about what the tool would offer and how it would offer it. The conversational interface was a design choice. The immediate feedback was a design choice. The polished output was a design choice. Each of these choices created affordances that shape the perceptual behavior of every organism that encounters the tool.
And yet the cognitive affordance structure of the AI workspace receives a fraction of the design attention that the physical affordance structure of a building receives. Architects study for years how the layout of a space shapes the behavior of its inhabitants. AI tool designers study how the interface maximizes engagement metrics. The difference is not in sophistication but in objective. The architect designs for human habitation. The AI designer, in the absence of an ecological framework, designs for usage — and usage is not the same as habitation.
Habitation implies dwelling. Remaining. Growing. Developing over time in a space that supports the full range of the organism's needs. Usage implies consumption — the extraction of value from a tool that has been optimized for the extraction. Attentional ecology insists that AI tools are not merely used. They are inhabited. They are the cognitive environments in which an increasing proportion of human thinking, learning, and creating takes place. And the design of these environments, evaluated by the criterion of what they afford for human perceptual and cognitive development, is the most consequential design challenge of the current technological moment.
Gibson did not live to see this challenge. But he built the framework that makes it intelligible. The environment offers. The organism perceives. The affordance shapes the action. The action shapes the organism. The design of the affordance landscape is the design of the organism's development.
Attentional ecology is not a metaphor. It is applied ecological perception theory, and the affordance structure of the AI-mediated environment is its object of study.
When a carpenter first picks up a hammer, the hammer is an object. It has weight, texture, a particular balance in the hand. The carpenter perceives the hammer — its surface properties, its affordances for gripping and swinging. The attention is on the tool.
After months of daily use, something changes. The carpenter no longer perceives the hammer. She perceives the nail. The hammer has become transparent — an extension of the arm, a medium through which the carpenter's perceptual system engages with the wood, the nail, the joint. The hammer's weight, which was once a property of an object being studied, is now a calibration of the perceptual system itself. The carpenter feels the wood through the hammer the way a person feels the road through the soles of shoes — the mediation is there, but it has become invisible. The tool has been incorporated into the body schema, absorbed into the perceptual system as a channel through which the world is perceived rather than a thing in the world to be perceived.
This phenomenon — the transparency of the mastered tool — has been studied by philosophers and psychologists since at least Maurice Merleau-Ponty's Phenomenology of Perception in 1945. Michael Polanyi developed a systematic account of it in Personal Knowledge, distinguishing between "focal awareness" — attention directed at the object of engagement — and "subsidiary awareness" — the background registration of the tool through which the engagement occurs. The blind person does not attend to the cane. She attends to the world through the cane. The cane has become subsidiarily integrated into her perceptual system.
Gibson's framework provides a specific account of what transparency means in terms of affordance perception. The novice tool-user perceives two sets of affordances: the affordances of the tool (it affords gripping, swinging) and the affordances of the material (it affords being struck, shaped, joined). As mastery develops, the tool's affordances fade from perceptual salience. The tool becomes what Gibson would call a "medium" — a substance through which the environment's affordances are perceived, rather than a surface whose own affordances demand attention. The air is a medium. Light is a medium. Neither is perceived in itself; both are perceived through. The mastered tool joins this category. It becomes part of the perceptual apparatus rather than part of the perceived world.
---
The implications for AI are immediate and consequential.
Claude, for a new user, is an object. The interface demands attention. The user learns to prompt, to evaluate responses, to iterate. The focus is on the tool — what it can do, what it cannot do, how to operate it effectively. This is the focal phase, and it is where most discussions of AI remain: How do I use this tool? What are its capabilities? What are its limitations?
For the experienced user, Claude becomes transparent. The builder no longer attends to the interface. She attends to the problem through the interface. The prompting becomes automatic, the iteration instinctive, the evaluation rapid and embodied rather than deliberate and effortful. Segal describes this state when he recounts working late, losing track of time, the ideas connecting through the conversation with a fluency that erased the boundary between his thinking and the tool's response. The tool had become a medium. He was perceiving the problem through Claude the way the carpenter perceives the wood through the hammer.
This transparency is the condition of skilled tool use. It is also the condition under which the tool's affordance structure becomes most powerful — and most invisible.
When the carpenter perceives the hammer as an object, the hammer's affordances are in focal awareness. The carpenter can evaluate them, choose to use the hammer or set it down, decide that this particular task requires a different tool. When the hammer becomes transparent, these evaluative affordances fade from perception. The carpenter does not choose to use the hammer. She uses it without choosing, because it has become part of the way she perceives the work. The evaluation — the moment of focal attention in which the tool itself is the object of scrutiny — has been automated out of the perceptual process.
For a physical tool with a stable affordance structure, this automation is benign. The hammer's affordances do not change between sessions. Once the carpenter has learned what the hammer offers and what it forecloses, she can trust the tool to behave consistently, and the transparency that comes with mastery is genuinely efficient — freeing perceptual resources for the work that matters.
The AI tool has a different property. Its affordance structure is not stable. It changes with each update, each model revision, each shift in training data or architectural design. The Claude that the builder mastered three months ago is not the Claude she is using today. But the tool's transparency persists — the builder continues to perceive through the tool, continues to treat its responses with the subsidiary awareness of a mastered instrument, even as the instrument's properties have shifted beneath the threshold of focal perception.
---
Gibson's framework identifies a specific danger here that the existing literature on AI tools has not fully articulated. The danger is not that the tool is untrustworthy. It is that the tool's incorporation into the perceptual system makes its properties invisible to the very organism that most needs to evaluate them.
Consider the senior engineer who has been using Claude Code for six months. She has developed a fluent, transparent relationship with the tool. She prompts without thinking about prompting. She evaluates outputs rapidly, with the kind of embodied judgment that comes from hundreds of hours of interaction. The tool has become part of her perceptual apparatus. She perceives codebases through it the way the pilot perceives the approach through the instrument panel.
Now suppose the tool introduces a subtle bias — a tendency to favor a particular architectural pattern, or to produce code that is locally correct but globally fragile. In the focal phase, when the tool was still an object of attention, the engineer might have detected this bias. She was scrutinizing the tool's outputs deliberately, evaluating each response against her own understanding, maintaining the evaluative distance that focal awareness provides.
In the transparent phase, the bias enters her perceptual system through the subsidiary channel. She does not evaluate it because she is no longer evaluating the tool. She is perceiving through it. The bias becomes part of how she sees the codebase, incorporated into her perceptual apparatus alongside the tool that carries it. The architectural pattern that the AI favors becomes the pattern she perceives as natural, because the medium through which she perceives has specified it as the default.
This is not a failure of intelligence or vigilance. It is a structural consequence of tool transparency — the same transparency that makes skilled tool use possible. The mastered tool becomes invisible, and its properties become the properties of the world it reveals. The blind person who uses a cane with a bent tip perceives a curved surface where a flat one exists. The curvature is not in the world. It is in the medium. But because the medium is transparent, the curvature is experienced as a property of the surface, not of the cane.
---
The problem deepens when one considers that the AI tool does not merely mediate perception of an existing environment. It generates the environment. When Claude produces code, the code is not a window onto a pre-existing computational reality. It is a construction — an artifact produced by the tool that then becomes part of the environment the builder perceives. The builder perceives the code through the tool that produced it, and the code itself is a product of the tool's own generative processes.
In Gibson's terms, this is a collapse of the distinction between medium and surface. The medium is supposed to be that which the organism perceives through. The surface is that which the organism perceives. When the medium generates the surface, the organism is perceiving the medium's outputs through the medium itself. The evaluative distance that the medium-surface distinction provides — the ability to distinguish between what the tool reveals and what the tool introduces — dissolves.
Segal catches this moment in his account of the Deleuze passage. Claude produced text that attributed a concept to Deleuze. The text was polished, confident, well-integrated into the surrounding argument. Segal, perceiving the text through the transparent medium of his established collaboration with Claude, initially accepted it. The surface — the passage — was generated by the medium — Claude — and perceived through the same medium that generated it. The evaluative distance was zero.
It was only later, when Segal broke the transparency — when he shifted Claude from subsidiary to focal awareness, from medium back to object — that he detected the error. The shift required deliberate effort. It required the specific perceptual act of attending to the tool rather than through it, of treating the output as a surface to be scrutinized rather than a window to be looked through.
Gibson's framework suggests that this deliberate shift — from subsidiary to focal awareness, from transparent use to evaluative scrutiny — must be a practiced skill, maintained against the natural tendency of mastered tools to become invisible. The carpenter who occasionally picks up the hammer and examines it, checking for cracks, testing the balance, attending to the tool as an object rather than a medium, maintains the evaluative relationship that transparency tends to erode.
The AI user who occasionally steps back from the collaboration and scrutinizes the tool's outputs with fresh, focal attention — who treats Claude as an object to be evaluated rather than a medium to be perceived through — performs the same maintenance. This is not paranoia. It is the perceptual hygiene of skilled tool use.
---
There is a further consequence of tool transparency that extends beyond the individual user to the collective ecology of knowledge work. When a tool becomes transparent for a community of users — when an entire profession perceives its work through the medium of AI — the tool's properties become the profession's assumptions. The patterns the AI favors become the patterns the profession considers natural. The solutions the AI generates most easily become the solutions the profession generates most frequently. The limits of the AI's competence become the profession's blind spots, because the profession perceives through the tool and the tool cannot reveal its own limitations.
This is not a future risk. It is a present reality in every field where AI tools have achieved widespread adoption. The code patterns that AI generates most fluently are already becoming the patterns that developers consider standard, not because the patterns are optimal but because the medium specifies them as the default. The legal arguments that AI assembles most readily are already becoming the arguments that junior lawyers reach for first, not because the arguments are strongest but because they are most available through the medium the lawyers have learned to perceive through.
Gibson studied how physical environments shape perceptual development — how the infant who grows up in a visually rich environment develops different perceptual skills than the infant raised in a visually impoverished one. The same principle applies to cognitive environments. The profession that thinks through AI develops different cognitive patterns than the profession that thinks without it. The patterns are not worse in every respect. But they are shaped by the medium, and the shaping is invisible to the organisms inside it, because the medium is transparent.
Gibson, again, would not frame this as a reason to reject the tool. He would frame it as a reason to study the mediation — to develop the perceptual practice of shifting between transparent and focal engagement, between perceiving through the tool and perceiving the tool, between using and evaluating. The blind person who periodically examines her cane maintains the ability to distinguish between what the cane reveals and what the cane introduces. The builder who periodically scrutinizes Claude's outputs with focal, evaluative attention maintains the ability to distinguish between what the AI reveals about the problem and what the AI introduces from its own generative patterns.
This periodic scrutiny is not a luxury. It is a structural requirement of skilled tool use in an environment where the tool generates the surfaces the organism perceives through it. The organism that cannot distinguish between what the medium reveals and what the medium introduces has not mastered the tool. The tool has mastered the organism.
In 1935, the Australian government released cane toads into the sugarcane fields of Queensland. The toads were imported from Hawaii, where they had successfully controlled beetle populations that were destroying sugarcane crops. The plan was simple and the logic was sound: introduce a predator, reduce the pest population, protect the harvest.
The cane toads did not eat the beetles. The beetles lived on the upper stalks of the sugarcane, and the toads, being ground-dwelling creatures with limited climbing ability, could not reach them. What the toads did instead was spread — rapidly, voraciously, and without natural predators — across the Australian landscape. They poisoned the native animals that tried to eat them. They outcompeted native species for food and habitat. They colonized territory at a rate of roughly fifty kilometers per year. Ninety years later, the cane toad population in Australia numbers in the hundreds of millions, and the ecological damage is incalculable.
The cane toad is the canonical example of what ecologists call emergent behavior — outcomes that arise from the interaction between an introduced organism and an existing ecosystem in ways that the designers of the introduction could not predict from the properties of the organism or the ecosystem alone. The toads were not designed to devastate Australian wildlife. The Australian environment was not designed to be devastated. The devastation emerged from the specific, unpredictable interaction between the organism's properties and the environment's affordance structure.
Gibson's framework, applied to technological environments, predicts that emergent behavior is not an occasional failure of design but a structural feature of complex affordance landscapes. The designer can specify what the environment offers. The designer cannot fully predict what organisms will do with the offering, because the organism's response depends on its own history, its perceptual skills, its current needs, and the broader ecological context of its engagement — none of which the designer controls.
---
The productive addiction that Segal documents throughout The Orange Pill is an emergent behavior. Nobody at Anthropic designed Claude Code to produce compulsive engagement patterns in software developers. The tool was designed to afford efficient, high-quality code generation through natural language interaction. The conversational interface was designed to afford ease of use. The immediate feedback was designed to afford rapid iteration. The polished output was designed to afford confidence in the result.
Each of these design decisions was reasonable in isolation. None of them specified compulsive engagement as a goal. But the combination — frictionless output, immediate gratification, conversational stimulation, the absence of natural stopping points, and the genuine intellectual satisfaction of watching ideas take form in real time — created an affordance structure in which compulsive engagement became the path of least resistance for a significant proportion of users.
The spouse who wrote "Help! My Husband is Addicted to Claude Code" was documenting an emergent behavior. Her husband was not engaging with a tool designed for addiction. He was engaging with a tool whose affordance structure, interacting with his specific perceptual history, his professional identity, his need for creative expression, and the broader cultural context of productivity optimization, produced a behavioral pattern that neither the designers nor the user intended.
Gibson's framework is precise about why this kind of emergence is predictable in principle but unpredictable in specifics. Affordances are relational — they exist in the interaction between organism and environment, not in either alone. The conversational interface affords continued prompting, but whether a particular user will prompt compulsively depends on factors outside the designer's control: the user's tolerance for uncertainty, her relationship to her work identity, the strength of her competing commitments, the affordance structure of the rest of her environment. The same affordance that produces compulsive engagement in one organism produces healthy, boundaried use in another, not because the affordance is different but because the organism is.
---
The history of technology is saturated with emergent behaviors that designers did not anticipate and could not have predicted from the properties of the technology alone.
The automobile was designed for transportation. It produced suburban sprawl, the decline of urban cores, the restructuring of social geography around the assumption of personal vehicle ownership, and a complete reorganization of American economic life that no automotive engineer intended or foresaw. These outcomes emerged from the interaction between the automobile's affordance structure — it afforded individual mobility over long distances — and the existing social, economic, and geographic landscape of twentieth-century America.
Email was designed for asynchronous communication — a way to send messages that the recipient could read and respond to at her convenience. It produced the expectation of immediate response, the erosion of the boundary between work and personal time, and the particular anxiety of the perpetually full inbox. These outcomes emerged from the interaction between email's affordance structure — it afforded instantaneous delivery and permanent availability — and the cultural norms of professional responsiveness.
Social media was designed for connection — a way to maintain relationships across distance and share experiences with a community. It produced polarization, anxiety, the fragmentation of shared reality, and the specific loneliness of people who are perpetually connected and perpetually unsatisfied. These outcomes emerged from the interaction between the feed's affordance structure and the human perceptual system's evolved responsiveness to social information.
In each case, the designers were not naive. They understood their technology. What they did not understand, and could not have understood from within the design process, was the ecological system into which their technology was being introduced. The automobile designers understood engines and roads. They did not understand the interaction between personal mobility and land-use patterns. The email designers understood messaging protocols. They did not understand the interaction between instant delivery and cultural norms of professional availability. The social media designers understood engagement mechanics. They did not understand the interaction between algorithmic curation and the human perceptual system's vulnerability to social comparison.
---
Gibson's framework explains why emergent behavior is structurally unpredictable, not merely practically difficult to foresee. Affordances are properties of the organism-environment system, not of the environment alone. The designer controls the environment. The designer does not control the organism, does not control the broader ecology in which the organism-environment interaction takes place, and does not control the temporal dynamics through which the interaction evolves. The organism's perceptual system adapts to the affordance structure over time, developing new attunements, losing old ones, and the adapted organism encounters the same environment differently than the unadapted organism did. The affordance landscape is not static. It co-evolves with the organisms that inhabit it.
This co-evolution is the mechanism through which emergent behaviors develop and stabilize. The builder who begins using Claude Code with deliberate, boundaried intention — prompting during work hours, scrutinizing outputs carefully, maintaining the focal awareness that allows evaluation — is encountering a particular affordance landscape. But as she develops transparency with the tool, as the prompting becomes automatic and the evaluation becomes rapid and embodied, the affordance landscape she perceives shifts. Affordances that were hidden become salient. The affordance for prompting during the elevator ride, which was not perceived during the focal phase, becomes perceivable once the tool is transparent. The affordance for continued iteration past the point of diminishing returns, which was resistible when the tool demanded focal attention, becomes the path of least resistance when the tool is subsidiary.
The organism changes the environment by inhabiting it, and the changed environment changes the organism. This is the co-evolutionary spiral that produces emergent behaviors. The cane toad altered the Australian ecosystem by surviving in it, and the altered ecosystem afforded the toad's further spread. The builder alters her cognitive ecology by using AI tools, and the altered ecology affords deeper integration of the tool into her cognitive processes.
---
The practical consequence of this analysis is that the design of AI tools cannot be evaluated at the point of introduction. It must be evaluated ecologically — through ongoing observation of the emergent behaviors that the affordance structure produces in the actual organisms that inhabit it.
This is what distinguishes attentional ecology from user experience design. UX design evaluates the interface at the point of contact: Does the user understand what to do? Can she accomplish her goal? Is the interaction efficient and satisfying? These are legitimate questions, but they are questions about the designed affordances, not about the emergent behaviors. A UX evaluation of Claude Code would likely conclude that the tool is extraordinarily well-designed — intuitive, responsive, powerful, satisfying. It would not detect the productive addiction, because the productive addiction is not a feature of the interface. It is an emergent property of the organism-environment system, observable only over time and only through the kind of embedded, longitudinal observation that the Berkeley researchers performed.
Gibson would insist on the longitudinal perspective. Ecological observation is not snapshot observation. It requires time — time to observe how the organism adapts to the affordance structure, how the affordance structure changes under the organism's habitation, how emergent behaviors develop and stabilize and resist intervention. The snapshot evaluation sees the tool working as designed. The ecological evaluation sees the system evolving in ways the design did not anticipate.
This is why Segal's call for ongoing, empirical attention to what AI does to the minds that use it is not a precautionary luxury but a structural necessity. The affordance structure of AI tools is powerful enough and novel enough that the emergent behaviors it produces cannot be predicted from first principles. They can only be observed, documented, and responded to through the continuous, adaptive, ecologically informed practice that attentional ecology requires.
---
The limits of design are not a counsel of despair. They are a counsel of humility — the recognition that the designer's control extends to the affordance structure of the environment but not to the behaviors the affordance structure produces. The cane toad disaster was not inevitable. It was the consequence of an introduction made without ecological study, without ongoing monitoring, without contingency plans for emergent outcomes. The toads were released and left to interact with the ecosystem without observation, and by the time the emergent behavior was documented, the interaction had progressed past the point of simple intervention.
AI tools have been released into the cognitive ecology of knowledge work with a similar confidence that the designed affordances will produce the designed outcomes. The tools are monitored for technical performance — accuracy, speed, safety — but not for ecological impact. The emergent behaviors documented by the Berkeley researchers, by Segal, by the spouses and colleagues and managers who have watched AI-augmented workers disappear into their tools, are the early signals of an ecological interaction that is still in its initial phase.
Gibson's framework provides the theoretical basis for taking these signals seriously. Emergent behaviors are not anomalies. They are the predictable consequence of introducing a powerful new affordance structure into a complex existing ecology. The question is not whether emergent behaviors will occur — they will, as surely as the cane toads spread — but whether the ecological monitoring systems are in place to detect them early, and whether the institutional flexibility exists to respond.
The beaver builds the dam. But the beaver also patrols the dam, every day, checking for the places where the current has loosened a stick or eroded the mud. The maintenance is not less important than the construction. In the ecology of AI-mediated work, the construction is the design of the tool. The maintenance is the ongoing, empirical observation of what the tool actually produces in the organisms that use it — the attentional patterns, the perceptual developments, the emergent behaviors that no designer intended and no designer can fully predict.
The observation must be as continuous as the interaction it observes. The ecology does not pause for evaluation. The river does not wait for the beaver to finish checking.
A geologist reads a cliff face the way a musician reads a score. The layers of sediment, the angle of the strata, the color shifts that mark the boundary between one epoch and another — these are not data points to be catalogued and computed. They are invariants in the structure of the rock, specifying the history of the landscape to any organism whose perceptual system has been educated to detect them. The geologist does not infer the cliff's history from its appearance. She perceives the history in the appearance, directly, because the information that specifies geological time is structured into the visible surface of the rock.
Gibson called this kind of perception "information pickup" and distinguished it sharply from the processing of data. Data is abstract, detached from the environment that produced it, requiring computation to yield meaning. Information, in Gibson's technical sense, is structured energy — patterns in the ambient array that specify properties of the environment without requiring the organism to compute, interpret, or infer. The optic flow pattern that specifies locomotion is information. The texture gradient that specifies distance is information. The sedimentary layering that specifies geological time is information. In each case, the meaning is in the structure, available for direct pickup by the attuned perceiver.
The distinction between information and data is not a terminological preference. It is a claim about the nature of understanding. An organism that picks up information from a structured environment develops a perceptual attunement to that environment — a capacity for direct, rapid, reliable detection of the properties that matter. An organism that processes data develops computational skills — the capacity to manipulate abstract representations according to rules. Both are forms of competence. They are not the same form, and they produce different kinds of understanding.
The geologist who has spent twenty years reading cliff faces possesses a perceptual competence that no amount of data processing can replicate. Show her a photograph of a rock formation, and she perceives its structure — the faults, the folding, the intrusion of igneous material through sedimentary layers — in a single glance. Her understanding is not the conclusion of an argument. It is the perception of a pattern, as immediate and as certain as the perception of a face.
Train an AI on the same geological dataset, and it will classify rock formations with impressive accuracy. It will identify strata, date deposits, flag anomalies. It will process the data correctly. But the AI's competence is computational, not perceptual. It manipulates representations according to learned patterns. It does not pick up information from the structure of the environment, because it does not inhabit the environment. It processes images of the environment — flat, detached, stripped of the ambient structure that Gibson argued was the vehicle of genuine perceptual information.
---
This distinction bears directly on the question of what kind of understanding AI-mediated work produces. When a builder works with Claude Code, the information she encounters is not the same kind of information she would encounter in unmediated engagement with the problem.
Unmediated engagement with a software system is a perceptual activity in Gibson's full sense. The developer reads the code — not as data to be processed but as a structured environment to be explored. She traces execution paths, encounters error conditions, discovers dependencies that the documentation does not describe. Each encounter deposits information — not in her memory, exactly, but in her perceptual system's attunement to the invariants of the codebase. Over time, the system becomes perceivable to her in the way the cliff face is perceivable to the geologist: as a structured environment whose properties are specified by patterns she has learned to detect.
AI-mediated engagement provides a different kind of information. Claude's response to a question about the codebase is not a pattern in the ambient array, available for direct pickup. It is a generated text — a representation of the system's properties, produced by computational processes, articulated in natural language, and presented as a finished description. The information in the response is semantic, not ecological. It tells the builder about the system. It does not afford the builder the perceptual engagement with the system that would develop her own capacity to detect its properties directly.
The difference is analogous to the difference between walking through a forest and reading a description of a forest. The description may be accurate, detailed, and informative. The walker who has read the description knows facts about the forest — its species composition, its topography, its hydrology. The walker who has walked through the forest knows the forest — its textures, its light, the way the ground feels under different canopy conditions, the specific silence of a conifer stand versus the rustling openness of a deciduous grove. The first kind of knowledge is propositional. The second is perceptual. Both are real. Only one is built through the organism's direct engagement with the structured environment.
---
Gibson was emphatic that information pickup requires exploration — active movement through the environment that samples the ambient array from multiple viewpoints and across time. The pilot perceives the landing approach by flying the approach. The geologist perceives the cliff face by walking along it, examining it from different angles, touching the rock, noting how the surface changes with the light. The information is there, structured into the environment, but it yields itself only to the organism that moves through it.
AI-mediated information does not require exploration. It arrives in response to a description of the problem, formatted, structured, and complete. The builder does not move through the information. The information is delivered to her, pre-structured, by a system that has processed it computationally and rendered it in a form optimized for comprehension.
This delivery is efficient. It is also ecologically impoverished, in a specific sense that Gibson's framework makes precise. The information that exploration yields is richer than the information that delivery yields, because exploration exposes the organism to invariants that description cannot capture. The developer who debugs a system manually encounters not just the error but the context of the error — the specific state of the system at the moment of failure, the sequence of operations that produced the state, the architectural decisions that made the failure possible. This contextual information is not incidental. It is the medium in which the error's significance is specified. The error, encountered in context through exploratory engagement, affords a different kind of understanding than the same error described in a report.
Claude can describe the error. It can identify its cause, suggest its fix, and explain the architectural decisions that made it possible. The description may be complete and accurate. What it cannot provide is the perceptual experience of encountering the error in context — the specific attentional sharpening that occurs when the organism, moving through the system, detects something that does not match the expected pattern. That sharpening is the mechanism through which perceptual expertise is built. It is the moment when the organism's attentional system recalibrates, adding a new invariant to its repertoire, becoming slightly more capable of detecting the next anomaly.
The AI does the exploration computationally and delivers the result. The organism receives the result without performing the exploration. The result is correct. The perceptual recalibration does not occur.
---
The consequences compound over developmental time. A junior developer who begins her career in an AI-mediated environment receives information about software systems from the first day. She learns facts, patterns, best practices, architectural principles. She may learn them faster than her predecessors did, because the AI delivers information efficiently and answers questions immediately. Her propositional knowledge — her ability to state what is true about software systems — may develop more rapidly than any previous generation of developers.
But her perceptual knowledge — her ability to detect invariants directly, to perceive the structure of a system through engagement with its specifics, to feel when something is wrong before she can articulate what — develops differently, because the affordances for exploratory engagement are reduced. She has fewer opportunities to debug manually, to trace execution paths by hand, to encounter errors in context and develop the attentional attunement that context provides. The information she receives is mediated, pre-structured, and delivered — not picked up through her own exploratory activity.
Gibson would predict that this developer will be competent in a way that is different from the competence of her predecessors. She will know more, in the propositional sense. She will perceive less, in the ecological sense. Her capacity to describe a system's properties will exceed her capacity to detect those properties directly. She will be articulate about architecture but less attuned to the subtle invariants that specify structural fragility, technical debt, and the thousand small warning signs that the experienced developer perceives without conscious analysis.
This is not a deficiency in the developer. It is a consequence of the affordance structure of her environment. She has developed the perceptual skills that her environment afforded. Her environment afforded the pickup of mediated, propositional, delivered information. It did not afford the kind of exploratory, direct, friction-rich engagement through which ecological perceptual skills develop.
---
The question this analysis raises is not whether AI-mediated information is valuable — it manifestly is — but whether the affordance landscape of AI-mediated work can be designed to include affordances for direct perceptual engagement alongside the affordances for mediated information delivery.
The answer, Gibson's framework suggests, is yes — but only through deliberate design, because the default affordance structure of AI tools overwhelmingly favors delivery over exploration. The conversational interface makes it easier to ask the AI than to investigate directly. The immediate response makes it easier to receive an answer than to discover one. The polished output makes it easier to accept a description than to develop a perception.
A redesigned affordance landscape might include what could be called "exploration affordances" — features that encourage the organism to engage directly with the material the AI has produced, rather than accepting it as a finished description. A code generation tool that delivers not just the code but a series of intermediate states — showing how the solution developed, inviting the builder to trace the logic, offering points where the builder must make a choice rather than accepting the AI's default — would afford a kind of engagement that the current interface does not.
A tool that deliberately withholds complete answers — that provides partial solutions and requires the organism to complete them, or that generates multiple alternatives and requires the organism to evaluate them through direct engagement rather than description — would afford the kind of exploratory activity that builds perceptual skill.
These design choices would reduce efficiency. A tool that requires the organism to explore is slower than a tool that delivers the answer. But the efficiency loss would be offset by a developmental gain — the organism would develop perceptual competencies that the delivery-optimized interface systematically eliminates.
Gibson spent his career arguing that the richness of perception depends on the richness of the information available for pickup, and that the richness of the information depends on the organism's active, exploratory engagement with a structured environment. The AI-mediated environment is structured. It contains information. But the affordance structure of the mediation determines whether the organism picks up that information through its own perceptual activity or receives it passively through a channel that bypasses the perceptual system entirely.
The cliff face does not deliver its geological history in a report. It structures the information into its visible surface and waits for the geologist who has learned to read it. The geologist's expertise is not knowledge about the cliff. It is the capacity to perceive the cliff — to pick up the information that specifies its history directly from the structure of the rock.
The question for AI-mediated environments is whether they can be designed to function more like cliff faces and less like reports — offering structured information for direct pickup rather than pre-processed descriptions for passive reception. The ecological richness of the environment depends on it. And the perceptual development of the organisms that inhabit it depends on the ecological richness of the environment.
Gibson's principle is simple and unbending: the organism develops the perceptual skills that its environment affords. Design the environment for delivery, and the organism develops skills of reception. Design the environment for exploration, and the organism develops skills of perception.
The choice is in the design.
Gibson died in December 1979, six months after the publication of the book that contained his most complete statement of ecological perception theory. He did not live to see the personal computer become ubiquitous, the internet restructure communication, the smartphone restructure attention, or artificial intelligence restructure production. He never encountered a large language model, never saw a social media feed, never experienced the affordance structure of an environment that generates the surfaces the organism perceives through it.
And yet the framework he built — the insistence that perception is an activity, not a reception; that environments structure the possibilities for action; that affordances shape behavior as powerfully as any physical constraint; that the organism's cognitive development depends on the ecological richness of the environment it inhabits — this framework describes the AI moment with a precision that no contemporary analysis has matched, because Gibson was not describing a technology. He was describing the fundamental relationship between organisms and their environments, a relationship that does not change when the environment is made of code.
The question that remains is practical: Given what Gibson's framework reveals about the affordance structure of AI-mediated environments — the systematic privileging of speed over depth, delivery over exploration, breadth over attunement — what would a well-designed affordance landscape look like? Not in the abstract, not as a philosophical ideal, but as a set of concrete design principles derived from ecological perception theory and applicable to the environments in which human beings will increasingly live, think, learn, and create.
---
The first principle is the preservation of affordances for exploratory engagement.
Gibson demonstrated that perceptual skill develops through active exploration — the organism moving through a structured environment, sampling it from multiple viewpoints, detecting invariants through the contrast between what changes and what remains stable. The current affordance structure of AI tools systematically reduces the opportunities for this kind of exploration by delivering results rather than structuring environments for discovery.
A well-designed AI environment would include what might be called "exploratory friction" — deliberate features that afford the organism's own perceptual engagement with the material. Not friction for its own sake. Friction that specifies exploration as an available action, making it perceivable alongside the affordance for accepting the delivered result.
Concretely, this might mean AI tools that present multiple solution paths rather than a single optimal result, requiring the organism to evaluate alternatives through engagement rather than accepting a default through passivity. It might mean tools that decompose complex outputs into stages, inviting the organism to examine each stage before proceeding to the next. It might mean tools that deliberately withhold confidence indicators, forcing the organism to develop its own perceptual capacity for evaluating quality rather than relying on the tool's self-assessment.
Each of these designs would reduce the tool's efficiency as measured by speed of output. Each would increase the organism's perceptual development as measured by the capacity for direct, skilled evaluation of the material. The trade-off is real, and it cannot be wished away. An environment optimized for maximum speed of production is, by structural necessity, an environment depleted of affordances for the perceptual development that speed of production eliminates.
---
The second principle is the provision of temporal affordances for non-engagement.
Gibson's framework identifies time as a structural feature of the environment, not merely a constraint within which the organism operates. The temporal structure of an environment determines which actions are perceivable and which are not. A gap between input and response is a temporal affordance — it specifies pausing, reflecting, or disengaging as available actions. An immediate response eliminates this affordance, specifying continuation as the primary action.
The current affordance structure of AI tools is temporally compressed. The response arrives in seconds. The organism perceives continuation as the available action and acts on it. The temporal space in which reflection, evaluation, and disengagement would occur has been compressed below the threshold of perceptual salience.
A well-designed AI environment would build temporal affordances into the interaction — not as imposed delays, which the organism would experience as obstacles, but as structured intervals that afford different kinds of engagement. A "thinking pause" after a complex prompt, during which the interface signals that the response is being formulated, would afford the organism a moment to formulate its own expectations — to predict what the response should contain, to notice what it cares about, to develop the evaluative stance that makes scrutiny possible.
The specific duration and structure of these temporal affordances would need to be determined empirically — through the kind of ecological observation that attentional ecology requires. Too short, and the affordance is imperceptible. Too long, and the affordance becomes an obstacle rather than an opportunity. The right duration depends on the organism's perceptual skills, the complexity of the task, and the broader ecological context of the engagement. There is no universal answer. There is only the commitment to studying the specific interaction and adjusting the design accordingly.
---
The third principle is the maintenance of affordances for focal evaluation of the tool itself.
Chapter 7 described the phenomenon of tool transparency — the process through which a mastered tool becomes a perceptual medium, perceived through rather than perceived as an object. Transparency is the condition of skilled use, but it is also the condition under which the tool's properties become invisible to the organism that most needs to evaluate them.
A well-designed AI environment would include features that periodically restore the tool to focal awareness — that interrupt the transparent use and afford the organism a moment of evaluative scrutiny. Not constantly, which would prevent the transparency that skilled use requires. But regularly enough to prevent the complete subsumption of the tool into the perceptual system.
Concretely, this might mean an interface that periodically surfaces its own uncertainty — not just in the content of its responses but in the structure of the interaction itself. A tool that occasionally says "I am less confident about this than my tone suggests" performs a metacognitive function that the organism in transparent mode cannot perform for itself. A tool that tracks its own influence patterns — noting when the organism has accepted three consecutive suggestions without modification — and flags the pattern for the organism's attention affords the specific evaluative scrutiny that transparency tends to eliminate.
These features would work against the grain of the tool's primary affordance structure, which is optimized for fluent, uninterrupted collaboration. The interruption would be perceived, initially, as friction — an obstacle to the smooth flow of production. But the friction would serve a specific ecological function: the preservation of the organism's capacity to perceive the tool as a tool, rather than experiencing it as an extension of its own cognitive apparatus.
---
The fourth principle is the design of affordance structures that support the full range of attentional modes — not just the focused, productive, task-oriented attention that AI tools most naturally afford, but also the diffuse, undirected, ruminative attention that the default mode network requires and that boredom, in its ecological function, produces.
The current affordance structure of AI tools is relentlessly productive. Every moment affords action. Every gap affords filling. The tool is always available, always responsive, always ready to engage. The affordance for non-productive attention — for the specific cognitive state in which the perceptual system disengages from task-oriented processing and enters the diffuse, associative mode that produces creative insight, autobiographical reflection, and the consolidation of learning — is not merely absent. It is actively suppressed by an affordance structure that specifies productive engagement as the perpetual default.
A well-designed attentional ecology would include environments — physical, temporal, digital — whose affordance structures support non-productive attention. These are not breaks from work. They are environments for a different kind of cognitive activity, as essential to the organism's development as sleep is essential to physical recovery. The garden that Byung-Chul Han tends is an environment whose affordance structure supports precisely this kind of attention. The soil affords engagement that cannot be accelerated. The season affords patience that cannot be optimized. The plant affords the perception of growth that operates on its own temporal scale, indifferent to the organism's preferences.
Not everyone has a garden. But the principle — that the affordance landscape of a healthy cognitive ecology must include environments that afford non-productive, non-goal-directed, temporally uncompressed attention — is universal. The design challenge is to create these environments within the digital ecology, not merely beside it.
---
The fifth principle is the recognition that affordance design is an ongoing ecological practice, not a one-time engineering intervention.
Gibson studied environments that were, for the most part, stable. The cliff face does not redesign itself between the geologist's visits. The forest floor does not rearrange its surfaces overnight. The affordance structure of the natural environment changes slowly, on geological or ecological timescales, and the organism's perceptual adaptation can keep pace.
The AI-mediated environment changes rapidly. The tool is updated. The model is retrained. The affordance structure shifts. Capabilities appear and disappear. The patterns the organism has learned to perceive through the tool are altered by changes in the tool's underlying processes. The co-evolutionary spiral between organism and environment, which in natural ecologies unfolds over generations, unfolds in the AI ecology over months.
This pace of change means that the design of the affordance landscape cannot be completed. It must be maintained — continuously observed, evaluated, and adjusted in response to the emergent behaviors the environment produces. This is the ecological practice that Segal captures in the image of the beaver maintaining its dam: not a single act of construction but a perpetual engagement between the builder and the current.
---
Gibson's ecological approach to perception was, at its core, an argument about where to look when studying how organisms know their world. The dominant tradition said: look inside the head, at the computational processes that transform sensory data into meaningful perception. Gibson said: look at the relationship between the organism and its environment, at the affordances the environment provides and the perceptual skills the organism has developed to detect them.
Applied to the AI moment, this shift in attention — from the internal state of the user to the structure of the environment the user inhabits — transforms the conversation. The question is no longer whether AI tools are making people smarter or more dependent, more productive or more depleted, more capable or more shallow. These are questions about the organism, and they admit no general answer because organisms vary.
The question is what the environment affords. What possibilities for action does the AI-mediated environment specify? Which of those possibilities are most easily perceived? Which are hidden? What affordances for depth, for exploration, for sustained engagement, for non-productive attention exist in the environment? What affordances have been eliminated by the environment's optimization for speed?
These are questions about the environment, and they admit specific, empirical, designable answers. The affordance landscape of AI-mediated work is not a natural formation. It is an artifact — built by designers, deployed by organizations, inhabited by organisms. It can be studied. It can be evaluated. It can be redesigned.
Gibson spent his career demonstrating that the environment is richer than the organism suspects. The ambient array contains information that specifies the world's properties directly, available for pickup by the attuned perceiver. The tragedy of the current AI moment, analyzed ecologically, is not that the tools are too powerful. It is that the affordance landscape has been designed to capture only a fraction of their potential — the fraction that maximizes speed of production — while systematically eliminating the affordances for the perceptual development, the exploratory engagement, and the cognitive depth that would allow the organisms inhabiting these environments to use the tools' full power wisely.
The cliff face affords falling-off. It also affords climbing-on. The geological strata afford the geologist's perception of deep time. The forest floor affords the child's first understanding of ecological complexity. The environment is generous with its affordances, offering more than any single organism can detect, inviting perceptual development that continues as long as the organism continues to explore.
The AI-mediated environment can be equally generous — if it is designed to be. The affordances for speed are already there, powerful and perceivable and heavily used. The affordances for depth are the ones that must be built. They will not emerge from the tool's default logic. They must be designed, deliberately, by people who understand that the environment shapes the organism, that the affordance determines the action, and that the quality of human cognition in the age of artificial intelligence depends not on the power of the tools but on the ecological richness of the environments in which those tools are used.
Gibson provided the framework. The design is ours.
The word I had never thought to question was "environment."
I used it constantly — in conversations with my team, in the pages of The Orange Pill, in my late-night sessions with Claude where the ideas would connect faster than I could track them. Digital environment. Work environment. The environment of the feed, the workspace, the interface. I used the word the way most people do: as a backdrop. A setting. The place where things happen, but not itself a thing that acts.
Gibson dismantled that assumption with a rigor I was unprepared for. The environment is not a backdrop. It is a field of structured offerings — affordances, in his language — and those offerings shape what organisms do as powerfully as any force, any incentive, any conscious decision. The cliff edge does not persuade you to fall. It affords falling-off, and the affordance is there whether you attend to it or not.
When I read that — really absorbed it — I stopped thinking about the individual. I stopped asking, "Why can't this person stop prompting at three in the morning?" and started asking a different question entirely: "What does this environment offer that person at three in the morning, and what does it fail to offer?"
The shift feels small. It is not small. It is a reorganization of where responsibility lives.
In The Orange Pill I wrote about productive addiction, about the builder's inability to close the laptop, about the grey exhaustion the Berkeley researchers documented. I described these as problems of will, of discipline, of the tension between flow and compulsion. I still believe that tension is real. But Gibson forced me to see that the tension is not just inside the person. It is in the design of the space the person inhabits. The conversational interface that never ends. The immediate feedback that never pauses. The polished output that always specifies acceptance as the most perceivable action. These are not features of a neutral tool. They are affordances of a structured environment, and the environment is doing work — shaping perception, shaping behavior — that I had attributed entirely to the organism inside it.
I kept returning to his Air Force research. The pilots who crashed were not perceiving the landing. They were computing it, from instruments, under stress, and the computations broke down faster than the perceptions would have. The pilots who landed safely had developed a perceptual attunement — educated through friction, through hours of flying — that allowed them to pick up the approach directly from the structure of the visual field. The instrument was a supplement. When it became a substitute, the pilot became dependent.
That distinction — supplement versus substitute — is the one I now carry into every design decision, every product conversation, every moment when I watch a member of my team interact with Claude. Is this tool supplementing her perception, giving her another angle on a problem she is actively exploring? Or is it substituting for her perception, delivering an answer that bypasses the exploration that would have educated her capacity to see?
The honest answer is: it does both, often in the same session, and the line between them is nearly invisible from the outside. Gibson's framework does not make the line easier to see. It makes the line urgent to look for.
The detail from this book I cannot put down is the simplest one: the organism develops the perceptual skills that its environment affords. That sentence restructures everything I care about — how I build products, how I lead a team, how I think about what my children encounter when they pick up a device. If the environment affords depth, the child develops depth. If the environment affords only speed, the child develops speed and nothing else. Not because the child chose speed. Because speed was what the cliff face offered, and the handholds for climbing were never built.
We are the ones who build the handholds. That is what design means, now. Not the optimization of engagement metrics. The deliberate structuring of environments that afford the full range of human perceptual and cognitive capability — depth and speed, exploration and production, struggle and fluency, the quiet attention that grows in the gap between prompts.
I am a builder. I will keep building. But I am building differently now, because I understand, with a clarity I did not possess before this encounter with Gibson's ideas, that every environment I design is an affordance landscape — a structured field of offerings that will shape the perception, the behavior, and the cognitive development of every person who inhabits it.
The cliff edge affords falling-off. It also affords the geologist's perception of four billion years of planetary history, visible in the strata, available to the organism that has learned to look.
I want to build environments that afford the looking.
The AI tools reshaping your work are not neutral instruments waiting for your sovereign decision to use them. They are affordance landscapes -- structured environments that specify which actions are easy, which are perceivable, and which have been designed out of existence. James J. Gibson, the psychologist who revolutionized our understanding of how organisms relate to their environments, never saw a chatbot. His framework describes what chatbots do to your mind with more precision than anything written since their arrival.
This book applies Gibson's ecological perception theory to the AI revolution unfolding now. It reveals why builders cannot stop prompting at three in the morning, why polished AI output is a form of perceptual camouflage, and why the question "Is AI good or bad for us?" is malformed at its root. The real question is what the environment offers -- and what it has quietly stopped offering.
If the tools we inhabit shape the minds we develop, then designing those tools is the most consequential act of our time. Gibson provides the framework. The design is ours.
-- James J. Gibson, The Ecological Approach to Visual Perception (1979)

A reading-companion catalog of the 21 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that James J. Gibson — On AI uses as stepping stones for thinking through the AI revolution.
Open the Wiki Companion →