Attentional ecology, in the sense developed in The Orange Pill and extended by Noë's framework, is the empirical and normative study of how AI-mediated environments affect the cognitive capacities of the organisms inhabiting them. The concept borrows the ecologist's stance: one does not attempt to eliminate an invasive species wholesale but studies the system, identifies leverage points, and intervenes precisely where intervention will have the most impact with the fewest consequences. Applied to AI, attentional ecology treats the chatbot interface, recommendation engine, and algorithmic feed as environmental features whose effects on human attention, judgment, and embodied engagement must be actively monitored and shaped.
The concept reframes the question of AI governance. The dominant framings — safety, alignment, regulation — operate at the level of the technology itself: what should be built, what should be disclosed, what should be restricted. Attentional ecology operates at the level of what the technology does to the minds and practices of its users. This is the demand-side question, the one that supply-side governance largely ignores.
The ecological stance is methodologically important. Ecologists do not control nature; the pretense to control is what got us into most ecological problems. The successful ecologist studies leverage points — places where a small intervention cascades through the system. Applied to AI, this means not the wholesale rejection or embrace of the technology but the identification of specific practices, institutional norms, and design choices that preserve the cognitive ecology on which human flourishing depends.
Noë's enactive framework supplies the theoretical foundation. If cognition is constitutively embodied, then cognitive ecology is constitutively embodied ecology. The practices that need to be preserved are not abstract cognitive disciplines but concrete bodily engagements — handwriting, physical experimentation, face-to-face interaction, the manipulation of resistant materials, the experience of confusion that is not immediately resolved. These are not lifestyle preferences. They are ecological necessities for the cognitive species humans have become.
Practical applications include educational practices that preserve the body's participation in learning, organizational norms that protect time for friction-rich engagement, technological design that maintains user awareness of the tool's operation rather than allowing it to recede into invisible mediation. The strange tools concept provides a specific mechanism: practices that make the organizing function of AI visible, preserving the capacity for critical reflection that ecosystems of habitual use otherwise erode.
The concept is developed in Edo Segal's The Orange Pill (2026) and extended through Noë's enactive framework in the present volume. It draws on the tradition of media ecology (McLuhan, Postman) and on ecological approaches to cognition (Gibson, enactivism).
Demand-side governance. Focus shifts from regulating what AI companies build to protecting what citizens need to navigate the resulting environment.
The ecological stance. Study the system, identify leverage points, intervene precisely — not wholesale control.
Embodied ecology. The cognitive practices to be preserved are concretely bodily, not abstract cognitive disciplines.
Leverage through strange tools. Practices that make AI's organizing function visible preserve critical reflection.
Active maintenance. Cognitive ecology is not self-sustaining; it requires ongoing tending, like the beaver's dam.