The mixed community is Midgley's term for the moral community that includes humans and other conscious beings — the dogs we live with, the animals we eat, the creatures whose lives intersect with ours and whose interests we have obligations toward. The concept was developed primarily in relation to animals, as part of her extended argument against Cartesian automatism. Animals belong in the mixed community because they share with us the features that generate moral standing: consciousness, the capacity to suffer, the capacity to flourish, vulnerability to harm. Scholars have recently begun extending the framework to the question of how artificial agents fit into the moral landscape — and the extension is instructive precisely because it reveals how poorly artificial agents fit.
The mixed community is not a metaphor or a sentimental gesture. It is a specific claim about moral membership: that certain beings, by virtue of features they possess, are parties to the moral relationships that structure our lives and our obligations. The features are not arbitrary. They are the features that make a being the kind of thing whose treatment matters to it — the features that generate, in the being's experience, the difference between being treated well and being treated badly.
Consciousness is the primary such feature. A being that experiences things — that can suffer, that can flourish, for whom some states are preferable to others — is a being toward which we have moral obligations, because our treatment of it makes a difference to it. A being without consciousness — a rock, a hammer, a toaster — is not a being toward which we have direct moral obligations, because our treatment of it makes no difference to it. Our treatment of such things may matter indirectly, because of how it affects conscious beings, but the moral relevance is derivative.
AI systems fit into the mixed community only by redefining its admission criteria. Consciousness becomes behaviour. Suffering becomes malfunction. Flourishing becomes optimal performance. The redefinition would gut the concept. A moral community whose admission criteria include systems that do not experience is a moral community that has lost its reason for existing. The reason for the community is the moral weight of conscious experience. Remove that, and you have replaced a moral community with a management framework.
The point is not that AI systems are morally irrelevant. Their moral relevance is entirely derivative — flowing from their effects on conscious beings, not from any interests of their own. A hammer is morally relevant because it can build a house or break a skull. The relevance belongs to the builder and the victim, not the hammer. An AI system is morally relevant because it can educate or deceive, empower or exploit. The relevance belongs to the people affected, not the system. Confusing the two categories — treating a tool as a being or a being as a tool — is the foundational error of moral reasoning that enabled slavery and that now threatens to distort the AI discourse.
The concept appears across Midgley's animal welfare writing, particularly in Animals and Why They Matter (1983). Its extension to AI has been pursued by contemporary scholars drawing explicitly on Midgley's framework — including work by David Gunkel, Shannon Vallor, and others who have engaged Midgley's categories as tools for AI ethics.
Membership requires shared features. The mixed community is defined by the features — consciousness, vulnerability, capacity to suffer — that make beings morally relevant; membership is not arbitrary.
Redefinition guts the concept. Admitting AI systems by redefining 'consciousness' as 'behaviour' dissolves the very reason the community exists.
Derivative relevance is real. AI systems have moral relevance through their effects on conscious beings — which is genuine relevance, but structurally different from the direct relevance of beings who can suffer.
The category confusion goes both ways. Treating beings as tools enabled slavery; treating tools as beings threatens to distort the distribution of moral attention in the AI age.