A community of practice is not a team, a department, or a network. It is the specific social form through which practitioners who share a domain, interact regularly, and develop a collective repertoire of stories, techniques, and standards produce knowledge that exceeds what any member possesses alone. Wenger derived the concept from Jean Lave's anthropological fieldwork and his own observations of claims processors at an American insurance company — workers whose actual expertise lived not in the training manual but in the informal community that management did not design and barely recognized. The framework has since become one of the most widely adopted in organizational theory, precisely because it names a social structure that was always doing the heavy lifting of professional learning and that most institutions had failed to see.
There is a parallel reading that begins not with the social forms of learning but with the material conditions that make communities of practice possible — and impossible. The insurance claims processors Wenger studied worked in physical proximity, constrained by the technologies of paper files and mainframe terminals that demanded co-location. Their community emerged not from some innate human tendency toward collective learning but from the specific friction of 1980s office work: the waiting at the printer, the coffee pot conversations while systems processed, the necessity of physical handoffs. These weren't incidental details but the substrate upon which community crystallized.
The AI moment doesn't just challenge communities of practice; it reveals that they were always artifacts of particular technological regimes. The solo builder with GPT-4 isn't experiencing a loss of community but the logical endpoint of thirty years of systematic workplace atomization — open offices that destroyed informal conversation, remote work that eliminated serendipitous encounter, productivity metrics that penalized the very interactions where knowledge transfer occurred. AI simply completes what lean management began: the elimination of organizational slack where communities once formed. What Wenger called 'mutual engagement' was never a choice but a byproduct of inefficiency that organizations tolerated only because they couldn't yet eliminate it. The repertoire that accumulated in those communities was valuable precisely because it was illegible to management — tacit knowledge that couldn't be extracted and codified. Now that AI can approximate that repertoire without the messy human substrate, the economic logic that barely tolerated communities of practice has no reason to preserve them.
The three constitutive elements are precise and each is tested differently by AI. The shared domain is the practice the members care about — not insurance in the abstract but claims processing as a lived daily engagement with specific problems. The community is sustained mutual interaction — the breakfast conversations, the questions asked at adjacent desks, the informal help extended across years. The practice is the accumulated shared repertoire — the stories, shortcuts, implicit standards, and collective memory that constitute knowledge-in-use.
Each element is structurally different from organizational assignment. People who share a floor but never interact are not a community of practice. People who share a domain without developing joint practice are a profession, not a community. The concept identifies a specific social formation that emerges through time and sustained engagement, and that cannot be manufactured by org chart.
The framework's diagnostic power in the AI moment comes from its precision about what gets lost when teams dissolve into solo builders. The imagination-to-artifact ratio may have collapsed, but the communities through which professional judgment was historically formed require time, proximity, and sustained joint work that solo building structurally eliminates. The builder retains whatever repertoire she absorbed from prior community membership; she cannot generate new repertoire alone.
Wenger's later work extended the concept into constellations of practice — networks of interrelated communities whose boundaries become sites of productive friction and cross-domain learning. The extension matters because organizations of any complexity are never single communities but overlapping constellations, and it is the connective tissue between communities that AI mediation most quickly erodes.
The concept emerged from the 1989 ethnographic work at the Institute for Research on Learning in Palo Alto, where Wenger watched claims processors learn through informal interaction rather than through the training programs their employer had designed. The collaboration with anthropologist Jean Lave produced Situated Learning in 1991; Wenger's solo masterwork Communities of Practice: Learning, Meaning, and Identity followed in 1998.
Wenger had come to the question from an unusual direction — his 1987 book had surveyed artificial intelligence tutoring systems, treating knowledge as a commodity to be extracted, encoded, and delivered. His encounter with Lave's apprenticeship research convinced him the entire paradigm was wrong at its foundations. The trajectory matters: the AI of the 1980s that Wenger rejected has returned in a form he could not have anticipated, raising the question his framework exists to answer.
Three constitutive elements. Shared domain, mutual engagement, and collective repertoire — each necessary, none sufficient alone.
Knowledge between people. The most consequential knowledge in any complex domain lives not in individual heads but in the community's shared practice.
Emerges, not designed. Communities of practice arise organically from the conditions of work; they can be cultivated but not specified top-down.
Reproduces through participation. New practitioners become practitioners by moving from periphery toward center within a community, not by receiving transmitted knowledge.
Generates more than members know. The community's knowledge exceeds the sum of individual knowledge because it lives in the interactions themselves — the stories, the corrections, the implicit standards negotiated through use.
Critics have questioned whether the framework romanticizes informal community and underestimates power differentials within communities of practice — who gets to set standards, who gets recognized, who remains permanently peripheral. The AI moment reopens an older debate: whether 'community' is too strong a word for what forms in digital, distributed, AI-mediated work, or whether the recognition of shallow networks as genuine communities of practice dilutes the concept beyond usefulness.
The question of whether communities of practice remain viable in the AI era depends entirely on which aspect of Wenger's framework we're examining. On the diagnostic value of the three constitutive elements — domain, community, practice — Wenger's precision remains fully intact (100% weighting). The framework accurately identifies what gets lost when solo builders replace collaborative teams: not just knowledge but the social process through which professional judgment forms. Where the contrarian view dominates (80% weighting) is in explaining why this loss occurs: communities of practice were indeed dependent on specific material conditions that AI systematically eliminates.
The sharpest divergence appears around causality and agency. Wenger frames communities as emergent properties of human work that organizations failed to recognize; the infrastructure reading sees them as byproducts of technological constraint that organizations actively sought to eliminate (here the views split 50/50, answering different questions). Wenger asks 'what social forms produce knowledge?' while the contrarian asks 'what material conditions permit those forms to exist?' Both are right within their frame — communities do generate knowledge through mutual engagement, and that engagement does require physical or temporal slack that efficiency pressures destroy.
The synthetic insight may be that communities of practice were always more fragile than Wenger's framework suggested — not because the theory was wrong but because it abstracted away from the material substrate. The AI moment forces us to see that 'mutual engagement' and 'shared repertoire' weren't just social phenomena but achievements that required specific technological and economic arrangements. The framework's enduring value lies not in prescribing how to preserve these communities but in precisely naming what we're losing and why it matters — even if we lack the economic power to prevent that loss.