The apprenticeship problem names the structural crisis that emerges when AI tools eliminate exactly the practices through which expertise was historically acquired and transmitted. In the pre-AI workplace, the junior developer learned by debugging under the guidance of a senior, accumulating across years the specific patterns of recognition and judgment that constituted expertise. The senior, in teaching, reinforced her own knowledge and gained the social standing that came from transmission. Both were embedded in a community of practice sustained by the cycle of giving, receiving, and reciprocating. When AI tools handle debugging more efficiently than either junior or senior, the economic rationale for the entire apprenticeship structure collapses. The junior has no incentive to practice. The senior has no incentive to teach. The chain of transmission breaks at both ends simultaneously, and the knowledge that would have been produced by the practice does not exist.
The problem is not hypothetical. Segal's account of the Trivandrum training documents precisely this dynamic: engineers became more individually capable while the collaborative practices that had previously connected specialists — code reviews, architectural discussions, negotiations between backend and frontend — were correspondingly reduced. Each individual was more capable. The connective tissue was thinner.
The structural problem is that AI systems most requiring human oversight are simultaneously eliminating the experiences through which oversight capability develops. The practitioner who has never debugged without AI assistance cannot effectively supervise AI debugging — she lacks the pattern library that supervision requires. This is Lisanne Bainbridge's irony of automation applied to cognitive work: automation removes the operator from the loop, then transforms the operator into a supervisor whose skills cannot develop because the supervision task denies the engagement that would build them.
The Mauss-derived diagnosis differs from the technical diagnosis: not merely that skills atrophy, but that the gift economy of mentoring dissolves. The code review was not only a quality-control mechanism but a social institution through which expertise transmitted and community formed. Its disappearance severs transmission and community simultaneously.
The concept draws on Mauss's analysis of cultural transmission, Bainbridge's 1983 ironies of automation, Lave and Wenger's situated learning, and contemporary empirical research on AI's effects on skill development documented by Anders Ericsson, Gary Klein, and others.
Both ends break. Economic rationale for teaching and learning dissolves simultaneously when the tool makes the practice unnecessary.
Supervisory paradox. AI oversight requires the expertise AI deployment is eliminating.
Gift economy dissolution. The mentoring relationship is not only a transmission mechanism but a social institution.
Atomization of capability. Individuals may become more capable while the network of mutual dependence thins.
Generational amnesia. The next generation cannot miss what it never encountered; the loss becomes invisible.