Manufacture turns raw material into a finished product according to a predetermined specification that is external to the material. The factory does not ask the wood what it wants to become. It shapes the wood into the chair the designer specified; the material's own properties — grain, knots, natural inclinations — are obstacles to be overcome. Accompaniment is something else entirely. The accompanier walks alongside. She has no predetermined destination for the person being accompanied. She observes, supports, occasionally redirects, but does not control. The journey belongs to the person being accompanied. The accompanier's role is to be present — to provide the stability and attention that allow the accompanied person to take risks, make mistakes, and discover, through lived experience, who she is becoming. This distinction maps directly onto the AI-in-education debate and reveals the dominant paradigm's true character: not a technological problem but a moral one.
The adaptive learning platform is a manufacturing system. It has a specification (the grade-level standard, the competency benchmark), raw material (the child, assessed relative to the specification), a process (targeted content delivery to close gaps), and a quality metric (speed and completeness of closure). Everything about this system is oriented toward producing the child it was designed to produce. The child's own inclinations — her curiosities, her resistances, her moments of unprompted interest in something the curriculum does not cover — are noise. A well-designed manufacturing system minimizes them.
Korczak's orphanage was designed to maximize them. The children's parliament was inefficient precisely because it allowed children to pursue their own inclinations — to debate topics that interested them, to propose solutions reflecting their own understanding, to arrive at conclusions no adult had predetermined. The inefficiency was the feature, not the flaw. The children were not being manufactured into predetermined citizens. They were being accompanied through the experience of self-governance, and the experience itself — messy, slow, often frustrating — was the education.
The educator who accompanies does something no current AI system can: she withholds. She sees the child struggling and does not intervene. She watches the child making a mistake and does not correct it. She waits — not passively, but because she understands that the child's relationship with the mistake is more developmentally valuable than the correct answer. Withholding requires judgment — real-time assessment of whether the struggle is productive (building capability) or destructive (producing only frustration). It requires sensitivity to the particular child: this child's threshold, history, current emotional state. This judgment is relational. It depends on knowing the child not as a data point but as a person whose temperament and circumstances the educator has observed over weeks and months of shared life.
Seymour Papert's 1980 Mindstorms anticipated this distinction. Papert argued that computers could be powerful learning tools — not because they could teach children, but because children could use them to build things and, through the building, develop thinking. The computer was a medium for the child's own construction of knowledge, not a delivery system for predetermined content. The AI tools of 2025 are deployed almost exclusively as delivery systems rather than construction media. The medium has been misidentified as the message.
The distinction maps onto Gopnik's gardener-carpenter metaphor. The carpenter has a blueprint and shapes raw material to match it. The gardener creates conditions — soil, light, water, protection from pests — and observes what grows. Korczak was a gardener. His orphanage was a garden. What the children grew into was unpredictable, often surprising, occasionally alarming — but the growth was genuine, because it belonged to the children. The AI-mediated classroom of 2026 is a carpentry shop.
The distinction is inseparable from Korczak's clinical training. As a pediatrician he observed what growth actually looks like — the irreducibly particular trajectories of individual children, the unpredictable timing of developmental achievements, the way interventions intended to accelerate often disrupted. The framework was systematized in How to Love a Child (1919), where Korczak articulated the physician's posture of attending to what is there rather than imposing what should be — translated from the body to the psyche.
Specification vs. emergence. Manufacturing requires a predetermined specification; accompaniment allows the destination to emerge from the process.
Withholding as pedagogical act. The educator's decision not to intervene is as consequential as any intervention, and AI systems are structurally incapable of this restraint.
Relational knowledge. Accompaniment depends on knowing the particular child in ways no dataset can capture.
Medium vs. delivery. Computing tools can be media for construction or systems for delivery; the same technology functions radically differently depending on which frame governs its deployment.