Intellectual privacy is Skenazy's name for the developmental condition under which children can explore ideas without observation, ask questions without an audience, and hold hypotheses tentatively without being required to defend them to adults. The condition is not merely absence of surveillance. It is the positive space within which independent thinking forms. A child who knows every question will be overheard learns to ask the questions she thinks adults expect. A child who knows every exploration will be evaluated learns to perform intellectually rather than think. The supervised generation — whose educational environments, social interactions, and digital activity have been continuously monitored by well-intentioned adults — has grown up with almost no intellectual privacy, and the specific developmental cost is visible in the psychological research and the contemporary anxiety epidemic.
The concept has precedents in legal and political theory — Daniel Solove's work on privacy, Neil Richards's work on intellectual privacy in a constitutional context — but Skenazy's application is developmental. Intellectual privacy matters for children not because they have adult rights to it but because they have developmental needs for it. Independent thought requires the mental space within which thoughts can form without social reference. That space requires freedom from observation; observation, however benign, inflects what the observed mind can think.
The supervised generation's relationship to intellectual privacy is worth examining directly. These children grew up in environments where adults monitored their social media, reviewed their browsing histories, audited their friendships, supervised their homework, tracked their locations, and evaluated their interests for college-admissions relevance. The monitoring was typically well-intentioned and often effective at its stated purposes. Its effect on intellectual privacy was nevertheless profound. The children adapted, as children always do, by producing the thoughts their monitoring apparatus rewarded and suppressing the thoughts that might generate concern or intervention.
AI, in this context, represented something genuinely novel: a space where the child's thinking was not observed, evaluated, or reported. The child afraid to admit in class that she did not understand photosynthesis could ask Claude. The child curious about a topic the curriculum did not cover could follow that curiosity without the social cost of being seen as off-syllabus. The child working through a half-formed idea could use the AI as a thinking partner that would not report her tentative formulations back to the authorities who controlled her academic trajectory. The attraction of AI for supervised children was not merely the technology. It was the intellectual privacy the technology provided — a privacy their physical lives had been systematically stripped of.
This reframing produces conclusions that unsettle both critics and defenders of child AI use. The children most at risk of unhealthy AI attachment are not those given the most freedom but those given the least — because for them, the AI is not supplementing a rich unsupervised life but providing the only unsupervised intellectual space they have. The prescription is not less AI but more physical-world autonomy: returning to children the thousand small spaces of unsupervised play, exploration, and social interaction whose elimination made AI attachment a rational adaptation rather than a pathological one. Intellectual privacy is not something AI created. It is something AI provided after the supervised generation's childhood had eliminated its physical-world sources.
Skenazy developed the concept across her free-range writing and particularly in her post-2023 analysis of children's AI use. The framework draws on legal scholarship on privacy but applies it specifically to developmental psychology.
Privacy as developmental condition. Independent thought requires space within which thoughts can form without social reference — a condition that cannot be imposed but can be destroyed by observation.
Adaptation to surveillance. Children in continuously monitored environments adapt by producing observable thoughts that satisfy their observers, suppressing the private thinking that development requires.
AI as privacy restoration. For the supervised generation, AI provided intellectual privacy that physical-world childhood no longer did — a space for unobserved curiosity whose appeal is rational rather than pathological.
Prescription is physical-world autonomy. The solution to AI over-attachment is not less AI but more unsupervised physical-world experience, restoring the multiple sources of intellectual privacy the supervised generation lost.
Critics argue that "intellectual privacy" for children is inadequately balanced against legitimate parental responsibilities for safety and formation. Skenazy's response is that the balance has been severely miscalibrated in the direction of surveillance, and that restoring appropriate intellectual privacy is not an abdication of parental responsibility but a recognition of what developmental responsibility actually requires.