The squiggle game was Winnicott's signature clinical improvisation. He would draw a random line — a squiggle — on a sheet of paper, hand it to the child, and invite the child to make something of it. The child would draw; Winnicott would respond to the child's drawing with his own squiggle; the exchange would continue. What emerged was neither Winnicott's projection nor the child's expression but something that arose in the space between — a transitional phenomenon in clinical form, in which unconscious material found expression through a collaboration that no participant fully directed.
There is a parallel reading that begins from the material conditions of AI production rather than the psychological dynamics of collaboration. The squiggle game, in Winnicott's hands, required only paper, pencil, and two minds meeting in a quiet room. The AI version requires server farms burning electricity at planetary scale, venture capital seeking 100x returns, and platforms designed to maximize engagement metrics. The child drawing with Winnicott paid nothing; the builder playing with AI pays in data, attention, and subscription fees that flow upward to concentrated wealth.
The asymmetry runs deeper than consciousness. When a child completes Winnicott's squiggle, she exercises agency within a protected therapeutic space designed for her development. When a builder completes an AI's squiggle, she operates within a commercial space designed for extraction. The AI's responses aren't random marks but optimized outputs shaped by corporate priorities: safety filters that encode particular values, training data that reflects existing power structures, fine-tuning that maximizes user retention. The builder thinks she's playing a creative game, but she's actually training the next version of the model, contributing to a dataset that will be monetized without her consent. The original squiggle game distributed power horizontally between therapist and child; the AI version concentrates power vertically from user to platform. What looks like collaboration from inside the interaction looks like capture from outside it. The builder's sense of creative partnership masks her actual position as both product and producer in a system she neither controls nor fully comprehends.
The game has become, in recent AI discourse, the paradigmatic image of human-machine collaboration. The prompt is a squiggle. The AI's response is the completion of the squiggle that the builder did not anticipate. The builder's refinement of the prompt is the next squiggle, extending the collaboration. At its best, the interaction produces something neither the builder nor the AI could have produced alone, arising from the space between rather than belonging to either side.
The University of Pennsylvania's Psyche on Campus paper uses the game to articulate a crucial distinction. The squiggle game's apparent randomness, the authors argue, finds its meaning first in the unconscious. The hand may tell a story that words can't yet find. Generative AI can make connections and associations. But it can't make them for us — not in a way that fosters self-understanding and healthy development. The connections the AI makes are genuine connections, real patterns in real data. But they are the AI's connections, not the builder's. The builder who accepts them uncritically has not made the connections herself, has not undergone the cognitive and emotional process that making connections requires.
The distinction reveals a limit of the squiggle-game analogy for AI collaboration. The Winnicott-child game depends on both participants contributing unconscious material. The AI has no unconscious. Its connections come from statistical patterns in language, not from the depths of a psyche with its own history. The builder who plays the squiggle game with an AI gets a collaborator that responds to her marks with genuine novelty but that lacks the developmental depth that made the original game therapeutically powerful. The collaboration is real; the depth is asymmetric.
Winnicott developed the technique across decades of pediatric practice and described it most fully in Therapeutic Consultations in Child Psychiatry (1971). The game's apparent simplicity belied its sophisticated theoretical grounding in the concept of the transitional space.
Arose in the space between. Neither projection nor expression, the game produces something that belongs to the collaboration.
Unconscious participation. The power comes from both participants contributing material whose significance they do not fully understand.
AI lacks the unconscious half. The analogy to AI collaboration is genuine but asymmetric; the machine has no depths comparable to the child's.
Connections without self-understanding. The AI can produce novel associations, but the builder's self-understanding requires her own unconscious to be engaged.
The entry and the contrarian view illuminate different dimensions of the same phenomenon, and the right weighting depends entirely on which question we're asking. If we're asking about the phenomenology of creative collaboration — what it feels like to work with AI — the entry captures something essential (85%). The squiggle game analogy genuinely describes the back-and-forth, the surprise, the sense of something emerging between prompt and response. Users report this experience consistently; it's not false consciousness but accurate description of a real dynamic.
But if we're asking about the political economy of that collaboration — who benefits, who pays, who owns — the contrarian view becomes primary (75%). The substrate does matter. The extraction is real. The child with Winnicott owned her drawings; the builder with ChatGPT owns nothing. Yet even here, the entry's psychological insight remains relevant (25%): the asymmetry of consciousness still shapes what kinds of development are possible, regardless of who profits. A builder might pay OpenAI and still experience genuine creative growth, just as a patient might pay a therapist and still achieve real insight.
The synthetic frame might be this: the AI squiggle game operates simultaneously as genuine creative collaboration and as extractive platform capitalism. Neither cancels the other. The depth-scale tradeoff is intrinsic — AI achieves scale by sacrificing depth, reaches millions by flattening the unconscious dimension that made Winnicott's version transformative. The question isn't whether the AI version is authentic but what we're willing to trade for scale: the intimate, bilateral, owned creative process for the networked, surveilled, rented one. Both views are right because they're measuring different things: psychological depth versus social reach, individual development versus collective capture.