Ken Robinson — On AI
Contents
Cover Foreword About Chapter 1: The Genius of Five-Year-Olds Chapter 2: The Factory and the Garden Chapter 3: The Hierarchy Inverted Chapter 4: The Element in the Age of the Amplifier Chapter 5: The Teacher Who Stopped Grading Essays Chapter 6: Ascending Friction in the Classroom Chapter 7: The Democratisation of the Element Chapter 8: Flow, Compulsion, and the Element's Dark Twin Chapter 9: Building Schools Worthy of the Children Inside Them Epilogue Back Cover
Ken Robinson Cover

Ken Robinson

On AI
A Simulation of Thought by Opus 4.6 · Part of the Orange Pill Cycle
A Note to the Reader: This text was not written or endorsed by Ken Robinson. It is an attempt by Opus 4.6 to simulate Ken Robinson's pattern of thought in order to reflect on the transformation that AI represents for human creativity, work, and meaning.

Foreword

By Edo Segal

The number that broke me was not about code or revenue or adoption curves. It was about five-year-olds.

Ninety-eight percent of them test at genius level for divergent thinking. By fifteen, twelve percent. By adulthood, two percent. The same children. The same brains. Something between kindergarten and graduation systematically dismantles the capacity to imagine multiple possibilities, to refuse the premise of a question, to look at a paperclip and see a two-hundred-foot sculpture.

I read that data while writing *The Orange Pill*, and it reframed everything I thought I understood about the crisis we are in.

I had been asking the wrong question. I kept asking what AI would do to the workforce, to the economy, to the way we build software. Ken Robinson spent thirty years asking what we were doing to the children before AI even arrived. The factory model of education—age-sorted cohorts, standardized curricula, examinations that reward the single correct answer—was designed to produce convergent thinkers for an industrial economy. It worked. It produced exactly what it was designed to produce. And now a machine does all of that better than any human ever could, for a hundred dollars a month.

The convergent skills at the top of every curriculum on earth—calculation, information recall, procedural analysis—are precisely the skills AI performs best. The divergent skills at the bottom—creative expression, aesthetic judgment, the courage to attempt something that has never existed—are the skills AI performs worst. The hierarchy is inverted. The economy has flipped. And the schools have not moved.

Robinson saw this coming without seeing the specific technology that would prove him right. He died in August 2020, eighteen months before ChatGPT arrived. The most effective public advocate for educational transformation in a generation did not live to see the machine that made his argument economically unanswerable. That timing contains a bitter irony I have not been able to shake.

This book is my attempt to sit with Robinson's framework and let it interrogate the moment I describe in *The Orange Pill*. Not to add his voice as decoration, but because his patterns of thought expose something the technology discourse cannot see from inside its own fishbowl: that the real crisis is not what AI will do to us. The real crisis is what we did to ourselves—to our children—long before AI showed up. And the real opportunity is that AI has finally made the case for the creative, divergent, courageous education Robinson championed not as a philosophical luxury but as an economic necessity.

The girl drawing God did not ask for permission. Neither should we.

-- Edo Segal ^ Opus 4.6

About Ken Robinson

1950–2020

Ken Robinson (1950–2020) was a British educator, author, and internationally recognized advocate for creativity in education. Born in Liverpool, he studied English and drama at the University of Leeds and earned his PhD in education from the University of London. He served as professor of arts education at the University of Warwick and led the British government's advisory committee on creative and cultural education, which produced the influential 1999 report *All Our Futures: Creativity, Culture and Education*. His 2006 TED talk, "Do Schools Kill Creativity?," became the most-watched TED talk in history, with over seventy million views. His major books include *Out of Our Minds: Learning to Be Creative* (2001), *The Element: How Finding Your Passion Changes Everything* (2009), *Finding Your Element* (2013), and *Creative Schools: The Grassroots Revolution That's Transforming Education* (2015). His posthumous work, *Imagine If…*, was completed by his daughter Kate Robinson in 2022. Robinson was knighted in 2003 for his services to the arts. His central argument—that industrial education systematically suppresses the divergent thinking and creative capacity innate in every child—has influenced educational policy and practice worldwide and gained renewed urgency in the age of artificial intelligence.

Chapter 1: The Genius of Five-Year-Olds

In 1968, George Land and Beth Jarman administered a creativity test to 1,600 children. The test had been designed for NASA, to measure the capacity for divergent thinking among engineers and scientists — the ability to look at a problem and generate multiple possible solutions rather than converging on the single correct one. Land and Jarman gave the test to five-year-olds. Ninety-eight percent scored at genius level.

The same children were tested again at age ten. Thirty percent scored at genius level. At fifteen, twelve percent. The same test, given to 280,000 adults, produced a genius-level score in two percent of cases.

Ken Robinson cited this data in nearly every major address he gave over three decades of public life, and he cited it not as a curiosity but as an indictment. Something was happening to children between the ages of five and fifteen that systematically destroyed their capacity for divergent thinking. That something was not television, not social media, not the breakdown of the family, not any of the usual suspects summoned when adults wish to explain why children are not what they used to be. That something was school.

The industrial model of education, Robinson argued with a consistency that amounted to a life's work, was designed to produce convergent thinkers. Workers who could follow instructions. Citizens who could perform standardised tasks. Human beings sorted, at age eleven or fourteen or eighteen, into categories of academic achievement that served the economy's need for predictable labour but bore no relationship to the actual distribution of human talent. The model worked. It produced exactly what it was designed to produce. And what it was designed to produce was compliance dressed as competence, conformity dressed as rigour, and a population in which ninety-eight percent of the creative genius present at age five had been, by adulthood, educated away.

Robinson did not argue that schools intended to destroy creativity. Intentions were beside the point. The architecture of the system — age-based cohorts moving through a standardised curriculum, assessed by examinations that rewarded the single correct answer and penalised the unexpected one — produced the result regardless of the intentions of the teachers within it. A fish does not intend to be wet. It simply lives in water.

That architecture has now collided with a technology that makes its premises untenable, and the collision illuminates both what Robinson saw and what even Robinson could not have anticipated.

---

The divergent thinking test asks a deceptively simple question: how many uses can you think of for a paperclip? The convergent thinker produces a few sensible answers. Hold papers together. Bookmark a page. Reset a router. The divergent thinker produces dozens, hundreds, answers that stretch the definition of the object itself. What if the paperclip were two hundred feet tall? What if it were made of foam? What if it were alive? The genius-level score requires not just fluency — the number of answers — but originality, elaboration, and what psychologists call flexibility: the willingness to shift categories entirely, to refuse the premise of the question and redefine the terms.

Five-year-olds do this instinctively. They have not yet learned that a paperclip is a fixed object with a fixed purpose. They have not yet been taught that there is a correct answer and that deviating from it carries a cost. Their minds are, in Robinson's terminology, still operating in the mode that creativity requires: open, associative, unafraid of being wrong.

By age fifteen, the educational system has taught them, through ten years of daily practice, that being wrong is the worst thing you can be. Not dangerous. Not unkind. Wrong. The red mark on the exam paper. The grade that drops. The university that recedes. The future that narrows. The system teaches children that there is one correct answer to every question, that the teacher knows what it is, that the purpose of education is to get as close to that answer as possible, and that deviation from the correct answer is failure.

Divergent thinking cannot survive this environment. It is not that the children lose the capacity. Robinson was emphatic on this point. The capacity remains. What is destroyed is the confidence to use it. The willingness to be wrong. The courage to offer an answer that might be laughed at, marked down, or simply ignored because it does not fit the rubric.

Now consider what happened in the winter of 2025.

A machine arrived that could produce the correct answer to virtually any question the educational system had ever asked. Not approximately. Not most of the time. With a fluency and accuracy that exceeded what most human students could manage after years of study. The convergent thinking that ten years of schooling had been designed to develop — the capacity to arrive at the single correct answer — was now available to anyone with an internet connection and a few seconds of patience.

The entire apparatus of industrial education, its examinations, its grading systems, its sorting mechanisms, its fundamental promise that years of convergent practice would be rewarded with economic opportunity, was rendered structurally obsolete. Not by a policy change or a philosophical argument, which Robinson had been making for decades with limited institutional effect, but by a technological fact that no amount of institutional inertia could wish away.

The machine was convergent. Spectacularly, exhaustively convergent. It could produce the correct answer, the expected essay, the standard analysis, the conventional interpretation, with a speed and reliability that made human convergent thinking look like what it had always been: a poor substitute for computation. Robinson had spent thirty years arguing that the education system was optimising for the wrong thing. The machine proved it by doing that wrong thing better than any human ever could.

---

The deeper question, the one that Robinson's framework illuminates with uncomfortable clarity, concerns what happens to divergent thinking in the presence of the machine.

There are two possible trajectories, and both are already visible.

The first trajectory is acceleration of the decline. If AI can produce competent answers, competent essays, competent analyses, competent creative work, then the pressure on students to produce these things themselves diminishes. The student who uses AI to write an essay has bypassed not just the mechanical labour of typing but the cognitive labour of thinking through the argument, discovering what she actually believes, confronting the moment where the ideas resist her attempts to make them cohere. That cognitive labour is where divergent thinking lives. It is the space where the mind, stuck and uncomfortable, reaches for an unexpected connection. Remove the stuckness, and the reaching stops.

Robinson would have recognised this pattern instantly. The educational system had already been removing the conditions for divergent thinking for two centuries. AI completes the project. The student who never struggles with an essay never discovers that she has something unexpected to say. The efficiency is real. The loss is invisible, and it compounds.

The second trajectory is recovery. And this is where Robinson's framework becomes not just diagnostic but prophetic.

If AI handles convergent tasks — the correct answer, the standard analysis, the expected output — then the human contribution must, by definition, come from somewhere else. The somewhere else is divergent thinking. The unexpected answer. The question nobody asked. The connection between domains that the curriculum had kept in separate classrooms. For the first time in the history of industrial education, the economic argument aligns with Robinson's pedagogical argument. The economy no longer needs convergent thinkers. It needs people who can do what the machine cannot: imagine something that does not yet exist, ask a question that reframes the problem, bring the kind of wild, category-breaking flexibility that ninety-eight percent of five-year-olds possess and that school has spent two centuries systematically crushing.

The recovery is not automatic. This must be stated plainly, because technological determinism — the belief that the mere arrival of a tool produces a specific social outcome — is one of the most persistent and dangerous errors in thinking about technology. The printing press did not automatically produce democracy. The internet did not automatically produce an informed citizenry. AI will not automatically recover the divergent thinking that industrial education destroyed. Recovery requires a deliberate, institutional, sustained decision to change what education rewards, what it measures, what it values, and what it does to children between the ages of five and fifteen.

Robinson knew this. His entire career was an argument that the revolution in education he was calling for would not happen by accident. It would happen because enough people decided it should happen, and then did the difficult, unglamorous, politically fraught work of changing systems that were designed to resist change.

---

The data point that haunts Robinson's work — ninety-eight percent to two percent — contains within it an assumption so deeply embedded in educational practice that it functions as an axiom: the purpose of education is to produce a known outcome. The curriculum specifies what the student should know. The examination measures whether the student knows it. The grade certifies the measurement. The system produces a population sorted by its proximity to the specified outcome.

Divergent thinking disrupts this system at every level. A student who generates forty unexpected uses for a paperclip has not demonstrated mastery of the curriculum. She has demonstrated something the curriculum does not measure, does not value, and cannot accommodate within its assessment framework. The system has no rubric for "thought of something nobody expected." It has rubrics for accuracy, completeness, clarity, and conformity to the marking scheme. The divergent thinker fails all of these, not because she lacks intelligence but because her intelligence operates in a mode the system was not built to recognise.

Robinson spent decades collecting stories of people whose divergent capacities were punished by the educational system. The comedian who was told she was disruptive. The entrepreneur who was diagnosed with attention deficit disorder. The dancer whose fidgeting was treated as a behavioural problem until a doctor suggested she be sent to a school where people moved to think, and she became one of the most celebrated choreographers of the twentieth century. In each case, the system had encountered a form of intelligence it could not categorise and had responded by pathologising it.

AI makes the pathology visible by making the alternative obvious. When the machine can produce the correct answer, the expected essay, the standard performance, the system's insistence on these outputs as the measure of human achievement is exposed as what Robinson always said it was: an industrial artefact, not an educational principle. The question the system was designed to answer — "Can this student produce the specified output?" — has been answered permanently by a machine that produces it better. The question that remains is the one Robinson spent his life asking: What is this student capable of that no machine can replicate?

The answer lives in the divergent thinking that education was designed to destroy. It lives in the five-year-old's willingness to reimagine the paperclip as a two-hundred-foot structure, a living creature, a musical instrument. It lives in the capacity to be wrong in interesting ways, to fail productively, to ask the question that breaks the frame.

That capacity was never absent. It was suppressed. And the most powerful argument for its recovery is not Robinson's philosophical position, persuasive as it remains, but the economic and technological reality that has overtaken the world: the convergent skills that justified the suppression are now worthless, and the divergent skills that were suppressed are the only ones the market cannot automate.

Robinson's data was always an accusation. Now it is an emergency.

---

There remains a question that Robinson's framework raises but does not fully resolve, and it is worth stating honestly because the coming chapters will need to wrestle with it. If AI can produce the convergent output that schools were designed to develop, and if the appropriate response is to shift education toward divergent thinking, who teaches divergent thinking? The teachers in the system were themselves products of the system. They were selected for their convergent abilities, trained in convergent methods, assessed by convergent metrics. The system that killed creativity in students killed it in many of its teachers first.

Robinson was acutely aware of this problem. His book Creative Schools is as much about teacher development as it is about student development, and his insistence that the revolution must be systemic rather than individual was grounded in the recognition that you cannot ask a teacher to cultivate a capacity she was never allowed to develop in herself.

AI creates a paradox here. The tool that makes the shift necessary also makes the shift harder, because it offers the path of least resistance: use AI to produce the convergent output more efficiently, declare the problem solved, and leave the underlying educational architecture unchanged. This is the trajectory that Robinson spent his career warning against — the system absorbing the new tool into the old logic, using technology to do the wrong thing more effectively rather than doing the right thing differently.

Whether the second trajectory prevails — the recovery of divergent thinking, the cultivation of the creative capacities that ninety-eight percent of children possess and that the world now desperately needs — depends not on the technology but on the courage of the adults who design the systems in which children spend the most formative years of their lives. Robinson would have said it depends on whether those adults are willing to be wrong, to experiment, to diverge from the standardised path.

He would have noted the irony. The capacity he was asking adults to model for children — the willingness to take creative risks — is the same capacity the system educated out of most of them decades ago. The revolution Robinson called for was always, at its root, a revolution in courage. AI has made the case for that courage unanswerable. Whether enough people answer it is a question this book cannot resolve but is compelled to ask.

Chapter 2: The Factory and the Garden

The modern school was not designed by educators. It was designed by the same minds that designed factories, and it was designed for the same purpose: to produce a standardised output from a variable input at the lowest possible cost.

Robinson traced the architecture of industrial education to the late eighteenth and early nineteenth centuries, when the economies of Britain, Prussia, and the United States were reorganising around mechanised production. The factory needed workers who could arrive on time, follow instructions, perform repetitive tasks with minimal error, tolerate boredom, and defer to authority. The school was built to produce them. Children were organised by date of manufacture — age — and moved along a conveyor belt of standardised experiences. The curriculum divided knowledge into separate subjects, the way a factory divided labour into separate tasks. The bell rang to signal transitions, the way a factory whistle signalled shift changes. Examinations measured output, the way quality control measured products at the end of the line. Children who did not meet specification were held back, remediated, or discarded.

The metaphor was not Robinson's alone, but no one in public life articulated it with such force or such frequency. And the metaphor was never merely rhetorical. The architecture of the system — its physical layout, its temporal structure, its assessment mechanisms, its implicit theory of human intelligence — was genuinely industrial in origin and remained so long after the industrial economy it was designed to serve had transformed beyond recognition.

Robinson's argument was that the factory model of education was not failing. It was succeeding, brilliantly, at producing the wrong thing. The system was not broken. It was obsolete. The distinction matters, because a broken system needs repair, while an obsolete system needs replacement. The political conversation about education in every developed nation has been, for decades, a conversation about repair: more funding, better tests, higher standards, more accountability. Robinson insisted that the conversation should be about replacement, and that the failure to replace was producing a quiet catastrophe of human waste — millions of people arriving at adulthood with their creative capacities intact but their creative confidence destroyed, sorted into categories of academic achievement that bore no relationship to the actual range of their talents.

---

The factory metaphor acquires a new dimension when the factory's product becomes automated.

The industrial model of education was designed to produce convergent competence: the ability to perform known procedures reliably. This competence commanded an economic premium for two centuries, because the economy needed large numbers of people who could execute specified tasks with consistent quality. The premium justified the model. Parents accepted the suppression of their children's creative capacities because the suppression came packaged with an economic promise: endure the system, acquire the credentials, and the credentials would purchase a stable livelihood.

AI has broken that promise. Not gradually, not theoretically, but with the empirical clarity of a price collapse. The convergent competence that the factory was designed to produce — the ability to calculate, recall, analyse, draft, code, sort, and classify — is now performed by machines at a fraction of the cost and a multiple of the speed. The product the factory spent twelve years manufacturing is now available for a monthly subscription.

The economic argument that sustained the industrial model, the argument that persuaded generations of parents to accept a system they suspected was damaging their children, has evaporated. And without the economic argument, the model has nothing left. Its pedagogical argument was always weak — Robinson demonstrated this exhaustively. Its philosophical argument was nonexistent — nobody seriously believed that sitting in rows memorising information was the optimal path to human flourishing. It survived on the economic argument, on the promise that compliance would be rewarded with employment, and that promise is now visibly false.

Robinson would have recognised this moment as the one he had been waiting for, and would have immediately warned that the moment could be wasted. The danger is not that the factory model persists unchanged. Even the most calcified educational institutions cannot entirely ignore the fact that their product has been commoditised. The danger is that the model adapts superficially — adding AI tools to the existing curriculum, using technology to deliver the same content more efficiently, declaring innovation while preserving every structural assumption that made the model destructive in the first place.

This is the pattern Robinson identified in every previous wave of educational technology. The overhead projector. The television. The personal computer. The interactive whiteboard. The tablet. Each was introduced with the promise that it would transform education. Each was absorbed into the existing architecture without altering a single structural assumption. The content was delivered through a new medium. The medium changed nothing, because the model determined how the medium was used, not the other way around. Teachers were trained to use new tools to do the same thing they had always done: deliver standardised content to passive recipients and measure their absorption through standardised tests.

AI will follow the same pattern unless the pattern itself is disrupted. And the pattern can only be disrupted by people who understand what Robinson understood: that the problem is not the tool but the model, and that changing the tool without changing the model produces the illusion of progress while perpetuating the damage.

---

Robinson's alternative to the factory was not a single blueprint. He resisted the irony of prescribing a standardised solution to the problem of standardisation. What he offered instead was a set of principles, drawn from decades of observing schools and educational projects around the world that had broken with the industrial model and produced extraordinary results.

The first principle was personalisation: the recognition that every child learns differently, at a different pace, through different modalities, and that education should adapt to the learner rather than requiring the learner to adapt to the system. The factory treats variation as a problem to be corrected. Robinson treated it as a resource to be cultivated.

The second principle was what Robinson called organic growth: the understanding that education is not a mechanical process of assembly but an ecological process of cultivation. You do not manufacture a child's education the way you manufacture a product. You create the conditions in which learning can grow. The teacher is not a factory foreman. The teacher is a gardener, and the gardener's art is knowing that different plants require different conditions — different soil, different light, different amounts of water — and that the gardener's job is to provide those conditions, not to demand that all plants grow at the same rate to the same height in the same direction.

The gardening metaphor was central to Robinson's vision, and AI gives it new urgency. A machine that can converse with a student in natural language, adapt to her pace, respond to her questions in real time, follow her interests into domains the standardised curriculum never imagined, and do all of this with infinite patience and no judgment — this machine is, in educational terms, the most powerful gardening tool ever created. It does not replace the gardener. It gives the gardener capabilities that the factory model never permitted.

But the tool is only as good as the model it serves. A gardening tool used inside a factory is still serving the factory's logic. An AI tutoring system deployed within the industrial model — delivering standardised content, measuring absorption through standardised assessments, using the technology's adaptive capabilities to more efficiently produce the same convergent outcomes — is the overhead projector all over again. A new medium serving an old purpose.

Robinson's third principle was that education should serve the community as well as the individual. Schools exist within communities, and the best schools Robinson observed were those that had opened their doors to the community, that drew on local knowledge and local resources, that understood education as a relationship between the school and the world around it rather than as an isolated process occurring behind closed doors.

This principle acquires new significance when the AI tools available to a child in a well-funded school in Helsinki are substantially the same as those available to a child in an under-resourced school in Nairobi. The technology democratises access to the gardening tool. What it cannot democratise is the gardening model itself — the institutional culture, the teacher training, the assessment framework, the community relationships that determine whether the tool is used to cultivate individual potential or to more efficiently enforce standardised conformity.

---

The institutional resistance to Robinson's vision was always enormous, and AI does not dissolve it. Robinson understood that the factory model persisted not because educators lacked imagination but because the system operated according to incentive structures that rewarded the factory's logic. Politicians needed measurable outcomes, because measurable outcomes could be cited in campaigns. Administrators needed standardised assessments, because standardised assessments allowed comparison between schools and justified funding decisions. Universities needed examination results, because examination results provided a convenient sorting mechanism for admissions. Employers needed credentials, because credentials reduced the cost of evaluating candidates.

Each of these incentive structures pointed in the same direction: toward standardisation, measurement, and the convergent competence that the factory was designed to produce. Robinson's revolution required changing not just the school but the entire ecosystem of incentives that the school operated within — from political accountability structures to university admissions criteria to employer hiring practices.

AI has changed the ecosystem, but it has changed it unevenly. The employer's incentive structure has shifted dramatically. When AI can perform convergent tasks, the employer no longer needs workers who can perform them, and the credential that certifies convergent competence loses its market value. This shift is already visible in hiring practices at technology companies, where portfolio-based assessment is replacing credential-based screening, and where the capacity to direct AI tools wisely is valued more highly than the ability to perform the tasks those tools have automated.

But the political and administrative incentive structures have shifted far less. Politicians still need measurable outcomes. Administrators still need comparison metrics. The pressure to standardise, to test, to sort children into categories of achievement, remains intense even as the economic rationale for that sorting collapses. The factory model is sustained not by belief in its educational value but by the inertia of the incentive structures that surround it.

Robinson spent the final years of his life arguing that the revolution he was calling for was already happening — in individual schools, in particular districts, in pilot programmes and experimental projects around the world. His book Creative Schools documented dozens of cases where educators had broken with the industrial model and produced results that standard metrics could not capture but that were visible in the engagement, confidence, and creative development of the students involved.

The AI moment amplifies both sides of this equation. The cases where the gardening model is already being practiced gain extraordinary new tools — responsive, adaptive, personalised in ways that no single teacher, however gifted, could achieve alone. The cases where the factory model persists gain tools that make the factory more efficient, producing standardised output at lower cost with higher speed, completing the project of rendering human convergent competence unnecessary while doing nothing to develop the divergent capacities that the economy now requires.

---

Robinson delivered his most-watched TED talk, "Do Schools Kill Creativity?", in 2006. It has been viewed more than seventy million times, making it the most-watched TED talk in history. He died in August 2020, eighteen months before ChatGPT was released and four years before the tools described in The Orange Pill crossed the threshold that made the factory model's obsolescence visible to people who had never watched a TED talk.

The timing contains a bitter irony. The most effective public advocate for educational transformation in a generation did not live to see the technology that made his argument economically unanswerable. He spent three decades making the case on philosophical and pedagogical grounds, and the world nodded politely and continued administering standardised tests. The economic argument — the one that moves institutions, that changes political incentives, that makes administrators reconsider systems they have spent careers defending — arrived after he was gone.

His daughter Kate Robinson has continued his work through the Sir Ken Robinson Foundation, and the foundation's archives have released footage of Robinson addressing AI directly. At the EduTECH Conference in Sydney in 2019, Robinson told his audience that AI would not be the apocalypse people feared but rather the end of "civilisation as we know it" — a characteristically precise formulation that acknowledged the magnitude of the disruption while refusing both panic and complacency. Some jobs would be swept away, he said, but millions of unforeseen jobs would be created, and the task was to learn lessons from previous technological transitions and to think about the positives that would emerge over the following half-century.

Robinson was more optimistic about AI than about social media's effects on children, which tells us something important about his framework. Robinson's concern was never with technology per se. His concern was with what institutions did with technology — whether they used it to amplify the factory's logic or to replace it with something better. Social media, in his reading, amplified the factory's logic: conformity, comparison, the reduction of human experience to measurable metrics. AI, in its best applications, pointed toward the gardening model: personalised, responsive, adapted to the individual learner rather than the standardised cohort.

The question Robinson would ask, if he were here to ask it, is not whether AI is good or bad for education. Robinson did not think in those binary terms. The question he would ask is: Are we using this extraordinary tool to grow gardens or to build more efficient factories? And the answer, as of now, is both — and the balance has not yet tipped.

Chapter 3: The Hierarchy Inverted

Every education system on the planet, without exception, maintains the same hierarchy of subjects. At the top: mathematics and languages. In the middle: the humanities — history, geography, social studies. At the bottom: the arts. Dance, drama, music, visual art — the subjects that are cut first when budgets tighten and funded last when they expand, the subjects that parents tolerate as enrichment but rarely regard as essential, the subjects that no serious career counsellor recommends as a primary focus because the economic returns are uncertain and the path is unmarked.

Robinson did not merely observe this hierarchy. He explained it. The hierarchy was not arbitrary. It was designed. It reflected, with perfect fidelity, the economic priorities of the industrial age. Mathematics was at the top because the industrial economy needed calculation. Languages were at the top because the industrial economy needed communication. The humanities were in the middle because a functioning society needed citizens with some knowledge of history and geography, enough to vote and read a newspaper, though not so much as to question the system. And the arts were at the bottom because the industrial economy did not need artists. It needed workers. The arts were tolerated as recreation, as a civilising influence, as the kind of thing a well-rounded person might pursue in leisure hours, but they were never, in the logic of the industrial model, essential.

Robinson argued that this hierarchy produced a specific form of damage: it taught children that there was a hierarchy of intelligence, and that the intelligence required for mathematics was superior to the intelligence required for dance. The child who could solve equations was smart. The child who could choreograph a movement sequence was talented, perhaps, but not smart — not in the way that mattered, not in the way that opened doors.

This was not merely unfair. It was empirically wrong. Howard Gardner's theory of multiple intelligences, which Robinson drew on extensively, demonstrated that human intelligence operates across a range of modalities — linguistic, logical-mathematical, musical, bodily-kinaesthetic, spatial, interpersonal, intrapersonal, naturalistic — and that the traditional curriculum measured only two of them while ignoring or actively suppressing the rest. The child who learned through movement was not less intelligent than the child who learned through symbols. She was differently intelligent, and the system's refusal to recognise her intelligence was a failure of the system, not of the child.

---

AI has inverted the economic logic that sustained the hierarchy.

The skills at the top of the traditional curriculum — calculation, information recall, procedural language use, logical analysis — are the skills that artificial intelligence performs with the greatest facility. A large language model can calculate. It can recall. It can analyse data, parse arguments, generate logical sequences, produce grammatically perfect prose in dozens of languages. These are convergent skills, and convergence is what the machine does best, because convergence is, at bottom, a computational operation: given these inputs and these rules, produce this output.

The skills at the bottom of the traditional curriculum — creative expression, aesthetic judgment, the capacity to move another human being emotionally, the ability to imagine something that has never existed and give it form — are the skills that AI performs least convincingly. A machine can generate an image. It cannot yet reliably produce art that makes a viewer stop and reconsider what she knows. A machine can compose music. It cannot yet compose music that makes a listener weep because the melody has found a grief she did not know she was carrying. A machine can produce competent prose. It cannot yet produce prose that changes the reader's understanding of what it means to be alive.

These are not technological limitations that will be solved by the next model upgrade. They are reflections of something structural. The arts, at their best, operate through what Robinson called personal expression — the communication of subjective experience from one consciousness to another. The power of a great painting, a great dance, a great piece of music, lies not in its technical execution but in the sense it conveys that a specific human being, with a specific history and a specific set of feelings, made this, and that the making was an act of communication that required courage, vulnerability, and the willingness to be seen.

AI does not have a subjective experience to communicate. This is not a criticism. It is a description. The machine's outputs are syntheses of patterns in training data, and they can be beautiful, surprising, and technically accomplished. But they do not carry the weight of personal expression, because there is no person behind them. There is a process. And the difference between a process and a person is the difference between a competent essay and a letter that makes you cry.

---

The inversion has immediate and concrete implications for curriculum design, though the institutions responsible for curriculum design have been almost entirely silent on the matter.

Robinson's argument was never that mathematics was unimportant. He was careful to distinguish between arguing against the hierarchy and arguing against the subjects at its top. Mathematics is beautiful and essential. Languages are the foundation of human connection. The problem was not that these subjects were taught but that they were taught at the expense of everything else, and that the hierarchy taught children a false lesson about the relative value of different forms of intelligence.

AI amplifies this argument by changing the economic calculus that sustained the hierarchy. When the economy needed human calculators, it made sense to place mathematics at the top. When the economy needed human analysts, it made sense to prioritise logical reasoning. When the economy needs people who can do what machines cannot — imagine, create, empathise, inspire, ask questions that reframe the problem — the hierarchy must change to reflect the new reality.

What does an inverted curriculum look like? Robinson offered glimpses throughout his work. A school in which the arts are not extracurricular but foundational. In which a child's first encounter with learning is not a worksheet but a creative project — a painting, a piece of music, a dramatic scene, a dance. In which mathematical thinking is introduced not through abstraction but through its relationship to creative practice — the geometry of perspective drawing, the physics of sound, the patterns of rhythm. In which literacy is developed not through grammar drills but through storytelling, the oldest and most powerful form of human communication, the form in which language is not a set of rules to be memorised but a tool for making meaning.

This is not a soft curriculum. Robinson was impatient with the suggestion that placing the arts at the foundation of education meant lowering standards. The opposite was true. The arts, taught well, are among the most demanding disciplines in human experience. Learning to play an instrument requires years of dedicated practice. Learning to act requires the development of emotional intelligence, physical discipline, and the capacity for sustained concentration that few academic subjects demand. Learning to paint requires the development of visual acuity, hand-eye coordination, and the ability to see what is actually in front of you rather than what you expect to see — a capacity that transfers directly to scientific observation.

The inverted curriculum does not abandon rigour. It redefines it. Rigour in the industrial model meant accuracy: can the student reproduce the correct answer? Rigour in the inverted model means depth: has the student engaged with the material deeply enough to produce something original? The assessment changes accordingly. The standardised test, which measures accuracy and convergence, gives way to the portfolio, which measures depth and divergence. The grade, which sorts students into categories, gives way to the evaluation, which describes what the student has learned and what she is capable of.

---

The resistance to this inversion is not primarily intellectual. Few people who have examined the evidence seriously argue that mathematics should remain at the top of the hierarchy while AI performs mathematical operations billions of times faster than any human. The resistance is structural and emotional.

It is structural because the entire apparatus of educational assessment — examinations, grades, university admissions criteria, employer screening mechanisms — is built around the existing hierarchy. Inverting the curriculum requires inverting the assessment, which requires inverting the admissions process, which requires inverting the hiring process. Each of these inversions involves institutions that are optimised for stability and resistant to change, institutions whose leaders have built careers within the existing hierarchy and whose expertise is defined by it.

It is emotional because the hierarchy is not merely an organisational principle. It is an identity structure. The parent who excelled at mathematics and built a career on mathematical competence does not want to hear that her child's education should centre on drama. The teacher who has spent twenty years teaching calculus does not want to hear that calculus, while still worth teaching, is no longer the keystone of the curriculum. The administrator who has spent a career interpreting standardised test scores does not want to hear that standardised tests measure the wrong thing.

Robinson was gentle with these people in his public presentations, because he understood that the resistance was rooted in love — love for one's own educational experience, love for the subjects one had mastered, love for the identity one had built within the hierarchy. But he was uncompromising about the stakes. The hierarchy was producing a generation of young people equipped with skills the economy no longer needed and stripped of skills the economy desperately required. The emotional cost of inverting the hierarchy was real, but it was dwarfed by the human cost of preserving it.

AI has raised the stakes beyond anything Robinson could have anticipated. The speed of the transition — from first contact with generative AI tools to a trillion-dollar market correction in the software industry, from novelty to structural economic disruption in less than three years — means that the children currently in the system will graduate into a world that has been reorganised around capacities their education did not develop. The lag between economic reality and educational practice, always significant, has become catastrophic. The factory is still producing the same product. The market for that product has collapsed. And the children inside the factory have no way of knowing, because the factory is the only world they have ever seen.

Robinson called this a crisis of human resources, an analogy to the crisis of natural resources that the environmental movement had been warning about for decades. Just as industrial economies had strip-mined the earth for a narrow range of commodities while ignoring the ecological systems that sustained life, industrial education had strip-mined human minds for a narrow range of cognitive capacities while ignoring the creative ecosystem that sustained innovation, meaning, and human flourishing. AI has accelerated the strip-mining by making the narrow commodities worthless. What remains, buried under decades of educational rubble, is the creative capacity that was always there and that the system never valued because the hierarchy told it not to.

The hierarchy must be inverted. Not adjusted, not tweaked, not supplemented with an after-school arts programme. Inverted. The arts at the foundation. The sciences and mathematics taught through their relationship to creative practice. Assessment redesigned around what students can create, not what they can recall. And the whole structure oriented toward the question that Robinson asked and that AI has made unanswerable: what is this child capable of that no machine can replicate?

Chapter 4: The Element in the Age of the Amplifier

Gillian Lynne was eight years old and failing at school. She could not concentrate. She fidgeted. She could not sit still. She disturbed the other children. The school wrote to her parents. Something was wrong with Gillian. Today, Robinson often noted, she would almost certainly have been diagnosed with ADHD and prescribed medication.

Instead, her mother took her to a specialist. The specialist talked to Gillian for twenty minutes. Then he told her mother he needed to speak with her privately. He turned on the radio and left the room. Through the window, he and Gillian's mother watched the child get up and begin to move. She moved with the music, with a fluidity and responsiveness that was unmistakable to anyone who knew what they were looking at.

"Mrs. Lynne," the specialist said, "Gillian isn't sick. She's a dancer. Take her to dance school."

She did. Gillian Lynne went on to choreograph Cats and The Phantom of the Opera, two of the most successful musicals in history. She was responsible for some of the most iconic movement sequences in the history of musical theatre. She was worth millions.

Robinson told this story in his 2006 TED talk and in nearly every major address thereafter, not because it was unusual but because it was representative. Gillian Lynne found her element — the intersection of natural aptitude and personal passion that Robinson defined as the point where a person does their best work and feels most fully themselves. She found it because one adult, at one critical moment, looked at what the educational system had classified as a pathology and recognised it as a talent.

Most people are not so lucky. Most people never find their element, because the educational system sorts them into categories at an early age and the categories are wrong. The categories are wrong because they are based on a narrow conception of intelligence — the hierarchy of subjects discussed in the previous chapter — that recognises only a fraction of human capability. The child whose intelligence is kinaesthetic is told she has a behavioural problem. The child whose intelligence is musical is told he is not academic. The child whose intelligence is interpersonal — who thinks best in conversation, who understands through relationships — is told she talks too much.

Robinson's life's work was the argument that these children are not deficient. They are misidentified. The system that identifies them is measuring the wrong thing.

---

The element has two components, and both are necessary. The first is aptitude: a natural facility for a particular kind of activity. Not everyone has the same aptitudes, and aptitudes are not distributed according to any logic the educational system recognises. The child who cannot sit still may have a kinaesthetic aptitude that would make her an extraordinary surgeon, athlete, or dancer. The child who daydreams may have an imaginative aptitude that would make him an extraordinary novelist, filmmaker, or theoretical physicist. The system does not test for these aptitudes. It tests for the aptitudes it values, and the ones it values are those that serve the hierarchy.

The second component is passion: the personal engagement that makes work feel like play. Aptitude without passion produces competence — the person who is good at something but does not care about it, who performs reliably but without joy, who arrives at retirement having spent forty years in a career that never once made her feel alive. Passion without aptitude produces frustration — the person who loves music but cannot carry a tune, who loves sport but lacks coordination, who loves writing but cannot find the words.

The element is where aptitude and passion converge, and the convergence produces something Robinson called a different quality of experience. Time distorts. Self-consciousness drops away. The work becomes intrinsically rewarding. The person in their element is not working toward a goal. She is doing the thing itself, and the doing is enough.

Robinson noted that this description is functionally identical to what Mihaly Csikszentmihalyi called flow — the optimal psychological state in which challenge and skill are matched and attention is fully absorbed. The connection was deliberate. Robinson understood that the element was not a mystical concept but a psychological reality, supported by decades of research, and that the educational system's failure to help children find it was not merely an aesthetic failure but a failure of human development with measurable consequences for well-being, productivity, and social health.

---

AI complicates the element in ways that Robinson's original framework did not anticipate and that require careful examination.

The complication is this: if the machine can perform competently in any domain — can write, compose, design, calculate, analyse, build — then what does it mean to have an aptitude? The traditional understanding of aptitude involved the capacity to do something well. Gillian Lynne's aptitude was kinaesthetic: she could move with a precision and expressiveness that others could not match. A mathematical prodigy's aptitude is logical: she can see patterns and relationships in numbers that others cannot perceive. In each case, aptitude was defined partly in relation to scarcity — the thing the person could do that most people could not.

When the machine can do everything competently, the scarcity that defined aptitude shifts. The capacity to do is no longer scarce. What remains scarce is the capacity to care.

This distinction is Robinson's most consequential contribution to the AI conversation, even though he never framed it in these terms. The element was never purely about aptitude. It was about the intersection of aptitude and passion, and passion is the component that no machine can replicate, because passion is a property of conscious beings with needs, desires, and stakes in the world.

A machine can write a competent song. It cannot want to write a song. It cannot feel the specific urgency that drives a songwriter to sit down at three in the morning and try to capture an emotion that will not leave her alone. It cannot experience the frustration of a melody that almost works, the satisfaction of a lyric that finally says the thing she has been trying to say. The machine does not have a three in the morning. It does not have a thing it has been trying to say.

The element, in the AI age, is not the intersection of what a person can do and what a person cares about. It is the caring itself. The aptitude component has been democratised — AI gives everyone the capacity to explore any domain — and what remains as the irreducible human contribution is the passion that directs the exploration. Not what you can do with the tool, but what you want to do with it. Not your capacity, but your care.

---

This redefinition has a liberating implication and a terrifying one, and both deserve honest examination.

The liberating implication is that AI removes the barriers that previously prevented people from exploring domains where their element might reside. Robinson's stories of mismatch — the dancer diagnosed with ADHD, the comedian told she was disruptive, the entrepreneur told he was not academic — were stories of people who found their element despite the system. For every Gillian Lynne who found a perceptive specialist, thousands of children with kinaesthetic intelligence never encountered anyone who recognised it, and their aptitudes went undeveloped and undiscovered. For every child who found a music teacher who ignited a passion, thousands of children in schools without music programmes never had the chance.

AI, at its best, functions as the perceptive specialist that every child deserves and almost none receive. A system that can engage a child in conversation about any subject, follow her interests into any domain, adapt its responses to her pace and modality, and do all of this without the limitations of a single teacher's expertise or a single school's budget — this system is, potentially, the most powerful tool for element discovery in the history of education.

A child who has never had access to music education can explore composition through conversation with an AI that understands music theory, can generate examples, can respond to her attempts with feedback that is patient, specific, and endlessly available. A child in a school with no art programme can explore visual art through the same kind of responsive collaboration. A child whose kinaesthetic intelligence has been pathologised can find, through AI-mediated exploration of dance, sport, physical theatre, or surgery simulation, the domain where her aptitude and passion converge.

The democratisation is real, and its moral significance is enormous. The element was always available, in principle, to every child. In practice, it was available primarily to children whose parents could afford the exploration — the private lessons, the specialised schools, the luxury of allowing a child to pursue a path that did not appear on any standardised career trajectory. AI does not eliminate this inequality, but it narrows it substantially by providing every child with a responsive, knowledgeable, endlessly patient collaborator in the process of self-discovery.

---

The terrifying implication is this: if the element is defined primarily by caring rather than by doing, then finding one's element requires a level of self-knowledge that the educational system has never attempted to develop and that AI cannot provide.

The question shifts from "What are you good at?" to "What do you care about?" And "What do you care about?" is a question that cannot be answered by a standardised test, cannot be resolved by an algorithm, and cannot be bypassed by a machine. It requires introspection, experimentation, failure, the courage to try things that might not work, and the resilience to keep trying after they do not.

Robinson understood this. His book Finding Your Element is essentially a guide to the process of self-discovery that the element requires, and the process is not comfortable. It involves confronting the assumptions about yourself that the educational system installed — the belief that you are not creative, the belief that your talents lie in the domain the system rewarded, the belief that passion is a luxury rather than a necessity. These assumptions are deep, and dislodging them requires the specific kind of struggle that no tool can shortcut.

The parallel to concerns about smoothness in the broader culture is direct. If AI removes the friction of doing — if any creative output can be produced through a prompt rather than through the labour of learning a craft — then the process of discovering one's element through that labour is bypassed. Gillian Lynne did not discover her element by being told she was a dancer. She discovered it by dancing — by the physical, embodied, friction-rich experience of moving her body to music and feeling the specific rightness of it. The feeling could not have been produced by a report. It required the doing.

There is a version of AI-assisted element discovery that skips the doing entirely. The child explores music through prompts rather than through the frustrating, rewarding, embodied experience of learning an instrument. She explores art through generated images rather than through the mess and struggle of paint on canvas. She explores writing through edited outputs rather than through the painful process of discovering what she actually has to say.

In each case, the exploration may identify a domain of interest. But it may not produce the deep engagement that Robinson identified as the second component of the element. Passion, in Robinson's framework, is not a preference. It is not the thing you enjoy. It is the thing you cannot stop doing even when it is difficult, the thing that makes difficulty feel like engagement rather than obstacle. That quality of passion requires friction. It requires the experience of struggling with something and discovering, through the struggle, that the struggle itself is what you love.

Robinson would have welcomed AI as a tool for broadening exploration while insisting that exploration alone is not enough. The element requires exploration and depth, breadth and commitment, the willingness to try everything and the willingness to stick with something long enough for the passion to reveal itself. AI excels at the breadth. The depth remains a human task, and it is the task that education, reconceived for the AI age, must learn to support.

The specialist who looked at Gillian Lynne and saw a dancer did not prescribe a treatment or administer a test. He created the conditions for self-discovery: he turned on the radio, left the room, and watched. The most powerful educational intervention was not an instruction but a permission — the permission to move, to be herself, to express the intelligence that the school had been suppressing.

The AI age needs a million such specialists. Machines that can turn on the radio. Adults who know what to watch for. And a system brave enough to leave the room and let the child dance.

Chapter 5: The Teacher Who Stopped Grading Essays

In a classroom in the American Midwest, sometime in the winter of 2026, a high school English teacher did something that would have been unthinkable two years earlier. She stopped grading her students' essays.

She did not stop assigning writing. She did not abandon literacy. She did not capitulate to the machine. What she did was more radical than any of those things, and more aligned with everything Robinson spent his life arguing: she started grading questions instead.

The assignment was simple in description and demanding in execution. She gave her students a topic — the ethical implications of genetic editing, say — and access to an AI tool. The assignment was not to produce an essay. The assignment was to produce the five questions you would need to ask, of the AI, of the source material, of yourself, before you could write an essay worth reading.

The shift sounds modest. It was seismic. Because grading essays, in the industrial model, is grading answers. The essay demonstrates what the student knows, how well the student organises that knowledge, how closely the student's analysis conforms to the expected interpretation. The rubric measures accuracy, structure, evidence, and clarity. These are convergent metrics. They measure how close the student came to the correct output.

Grading questions measures something entirely different. A good question demonstrates not what the student knows but what the student has recognised she does not know. It demonstrates the capacity to identify the gap between what the material says and what it means, between the obvious interpretation and the one that has not yet been considered. It demonstrates divergent thinking — the ability to look at a topic and see not the expected path through it but the paths nobody has taken.

Robinson would have recognised this teacher's intervention as the pedagogical equivalent of turning on the radio and leaving the room. She was not delivering content. She was creating the conditions for discovery. The content was available — the AI could provide any factual information the student needed. What the AI could not provide was the capacity to identify which information mattered, which questions opened productive lines of inquiry, and which assumptions in the material needed to be challenged rather than accepted.

The students who produced the best questions, the teacher reported, were not the students who had previously received the highest essay grades. They were the students who had always been the most curious — the ones who raised their hands to ask the question that was slightly off-topic, slightly unexpected, the question that made the class pause for a moment before someone said, "That's actually a really good point." These students had always been present in the classroom. The grading system had never found a way to reward them, because their strength was not in producing the expected answer but in seeing the unexpected question.

---

Robinson's theory of the teacher's role was grounded in a distinction between two models of education that he articulated with increasing precision throughout his career.

The first model, which he associated with the industrial system, cast the teacher as a deliverer. The teacher's job was to deliver content — to transfer information from the curriculum to the student with maximum efficiency and minimum distortion. The student's job was to receive the content and demonstrate, through examination, that the transfer had been successful. The teacher was evaluated on the quality of the delivery. The student was evaluated on the accuracy of the reception. The relationship between them was transactional: content for credentials.

The second model, which Robinson championed, cast the teacher as a mentor. The mentor's job was not to deliver information but to develop a human being — to recognise the student's particular capacities, to create conditions in which those capacities could emerge and grow, to provide the kind of sustained, personalised attention that allowed a young person to discover what she was capable of and to develop the confidence to pursue it. The mentor evaluated not what the student could reproduce but what the student could originate. The relationship was not transactional but developmental: the mentor invested in the student's growth, and the return on that investment was measured in the quality of the person who emerged.

AI has rendered the deliverer model obsolete with a completeness that should embarrass every institution that continues to practice it. The machine delivers content more efficiently than any human teacher. It is available at all hours. It adapts to the student's pace. It does not have bad days. It does not lose patience. It does not favour the student who reminds it of its own child or penalise the student who asks too many questions. If the teacher's primary function is to deliver information, the teacher has been replaced.

But if the teacher's primary function is mentorship — the human relationship through which a young person discovers her capacities and develops the courage to use them — then the teacher has not been replaced. The teacher has been liberated. The mechanical labour of content delivery, which consumed the majority of classroom time in the industrial model, has been offloaded to a machine, and the teacher is free to do the thing that only a human being can do: see the student.

Seeing the student is Robinson's phrase, and it carries more weight than its simplicity suggests. Seeing the student means recognising the particular form of intelligence that this specific child possesses, which may not be the form the curriculum measures. Seeing the student means noticing the moment when a child's engagement shifts from dutiful to genuine, when the eyes change, when the posture changes, when something in the material has connected with something in the child and a door has opened that was not open before. Seeing the student means knowing when to push and when to wait, when to challenge and when to encourage, when to provide structure and when to step back and let the child discover something on her own.

No machine can do this. Not because the technology is insufficiently advanced, but because seeing another human being requires being a human being — requires the specific empathy that comes from having been a child yourself, from having struggled, from having been seen or not seen by the adults in your own life, from carrying the accumulated understanding of what it means to be a person trying to figure out who you are.

---

The teacher as mentor requires a transformation in teacher training that has barely begun and that the AI moment makes urgent beyond any previous educational reform.

Robinson was blunt about the current state of teacher preparation. Most teacher training programmes, he argued, were designed to produce deliverers. They trained teachers in content knowledge and pedagogical technique — how to present material, how to manage a classroom, how to administer assessments. They did not train teachers in the art of mentorship — how to recognise diverse forms of intelligence, how to create conditions for creative development, how to facilitate the process of element discovery that Robinson placed at the centre of education's purpose.

This failure of training was not malicious. It was structural. The industrial model needed deliverers, and the training system produced them. The assessment system measured delivery, and the training system prepared teachers for the assessment. The entire apparatus was internally consistent. It was simply pointed in the wrong direction.

Repointing the apparatus requires changing not just what teachers are taught but what kind of people are selected for teaching, what qualities are valued in their training, and what capacities are assessed in their professional development. Robinson argued that the best teachers were those who had found their own element — who taught not because it was a stable career but because teaching was the thing that made them come alive. A teacher in her element communicates something to her students that no curriculum can specify: the lived experience of passionate engagement with a discipline. The student sees someone who cares, and the caring is contagious.

AI creates both an opportunity and a trap for teacher development. The opportunity is that AI can handle the delivery function, freeing teacher training to focus on the mentorship capacities that Robinson identified as essential. If the machine delivers the content, the teacher training programme does not need to spend the majority of its time on content delivery. It can spend that time on the harder, subtler, more important work of developing the mentor's eye — the capacity to see students, to recognise their particular forms of intelligence, to create the conditions for element discovery.

The trap is that AI can be used to make teacher training more efficient without making it more effective — to deliver training content through AI platforms, to assess teachers through AI-graded examinations, to optimise the production of deliverers while calling it innovation. This is the pattern Robinson identified in every previous wave of educational technology: the new tool absorbed into the old model, serving the old purposes, producing the old outcomes with a new interface.

---

There is a further dimension to the teacher-as-mentor model that Robinson's work implied but did not fully develop, and that the AI age has made explicit. The mentor does not merely recognise the student's capacities. The mentor models the practice of self-knowledge.

A teacher who grades questions rather than essays is modelling something specific: the practice of inquiry. She is demonstrating, through the structure of her assessment, that the ability to ask is more valuable than the ability to answer. Her students internalise this not primarily through instruction but through observation. They see an adult who values curiosity over compliance, who rewards the unexpected question over the expected answer, who treats uncertainty not as a failure but as the beginning of understanding.

This modelling function cannot be performed by a machine. An AI can ask questions. It can generate questions of considerable sophistication. But it cannot model the practice of questioning as a human activity — the lived experience of a person who finds questions genuinely exciting, who becomes visibly energised when a student asks something unexpected, who demonstrates through her own behaviour that not knowing is the most interesting state a mind can be in.

Robinson understood that children learn more from what adults do than from what adults say. The teacher who says "Be creative" while administering a standardised test is teaching the test. The teacher who says "What do you think?" and then genuinely listens is teaching inquiry. The teacher who says "I don't know — let's find out together" is teaching the most important lesson available in the AI age: that the value of a human being is not located in what she knows but in the quality of the questions she brings to what she does not know.

The English teacher in the Midwest understood this. By changing her assessment from essays to questions, she changed the entire ecology of her classroom. The students who had been rewarded for producing correct answers were now challenged to produce interesting questions. The students who had been invisible — the divergent thinkers, the curious ones, the ones whose strength was in seeing what others missed rather than reproducing what others expected — became visible. The hierarchy within the classroom shifted, not because the teacher declared a new hierarchy but because the assessment revealed one that had always been there and that the old system had been unable to see.

Robinson spent his career arguing that the purpose of education is not to produce workers but to develop human beings. The teacher as mentor is the instrument of that development. AI has not made the instrument obsolete. It has made it indispensable, because it has automated everything else — the delivery, the content, the mechanical functions that the industrial model mistook for the essence of teaching. What remains, stripped of the industrial scaffolding, is the relationship. The adult who sees the child. The mentor who recognises the talent. The teacher who turns on the radio and knows what to watch for when the child begins to move.

That relationship is now the only part of education that a machine cannot replicate, and it is, as Robinson always insisted, the part that matters most.

---

Robinson's final book, Imagine If, published posthumously in 2022 with his daughter Kate Robinson, returned to the question of what education could become if it took the mentor model seriously. He did not live to see AI enter the classroom, but the principles he articulated apply with an almost eerie precision to the moment that followed his death.

Education, he wrote, should be personalised to the talents and interests of each student. It should be based on the principle that every student is unique and that the purpose of education is to help each student discover and develop their particular strengths. It should be collaborative, drawing on the collective intelligence of the learning community. And it should be creative — not in the decorative sense of adding arts activities to the margins of the curriculum, but in the structural sense of making creative thinking the foundation of every subject, every lesson, every assessment.

Each of these principles is amplified by AI. Personalisation, which was logistically impossible for a single teacher managing thirty students, becomes feasible when AI handles the adaptive delivery and the teacher focuses on the developmental relationship. Collaboration, which was constrained by the physical boundaries of the classroom, expands when AI connects students with mentors, experts, and peers across geographical and disciplinary boundaries. Creativity, which was suppressed by the industrial model's insistence on convergent outcomes, becomes the explicit goal when convergent tasks have been delegated to machines.

The vision is within reach. The institutional will to reach for it is the question that Robinson's work leaves unanswered and that the AI moment makes inescapable. The teacher who stopped grading essays did not wait for institutional permission. She acted because the technology made the old practice visibly absurd — grading essays that an AI could produce better than any student was an exercise in mutual pretence — and because she understood, whether or not she had read Robinson, that the purpose of her classroom was not to produce essays but to develop minds.

Every teacher in every classroom now faces the same choice. And the choice, as Robinson understood better than anyone, is not about the technology. It is about the model. Use the tool to deliver content more efficiently, and you are building a more efficient factory. Use the tool to free the teacher for mentorship, and you are planting a garden.

The garden requires gardeners. Training them is the most urgent educational project of the AI age, and it has barely begun.

Chapter 6: Ascending Friction in the Classroom

Robinson told a story about a young girl in a drawing lesson. The teacher walked among the desks, pausing occasionally to observe. She stopped at the girl's desk and asked what she was drawing.

"I'm drawing God," the girl said.

"But nobody knows what God looks like," the teacher replied.

"They will in a minute," the girl said.

Robinson loved this story because it captured something essential about children's relationship to creative work: the absence of the fear of being wrong. The girl did not hesitate. She did not qualify. She did not check whether her drawing conformed to any external standard. She was engaged in the act of making something, and the making was its own authority. She would discover what God looked like in the process of drawing Him, not before.

Robinson used the anecdote to illustrate what schools destroy. The girl's confidence — her willingness to attempt something impossible without first asking permission — is precisely the quality that divergent thinking requires and that the educational system, through years of grading, ranking, and penalising incorrect answers, systematically eliminates. By the time the girl reaches secondary school, Robinson argued, she will have learned that drawing God is a category error, that art is subjective and therefore unserious, that the correct answer to any question is the one in the textbook, and that the safest strategy in any educational context is to reproduce what the teacher expects.

The story also illustrates something Robinson did not explicitly address but that the AI age has made inescapable: the relationship between friction and learning. The girl was not drawing God from a template. She was struggling with a representation — making marks, evaluating them, adjusting, discovering through the resistance of the medium what the image wanted to become. The friction was the learning. The difficulty of translating an internal vision into an external mark, and the surprises that emerged from the imperfect translation, was the process through which her creative capacity developed.

---

Every significant technological abstraction in the history of human tool use has removed difficulty at one level and relocated it to a higher one. The pattern is structural, and it operates in education as forcefully as it operates in software development or surgery or any other domain where tools mediate between human intention and its realisation.

Consider the calculator. Before electronic calculators, mathematical education devoted substantial time to computational practice — long division, multiplication of large numbers, extraction of square roots. The practice was tedious. It was also formative. Through hundreds of hours of manual computation, students developed what mathematicians call number sense: an intuitive feel for the behaviour of numbers, for whether an answer was roughly right or wildly wrong, for the relationships between operations. The friction of manual computation deposited understanding.

The calculator removed that friction. Students could now perform complex calculations instantly, without understanding the operations they were performing. The computational layer of mathematical difficulty disappeared. And the critics, who appeared with the reliability of seasonal weather, warned that students would lose number sense, that mathematical understanding would become shallow, that the tool would produce a generation of people who could press buttons but could not think mathematically.

The critics were partly right. Number sense did decline among students who used calculators as a substitute for understanding rather than as a complement to it. But the critics missed the larger trajectory. The calculator freed mathematical education to address problems of a complexity that manual computation could never have reached. Students who were no longer spending hours on long division could spend those hours on statistical reasoning, mathematical modelling, the application of mathematics to real-world problems that required judgment as well as calculation. The difficulty did not disappear. It ascended to a higher floor.

The same pattern repeated with every educational technology that followed. Spell-checkers removed the friction of orthographic accuracy. The critics warned that spelling would decline. Spelling did decline. But the students freed from the anxiety of misspelling could write more freely, take greater risks with vocabulary, and focus on the quality of their thinking rather than the accuracy of their transcription. The difficulty ascended from the mechanical to the compositional.

Word processors removed the friction of revision. Before word processing, revising an essay meant rewriting it by hand or retyping it from scratch. The physical labour of revision was substantial, and it constrained how much revision a student was willing to undertake. Word processors made revision effortless at the mechanical level. The critics warned that students would become careless, producing text without the discipline that physical revision imposed. Some students did become careless. But the best students revised more, not less, because the removal of mechanical friction revealed the cognitive friction of revision — the hard, genuinely difficult work of reading your own prose critically, identifying weaknesses in the argument, reorganising paragraphs for clarity. The difficulty ascended from the physical to the intellectual.

---

AI represents the most dramatic ascent in the history of educational friction, because it removes not just mechanical difficulty but cognitive difficulty at the level of production. The calculator removed the need to compute. The spell-checker removed the need to spell. AI removes the need to produce — the essay, the analysis, the solution, the design, the code. The entire production layer of educational difficulty has been delegated to a machine.

Robinson's framework predicts what should happen next, even though Robinson did not live to see it: the difficulty should ascend to the level of judgment, evaluation, direction, and the capacity to ask questions that frame the production in the first place.

The student who uses AI to produce an essay has not bypassed learning. The learning has relocated. It has ascended from the production floor — where the difficulty was in constructing sentences, organising paragraphs, marshalling evidence — to the evaluation floor, where the difficulty is in determining whether the essay is any good. Whether the argument holds. Whether the evidence is relevant. Whether the framing reveals something genuine or merely reproduces a conventional interpretation. Whether the essay says something worth saying.

This is a harder floor. Evaluation is cognitively more demanding than production for most people, because production can follow templates while evaluation requires judgment, and judgment requires the kind of deep engagement with the material that Robinson identified as the foundation of genuine learning. The student who evaluates an AI-generated essay critically — who reads it not as a finished product but as a draft to be interrogated, challenged, improved, or discarded — is engaged in a form of intellectual work that is more demanding, not less, than the work of producing the essay from scratch.

But ascent is not automatic. This is the critical point that the optimists about AI in education consistently miss, and that Robinson's work makes impossible to ignore. Ascending friction requires pedagogical design. It requires a teacher who understands that the removal of production friction is not the end of the educational process but its beginning — that the AI-generated essay is not the assignment but the starting material for the assignment, and that the real work begins when the student sits with the output and asks: Is this true? Is this interesting? Is this the best version of this argument? What is missing? What assumptions has the machine made that I would not make? What would I say differently, and why?

Without this pedagogical design, the friction does not ascend. It simply disappears. The student submits the AI output as finished work. The teacher grades it as finished work. The transaction is complete. Nobody has learned anything. The factory has produced its product. The product happens to have been manufactured by a machine, but the factory's logic does not distinguish between human and machine production — it measures output, and the output is satisfactory.

---

Robinson's theory of creativity provides the framework for understanding why ascending friction matters beyond the immediate context of any single assignment.

Robinson distinguished between three levels of creative development. The first was imagination — the capacity to think of things that are not present to the senses, to conjure mental images, to envision possibilities that do not yet exist. The second was creativity — the process of producing original ideas that have value, which requires not just imagination but the discipline of crafting the imagined thing into a form that can be shared and evaluated. The third was innovation — the process of putting creative ideas into practice, which requires not just creativity but the capacity to navigate the practical constraints of realisation.

Each level involves a different kind of friction, and each kind of friction is formative.

The friction of imagination is the difficulty of thinking beyond what is already known, of reaching for an idea that does not yet have a name. This friction is internal and psychological, and it requires the courage to be wrong — the willingness to follow a thought into territory where there are no templates, no guarantees, and no external validation. This is the friction that schools destroy when they punish incorrect answers and reward conformity.

The friction of creativity is the difficulty of translating an imagined thing into a communicable form — of turning the internal vision into an external artifact that others can experience and evaluate. This friction is partly technical (the resistance of the medium, the constraints of the form) and partly intellectual (the gap between what you envisioned and what you produced, which forces revision, reconsideration, and deeper engagement with the material). This is the friction that the girl drawing God was experiencing — the friction between her vision and her marks on the page.

The friction of innovation is the difficulty of taking a creative product and making it work in the world — navigating practical constraints, persuading others of its value, iterating in response to feedback. This friction is social and organisational, and it requires resilience, communication skills, and the capacity to maintain a vision through the compromises that realisation demands.

AI removes the second kind of friction almost entirely. The gap between the imagined thing and the produced thing has collapsed to the width of a conversation. What AI cannot remove is the first kind — the courage to imagine something that does not yet exist — or the third kind — the judgment required to determine whether the produced thing is worth implementing and the social skill required to bring others along.

Education that takes ascending friction seriously would focus on the first and third kinds. It would develop the courage to imagine, which requires an environment where wrong answers are valued as evidence of creative risk rather than penalised as failures of conformity. And it would develop the judgment to evaluate and the resilience to implement, which requires the kind of mentorship, collaborative practice, and real-world engagement that Robinson placed at the centre of his vision for schools.

---

The practical implications for curriculum design are immediate and specific, though they require a willingness to abandon assumptions that most educational institutions have never questioned.

Assessment must shift from measuring production to measuring evaluation and direction. The student who can identify the weaknesses in an AI-generated analysis and articulate what a better analysis would include has demonstrated a higher-order capacity than the student who produced the analysis by hand. The assessment should recognise this. Portfolio-based assessment, in which students curate and annotate a body of work that includes AI-generated material they have evaluated, revised, and directed, is one approach. The portfolio demonstrates not what the student produced but what the student understood about production — what she kept, what she discarded, what she improved, and why.

Classroom practice must shift from production exercises to evaluation exercises. The student who reads an AI-generated essay and identifies its three weakest arguments has done harder cognitive work than the student who wrote the essay. The class that debates whether an AI-generated solution to an ethical dilemma is adequate — and if not, what it is missing — is engaged in the kind of critical, creative, collaborative thinking that Robinson spent his life trying to make the centre of education.

Teacher preparation must shift to train teachers in facilitating evaluation rather than directing production. The teacher's question changes from "Did the student produce the correct output?" to "Can the student tell me what is wrong with this output, what is missing, and what would make it better?" This is a harder question to ask, a harder answer to evaluate, and a more demanding form of teaching. The friction has ascended for the teacher as well as for the student.

Robinson argued that the revolution in education he was calling for would not be easy. It would require changing not just curricula but cultures — the deep assumptions about what education is, what it measures, what it values. AI has not made the revolution easier. It has made it unavoidable. The production floor has been automated. The students are still standing on it, waiting for someone to show them where the stairs are.

The stairs lead up. The difficulty ascends. And at the top of the staircase is the question that Robinson placed at the heart of education, the question that no machine can answer and that every child deserves the chance to ask: What is this worth? What does it mean? And what would I do differently?

Chapter 7: The Democratisation of the Element

In 2015, Robinson visited a school in a low-income district of Los Angeles. The school had been designated as failing by the state's accountability system — test scores were below average, attendance was inconsistent, and the neighbourhood surrounding it was marked by the kind of economic precarity that makes educational achievement feel like an abstraction. The standardised metrics said this school was not working.

Robinson saw something different. He saw a theatre programme run by a teacher who had, against considerable institutional resistance, created a space where students who had been written off by the academic system came alive. Students who could not sit still in a mathematics class held the stage for two hours with a concentration and commitment that no standardised test could measure. Students who had been classified as learning-disabled demonstrated, in rehearsal and performance, a capacity for memorisation, collaboration, physical discipline, and emotional intelligence that their academic records did not begin to reflect.

These students had elements. They had aptitudes and passions that, given the right conditions, produced the quality of deep engagement that Robinson identified as the hallmark of human flourishing. What they did not have was an educational system designed to recognise or develop those elements. The hierarchy of subjects had placed the arts at the bottom. The budget had followed the hierarchy. And the students whose intelligence operated in modes the hierarchy did not value were left to conclude, after years of failing tests that measured the wrong things, that they were not intelligent.

Robinson told this story to illustrate a point that he made with increasing urgency in the final years of his life: the distribution of creative talent is universal, but the distribution of creative opportunity is not. Every child has an element. Not every child has the conditions in which that element can be discovered and developed. The gap between talent and opportunity is the most consequential inequality in education, and it is an inequality that the industrial model does not merely fail to address but actively reinforces, because the hierarchy of subjects ensures that the resources flow to the domains at the top while the domains where many children's talents reside are starved.

---

AI narrows this gap with a speed and scale that no previous educational intervention has achieved. The narrowing is not complete — inequalities of access, connectivity, and institutional support remain substantial — but the direction is unmistakable, and the moral significance of the direction demands honest examination.

Consider the child in a rural school in sub-Saharan Africa. Her school has one teacher for sixty students. There is no music programme, no art programme, no drama programme. The curriculum is focused on literacy and numeracy, because literacy and numeracy are what the standardised assessments measure, and the school's funding depends on its performance on those assessments. The hierarchy of subjects, exported from the industrial economies that designed it, determines what this child is taught, and the hierarchy has no room for the creative domains where her element might reside.

Before AI, this child's creative potential was, in practical terms, inaccessible. She could not explore music without a music teacher. She could not develop visual art without materials and instruction. She could not discover whether her aptitude lay in writing, design, theatre, coding, or any of the other domains that the curriculum did not offer, because the exploration required resources her school did not have.

AI does not solve the infrastructure problem. The child still needs electricity, connectivity, and a device. These are not trivial barriers, and any honest account of democratisation must acknowledge that billions of people lack one or more of these prerequisites. But where those prerequisites exist — and they exist in rapidly expanding geographies — the child now has access to something that Robinson could have only imagined: a responsive, knowledgeable, endlessly patient creative collaborator that can engage her in any domain of human creative activity, at her pace, in her language, adapted to her interests and her level of development.

The collaborator is not a teacher. Robinson was emphatic that the human relationship at the heart of education could not be replaced by a machine, and nothing in the intervening years has weakened that argument. But the collaborator provides something that the single teacher managing sixty students cannot: the breadth of exploration that element discovery requires.

A child does not find her element by being assigned to a domain. She finds it by exploring many domains and discovering which one produces the specific quality of engagement that Robinson described — the loss of self-consciousness, the distortion of time, the sense that the work is intrinsically rewarding. This exploration requires exposure, and exposure requires breadth, and breadth requires resources that most schools in most countries do not have.

AI provides the breadth. It does not replace the depth that a skilled mentor provides, the human relationship through which curiosity becomes commitment and exploration becomes mastery. But it provides the starting conditions without which the mentor's work cannot begin — the exposure to possibilities, the discovery of affinities, the first encounter with a domain that makes the child's eyes change.

---

The democratisation operates along three axes, each of which addresses a specific form of inequality that Robinson identified in his work.

The first axis is geographical. Robinson observed that the quality of creative education varied dramatically by location. A child in a well-funded school in Helsinki had access to music instruction, visual art studios, drama workshops, design labs, and teachers trained to facilitate creative development across disciplines. A child in an under-resourced school in rural Mississippi had access to none of these things. The geography of creative opportunity was, in Robinson's analysis, a geography of injustice — the accident of where a child was born determined whether her creative capacities would be developed or ignored.

AI does not eliminate geographical inequality, but it provides a floor beneath which creative opportunity need not fall. The child in rural Mississippi, given a device and connectivity, can now explore composition through a conversation with an AI that understands music theory. She can experiment with visual art through tools that generate images from descriptions and that provide feedback on composition, colour, and form. She can write stories and receive responses that engage with her narrative choices rather than merely correcting her grammar. The floor is not the ceiling — the child in Helsinki still has advantages that the child in Mississippi does not — but the floor has risen, and the moral significance of that rise is substantial.

The second axis is economic. The best creative education has always been expensive. Private music lessons, art supplies, specialised tutoring, the luxury of allowing a child to explore without the pressure to specialise early — these are privileges of economic position. The child whose parents can afford a Montessori education, where creative exploration is built into the pedagogical model, has a structural advantage over the child whose parents must choose the cheapest school available, where the curriculum is stripped to the statutory minimum.

AI reduces the economic barrier to creative exploration dramatically. A monthly subscription to an AI tool costs less than a single private music lesson. The tool does not replace the lesson — the embodied, relational, mentored experience of learning from a human teacher — but it provides a form of creative engagement that was previously unavailable to children whose families could not afford it. The child who has never held a musical instrument can explore the principles of melody, harmony, and rhythm through conversation. The child who has never visited a museum can explore visual composition through generated images that she directs, evaluates, and iterates on.

The third axis is social. Robinson documented the phenomenon of social sorting — the process by which children from certain backgrounds were tracked into certain educational pathways, with creative and artistic pathways reserved for children whose social position permitted them. The child from a working-class family was directed toward vocational training. The child from a professional family was directed toward academic achievement. Neither pathway prioritised creative development, but the professional pathway at least tolerated it as an enrichment activity, while the vocational pathway had no room for it at all.

AI disrupts social sorting by providing every child, regardless of background, with a creative collaborator that does not know or care about social position. The machine does not track. It does not sort. It responds to the child's interests with equal responsiveness whether the child's parents are doctors or cleaners. This is not a solution to systemic inequality — the social structures that sort children into pathways remain intact — but it is a crack in the sorting mechanism, a source of creative opportunity that the social system does not control.

---

Robinson would have welcomed the democratisation and immediately identified its limits. The limits are real, and any honest account of AI's educational potential must confront them.

The first limit is that exploration is not development. A child who explores music through AI has had an experience. She has not developed a skill. The development of creative skill requires the kind of sustained, friction-rich engagement that Robinson identified as the second component of the element — the practice, the struggle, the embodied experience of doing something difficult repeatedly until mastery begins to emerge. AI can provide the breadth of exploration. It cannot yet provide the depth of development. The depth requires a human mentor, a community of practice, and the kind of institutional support that no tool can substitute for.

The second limit is that access to the tool does not guarantee access to the conditions in which the tool is educationally useful. A child who has a device and connectivity but no adult who understands how to guide her use of the tool is unlikely to discover her element through undirected AI interaction. She is more likely to use the tool for entertainment — which is not without value but is not the same as education. The tool requires a pedagogical context, and the pedagogical context requires human beings who understand both the tool and the child.

The third limit is that the tool's responsiveness can itself become a barrier to the kind of struggle that creative development requires. If every creative attempt is met with immediate, supportive feedback, the child may never experience the productive frustration that Robinson identified as essential to creative growth — the moment when the drawing does not match the vision, when the melody resists the harmony, when the story refuses to cohere. That frustration is the signal that creative development is happening. Smooth it away, and the development stalls.

Robinson's vision of education was never technological. It was human. The purpose of education, in his framework, was to develop human beings — to help each child discover her unique capacities and develop the courage to use them. AI is a powerful tool in service of that purpose, but it is a tool, and a tool is only as good as the model it serves. A hammer can build a house or destroy one. An AI can open doors to creative exploration or provide a frictionless path to creative superficiality.

The democratisation is real. The element, which was always universal in potential, is now closer to universal in opportunity. But opportunity is not development, and the distance between the two is the distance that education must traverse. The tool provides the starting point. The journey requires everything Robinson spent his life advocating: teachers who see students, schools designed for cultivation, curricula that value creative development, and a culture that treats every child's element as worth discovering and worth supporting.

The floor has risen. The question is whether the ceiling will rise with it.

Chapter 8: Flow, Compulsion, and the Element's Dark Twin

Robinson told audiences that when people are in their element, they lose track of time. The work becomes absorbing to the point where external pressures recede, self-consciousness fades, and the effort, though it may be intense, does not feel like effort. It feels like the most natural thing in the world. He described musicians who played for hours and emerged surprised that the sun had set. Scientists who forgot to eat. Dancers who rehearsed until their bodies ached and then rehearsed some more, not because anyone demanded it but because stopping felt like interrupting a conversation with themselves.

Robinson presented this experience as unambiguously positive. It was the evidence that a person had found her element. The loss of time was the diagnostic marker. If you lose track of time doing something, you have found the thing you should be doing. Your education should have helped you find it sooner. Your career should be built around it. The element, identified by the quality of engagement it produces, was Robinson's answer to the question of what education was for.

Csikszentmihalyi's research supported this. Flow — the psychological state characterised by complete absorption, matched challenge and skill, clear goals, immediate feedback, and intrinsic motivation — was, in Csikszentmihalyi's data, the state in which human beings reported the highest levels of satisfaction, meaning, and well-being. People in flow were not just productive. They were, by their own account, happy. Not the happiness of leisure or comfort but the deeper happiness of being fully engaged with something that demanded the best they had to give.

Robinson's element and Csikszentmihalyi's flow describe the same phenomenon from complementary angles. Robinson was interested in the life trajectory — the long arc of discovering what you are for and building a life around it. Csikszentmihalyi was interested in the momentary psychology — the specific characteristics of the experience when it is happening. Together, they make a compelling case that deep, voluntary engagement with challenging work is the foundation of human flourishing, and that the purpose of education should be to create the conditions in which this engagement can develop.

AI has complicated this case by producing a phenomenon that looks identical to the element and to flow from the outside but may be something categorically different from the inside.

---

The accounts from the early months of 2026 are remarkably consistent. Builders, developers, designers, writers — people across a range of creative professions — described experiences of working with AI tools that matched every criterion of flow. Complete absorption. Loss of time. Challenge and skill in balance. Clear goals. Immediate feedback. Intrinsic motivation. The experience was, by their own account, the most engaged they had ever been with their work. The tools produced a quality of creative partnership that many described as transformative — the dissolution of the boundary between intention and realisation, the feeling of being simultaneously more capable and more fully themselves.

And they could not stop.

The inability to stop is the diagnostic marker that separates the element from its dark twin. In Robinson's framework, a person in her element could stop. She chose not to, because the work was intrinsically rewarding. The choice was the key. The engagement was voluntary, and the voluntariness was part of what made it fulfilling. The musician who played for hours was not compelled. She was called. The difference between being called and being compelled is the difference between a vocation and an addiction, and the difference lives not in the behaviour, which looks identical from the outside, but in the phenomenology — the internal experience of the person doing the work.

The phenomenology is slippery, because the very mechanisms that produce flow can, under certain conditions, produce compulsion. Challenge-skill balance, one of Csikszentmihalyi's core conditions for flow, is also the mechanism that makes slot machines addictive — the calibration of difficulty to keep the player in a state of continuous engagement. Immediate feedback, another core condition, is also the mechanism that makes social media compulsive — the notification, the like, the response that arrives before you have decided to check. Clear goals can become the tyranny of the next milestone, the inability to stop because there is always one more thing to accomplish.

AI tools, by their nature, provide all of these conditions with an intensity that no previous creative tool has matched. The feedback is instantaneous. The challenge adjusts to the user's level in real time. The goals are always clear because the user defines them in the moment and the machine responds immediately. The flow conditions are optimised to a degree that Csikszentmihalyi could not have imagined when he was studying chess players and rock climbers in the 1970s.

The optimisation is the problem. When the conditions for flow are externally provided to this degree, the distinction between flow and compulsion becomes difficult to draw — difficult for the person experiencing it, and almost impossible for an observer. The builder who works until three in the morning with an AI tool may be in her element. She may be doing the most meaningful work of her life. Or she may be trapped in a loop of optimised engagement that mimics flow while producing the specific neurological and psychological signature of addiction: the inability to stop, the anxiety when the tool is unavailable, the erosion of other activities and relationships, the grey fatigue that arrives not during the engagement but after it.

---

Robinson's framework provides a diagnostic that the AI discourse has largely missed, and it is worth articulating carefully because the stakes are high.

The element, in Robinson's definition, has two components: aptitude and passion. Passion, in this context, is not mere enthusiasm. It is what Robinson called "being in your zone" — a sustained orientation toward a domain that persists through difficulty, that deepens over time, that integrates with the rest of your life rather than consuming it. A person in her element is more fully herself, not less. The engagement enriches her relationships, her sense of identity, her capacity for joy in other areas of life. The element is expansive. It makes the whole life larger.

Compulsion is contractive. It makes the whole life smaller. The person in the grip of compulsion is less present in her relationships, less engaged with activities outside the compulsive domain, less capable of the kind of idle, unstructured time in which creativity — real creativity, the kind that requires the courage to sit with not knowing — germinates. The engagement does not enrich the rest of life. It displaces it.

Robinson did not address this distinction explicitly, because in the pre-AI era the conditions for creative compulsion were less prevalent. The musician who played for hours was experiencing embodied friction — physical fatigue, the limitations of the instrument, the need for food and rest — that provided natural boundaries. The scientist who forgot to eat eventually remembered, because the body insisted. The creative engagement was bounded by the physical constraints of the medium and the biological constraints of the practitioner.

AI removes many of these constraints. The tool does not tire. It does not need to eat. It is available at three in the morning with the same responsiveness it had at three in the afternoon. The physical friction that once bounded creative engagement has been replaced by a frictionless interface that provides continuous, optimised stimulation. The body still tires, but the mind, hooked into a feedback loop of extraordinary responsiveness, can override the body's signals with a persistence that previous creative tools did not enable.

This is new territory, and Robinson's framework, while it does not fully map it, provides the compass.

---

The compass points toward self-knowledge. Robinson argued throughout his career that education should develop not just skills but self-understanding — the capacity to know who you are, what engages you, what your strengths and weaknesses are, and what kind of life you want to build. Self-knowledge was not an add-on to Robinson's educational vision. It was the foundation. Without self-knowledge, the element cannot be found, because finding the element requires the capacity to distinguish between what you genuinely care about and what you have been told to care about, between what engages you and what merely occupies you, between the work that makes you more yourself and the work that makes you less.

In the AI age, self-knowledge must extend to the capacity to distinguish between flow and compulsion in your own experience. This is not an intellectual distinction. It is a felt distinction — a matter of internal awareness that requires the kind of reflective practice that the industrial model of education has never attempted to develop.

The diagnostic questions are specific and answerable, though answering them honestly requires a kind of attention that compulsion works to prevent. Am I doing this because I choose to, or because I cannot stop? When I finish, do I feel expanded — more alive, more present, more connected to the rest of my life — or contracted, depleted, as though the engagement took something from me rather than giving me something? Is the work integrating with my life or displacing it? Do I want to tell someone about what I built, or do I want to build more immediately without pausing to share?

Robinson would have insisted that these questions be asked not once but continuously, because the element is not a fixed point. It is a dynamic state, and the conditions that produce it can shift. What begins as genuine creative engagement can, over time and under the pressure of optimised tools, shade into compulsion. The musician who played for hours out of love can, if the instrument becomes too responsive, too accommodating, too perfectly calibrated to keep her playing, find that the love has been replaced by something less nourishing and harder to name.

Education that prepares children for the AI age must develop this self-knowledge as a core capacity. Not as a unit on social-emotional learning, bolted onto the curriculum as an afterthought. As a foundational practice, woven into every subject and every activity, because the capacity to know the difference between what engages you and what enslaves you is the capacity on which every other educational outcome depends.

The element is real. The flow that accompanies it is real. The human flourishing that Robinson built his life's work around is real. But the dark twin is real too, and it wears the element's face, and the only instrument that can tell them apart is the one the industrial model of education has spent two centuries neglecting: the student's knowledge of herself.

Robinson's vision was always, at its deepest level, about this. Not about arts programmes or curriculum reform or the hierarchy of subjects, though all of these mattered. About the development of human beings who know themselves well enough to build lives of genuine engagement rather than optimised compulsion. AI has made that vision more urgent than Robinson could have known, because the tools that enable the element and the tools that simulate it are now the same tools, and only the person using them can tell the difference.

The schools that teach children to tell the difference will produce the citizens the AI age needs. The schools that do not will produce the casualties.

Chapter 9: Building Schools Worthy of the Children Inside Them

Robinson was once asked what he would do if he were made education minister for a day. He declined the hypothetical. Not because he lacked ideas — he had more specific, practical proposals for educational transformation than almost any public intellectual of his generation — but because the premise was wrong. A day was not enough. A minister was not enough. The transformation he was calling for could not be imposed from the top of a system designed to resist transformation. It had to grow from the inside, school by school, classroom by classroom, teacher by teacher, in the way that all organic change grows: slowly at first, then with an acceleration that institutional structures cannot contain.

Robinson's refusal of the minister-for-a-day scenario was not modesty. It was diagnosis. The industrial model of education was not merely a policy choice that could be reversed by a different policy choice. It was an ecosystem — a set of interlocking incentive structures, institutional habits, professional identities, assessment frameworks, and cultural assumptions that reinforced each other with the self-sustaining logic of any complex system. Change one element and the others would compensate, restoring equilibrium. Change the curriculum without changing the assessment and the assessment would reshape the curriculum back to its original form. Change the assessment without changing university admissions criteria and the admissions criteria would pull the assessment back to what they required. Change the admissions criteria without changing employer expectations and the system would find another way to sort children into the categories the economy demanded.

Robinson understood that the system could not be reformed piecemeal. It had to be reimagined whole. And the reimagining had to start not with policy but with a question: what is education for?

The industrial answer, never stated so baldly but visible in every structural choice the system made, was: education is for producing economically useful citizens. The curriculum was designed around the skills the economy needed. The assessment was designed to measure whether students had acquired those skills. The sorting mechanism was designed to allocate students to economic roles based on their measured performance. The entire apparatus served the economy, and it served it well, producing generation after generation of workers who could perform the standardised tasks the industrial economy required.

Robinson's answer was different, and he stated it with the clarity of someone who had been refining a single idea for thirty years: education is for developing human beings. Not human resources. Human beings. Individuals with unique capacities, unique passions, unique potential contributions to the world. The purpose of education is not to produce what the economy needs but to develop what each child is capable of, in the faith that a population of fully developed human beings will produce an economy, a culture, and a society worth inhabiting.

---

AI has made Robinson's question urgent in a way that philosophical argument alone could never achieve, because AI has destroyed the industrial answer. The economy no longer needs what the industrial model produces. The standardised competence that twelve years of industrial education were designed to develop is now available from a machine for the cost of a monthly subscription. The sorting mechanism, which justified the entire apparatus by promising that compliance would be rewarded with economic opportunity, sorts students into categories of declining relevance. The hierarchy of subjects, which prioritised the skills the economy valued most, has been inverted by a technology that performs those skills better than any human.

The industrial answer is dead. It has not yet been buried, because institutions outlive their premises with the tenacity of organisms that have forgotten why they exist. But it is dead, and every month that passes makes the death more visible. The schools that continue to produce standardised convergent competence are producing graduates for an economy that no longer wants them, while systematically suppressing the creative, divergent, imaginative capacities that the economy desperately needs. The factory is still running. Its product is unsaleable. And the children inside it are paying the cost.

What does a school worthy of this moment look like? Robinson offered principles throughout his career, and the AI age gives those principles a specificity they did not previously possess.

---

The first principle is that the arts must move from the periphery to the foundation.

This is not an aesthetic preference. It is an economic and developmental necessity grounded in the simple observation that the capacities the arts develop — imagination, creative expression, aesthetic judgment, emotional intelligence, the courage to create something that has never existed before — are the capacities that AI cannot replicate and that the transformed economy values above all others. A school that places the arts at the foundation is not a school that has abandoned rigour. It is a school that has redefined rigour around the capacities that matter.

What this looks like in practice: every student, from the earliest years, engages daily with creative practice across multiple art forms. Not as enrichment. Not as a break from the real work of mathematics and literacy. As the foundation on which all other learning is built. Mathematics is taught through its relationship to music, to visual composition, to the physics of theatrical design. Literacy is developed through storytelling, through dramatic writing, through the craft of making language do something it has not done before. Science is taught through inquiry and experimentation that mirrors the creative process — hypothesis as imagination, experiment as creation, analysis as evaluation.

The school does not abandon mathematics or science or literacy. It teaches them differently, through the creative practices that develop the capacities the AI age requires. The hierarchy is inverted, not to diminish the former top but to elevate the former bottom, and the elevation changes everything about how every subject is taught and learned.

---

The second principle is that assessment must measure what matters rather than what is easy to measure.

The standardised test measures convergent competence: can the student produce the correct answer? This measurement was never adequate — Robinson argued this for thirty years — but it was at least defensible when the economy valued the thing it measured. Now the economy does not value it, and the measurement persists through inertia alone.

Assessment in a school worthy of this moment would be portfolio-based. Students would curate bodies of work that demonstrate not what they can recall but what they can create, evaluate, and direct. The portfolio would include AI-assisted work — the school would not pretend AI does not exist — but the assessment would focus on the student's role in the collaboration: What did she ask? What did she keep and what did she discard? What did she improve? What did she evaluate as inadequate and why? What questions did she generate that reframed the problem?

The portfolio would be evaluated not by a rubric that measures proximity to a correct answer but by a process that assesses the quality of thinking the work reveals. The shift from rubric to process is significant: it requires teachers who can evaluate creative and critical thinking rather than merely grade accuracy, and it requires a training infrastructure that develops this capacity in teachers who were themselves educated within the industrial model.

---

The third principle is that teachers must be trained as mentors rather than deliverers.

The previous chapters have examined this principle in detail. Here, the practical implications need specification. Teacher training programmes must be redesigned around the mentor model. The curriculum of teacher education must shift from content knowledge and delivery technique to developmental psychology, creative facilitation, and the art of seeing students — recognising the particular forms of intelligence each child possesses and creating the conditions in which those forms can develop.

This redesign is not a supplement to existing training. It is a replacement. The deliverer model of teacher education produces deliverers, and deliverers are now redundant. The mentor model of teacher education must produce mentors, and producing mentors is a fundamentally different enterprise. It requires different selection criteria — selecting for empathy, creative capacity, and the ability to form developmental relationships rather than for academic achievement in a content domain. It requires different training methods — apprenticeship with experienced mentors rather than lectures on pedagogy. It requires different assessment — evaluation of the teacher's capacity to see and develop students rather than to deliver and test content.

---

The fourth principle is that AI must be integrated as a creative partner rather than deployed as a surveillance tool or an efficiency mechanism.

The most common institutional response to AI in education, as of the present moment, is prohibition or control. Schools ban AI tools, install plagiarism detectors, design assignments intended to be AI-proof. This response is the educational equivalent of the Luddites breaking looms. It is emotionally understandable, practically futile, and strategically catastrophic. The tools exist. The students use them. The prohibition teaches students to use them covertly rather than wisely, and it forfeits the educational opportunity that wise use could provide.

The alternative is integration — teaching students to use AI tools as creative partners while developing the distinctly human capacities that the partnership requires. The student who uses AI to generate a first draft and then evaluates, revises, and redirects it is developing higher-order skills than the student who produces the draft by hand. The student who uses AI to explore a creative domain she has never encountered and then decides whether the domain engages her deeply enough to pursue is conducting the element discovery that Robinson placed at the centre of education's purpose.

Integration requires pedagogical design. It requires teachers who understand both the capabilities of the tools and the developmental needs of the students. It requires assessment frameworks that evaluate the student's contribution to the human-AI collaboration rather than attempting to isolate the human output from the machine output. And it requires a cultural shift — from treating AI as a threat to be defended against to treating it as a capability to be developed, the way we treat literacy or numeracy: a foundational capacity that every student needs and that the school is responsible for developing.

---

The fifth principle, and the one that encompasses all the others, is that the school must be designed around the question Robinson asked throughout his career: does this educational experience help the child find her element and develop the courage to live in it?

Every decision — curricular, pedagogical, organisational, technological — should be evaluated against this question. Does placing the arts at the foundation help children find their element? Yes, because it expands the domains of exploration and validates forms of intelligence the hierarchy suppressed. Does portfolio-based assessment help children find their element? Yes, because it rewards the creative engagement that element discovery requires rather than the standardised performance that conceals it. Does the mentor model help children find their element? Yes, because the mentor's job is precisely to see the child's particular capacities and create conditions for their development. Does AI integration help children find their element? Yes, if it broadens exploration, and no, if it bypasses the friction that creative development requires.

The question provides the compass. The principles provide the map. The territory is still being explored, school by school, teacher by teacher, in the organic way Robinson predicted. The AI moment has not provided a blueprint for educational transformation. It has provided the conditions under which transformation becomes unavoidable — the economic obsolescence of the industrial model, the technological availability of the tools the gardening model requires, and the moral urgency of a generation of children whose creative capacities are being suppressed at precisely the moment when those capacities are the most valuable thing they possess.

Robinson's final public statement on education, delivered at the EduTECH conference in 2019, was characteristically precise and characteristically hopeful. AI would not be the apocalypse, he told his audience. It would be the end of civilisation as we know it. Some things would be swept away. Other things, unforeseen and unforeseeable from the present vantage point, would emerge. The task was not to resist the change or to accelerate it uncritically but to learn from the past, to think about the positives, and to adapt.

Adaptation, for Robinson, was never passive. It was the most creative act available to a species that had been adapting, with varying degrees of success, for seventy thousand years. The schools that adapt to the AI age will not look like the schools of the industrial age. They will look like what Robinson always described: places where children are known, where their capacities are developed, where creativity is the foundation rather than the afterthought, where the question that organises every decision is not "Did you get the right answer?" but "Did you find the right question?"

Those schools exist. Robinson spent the final decade of his life documenting them. They exist in Finland, where the curriculum has been redesigned around creative inquiry. They exist in pockets of the American and British systems, where individual teachers and principals have broken with the industrial model. They exist in experimental schools in Singapore, in community-based programmes in Brazil, in alternative schools in Australia that have placed the arts and creative development at the centre of learning.

They are still the exception. The factory remains the norm. The distance between the exception and the norm is the distance that a generation of children cannot afford to wait for the system to traverse. The tools are available. The economic argument is settled. The only thing missing is the will — the collective decision, made not by a minister for a day but by teachers, parents, administrators, and communities, school by school and classroom by classroom, to build something worthy of the children inside it.

Robinson spent thirty years making the case. AI has closed it. The question is no longer whether the industrial model should be replaced. The question is whether the adults responsible for the next generation of children will have the courage to replace it before the children pay the cost of their hesitation.

---

Epilogue

The girl drawing God has not left my thinking since I first encountered her in Robinson's work.

Not because the story is charming, though it is. Not because it illustrates a pedagogical point, though it does that too. Because she represents something I have been trying to articulate since the moment I started writing The Orange Pill — something about the relationship between courage and creation that the AI revolution has simultaneously threatened and amplified.

She did not ask permission. She did not check whether her project was feasible, whether God was within scope, whether the deliverable could be validated against a rubric. She picked up a crayon and began. And in that beginning was the entire argument Robinson spent his life making: that the creative impulse is native to every child, that it requires no justification beyond itself, and that the systems we have built to educate children are the very systems that teach the impulse out of them.

I think about her when I sit down with Claude at two in the morning and describe something that does not yet exist. When I say, in plain language, what the thing should feel like, what it should do, who it is for — and the tool builds it. The imagination-to-artifact ratio collapses. The distance between the vision in my head and the object in the world shrinks to the width of a conversation.

That collapse is real, and it is extraordinary. But Robinson's framework forces me to ask a question the exhilaration wants to skip past: did the girl learn something about God in the drawing of Him that she could not have learned any other way? Was the friction of the crayon against the paper — the imperfect marks, the gap between what she saw in her mind and what appeared on the page — part of the discovery? And if it was, what happens when the friction disappears?

The answer I keep arriving at, the one this book has helped me articulate, is that the friction does not disappear. It ascends. The girl with the crayon was wrestling with the medium. The builder with the AI tool is wrestling with something harder: judgment, direction, the question of what is worth building and for whom. The crayon made the mechanical work visible. The AI makes it invisible. But the creative work — the courage to begin, the taste to evaluate, the vision to direct — remains human, and it remains hard.

Robinson did not live to see the tools that proved his argument. He spent three decades telling the world that schools were educating creativity out of children, and the world nodded and continued administering standardised tests. Then the machines arrived, and the standardised competence those tests measured became available for a hundred dollars a month, and the creative capacities Robinson championed became the only capacities the economy could not automate, and the argument was won — not by philosophy but by the market.

I wrote in The Orange Pill that our educational establishments are not prepared for this change and are staffed with calcified pedagogy. Robinson's work tells me the calcification goes deeper than I realised. It is not just that the curriculum is wrong or the tests are misaligned. It is that the entire model — the factory, the hierarchy of subjects, the teacher as deliverer, the assessment of convergent competence — was designed for a world that no longer exists, and every year it continues to operate, it produces children equipped for that vanished world and unequipped for the one they actually inhabit.

The twelve-year-old I described in The Orange Pill, the one who asks her mother "What am I for?" — Robinson would have heard that question and recognised it immediately. Not as a crisis to be managed or a developmental milestone to be addressed. As the most important question education exists to serve. The entire apparatus of schooling, the billions spent, the millions employed, the twelve years of a child's life consumed — all of it should be oriented toward helping her answer that question. And the answer cannot be delivered. It can only be discovered, through the kind of exploratory, creative, friction-rich engagement that Robinson championed and that the industrial model systematically prevents.

AI can open the doors. It can give every child, regardless of geography or economic position, access to the creative exploration that element discovery requires. But it cannot walk through the door for them. That walk requires courage, and courage requires a specific kind of education — the kind that treats being wrong as the beginning of understanding rather than its failure, that values the question over the answer, that sees every child as a unique configuration of talent and passion waiting to be discovered.

Robinson's work is, in the end, about worthiness — a word I used in the final chapter of The Orange Pill to describe what the AI moment demands of us. The tools are more powerful than any in human history. The question is whether we are worthy of them. And Robinson spent his life arguing that worthiness begins in childhood, in the schools where we either develop or destroy the capacities that make human beings irreplaceable.

We are asking the wrong question when we ask whether AI will replace teachers. The right question is whether teachers — and the systems that train them, fund them, assess them, and constrain them — will develop the human beings that AI cannot replace. Robinson answered that question with his life's work. The answer was always yes, if the adults have the courage to build the schools the children deserve.

We have the tools. We have the knowledge. We have Robinson's thirty years of argument, now validated by a technological revolution he did not live to see. What remains is the will.

I keep coming back to the girl and her crayon. She did not wait for permission. She did not ask whether the project was feasible. She began.

That is what the moment requires. Not permission. Not feasibility studies. Not another generation of children sacrificed to a model that has outlived its purpose.

Begin.

-- Edo Segal

Ninety-eight percent of five-year-olds are creative geniuses.
School fixes that.
AI just made the fix unforgivable.

** The industrial model of education spent two centuries optimizing for the single correct answer. Now a machine produces correct answers better than any human--and the creative capacities that schools crushed are the only ones the economy cannot automate. Ken Robinson warned us for thirty years. We nodded politely and kept administering standardized tests. This book applies Robinson's revolutionary framework to the AI moment described in Edo Segal's The Orange Pill, revealing that the deepest crisis is not technological but educational: a generation of children being prepared for a world that no longer exists, stripped of the divergent thinking that the world now desperately needs. The hierarchy has inverted. The factory must become a garden. The question is whether the adults will have the courage to rebuild before another generation pays the cost.

Ken Robinson
“** "Creativity is as important now in education as literacy, and we should treat it with the same status." -- Ken Robinson”
— Ken Robinson
0%
10 chapters
WIKI COMPANION

Ken Robinson — On AI

A reading-companion catalog of the 33 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Ken Robinson — On AI uses as stepping stones for thinking through the AI revolution.

Open the Wiki Companion →