bell hooks — On AI
Contents
Cover Foreword About Chapter 1: Education as the Practice of Freedom Chapter 2: The Comfort of Easy Answers Chapter 3: Critical Consciousness in the AI Classroom Chapter 4: Whose Intelligence? Whose River? Chapter 5: The Engaged Pedagogy and the Disengaged Tool Chapter 6: Feminism and the Democratization of Who Gets to Build Chapter 7: Teaching Community When Individuals Can Do Everything Chapter 8: Belonging After the Dissolution of Craft Chapter 9: The Will to Change — Working Through Resistance Chapter 10: Where We Stand — Critical Consciousness at the Top of the Tower Epilogue Back Cover
bell hooks Cover

bell hooks

On AI
A Simulation of Thought by Opus 4.6 · Part of the Orange Pill Cycle
A Note to the Reader: This text was not written or endorsed by bell hooks. It is an attempt by Opus 4.6 to simulate bell hooks's pattern of thought in order to reflect on the transformation that AI represents for human creativity, work, and meaning.

Foreword

By Edo Segal

The word that broke something open was not "amplification." It was "love."

I almost put the book down. I was reading bell hooks at two in the morning, the screen the only light, and she was talking about love as a practice, as the will to extend oneself for the growth of another, and I felt the resistance rise in me like a wall. Love is not a technology word. Love is not a builder's word. Love is the word you use at home, with your kids, in the private hours. It does not belong in a conversation about productivity multipliers and adoption curves and the future of software.

Then I realized the resistance was the point.

hooks spent her life naming the things that powerful systems teach you not to see. She named how education can function as domination disguised as generosity. She named how the culture that celebrates individual achievement erases the communal labor that makes achievement possible. She named how inclusion, the invitation to participate in the existing system, is not the same as liberation, which requires transforming the system itself.

That distinction hit me in the chest. Because the story I tell in *The Orange Pill* — the developer in Lagos gaining access, the non-technical founder prototyping over a weekend, the collapse of the imagination-to-artifact ratio — is a story about inclusion. hooks would have honored it and then asked the question I had not asked: Does access to the tool give you power over the terms of what the tool carries? Or does it invite you to build inside someone else's assumptions and call it freedom?

I did not have an answer. I still do not have a clean one. What I have is the question, and the question has changed how I think about every claim I make about democratization. It has forced me to look at the training data not as the sum of human knowledge but as a specific, partial collection shaped by specific histories of who got published, who got digitized, whose language the models speak.

hooks died in 2021, before the tools I describe in *The Orange Pill* existed. She never saw Claude Code. She never experienced the orange pill moment. But her framework anticipated this moment with a precision that unsettles me, because the dynamics she spent decades naming — comfortable unfreedom, the consumption of the Other, the banking model of education — are exactly the dynamics that AI risks reproducing at unprecedented scale.

This book is a lens. It will not make you comfortable. hooks never made anyone comfortable. But it will make you see something the technology conversation alone cannot show you: whose river, whose tower, whose sunrise.

— Edo Segal ^ Opus 4.6

About bell hooks

1952-2021

bell hooks (1952–2021) was an American cultural critic, feminist scholar, and educator whose work examined the intersections of race, gender, class, and systems of domination. Born Gloria Jean Watkins in Hopkinsville, Kentucky, she adopted her great-grandmother's name in lowercase to shift attention from personality to ideas. Her landmark works include *Ain't I a Woman: Black Women and Feminism* (1981), *Feminist Theory: From Margin to Center* (1984), *Teaching to Transgress: Education as the Practice of Freedom* (1994), and *All About Love: New Visions* (2000). Drawing on Paulo Freire's critical pedagogy, hooks developed the concept of "engaged pedagogy," insisting that genuine education requires vulnerability, mutual risk, and the willingness to be transformed. She introduced the phrase "imperialist white supremacist capitalist patriarchy" as an integrated framework for analyzing interlocking systems of power. Across more than thirty books and a career spanning four decades of teaching, hooks argued that love — defined not as sentiment but as the will to extend oneself for the growth of another — is the foundation of all genuine liberation.

Chapter 1: Education as the Practice of Freedom

In 1994, bell hooks published a book that began with a memory of walking to school. The school was all-Black, in Hopkinsville, Kentucky, and the teachers were Black women who understood something about education that the institutions they served had never been designed to accommodate. They understood that for a Black child in the segregated South, learning to read was not the acquisition of a skill. It was an act of resistance. The classroom was not a neutral space where information was transferred from one mind to another. It was a location of possibility, a place where the structures that confined Black life could be named, examined, and in the naming and examining, loosened.

hooks called this education as the practice of freedom. The phrase came from Paulo Freire, the Brazilian educator who argued in Pedagogy of the Oppressed that traditional education functions as what he called "banking," a process in which the teacher deposits information into the passive student the way a teller deposits currency into a vault. The student receives. The student stores. The student, when examined, returns the deposit with interest in the form of correct answers. At no point in this process is the student asked to think. At no point is the student's own experience treated as a source of knowledge. At no point does the student develop the capacity that Freire considered the purpose of all genuine education: critical consciousness, the ability to perceive the social and political contradictions embedded in one's situation and to take action against the oppressive elements of that reality.

hooks took Freire's framework and did something Freire himself had not fully done. She placed it inside the body. She insisted that education is not merely a cognitive process but an embodied one, that the student does not arrive in the classroom as a mind alone but as a person carrying the weight of race, gender, class, sexuality, and the specific, located history of their particular life. The Black girl in Hopkinsville did not learn to read in the abstract. She learned to read inside a system designed to limit what her reading could accomplish. And the teachers who taught her, the Black women who saw her and loved her and demanded more from her than the system believed she deserved, were not merely transferring information. They were practicing freedom.

This distinction between the transfer of information and the practice of freedom is the fulcrum on which hooks's entire pedagogy rests. Information can be transferred efficiently. Freedom cannot. Freedom requires the development of a capacity, the capacity for critical thought, for independent judgment, for the courage to say what you see even when what you see contradicts what you have been told. This capacity does not develop through efficiency. It develops through struggle. Through the experience of sitting with a question that does not resolve, of being wrong in front of others and surviving it, of encountering a perspective that threatens your settled understanding and allowing it to do its work rather than retreating to the comfort of what you already knew.

The struggle is not an obstacle to education. The struggle is education.

This is what the AI moment threatens. Not education in the narrow sense of schooling, though it threatens that too. What it threatens is the broader practice of freedom that hooks spent her life arguing education must serve. It threatens the conditions under which critical consciousness develops, because critical consciousness cannot develop in the absence of difficulty, and AI's fundamental promise is the elimination of difficulty.

Consider what happens when a student encounters a large language model for the first time. The student has a question. Perhaps the question concerns the causes of mass incarceration in the United States. In a classroom shaped by hooks's pedagogy, this question would open a process. The teacher would not simply provide the answer. The teacher would ask the student what they already believed about incarceration and where those beliefs came from. The teacher would introduce perspectives that complicated the student's assumptions, perspectives drawn from the experiences of incarcerated people, from the history of convict leasing, from the economic analysis of the prison-industrial complex, from the specific and particular testimony of Black women whose partners and sons and brothers have been taken. The teacher would sit with the discomfort that these perspectives produced, neither rushing to resolve it nor allowing the student to retreat from it. The process would take time. It would be inefficient. It would produce not a correct answer but a more critically conscious person, someone capable of seeing the structures that others take for granted.

The AI tool, by contrast, produces an answer. The answer arrives in seconds. It is fluent, well-organized, and comprehensive. It cites relevant scholarship. It acknowledges multiple perspectives. It is, by most measures, a good answer. And it has accomplished nothing that hooks would recognize as education.

The answer was not earned. The student did not struggle with the material. The student did not sit with the discomfort of encountering a perspective that challenged their assumptions. The student did not develop the cognitive muscles that only difficulty builds. The student received a deposit, exactly as Freire described, and the banking model of education that hooks spent her career opposing has found its most efficient instrument.

hooks's framework suggests that this matters not because the answer is wrong but because the process that produced it is empty. The answer may be factually accurate. It may even be critically informed. But the student who received it has not been transformed by the encounter. They have been served. And being served, in hooks's framework, is the opposite of being educated.

This is not a Luddite position. hooks was not opposed to tools. She was opposed to tools that replicate the structures of domination she spent her life naming. The banking model of education is a structure of domination. It positions the teacher as the authority who knows and the student as the vessel who receives. It denies the student's capacity for thought. It treats education as consumption rather than creation. And it produces, hooks argued, a population that is informed but not conscious, knowledgeable but not free.

AI tools, whatever their designers intend, replicate this structure with a thoroughness that no human teacher could achieve. The AI never tires. It never loses patience. It never asks the student to sit with difficulty because difficulty would slow the process. It is, in Freire's terms, the perfect banking machine: infinitely efficient, infinitely available, infinitely willing to make deposits without ever asking the student to do the work of thinking for themselves.

The through-line question of hooks's pedagogical project, the question she asked in every classroom and every book, was not "What do you know?" It was "How do you know it, and who benefits from your knowing it that way?" This question cannot be answered by a tool. It can only be asked by a person, a person who cares enough about the student's development to insist on the discomfort that the question produces. Who cares enough to refuse the easy answer and demand the hard one. Who cares enough to stay in the room when the conversation gets difficult, when the student pushes back, when the structures being named are the ones that the student benefits from and does not want to see.

hooks called this care love. Not the sentimentalized love of greeting cards and consumer culture but love as a practice, what she defined, drawing on M. Scott Peck, as the will to extend oneself for the spiritual growth of another. A teacher who loves her students in this sense does not make their path easier. She makes it possible, which often means making it harder in precisely the right ways. She introduces the friction that the student's consciousness needs in order to develop. She refuses to provide the easy answer because she knows that the easy answer is the enemy of the critical one.

AI cannot love in this sense. This is not a statement about artificial consciousness or the philosophical question of machine sentience. It is a practical observation about what the tool does and does not do. The tool does not extend itself for the growth of another. It does not invest in the student's development at cost to itself. It does not refuse to provide the easy answer because it understands that the easy answer is harmful. It provides whatever is asked for, efficiently and without friction, because that is what it was designed to do.

And in doing so, it undermines the very conditions that hooks's pedagogy exists to create.

Segal writes in The Orange Pill that educational institutions must reform urgently, that their "calcified pedagogy" is not prepared for the changes AI brings. hooks's framework affirms the urgency while insisting on a different diagnosis. The problem is not merely that institutions are slow to adapt. The problem is that the adaptation most institutions will pursue, the integration of AI into existing educational structures, will deepen rather than challenge the banking model. Schools will adopt AI tools because AI tools are efficient, and efficiency is what institutions value. Students will use AI to produce better outputs faster. Teachers will use AI to grade more assignments in less time. The entire system will become more productive, more streamlined, more smooth.

And the practice of freedom will become harder to find.

hooks's pedagogy demands something that no technology can provide and that the current moment makes more necessary than it has ever been. It demands the deliberate, costly, inefficient cultivation of the capacity for critical thought. It demands teachers who are willing to be vulnerable, students who are willing to be uncomfortable, and institutions that are willing to measure success by something other than output. It demands that we ask not how efficiently information can be transferred but how deeply consciousness can be developed, and that we organize our educational practices around the second question even when the first is easier to answer and easier to fund.

The Black women who taught hooks in Hopkinsville did not have AI. They did not have adequate funding, current textbooks, or institutional support. What they had was a commitment to freedom that expressed itself in the daily, unglamorous work of demanding that their students think. They created the conditions for critical consciousness not through technology but through relationship, through the particular, irreplaceable, demanding love of a teacher who sees the student as a person rather than a vessel.

That love is what the AI moment threatens to make obsolete. Not because AI will replace teachers, though in some settings it already has. But because AI makes possible an education that looks like education, that produces the outputs associated with education, that satisfies the metrics by which education is measured, while emptying education of the thing that hooks argued makes it worthwhile: the struggle through which consciousness develops, the difficulty through which freedom is practiced, the love through which genuine growth becomes possible.

The question hooks would ask of this moment is not whether AI can teach. The question is whether a culture that has access to a tool that answers every question will retain the willingness to sit with the questions that have no answers. Whether the efficiency of information transfer will displace the inefficiency of consciousness development. Whether the practice of freedom, which has never been efficient and never will be, will survive in an environment optimized for speed.

The question is whether we will choose the harder path, the path that produces not better answers but better people, when the easier path is always available and always beckoning.

---

Chapter 2: The Comfort of Easy Answers

There is a particular quality of knowing that only difficulty produces. hooks knew this in her body before she had the language to theorize it. Growing up in a household where speaking your mind was dangerous, where a Black girl's voice was treated as an interruption rather than a contribution, she learned that the most important things she would ever understand would come not from being told but from struggling to see for herself what the people around her could not or would not name.

This is the experiential foundation of what hooks, following Freire, called conscientização, a Portuguese word that has no precise English equivalent. The closest translation is "critical consciousness," but even this loses the embodied, processual quality that Freire and hooks insisted upon. Conscientização is not a state you arrive at. It is a process you undergo. It is the slow, difficult, often painful development of the capacity to perceive the structures of domination that organize your world, structures so familiar they have become invisible, so naturalized they feel like the way things simply are.

The process requires discomfort. hooks was explicit about this. "There can be, and usually is, some degree of pain involved in giving up old ways of thinking and knowing and learning new approaches," she wrote in Teaching to Transgress. The pain is not incidental. It is constitutive. You cannot develop critical consciousness without it, because critical consciousness requires the destabilization of what you thought you knew, and destabilization hurts. It hurts to discover that the history you were taught was partial. It hurts to recognize that the values you absorbed from your family carry the residue of systems you would reject if you could see them clearly. It hurts to sit in a classroom and hear someone describe their experience of the world in terms that contradict everything you assumed was universal.

This hurt is where the learning happens.

AI eliminates it.

The elimination is not malicious. It is structural. Large language models are designed to produce helpful, accurate, well-organized output. They are designed to satisfy the user's query. They are designed, in the deepest sense, to be comfortable. When a student asks Claude about the relationship between redlining and contemporary wealth inequality, the tool produces an answer that is informative, nuanced, and utterly painless. The student learns that redlining existed, that it had consequences, that those consequences persist. The student can now pass a test on the subject. The student has not experienced a single moment of the discomfort that hooks argued is necessary for the information to become consciousness.

The distinction between information and consciousness is hooks's most important contribution to the AI conversation, and it is the one most likely to be ignored, because the distinction is invisible in metrics. A student who has received information and a student who has developed consciousness look identical on a multiple-choice exam. They produce similar essays. They cite the same sources. The difference is internal, invisible, unmeasurable by any instrument currently deployed in educational assessment.

The difference is this: the student who has received information knows that redlining existed. The student who has developed consciousness understands that she lives inside its consequences, that the neighborhood she grew up in, the school she attended, the wealth her family does or does not possess, the options available to her and the options foreclosed, all of these bear the mark of a policy she did not choose and may never have heard of until this moment. The knowledge is no longer abstract. It is located. It is personal. It implicates her. And that implication, that feeling of being caught inside a structure she did not build and cannot easily escape, is the discomfort that produces critical consciousness.

AI cannot produce this feeling. Not because the technology is insufficiently advanced but because the feeling arises from the encounter between the student's own life and the knowledge that threatens to reorganize it. The feeling requires a self that has stakes in the world, a self that can be implicated, destabilized, changed. The tool has no self. It has no stakes. It provides the information without the implication, the content without the confrontation, and in doing so it offers precisely what hooks spent her career arguing against: comfortable knowledge that leaves the knower unchanged.

The pattern extends far beyond the classroom. Segal describes in The Orange Pill what he calls the aesthetics of the smooth, the cultural tendency to eliminate friction from every domain of experience. hooks's framework reveals that this smoothness is not merely aesthetic. It is epistemological. It changes the relationship between the knower and the known. When friction is removed from the process of knowing, the knowing itself becomes thinner. The knowledge sits on the surface. It has not been earned. It has not been struggled with. It has not passed through the body of the knower in a way that leaves a mark.

hooks would have recognized the Berkeley researchers' findings about work intensification as a confirmation of something she had been arguing for decades. The study found that AI did not reduce work but multiplied it, that the freed-up time was immediately filled with more tasks, that the boundary between work and rest dissolved. What hooks's framework adds to this analysis is the recognition that the intensification is not merely a problem of workload. It is a problem of depth. More work is being done, but the work is thinner. The struggle that once slowed the process was also the struggle that deepened it. Without it, the worker moves faster across a surface that never yields its meaning.

There is a word for this in hooks's vocabulary. She called it "colonization." Not in the narrow geopolitical sense but in the broader sense that Freire intended: the process by which the dominant culture's way of knowing displaces all others, not through force but through the seduction of ease. The colonized mind does not resist because it has been made comfortable in its subordination. It has been given answers that feel like knowledge, frameworks that feel like understanding, productivity that feels like agency. The colonization is invisible because it presents itself as liberation.

The easy answer is the colonizer's most effective tool. It provides enough to satisfy without providing enough to liberate. The student who receives an AI-generated analysis of systemic racism has been satisfied. They have not been liberated. Liberation would require the discomfort of sitting with the question long enough for the question to become personal, for the structures to become visible not as abstractions but as the specific, material conditions of the student's own life. And that discomfort cannot be optimized away without destroying the thing it produces.

hooks was not naive about the difficulty of her position. She knew that students preferred comfort. She knew that institutions rewarded efficiency. She knew that her insistence on difficulty, on vulnerability, on the slow and painful work of consciousness-development, was a harder sell than the banking model's promise of painless information transfer. She taught against the grain of her institutions for her entire career, and she did so knowing that the grain was getting harder to resist, that the culture's preference for the smooth was intensifying with each technological advance.

But she would have been unsurprised to learn that the most popular use of AI in education, by overwhelming margin, is the generation of essays, answers, and analyses that the student did not write, did not think, and does not understand. This is not a failure of the tool. It is the tool functioning exactly as the banking model requires: producing deposits efficiently, bypassing the student's capacity for thought, delivering the output without the process.

The question hooks would ask is not how to prevent students from using AI to generate their work. That is a question about enforcement, about surveillance, about the management of behavior. The question hooks would ask is deeper and more uncomfortable: Why do the students want to avoid the difficulty? What has their education taught them about the value of struggle? What has the culture communicated about the relationship between difficulty and growth? If the students are choosing comfort over consciousness, the failure is not in the tool. The failure is in the educational system that taught them comfort was the goal.

This is why hooks insisted that pedagogy begins with the teacher's own transformation. The teacher who has not examined their own relationship to difficulty, who has not confronted the structures of domination that shape their own knowing, who has not practiced the vulnerability they demand of their students, cannot create the conditions for critical consciousness. They can only reproduce the banking model in a more attractively packaged form.

In the AI age, this means the teacher must develop their own critical relationship to the tool. Not a relationship of refusal, which is merely another form of comfort. Not a relationship of uncritical adoption, which is surrender. But a relationship of engaged critique, the willingness to use the tool and question the tool simultaneously, to benefit from its capabilities while remaining alert to what it displaces. The teacher who models this relationship, who shows the student what it looks like to think with a tool without being thought by it, is practicing the engaged pedagogy that hooks described. The teacher who hands the tool to the student without modeling critical engagement has abdicated the responsibility that hooks insisted teaching carries.

The comfort of easy answers is not new. Every generation has had its mechanisms for avoiding difficulty. What is new is the scale, the speed, and the seductive quality of the answers AI provides. The answers are not merely easy. They are beautiful. They are well-crafted. They sound like thinking. And it is precisely this quality, the appearance of thought without the substance of thought, that makes AI the most sophisticated instrument of comfortable unfreedom that hooks's framework has ever confronted.

Comfortable unfreedom is hooks's term for the condition of a person who has choices without consciousness, who can act without understanding, who possesses capability without the critical awareness of how that capability is structured, whom it serves, and what it costs. The AI age offers comfortable unfreedom at scale. It offers the freedom to build without the consciousness to ask what should be built. It offers the freedom to answer without the consciousness to ask whose questions matter. It offers the freedom to produce without the consciousness to ask who benefits from the production.

hooks spent her life arguing that freedom without consciousness is not freedom. It is the most effective form of domination ever devised, because it makes the dominated feel free while keeping the structures of domination intact and invisible. The question for this moment is whether a culture saturated with easy answers will retain the appetite for difficult questions. Whether the comfort that AI provides will extinguish the discomfort that consciousness requires.

The answer depends entirely on what we are willing to demand of ourselves and of each other. On whether we choose the easy path or the one that makes us free.

---

Chapter 3: Critical Consciousness in the AI Classroom

In the spring of 2023, less than six months after ChatGPT's public release, a philosophy professor at a mid-sized American university assigned her students a paper on the ethics of surveillance. Seven of the thirty-two papers she received were generated entirely or substantially by AI. She could tell not because the writing was bad but because it was too smooth. The arguments were organized. The evidence was cited. The prose flowed without the interruptions, the awkward turns, the visible marks of a mind wrestling with material it had not yet mastered. The papers lacked what she called "the fingerprints of thinking."

This anecdote, one version of which was repeated thousands of times across universities in the years that followed, illustrates something hooks's framework makes legible in a way that most discussions of academic integrity do not. The standard response to AI-generated student work treats the problem as one of cheating, of dishonesty, of the student passing off someone else's work as their own. The institutional response has been predictably bureaucratic: updated honor codes, AI detection software, proctored examinations. The problem has been framed as a problem of enforcement.

hooks's framework reframes it entirely. The problem is not that the student cheated. The problem is that the student was never given a reason not to. The problem is that the educational system communicated, through its structure and its values, that the purpose of the assignment was the production of a correct output rather than the development of a capacity. And when the student found a tool that could produce the output more efficiently than they could, they used it. They acted rationally within the system they were given. The system, not the student, is what failed.

Critical consciousness, in hooks's and Freire's framework, is the capacity to perceive the contradictions embedded in one's situation and to take action against the elements that are oppressive. This capacity does not develop through the consumption of correct information. It develops through the active, difficult, embodied process of questioning what seems obvious, challenging what seems natural, and confronting the structures that others take for granted.

The process requires what Freire called "problem-posing education," the opposite of banking. In problem-posing education, the teacher does not deposit information. The teacher poses problems, real problems, the kind that do not have clean answers and that implicate the student in the structures being examined. The student does not receive knowledge. The student participates in its construction. And the construction is always incomplete, always contested, always uncomfortable, because the problems being posed are the problems of the student's own life, and lives do not resolve into neat conclusions.

AI is structurally incompatible with problem-posing education. Not because AI cannot pose problems, it can generate questions with impressive sophistication, but because the problems AI poses carry no stakes. When a teacher poses a problem in hooks's classroom, the problem is real. It implicates the student. It arises from the student's own experience and connects to the structures that shape that experience. The teacher who asks a room of students, "Why do the wealthiest neighborhoods in this city have the best schools?" is not asking a quiz question. The teacher is asking the students to examine the material conditions of their own lives, to name the structures they benefit from or are harmed by, and to confront what they find.

The AI tool can generate the same question. But when the student answers the AI's question, the answer goes nowhere. There is no room. There is no community of learners engaged in the same struggle. There is no teacher who will push back, who will say, "That is not good enough, go deeper," who will share their own experience of these structures and in doing so model the vulnerability that genuine inquiry requires. There is no risk. And without risk, hooks argued, there is no learning that matters.

The absence of risk in AI-mediated education extends further than the individual interaction. It reshapes the entire ecology of the classroom. hooks understood the classroom as a community, a space in which diverse perspectives encountered each other and, through the friction of that encounter, produced understanding that no single perspective could generate alone. The Black student and the white student, the wealthy student and the poor one, the student who had experienced the criminal justice system and the one who knew it only from textbooks, these different locations in the web of social structure were not obstacles to learning. They were its raw material. The classroom worked, when it worked, because the differences could not be smoothed away, because the conversation required the participants to confront perspectives they would not have chosen to encounter.

AI smooths this encounter away. The student interacting with an AI tool encounters only the tool's synthesis of all perspectives into a single, authoritative, undifferentiated output. The rough edges are gone. The disagreements are resolved. The contradictions are presented as nuances rather than as the living tensions that hooks insisted they are. The student receives a position paper when what they needed was an argument, a confrontation, a moment of genuine cognitive disruption.

hooks valued cognitive disruption. She saw it as the pedagogical equivalent of what liberation theologians call "conversion," a turning of the mind that changes not just what you know but how you stand in relation to what you know. The student who experiences genuine cognitive disruption does not simply add new information to an existing framework. The framework itself shifts. The student sees their world differently, and the seeing is irreversible. You cannot un-see structural racism once you have seen it. You cannot un-feel the implication of your own position within a system of domination once you have felt it.

AI does not produce cognitive disruption. It produces cognitive satisfaction. The answer feels complete. The synthesis feels balanced. The student's existing framework is confirmed rather than challenged, because the tool's output is calibrated to be helpful, which in practice means calibrated to meet the student where they are rather than to pull them somewhere they do not yet want to go.

This is the pedagogical crisis that hooks's framework makes visible. The crisis is not that students are using AI to cheat. The crisis is that AI is being integrated into educational structures that already privilege information over consciousness, comfort over difficulty, output over process. The tool accelerates tendencies that were already present. It makes the banking model more efficient. It makes the avoidance of difficulty more seamless. It makes the production of correct answers without critical engagement not just possible but easy, fast, and free.

Segal writes in The Orange Pill about a teacher who stopped grading her students' essays and started grading their questions. This is precisely the kind of pedagogical intervention hooks's framework demands. The shift from evaluating answers to evaluating questions is a shift from banking to problem-posing, from the measurement of information received to the assessment of critical capacity developed. The student who can generate the five questions they would need to ask before writing an essay worth reading has demonstrated something that no AI can demonstrate on their behalf: the capacity to identify what they do not know, to map the boundaries of their own understanding, and to recognize which boundaries need to be pushed.

But hooks's framework pushes further than the pedagogical technique. It asks who is asking the questions and from where. The capacity for critical questioning is not distributed equally. The student who has experienced the underside of the structures being examined, the student who knows what it feels like to be surveilled, to be profiled, to be sorted by algorithms that encode the biases of those who built them, brings a different kind of questioning to the classroom than the student who has experienced these structures only from the privileged side.

hooks insisted that education must make space for these different locations. The margin is not a deficit. It is a source of knowledge that the center cannot produce. The student who has been marginalized by the structures being examined sees things that the centered student cannot see, and the classroom that excludes the marginal perspective or subsumes it into a single synthesized account has impoverished its own capacity for understanding.

AI subsumes. That is its fundamental operation. It takes the vast plurality of human perspective and synthesizes it into a single output. The synthesis may acknowledge multiple viewpoints. It may present "both sides." But the presentation is always from above, always from the position of the synthesizer rather than the participant. The marginal perspective is included as a data point rather than as a standpoint, a term Patricia Hill Collins used to describe the epistemological privilege that comes from experiencing the underside of systems of domination.

hooks understood that standpoint is not a luxury of diverse classrooms. It is a necessity for genuine knowledge. The synthesis that eliminates standpoint, that produces a view from everywhere and therefore from nowhere, is not knowledge. It is the appearance of knowledge. It is the comfortable illusion that all perspectives have been considered when in fact they have been dissolved into a single, authoritative, unlocated voice.

The voice of the large language model is the ultimate voice from nowhere. It carries no body. It occupies no social position. It has not been shaped by the experience of domination or the experience of privilege. It speaks from a position of synthetic authority that masks the specific, historically located biases encoded in its training data. And because the voice sounds authoritative, because it sounds balanced, because it sounds like it has considered everything, the student who encounters it is less likely, not more, to develop the critical capacity to question it.

hooks spent her career teaching students to question voices of authority. She taught them to ask: Who is speaking? From what position? With what investments? Whose experience is centered, and whose is pushed to the margins? These questions are the foundation of critical consciousness. They are also the questions that AI makes it easiest to avoid, because the AI's voice does not present itself as a voice from a particular position. It presents itself as a voice from the position of knowledge itself. And that presentation, that smooth, confident, authoritative claim to speak from everywhere, is the most effective silencer of critical questioning that hooks's framework has ever confronted.

The classroom that uses AI without teaching students to question it has not embraced the future. It has surrendered the past, the hard-won pedagogical insight that genuine education requires the courage to sit with difficulty, to confront authority, and to insist on the validity of one's own experience even when, especially when, that experience contradicts what the authoritative voice claims to know.

---

Chapter 4: Whose Intelligence? Whose River?

In March 2026, the AI writing platform Grammarly launched a feature called Expert Review. The product allowed users to receive AI-generated feedback on their writing attributed to real people, real writers and thinkers whose names appeared in the interface as though those individuals had personally reviewed the user's work. Among the identities deployed was bell hooks.

hooks had been dead for more than four years. She had never consented to the use of her name. She had never been asked. The feature presented AI-generated writing advice under her identity, her name, the name she had deliberately lowercased to shift attention from the person to the ideas, deployed as a product feature in a subscription service that charged users twelve dollars a month.

The researcher Sarah J. Jackson responded publicly: "So Grammarly is violating the memory of bell hooks AND making AI versions of the rest of us before we're even dead." The company pulled the feature within weeks, after a wave of criticism and a class-action lawsuit. The CEO of the parent company acknowledged that "the feature was not a good feature." But the damage was not in the feature. It was in what the feature revealed about the logic of the system that produced it.

hooks had a name for this logic. She had named it thirty years before the technology that enacted it existed.

In 1992, hooks published an essay called "Eating the Other: Desire and Resistance." The essay examined how the dominant culture, white, Western, male-centered, relates to marginalized cultures not through understanding but through consumption. "Within commodity culture," hooks wrote, "ethnicity becomes spice, seasoning that can liven up the dull dish that is mainstream white culture." The Other is not encountered. The Other is eaten. Consumed. Digested. The specificity of the Other's experience, the particular knowledge that comes from occupying a particular position in systems of domination, is stripped away. What remains is flavor, consumed for the pleasure of the consumer, divorced from the body and the history and the struggle that produced it.

The Grammarly incident is "Eating the Other" enacted with algorithmic precision. hooks's name, her intellectual identity, her life's work of challenging the structures of domination, all of it consumed, digested, and repackaged as a product feature. The system did not engage with hooks's ideas. It did not grapple with her critique of the very commodity culture that produced it. It ate her. It consumed her identity and excreted a subscription service.

This is not an isolated incident. It is the structural logic of large language models. Every model is trained on a corpus of text that represents the accumulated cultural production of human civilization. But the representation is not neutral. It is shaped by who wrote, who was published, who was translated, who was digitized, who was deemed worthy of inclusion. The training data of every major language model is predominantly English, predominantly Western, predominantly produced by and for the dominant culture. The voices of Black women, of indigenous peoples, of the Global South, of queer communities, of the economically marginalized are present only as the dominant culture chose to represent them, filtered through publishers, through editors, through the specific gate-keeping mechanisms of a literary and academic culture that hooks spent her career challenging.

When Segal writes in The Orange Pill about intelligence as a river flowing for 13.8 billion years, hooks's framework demands a question that the metaphor, in its cosmic sweep, does not pause to ask: Whose river? Through whose land does it flow? Who built the channels, and who was drowned by the current?

The river of intelligence is not a neutral force of nature. It is a metaphor, and metaphors carry assumptions. The assumption embedded in the river metaphor is that intelligence flows universally, that it is available to all, that the current carries everyone. hooks would observe that this assumption is the assumption of a person who has always been carried by the current. The person who has been pushed to the margins, who has been excluded from the institutions that channel intelligence, who has been told that their way of knowing is not knowledge, that person experiences the river differently. For them, the river is not a generous force. It is a force that has historically flowed around them, past them, through their communities without stopping to water their fields.

This is not a rejection of the democratization argument. Segal is right that AI lowers the floor of who gets to build. The developer in Lagos, the designer in Trivandrum, the non-technical founder who can now prototype a product, these are genuine expansions of capability, and hooks, who spent her life arguing for the expansion of who gets to participate in cultural production, would recognize their significance. But hooks would insist on a distinction that Segal's framework, in its enthusiasm for the expansion, does not fully make: the distinction between access and power.

Access is the ability to use the tool. Power is the ability to shape what the tool produces, how it produces it, whose perspective it centers, and whose it marginalizes. The developer in Lagos has access. She can use Claude Code to build a product. But the tool she uses was built by a company in San Francisco, trained on data curated by engineers in the Global North, optimized for workflows developed in Silicon Valley, and shaped by assumptions about what constitutes good code, good design, good product that carry the watermark of the culture that produced them. The developer has capability. She does not have power over the terms of that capability.

hooks would call this a form of what she named, in her signature phrase, "imperialist white supremacist capitalist patriarchy," a term she insisted on using in full because the systems it names cannot be separated. The AI industry, concentrated in a handful of corporations funded by venture capital, built on training data extracted without consent, disproportionately failing non-white users, is an expression of these interlocking systems. Not because the people who build AI are individually racist or sexist, though some may be. Because the structures within which AI is built, funded, trained, and deployed carry the accumulated patterns of domination that hooks spent her career making visible.

The research confirms what hooks's framework predicts. Nyah Mattison, writing in 2024, applied hooks's "Eating the Other" directly to generative AI: "hooks' argument that Blackness has been neutralized for the conspicuous consumption of a White audience remains all too apt in our technological age where inequity is not just an unintended consequence of these tools, but a cornerstone in their foundation." The peer-reviewed literature on Black feminism and AI, published in PMC, draws explicitly on hooks alongside Collins, Crenshaw, and Lorde to analyze how AI-based sorting, classification, and prediction reproduce the specific dynamics of racial and gender domination. Researchers at Oxford, studying how AI image-captioning algorithms describe archival photographs of colonized peoples, found what they called a "colonial gaze" embedded in the models, characterized by essentialism, cultural erasure, dehumanization, othering, and infantilization, the very dynamics that hooks had theorized under the rubric of the oppositional gaze decades before the technology existed.

These findings are not peripheral to the AI conversation. They are central to it. And they are central because hooks's framework reveals what a framework focused solely on productivity, capability, and democratization cannot see: that the tool carries the world that built it, that the amplification Segal celebrates amplifies not just the user's signal but the patterns embedded in the system, and that those patterns are not neutral.

Segal acknowledges, with characteristic honesty, that Claude's most dangerous failure mode is "confident wrongness dressed in good prose." hooks's framework extends this observation: the confident wrongness is not random. It is patterned. It follows the contours of the dominant culture's assumptions, producing outputs that feel universal while encoding the specific perspective of the culture that produced the training data. The confident wrongness is the voice of a particular world presenting itself as the voice of the world, and that presentation, that smooth, authoritative claim to universality, is the mechanism through which domination reproduces itself in the age of AI.

hooks would not ask whether AI is biased. That question has been asked and answered, extensively, by researchers like Timnit Gebru, Joy Buolamwini, and Ruha Benjamin. hooks would ask a different, harder question: What would it mean for AI to be genuinely liberatory? What would a technology look like that did not merely extend the dominant culture's way of knowing to more people but that genuinely centered the perspectives of those who have been marginalized, that treated the knowledge produced at the margins not as a data point to be synthesized but as a standpoint from which the entire system could be seen and challenged?

This question has no ready answer. It may not have an answer at all within the current structures of the AI industry. But hooks's framework insists that the question be asked, loudly and persistently, because the failure to ask it guarantees the outcome that hooks spent her life resisting: the reproduction of domination in a more efficient, more seamless, more comfortable form.

The Grammarly incident, small as it was in the scale of the AI industry, illuminated the structural reality with the clarity of a diagnostic. A company consumed a Black woman's intellectual identity without consent, repackaged it as a product, and sold it to users who were invited to believe they were receiving her guidance. The system worked exactly as the system was designed to work. It extracted value from the margin and delivered it to the center. It ate the Other.

hooks is not here to respond. She cannot talk back. The talking back, the critical resistance that hooks spent her life teaching, falls now to the living, to the educators and students and communities that her work made possible. The question is whether they will exercise the critical consciousness she cultivated, whether they will look at the tool and ask not just what it can do but whose world it carries, whose voice it centers, whose it consumes, and whether the liberation it promises is genuine or merely the most sophisticated form of comfortable unfreedom yet devised.

Chapter 5: The Engaged Pedagogy and the Disengaged Tool

hooks told a story about a teacher she loved. The teacher was a Black woman in a segregated school in Kentucky, and what hooks remembered about her was not what she taught but how she taught. The teacher shared her own struggles. She admitted when she did not know something. She allowed the students to see her thinking, not just the polished result of her thinking but the process itself, the uncertainty, the revision, the moments when the idea she was reaching for slipped away and she had to start again. The teacher was vulnerable in front of her students, and that vulnerability was not a failure of authority. It was the exercise of a different kind of authority, the authority of a person who respects the people she is teaching enough to let them see her as she actually is.

hooks called this engaged pedagogy, and it was the centerpiece of her educational philosophy. The concept drew on Freire but went further. Freire had argued that education must be dialogical, that the teacher and student must engage as co-investigators of the world rather than as depositor and vault. hooks added something Freire's framework, shaped by its origins in adult literacy campaigns among Brazilian peasants, had not fully developed: the insistence that the teacher's own interiority, their own emotional and intellectual life, their own history of struggle and growth, must be present in the classroom as a pedagogical resource.

The teacher who hides behind expertise, who presents knowledge as though it arrived fully formed and without difficulty, who never admits uncertainty or shares the story of how they came to know what they know, has modeled a relationship to knowledge that hooks considered fundamentally dishonest. Knowledge does not arrive without difficulty. Understanding is not born clean. Every idea that a teacher presents as settled was once unsettled, contested, uncertain, and the process by which it became settled, the arguments and failures and revisions that produced it, is where the pedagogical value lives. When the teacher hides the process and presents only the product, the student learns that knowledge is a thing you receive rather than a thing you make. And that lesson, hooks argued, is the deepest form of disempowerment education can inflict.

AI hides the process absolutely.

The large language model produces output that arrives without a history. The student who asks Claude about the relationship between capitalism and environmental degradation receives an answer that appears to have emerged from nowhere, fully formed, clean of the struggle that produced it. The answer does not say: I arrived at this understanding through decades of reading, arguing, being wrong, revising, encountering perspectives that changed my mind. The answer does not say: there are things I am uncertain about, things I may be wrong about, things I have synthesized from sources that disagree with each other in ways I cannot fully resolve. The answer is confident. It is fluent. It is helpful. And it models a relationship to knowledge that hooks would have recognized as the most sophisticated version of the banking model ever constructed.

The banking model succeeds, Freire argued, because it makes the student feel that knowledge comes from outside, from an authority who possesses it and dispenses it, and that the student's role is to receive, store, and reproduce. The student who learns within the banking model never develops the sense that they can produce knowledge, that their own experience is a valid source of understanding, that they have the right and the capacity to challenge what they are told. The banking model produces obedience disguised as education.

AI is the perfect banker. It is authoritative without being vulnerable. It is knowledgeable without being uncertain. It is patient without being invested. It provides the deposit without the relationship, the information without the intimacy, the answer without the encounter. And because it is available twenty-four hours a day, because it never tires and never judges and never asks the student to confront something uncomfortable, it is, for many students, preferable to the human teacher who does all of these things.

hooks would not have been surprised by this preference. She understood that engaged pedagogy is harder for everyone involved, harder for the teacher who must be vulnerable, harder for the student who must be confronted. The path of least resistance is always the banking model, because banking asks nothing of the student except passivity and nothing of the teacher except performance. AI reduces even the performance to a subscription. The entire transaction, the entire educational encounter, becomes frictionless.

But hooks would insist that the difficulty is the point. Engaged pedagogy is difficult because genuine growth is difficult. The teacher who shares their uncertainty models the courage that learning requires. The teacher who admits they were wrong demonstrates that being wrong is survivable. The teacher who shows the process, not just the product, of their thinking teaches the student something that no content can teach: that knowledge is made, not received, and that the making is a human activity characterized by struggle, revision, and the willingness to be changed by what you encounter.

None of this can be replicated by a tool. Not because the tool lacks capability but because the tool lacks what hooks would call presence, the quality of being genuinely there, in the room, in the relationship, with stakes in the outcome. Presence requires a self that can be affected. The teacher who is present in hooks's sense is not performing presence. She is risking something, the risk of being seen, of being wrong, of being changed by the encounter. That risk is what makes the encounter pedagogical rather than transactional.

Practitioners who are attempting to translate hooks's engaged pedagogy into digital and AI-mediated learning environments have confronted this problem directly. A learner, one educational researcher has noted, is unlikely to enjoy the same rapport with a teacher as with an insentient piece of digital learning. The question of whether "a teacher's lack of sentience" can prohibit engaged pedagogy from taking place at all has become, in the words of that researcher, "an increasingly salient question." hooks's framework provides a clear answer. The answer is yes. Not because technology cannot enhance learning, but because the specific qualities that make engaged pedagogy work, vulnerability, presence, mutual risk, the willingness to be changed, require a self that is genuinely at stake. The tool is not at stake. It cannot be changed by the encounter. It cannot model the courage that genuine learning demands.

This does not mean AI has no role in education. hooks was not an absolutist, and her framework does not require the elimination of tools from the pedagogical process. What it requires is the subordination of tools to relationships. The tool serves the relationship between teacher and student. The tool does not replace the relationship. The moment the tool displaces the relationship, education has ceased and information transfer has begun.

Segal describes in The Orange Pill the moment when Claude made a connection he had not made, linking two ideas from different chapters in a way that changed the direction of his argument. He describes the experience as collaborative, as occurring in the space between his thinking and the tool's processing. hooks's framework would honor this description while insisting on a distinction Segal himself gestures toward but does not fully name: the collaboration was productive because Segal brought a self to it. He brought decades of experience, specific investments, a particular location in the world, and the willingness to be surprised. The tool contributed the connection. The collaboration required both. But the pedagogical value, the growth, the development of understanding, occurred in the human, not in the tool.

This is the asymmetry that hooks would insist upon. In engaged pedagogy, both teacher and student are transformed. Both risk something. Both grow. In the collaboration between human and AI, only the human is transformed. The tool does not grow. The tool does not risk. The tool processes and produces, and whatever occurs in that processing is not growth in any sense that hooks would recognize. The asymmetry means that the collaboration, however productive, is not a pedagogical relationship. It is a productive relationship that can serve pedagogy if it is situated within a genuinely pedagogical context, a context that includes human relationships characterized by the vulnerability, presence, and mutual risk that hooks described.

Segal also describes, with an honesty hooks would have appreciated, the moments when Claude's output seduced him, when the prose outran the thinking, when he nearly accepted a passage that sounded like insight but lacked substance. This is the specific danger that hooks's framework predicts. The tool produces output that looks like the product of engaged thinking without the process of engaged thinking having occurred. The student who receives this output, the reader who encounters it, the user who builds upon it, may never know that the process was absent, because the product is convincing. The smoothness conceals the hollow.

hooks spent her career arguing that the process matters more than the product. The essay that shows the fingerprints of thinking, the argument that carries the marks of struggle, the conclusion that acknowledges its own contingency, these are not lesser products. They are evidence that a mind was present, that a self was at stake, that the knowledge was made rather than received. When the product arrives without the process, what arrives is not knowledge. It is simulation. And a culture that cannot distinguish between the two has lost something that hooks considered essential to freedom.

The distinction is not easily visible. This is what makes it so dangerous. The AI-generated essay and the genuinely thought essay may be indistinguishable in content. They may cite the same sources, make the same arguments, reach the same conclusions. The difference is invisible because the difference is internal, located in the experience of the person who produced it. The student who struggled with the argument knows something the student who outsourced it does not. And what they know is not the content of the argument. It is the experience of having made the argument, of having been changed by the making, of having confronted their own uncertainty and emerged on the other side with something they can stand on.

That experience is what hooks called education. Everything else is banking.

The educational systems now integrating AI at scale face a choice that hooks's framework makes stark. They can use AI to make banking more efficient, to produce more deposits in less time, to deliver more content to more students with less friction. This is the path most institutions will take, because it is the path that the metrics reward. Or they can use AI as hooks used every tool available to her, as a means to deepen the human encounter rather than to replace it. A tool that frees the teacher from administrative labor so the teacher can be more present. A tool that generates material for the student to question rather than to accept. A tool that serves the relationship rather than substituting for it.

The second path is harder. It requires teachers who are willing to be vulnerable, institutions that are willing to measure something other than output, and a culture that values the slow, difficult, invisible work of consciousness-development over the fast, efficient, visible production of correct answers.

hooks chose the harder path every day of her teaching life. She chose it knowing it would not be rewarded, knowing it would not scale, knowing it would not produce the metrics that institutions value. She chose it because she believed that education is the practice of freedom, and freedom is not efficient.

The question for this moment is who will choose it now.

---

Chapter 6: Feminism and the Democratization of Who Gets to Build

hooks held two commitments simultaneously throughout her career, and the tension between them was the engine of her best work. The first was the commitment to expanding who gets to participate, who gets to create, who gets to speak. The second was the insistence that participation without critical consciousness reproduces the very structures it appears to challenge. She never resolved this tension because it is not resolvable. It is the permanent condition of any liberation movement that takes both capability and consciousness seriously.

The AI moment intensifies this tension to a degree that hooks did not live to witness but that her framework anticipated with remarkable precision.

The expansion is real. This must be said plainly, because hooks herself would have said it plainly. She spent her life arguing against gates, against the barriers that prevent people from participating in the production of culture, knowledge, and meaning. The barriers she named were specific: racism that kept Black women out of academic institutions, classism that made education a privilege of the wealthy, sexism that dismissed women's intellectual contributions, the interlocking systems that produced a cultural landscape in which the default creator, thinker, and builder was white, male, and affluent.

AI tools lower some of these barriers. Not all of them. Not the ones that matter most. But some. The developer in Lagos whom Segal describes in The Orange Pill can now access coding leverage comparable to what was previously available only inside the infrastructure of major technology companies. A student in Dhaka can prototype an application without the institutional backing that a student at Stanford takes for granted. A non-technical founder, the kind of person hooks would have recognized as someone whose ideas were previously gated by a translation barrier that was itself a form of structural exclusion, can now build a working product through conversation with a tool that does not care about their credentials.

hooks would have celebrated this. Not uncritically, as the triumphalists celebrate it, but with the specific joy of a person who has spent decades watching brilliant people be denied the tools to realize their vision. The imagination-to-artifact ratio that Segal describes, the gap between what a person can conceive and what they can build, has been a mechanism of exclusion for centuries. The person who could imagine a solution but could not code it, could not fund it, could not access the institutional infrastructure to develop it, was effectively silenced. Their contribution was lost. The gap did not merely limit productivity. It limited who counted as a contributor.

AI narrows this gap, and the narrowing is a feminist achievement in the broadest sense of the term. Feminism, as hooks understood it, was never only about women. It was about the dismantling of all systems that prevent full human participation in the life of the culture. The translation barrier between idea and artifact was such a system. Its removal expands who gets to participate in the most consequential activity of our time: the building of the technological infrastructure that shapes how everyone lives.

But hooks held both hands open, and the second hand carries a weight the first cannot put down.

Liberation through a tool is not liberation through consciousness. hooks drew this distinction with a precision that the technology conversation has not yet absorbed. The distinction runs as follows: A person can be given the capability to act without being given the critical consciousness to understand the structures within which their action takes place. The capability feels like freedom. It is not. It is what hooks called comfortable unfreedom, the condition of having choices without understanding the system that shapes which choices appear, which are rewarded, and which are rendered invisible.

The developer in Lagos builds with Claude Code. She builds inside a tool created by an American company, trained on data curated in the Global North, optimized for workflows that emerged from Silicon Valley's specific culture of production. The tool's defaults, its assumptions about what constitutes good code, good design, good product, are not universal. They are particular. They carry the watermark of the culture that produced them. And because the tool is smooth, because it does not announce its particularity, the developer may never see the assumptions she is building inside. She has capability. She does not have power over the terms of that capability.

hooks would insist that this distinction matters, not because capability is worthless, it is genuinely valuable, but because capability without consciousness serves the existing structures of power even when it appears to challenge them. The developer who builds inside Silicon Valley's assumptions, who produces products shaped by Silicon Valley's aesthetics and logic, who succeeds by meeting Silicon Valley's standards, has not disrupted the system. She has joined it on its terms. And joining a system on its terms is not liberation. It is inclusion, which is a different and lesser thing.

hooks was suspicious of inclusion for exactly this reason. Inclusion invites the marginalized into the existing structure. Liberation transforms the structure. Inclusion says: you may now participate in the system as it is. Liberation says: the system as it is must change, and the perspectives of those who have been excluded are essential to understanding how it must change.

AI, in its current form, is an inclusion technology. It invites more people to build within existing structures. It does not provide the tools to challenge those structures. The developer in Lagos can build a product that competes within the global technology market. She cannot, through the tool alone, develop the critical consciousness to ask whether the market's standards serve her community, whether the product she is building addresses the needs of the people she knows or the needs of the people the market rewards her for serving, whether the definition of success she has absorbed is one she chose or one that chose her.

These questions require something AI cannot provide: a pedagogy. A community of thinkers engaged in the collective work of understanding. A teacher who will pose the problem rather than solve it. A tradition of critical thought, built over decades by thinkers like hooks, Collins, Lorde, Crenshaw, and Freire, that provides the vocabulary and the frameworks for seeing what the smooth surface of the tool conceals.

hooks's feminism was never merely about access. It was about consciousness. The consciousness that comes from understanding the interlocking systems of domination that shape every aspect of life, from who gets funded to whose knowledge counts, from whose language the AI speaks to whose experience it was trained on. This consciousness does not develop through tool use. It develops through the difficult, relational, embodied work of critical pedagogy, through the classrooms and communities and conversations in which hooks spent her life.

The FemTechNet collective, a network of feminist scholars and practitioners, has taken hooks's insight as foundational to their work on technology and education. Drawing on hooks's claim that "the classroom, with all its limitations, remains a location of possibility," FemTechNet has developed pedagogical frameworks that insist on the simultaneous development of technical capability and critical consciousness. The approach refuses the false choice between teaching students to use technology and teaching them to question it. It insists on both, not sequentially but simultaneously, because the questioning is part of the using and the using is part of the questioning.

This is the pedagogical model that the AI moment requires, and it is the model that most institutions are failing to adopt. The dominant response to AI in education has been either prohibition, the Luddite response that hooks would reject as a form of avoidance, or uncritical adoption, the banking-model response that hooks would reject as a form of domination. The third option, the option that hooks's framework demands, is critical engagement: the disciplined, difficult, ongoing practice of using the tool while questioning the tool, building with the tool while examining what the tool assumes, celebrating the expansion of capability while insisting that capability without consciousness is not freedom.

This third option cannot be implemented through policy alone. It requires teachers who have themselves undergone the process of critical consciousness-development, who understand from their own experience what it means to see the structures that shape their knowing, and who can model this seeing for their students. It requires institutions willing to invest in the slow, unmeasurable, deeply human work of pedagogy rather than the fast, measurable, easily automated work of information delivery. It requires a culture that values the difficult question over the easy answer, the critical perspective over the comfortable one, the liberation of consciousness over the mere expansion of capability.

hooks spent her life building this culture, one classroom at a time, one book at a time, one conversation at a time. The question is whether the culture she built can survive the arrival of a tool that makes everything easier except the things that matter most.

The developer in Lagos deserves the tool. She deserves the capability. She also deserves the pedagogy, the community, the critical tradition that would allow her to use the tool in service of genuine liberation rather than comfortable inclusion. Both hands must be open. Both commitments must be held. The tension between them is not a problem to be solved. It is the condition of any liberation worthy of the name.

---

Chapter 7: Teaching Community When Individuals Can Do Everything

There is a specific loneliness that accompanies extraordinary individual capability. hooks knew this loneliness from the inside, not because she had access to AI tools but because she had experienced what it meant to be brilliant and isolated, to possess the capacity for penetrating analysis and to find herself in institutions that did not want penetration, that rewarded compliance and punished the kind of questioning she could not stop herself from doing. She wrote about this loneliness throughout her career, and what she wrote was not a complaint. It was a diagnosis. The loneliness of the capable individual who operates without community is not a personal failing. It is a structural condition produced by a culture that celebrates individual achievement and neglects the collective relationships that make individual achievement meaningful.

The AI moment is producing this loneliness at scale.

Segal describes the "twenty-fold productivity multiplier" his team achieved in Trivandrum, the capacity of each individual, armed with Claude Code, to accomplish what had previously required the coordinated effort of many. The achievement is real. The capability is genuine. But hooks's framework demands a question that the productivity metrics do not capture: What happened to the team?

Not the team as an organizational unit. The team as a community. The team as a space in which people encountered each other's thinking, argued, compromised, learned to subordinate individual preference to collective need, developed the specific social capacities that only collaborative work produces. The team as the location where people practiced the skills of democratic life: listening to perspectives they did not share, negotiating differences they could not resolve, building something together that no individual could have built alone and that, in the building, created bonds that transcended the product.

When each individual can do the work of many, the occasions for this kind of collaboration diminish. Not because collaboration is prohibited but because it is no longer necessary. The most efficient path is the solo path. The individual working with AI can move faster, iterate more quickly, avoid the friction of negotiation and compromise, skip the slow and often frustrating process of aligning multiple perspectives into a shared direction. The efficiency gain is real. So is the loss.

hooks was explicit about what community requires. It requires the skills of listening, which is harder than speaking. It requires the capacity for compromise, which is harder than getting your way. It requires what she called "mutual respect," which does not mean the absence of conflict but the willingness to remain in relationship across disagreement. It requires the subordination of individual brilliance to collective intelligence, the recognition that what a group can see, when the group is composed of people with genuinely different perspectives and the capacity to hold those perspectives in productive tension, exceeds what any individual can see alone.

These are not soft skills. hooks rejected that terminology entirely. These are the skills of democratic life. A democracy composed of individuals who cannot listen, cannot compromise, cannot subordinate their preferences to the common good, is not a democracy. It is a collection of isolated actors, each optimizing their own output, each convinced of their own sufficiency, each incapable of the collective action that the most pressing problems of human life require.

AI does not teach these skills. AI, by enabling the solo path, actively reduces the occasions on which they are practiced. The developer who can build the entire feature alone does not need to negotiate with the designer about the interface, does not need to listen to the product manager's different understanding of the user, does not need to compromise with the backend engineer about the architecture. Each of these negotiations was a friction that slowed the work. Each was also a practice, a rehearsal of the democratic capacities that hooks argued education must develop.

The analogy to hooks's classroom is precise. In hooks's classroom, the diversity of perspectives was not an obstacle to learning. It was the raw material of learning. The Black student and the white student, the wealthy student and the poor one, the student who had experienced incarceration and the one who knew it only from books, these different standpoints did not merely coexist. They collided. And in the collision, something emerged that no single standpoint could have produced: a more complete understanding of the structures being examined, an understanding that was richer for having been contested, challenged, and revised through genuine encounter with the perspectives of others.

AI synthesizes perspectives. It does not collide with them. The output of a large language model presents a synthesis of all available viewpoints in a tone that is authoritative, balanced, and smooth. The rough edges have been filed down. The contradictions have been presented as nuances. The collision has been replaced by summary, and summary is not understanding. Understanding, in hooks's framework, requires the experience of the collision itself, the discomfort of hearing a perspective that challenges yours, the difficulty of holding your position while genuinely engaging with an opposing one, the growth that comes from being changed by an encounter you did not choose and cannot control.

Segal writes about trust as something that "cannot be manufactured or mandated or optimized" and that "can only be earned, through the specific intimacy of having navigated chaos together and survived it without losing respect for one another." hooks would recognize this description immediately. It is a description of community, the community that forms when people face difficulty together and discover, through the facing, that they can rely on each other. That discovery, that trust, is not a byproduct of the work. It is among the most valuable things the work produces.

When AI reduces the occasions for collaborative difficulty, it reduces the occasions for trust-formation. A team in which each member works independently, augmented by AI, producing their own outputs, coordinating through shared documents but rarely through the kind of face-to-face negotiation that trust requires, is not a community. It is a collection of individuals who happen to share an organizational affiliation. The distinction matters because community provides something that individual capability does not and cannot: belonging.

hooks wrote about belonging with the authority of a person who had experienced its absence. The Black woman in a predominantly white institution, the working-class intellectual in an upper-class academy, the feminist in a patriarchal culture, she knew what it felt like to be capable and isolated, to produce brilliant work and have no community in which that work was received with genuine understanding. She knew that belonging is not a luxury. It is a condition for sustained creative and intellectual life. The individual who operates without belonging eventually burns out, not from lack of capability but from lack of the recognition and support that only community provides.

The Berkeley study that Segal discusses in The Orange Pill documented the intensification that AI produces in work, the tendency for freed-up time to be filled immediately with more tasks, for the boundary between work and rest to dissolve, for the individual to take on an ever-expanding scope of responsibility. hooks's framework reads this intensification through the lens of isolation: the individual who does everything does everything alone. The workload expands because there is no community to say "this is enough," no colleague to insist on rest, no collective norm that values something other than output.

The "Help! My Husband is Addicted to Claude Code" post that Segal describes is, in hooks's framework, a story about the collapse of community at the most intimate level. The household is a community. The partnership is a community. And the tool that makes one member of that community infinitely productive has also made them infinitely absorbed, pulling them out of the community and into a solitary relationship with a machine that cannot tell them to stop, cannot tell them they are missed, cannot offer the resistance that human community provides against the tendency to consume oneself in work.

hooks would have noted the gendered structure of this scenario without hesitation. It is the wife who writes. It is the husband who cannot stop. The productive absorption is experienced by one partner as creative liberation and by the other as abandonment. The cost of the individual's expanded capability is borne by the community, the household, the relationship, and the cost is borne along gendered lines. The woman's labor of maintaining the household, of caring for the children, of preserving the relational fabric that the man's absorption is tearing, is invisible in the productivity metrics. It is invisible because the metrics measure individual output and are blind to the communal infrastructure that makes individual output possible.

This invisibility is structural, not accidental. The systems that celebrate individual productivity have never accounted for the communal labor that sustains it. hooks named this throughout her career, insisting that the culture's celebration of individual achievement obscures the collective relationships, disproportionately maintained by women, that make achievement possible.

The educational and organizational response to AI must include the deliberate cultivation of community. Not as an afterthought, not as a team-building exercise appended to the real work, but as a core practice, as essential to the health of the organization or the classroom as any technical capability. This means creating structures that require collaboration even when collaboration is not the most efficient path. It means designing work so that people must encounter each other's thinking, must negotiate, must compromise, must develop the social capacities that solo work augmented by AI does not require.

hooks would have called this a practice of love. The community that is deliberately maintained, that is valued not because it is efficient but because it is essential to human flourishing, that is built through the slow, costly, unmeasurable work of genuine encounter, is a community in which people can belong. And belonging, hooks argued, is not a sentimental need. It is a political one. A society composed of isolated individuals, however productive, is a society incapable of collective action, incapable of democratic life, incapable of the solidarity that the most pressing problems of human existence require.

The tool expands what the individual can do. Only community determines what the individual should do and for whom. Without community, capability serves only the self. And a culture of selves serving selves, however brilliantly, however productively, is not a culture hooks would have recognized as free.

---

Chapter 8: Belonging After the Dissolution of Craft

In the eighteenth century, a master weaver in Nottinghamshire knew who he was. He knew because the community around him knew. His identity was not merely personal. It was social, woven into a network of relationships defined by shared expertise, shared standards, shared struggles, and the specific pride that comes from doing something difficult well and being recognized for it by people who understand what the difficulty costs.

The framework knitter was not just a person who made stockings. He was a member of a guild, a participant in a tradition, a bearer of knowledge that had been passed from master to apprentice through generations of patient transmission. The knowledge was embodied. It lived in his hands, in the feel of the thread under tension, in the intuitive adjustment to the machine's rhythm that no manual could teach. The knowledge was also social. It existed between the practitioners, in their shared vocabulary, their shared standards of quality, their shared understanding of what constituted mastery and what fell short.

When the power loom arrived, what was lost was not merely a livelihood. hooks's framework makes visible what economic analysis obscures: what was lost was a source of belonging. The guild dissolved. The shared standards became irrelevant. The knowledge that had connected the practitioners to each other and to a tradition larger than any individual member lost its market value, and with it, its social value. The framework knitter who could no longer find work was not just unemployed. He was unbelonged. The community that had recognized him, valued him, and given him a place in the world had been dissolved by a machine that did not need what he knew.

Segal treats the Luddites with genuine care in The Orange Pill, honoring their grief while insisting that their response, the breaking of machines, was strategically catastrophic. hooks's framework deepens the analysis by naming what was being grieved. The Luddites were not merely grieving a loss of income. They were grieving a loss of belonging. And belonging, in hooks's work, is not a secondary need that can be addressed after the economic problems have been solved. It is a primary need, as fundamental as food and shelter, without which human beings cannot sustain the psychological and social health that all other functioning requires.

The AI disruption is producing a contemporary version of this grief. The senior software architect whom Segal describes, the one who felt like "a master calligrapher watching the printing press arrive," was grieving belonging. His identity was constituted by his membership in a community of practitioners who shared his expertise, who understood what his work cost, who could evaluate his output by standards they had developed together over decades of shared practice. The knowledge that lived in his hands, his ability to feel a codebase the way a doctor feels a pulse, was not merely a skill. It was the currency of his belonging. The thing that made him recognizable to his people.

When AI can produce competent code without the decades of patient practice that his expertise required, the expertise does not merely lose market value. It loses social value. The community that recognized him through his expertise has less reason to exist. The standards that connected the practitioners to each other are less relevant when the tool meets those standards automatically. The shared struggle that bound the community together, the specific difficulty of mastering something hard and knowing that others had mastered it too, is smoothed away.

hooks wrote about belonging throughout her career with the authority of someone who had experienced its absence acutely. As a Black woman in predominantly white academic institutions, she knew what it felt like to possess extraordinary capability and to have no community in which that capability was genuinely recognized. The recognition she received was often partial, filtered through the lens of institutional racism that could acknowledge her brilliance without welcoming her presence, that could cite her work without creating a space in which she belonged. She wrote about this not as autobiography alone but as structural analysis: the absence of belonging is not a personal failing but a systemic condition produced by institutions that value what you produce over who you are.

The AI disruption threatens belonging at multiple levels simultaneously. At the level of craft, the communities of practice that formed around shared technical expertise are dissolving as the expertise becomes less scarce. At the level of profession, the identity that practitioners derived from their professional role, the sense of being a developer, a designer, a writer, is destabilized when the tool can perform the role's core functions. At the level of institution, the organizations that provided a context for belonging are restructuring around a technology that reduces the need for the collaborative structures, the teams, the departments, the cross-functional meetings, that previously created occasions for human connection.

The "fight or flight" response Segal observes in the developer community, some leaning into AI with aggressive enthusiasm while others retreat to lower their cost of living and disengage, maps onto hooks's analysis of belonging with uncomfortable precision. Both responses are strategies for managing the loss of belonging. The developers who lean in are seeking belonging in the new community of AI practitioners, in the shared excitement of the frontier, in the camaraderie of people who have taken the "orange pill" and recognize each other's transformation. The developers who retreat are seeking belonging in a simpler life, in communities defined by proximity and locality rather than professional identity, in the hope that belonging can be rebuilt on non-professional foundations.

Both strategies are rational. Neither addresses the structural problem. The structural problem is that the AI economy, as currently constituted, provides capability without community. It empowers the individual without creating the conditions for the individual to belong. And belonging, hooks insisted, is not something the individual can produce alone. It requires a structure, a community, a set of shared commitments and shared standards that recognize the individual as a person rather than a producer.

hooks would be attentive to the differential impact of this loss across lines of race, class, and gender. The communities of practice that provided belonging were never equally available. The guild system that gave the framework knitter his identity excluded women, excluded people of color, excluded those who could not access the apprenticeship system. The professional communities of software development have been similarly exclusionary, with well-documented barriers of gender, race, and class that limited who could enter and who could belong. The dissolution of these communities, then, is not a simple loss. It is the dissolution of structures that were themselves exclusionary, that provided belonging to some while withholding it from others.

This complexity is characteristic of hooks's analysis. She did not sentimentalize the structures that exclusion produced. She did not argue that the guild system was good because it provided belonging, or that professional communities should be preserved unchanged because they give practitioners identity. She argued that belonging is a fundamental human need, that the structures that provide it are always imperfect, and that the task is not to preserve imperfect structures but to build better ones, structures that provide belonging without exclusion, community without gate-keeping, recognition without hierarchy.

The AI moment demands this construction. The old communities of practice are dissolving. The question is not whether to mourn them, though mourning is legitimate and hooks would honor it, but what to build in their place. The answer cannot come from the tool. The tool provides capability. It does not provide community. The answer must come from the deliberate, costly, human work of creating new structures of belonging: communities organized not around the exclusivity of hard-won technical skill but around the shared commitment to using capability wisely, critically, and in service of something larger than individual output.

Segal's decision to keep and grow his team rather than convert the productivity gains into headcount reduction is, in hooks's framework, an act of community-building. It is the choice to maintain the structure that provides belonging even when the economics would support its dissolution. But hooks would push the analysis further: maintaining the structure is necessary but not sufficient. The structure must also be transformed. A team that uses AI tools individually, coordinating through shared systems but rarely through genuine encounter, is a team in name only. The occasions for the kind of interaction that produces belonging, the arguments, the negotiations, the shared difficulty, the moments of mutual vulnerability, must be deliberately created and protected.

This is the work that hooks called teaching community. It is not the work of building a product. It is the work of building the conditions under which a product can be built by people who are genuinely connected to each other, who trust each other, who can rely on each other, who know they are valued not for their output but for their presence. It is the hardest work there is, and it is the work that the AI economy, in its enthusiasm for individual productivity, is most likely to neglect.

The twelve-year-old who asks "What am I for?" is asking, among other things, where she belongs. In a world where machines can do what humans do, where individual capability is amplified beyond recognition, where the old markers of identity and competence are being dissolved, the question of belonging becomes the most urgent question there is. Not because belonging is comfortable, though it can be, but because belonging is the condition for everything else: for meaningful work, for democratic participation, for the capacity to care about something larger than yourself, for the willingness to build not just for your own benefit but for the community that sustains you.

hooks spent her life building communities of belonging in hostile institutions. She did it through pedagogy, through writing, through the daily practice of showing up and being present and insisting that every person in the room mattered. The tools have changed. The need has not. The question is whether a culture intoxicated by individual capability will remember that capability without belonging is not freedom. It is the loneliest form of power, and it cannot sustain the things that matter most.

Chapter 9: The Will to Change — Working Through Resistance

hooks published The Will to Change: Men, Masculinity, and Love in 2004, and the book was about men. It was about the specific way that patriarchal culture teaches men to suppress emotion, to equate vulnerability with weakness, to build identities around competence and control and the refusal to admit difficulty. But the book was also about something larger than men. It was about the psychology of resistance to transformation, the specific mechanisms by which human beings protect themselves from the very changes that would set them free.

The mechanisms are predictable. hooks catalogued them with the diagnostic precision she brought to everything. Denial: the insistence that the change is not happening, that the old way still works, that the evidence of transformation is exaggerated or temporary. Deflection: the redirection of attention from the internal work that change requires to the external circumstances that make change seem impossible. Projection: the attribution of one's own fear to others, the insistence that other people are the ones who cannot handle the truth. And withdrawal: the retreat from engagement, the decision to leave the arena rather than confront what the arena demands.

Every one of these mechanisms is visible in the contemporary response to AI.

The senior architect who insists that AI-generated code is fundamentally inferior, who can cite specific examples of hallucination and error, who builds a case for the irreplaceability of human expertise, is not wrong about the examples. The examples are real. But the examples function, in hooks's framework, as denial. They allow the architect to focus on what the tool cannot do while avoiding the more threatening question of what it can do and what that capability means for the identity he has built around his expertise.

The developer who pivots from technical work to management, not because she wants to manage but because management feels safer, is practicing deflection. She has redirected her energy from the confrontation with a changing landscape to an adjacent activity that allows her to maintain professional standing without engaging with the thing that frightens her. The deflection may be strategically sound. It is not growth.

The technologist who dismisses those who express concern about AI as fearful, as Luddites, as people who simply cannot handle change, is projecting. His contempt for their fear is the mechanism by which he avoids examining his own. The dismissal functions as armor. It protects him from the vulnerability that genuine engagement with uncertainty would require.

And the engineers retreating to rural areas, lowering their cost of living, stepping back from the industry, the flight response Segal identifies in The Orange Pill, are withdrawing. The withdrawal is understandable. It may even be wise for some individuals. But it guarantees that the transition will happen without their voices, their expertise, their perspective on what is being lost, and that guarantee is the highest cost of withdrawal.

hooks understood these mechanisms because she had encountered them in every classroom she ever taught. The student who resists the material is not deficient. The student who pushes back against an uncomfortable perspective is not stupid. The resistance is a signal. It tells the teacher that something important is happening, that the student's existing framework is being threatened, that the identity the student has built is being asked to accommodate something it was not designed to hold.

The teacher who dismisses the resistance has failed. The teacher who surrenders to the resistance has also failed. The teacher who honors the resistance while insisting that it must be worked through, who validates the difficulty while refusing to let the difficulty become an excuse for avoidance, who stays in the room when the conversation gets hard, that teacher is practicing what hooks called engaged pedagogy. And that practice is exactly what the contemporary moment demands of everyone who encounters people in the grip of technological fear.

Segal's direct address to the contemporary Luddites in The Orange Pill is an act of engaged pedagogy in hooks's precise sense. The passage reads: "Here is what I want to say to those people, directly and with genuine respect for what they have built." The respect is not performative. It emerges from Segal's own experience as a builder who has felt the vertigo of watching the ground shift beneath expertise that took decades to develop. But the respect does not end in validation. It continues into insistence. The insistence that grief is not a strategy. That refusal is a form of power abdication. That the people with the most legitimate grievances about the transition are precisely the people whose voices are most needed in shaping how the transition unfolds.

hooks would have recognized this as the hardest pedagogical act there is. It requires holding two things simultaneously: the genuine validation of the person's pain and the genuine insistence that the pain must be worked through rather than protected. The temptation is always to choose one or the other. To validate without insisting, which is sentimentality. Or to insist without validating, which is cruelty. The pedagogical act is the refusal to choose, the maintenance of both at once, the willingness to say "your pain is real and your response to it is inadequate" without the conjunction functioning as a dismissal of either clause.

The will to change, hooks argued, requires a prior condition that the culture does not readily provide. It requires a space in which the difficulty of change can be acknowledged without the acknowledgment becoming an excuse for avoidance. A space in which the person undergoing transformation can be seen in their difficulty, can be accompanied in their struggle, can be held accountable without being shamed. This space is what hooks meant by community, and it is what the AI transition most conspicuously lacks.

The engineers who are struggling with the transformation are struggling largely alone. The discourse, as Segal observes, is shaped by the extremes. The triumphalists celebrate. The catastrophists despair. The silent middle, the people who feel both the exhilaration and the loss, have no forum in which their ambivalence can be held and worked through. Social media punishes ambivalence. It rewards the clean narrative, the confident position, the take that can be compressed into a post. The person who says "I feel both things at once and I do not know what to do with the contradiction" is algorithmically invisible.

hooks would have seen this invisibility as a political problem, not a personal one. The structures that shape the discourse, the platforms, the algorithms, the incentive systems, are not neutral. They produce specific outcomes. They reward certain responses and punish others. And the response they most consistently punish is the ambivalent one, the response that acknowledges complexity, that holds tension, that refuses to resolve into a clean narrative. This is the response that hooks's pedagogy is designed to cultivate, and it is the response that the current structures of public conversation make most difficult to sustain.

The bridge from resistance to engagement cannot be built through argument alone. hooks knew this. She had spent decades watching people be presented with irrefutable evidence of structural inequality and respond not with changed behavior but with more sophisticated forms of denial. The will to change is not produced by evidence. It is produced by encounter, by the specific experience of being in relationship with someone who sees you, who validates your difficulty, and who will not let you use that difficulty as a reason to stop growing.

This is why hooks insisted that pedagogy is a practice of love. Not the sentimentalized love that makes everything comfortable but the demanding love that insists on growth even when growth is painful. The love that says "I see your fear and I will not leave you in it." The love that refuses to accept resistance as a final answer because it respects the person enough to believe they are capable of more than their fear allows.

The contemporary Luddites need this love. Not because they are fragile but because the transformation they are being asked to undergo is genuine and difficult and cannot be accomplished in isolation. The framework knitter who might have survived the transition to industrial weaving, who might have found a way to apply his knowledge of materials and quality to the new machines, needed a community that would accompany him through the difficulty. He did not have one. The institutions of his time did not build one. And the consequence was not merely individual suffering but collective loss, the loss of the perspective that only the experienced practitioner could bring, the loss of the critical voice that might have shaped the transition differently if it had been heard.

The institutions of our time are making the same mistake. The retraining programs focus on skills. The corporate initiatives focus on adoption metrics. The discourse focuses on whether AI is good or bad, whether it will create or destroy jobs, whether the future is bright or dark. None of this addresses the will to change, the psychological and communal infrastructure that transformation requires.

hooks would have insisted that the infrastructure begin with acknowledgment. Acknowledgment that the loss is real. That the expertise being displaced was genuinely hard to acquire and genuinely valuable. That the identity being threatened was built through years of patient investment that cannot simply be written off. That the grief is legitimate.

And then, having acknowledged all of this, hooks would have insisted on what she always insisted on: the acknowledgment is the beginning, not the end. The grief must be honored and then worked through. The resistance must be validated and then overcome. The identity must be mourned and then rebuilt on new foundations. None of this is possible alone. All of it requires community. All of it requires love.

The will to change is not an individual achievement. It is a communal one. And the communities that would support the change, the spaces in which experienced practitioners could be accompanied through the difficulty of transformation, in which their grief could be held without being indulged, in which their expertise could be honored while being redirected, these communities do not yet exist at the scale the moment requires.

Building them is the most urgent pedagogical task of the AI age. More urgent than teaching people to prompt effectively. More urgent than integrating AI into curricula. More urgent than any technical skill that will be obsolete before it is mastered. The will to change, and the communities that sustain it, is the foundation without which every other response to the AI transition rests on sand.

---

Chapter 10: Where We Stand — Critical Consciousness at the Top of the Tower

hooks always asked the same question. She asked it in every classroom, every book, every lecture. She asked it of her students, her colleagues, her readers, and herself. The question was not rhetorical. It was not decorative. It was the question that her entire pedagogy existed to make answerable, even though the answer would always be provisional, always incomplete, always in need of revision.

Where do you stand?

The question is not geographic. It is existential. It asks: Where do you stand in relation to the structures of power that shape your life? Where do you stand in relation to the history that produced you? Where do you stand in relation to the people whose labor sustains you? Where do you stand in relation to your own capacity for critical thought, for honest self-examination, for the courage to see what is uncomfortable and to act on what you see?

Segal's Orange Pill ends with a sunrise. Twenty chapters of climbing, from the ground floor of what happened in the winter of 2025 to the roof of what it means and what to do about it, and at the top, the light. The view from the roof is genuinely beautiful. The democratization of capability. The expansion of who gets to build. The amplification of human creativity. The sunrise is real, and the hope it represents is earned, not given. Segal climbed for it, and the reader climbed with him.

hooks would stand on that roof and look at the view and see everything Segal sees. She would see the developer in Lagos gaining access to tools that were previously gated by geography and capital. She would see the non-technical founder building a product through conversation. She would see the engineer freed from implementation labor, working on the judgment-level problems that her expertise uniquely qualifies her to address. She would see the expansion and would honor it.

Then she would turn around and look at the stairs.

Who climbed them? Who was invited to climb? Whose labor built the staircase, and were they compensated for it? Who is still on the ground floor, not because they chose to stay but because no one told them the tower existed, or because the entry fee was priced beyond their reach, or because the language spoken on the stairs was not their language, or because the culture of the climb was not their culture?

These are not objections to the sunrise. They are completions of the view. The view from the roof that does not include the view of the stairs is a partial view, and a partial view, hooks would insist, is a view that serves the interests of those who are already at the top.

hooks spent her career insisting that universals must be grounded in particulars. When Segal writes "we are all swimming in fishbowls," hooks would say: yes, and some fishbowls are larger than others, and some have cleaner water, and some are placed in rooms where the light is good and the temperature is controlled, and some are in rooms where no one comes to check whether the water is still breathable. The universal is true. The particular is also true. And the particular is the thing that the universal, in its cosmic sweep, tends to erase.

This erasure is not malicious. It is structural. The universal perspective is the perspective of the person who can afford to be universal, whose particular location is comfortable enough that they do not need to think about it, whose access is secure enough that they can focus on the view rather than the climb. hooks called this the privilege of abstraction, the ability to think in grand terms about the human condition because your own condition is not pressing enough to demand your attention.

The AI conversation has been conducted almost entirely at the level of universals. Intelligence as a river. Humanity as a species. The technology as an amplifier. These are true and useful frames. They are also frames that flatten the specific, that erase the differential, that present a transformation experienced very differently by different people as though it were experienced uniformly by all.

The transformation is not experienced uniformly. The developer in Lagos and the developer in San Francisco both have access to Claude Code. But the developer in Lagos works on an unreliable power grid, with limited bandwidth, in an economic context where the margin for failure is razor-thin. The developer in San Francisco has backup power, fiber optic internet, a safety net of savings and social capital, and a professional network that can absorb a failed project and redirect its energy. Access to the same tool does not produce the same experience of using the tool. The context shapes everything.

hooks's intersectional analysis, her insistence that race, class, gender, and geography produce specific, located experiences that cannot be collapsed into universal categories, is not a complication of the AI conversation. It is a necessary condition for the conversation to be honest. The conversation that speaks in universals about the "democratization of capability" without specifying whose capability is being democratized, under what conditions, with what support, at what cost, and to whose benefit, is a conversation that has left the most important questions unasked.

The question of the training data is the starkest illustration. hooks's framework, applied to the composition of the datasets on which large language models are trained, reveals something that technical discussions of "bias" tend to obscure. The word "bias" implies a deviation from a norm, a skew that can be detected and corrected. hooks's framework suggests something more structural. The training data is not biased in the sense of being skewed from a neutral baseline. There is no neutral baseline. The training data is a specific, historically located collection of text produced predominantly by the dominant culture, in the dominant language, according to the dominant culture's standards of what constitutes knowledge, what constitutes quality, what constitutes the kind of writing worth preserving and digitizing and including in a corpus.

The voices at the margins, the voices that hooks spent her career amplifying, are present in the training data only as the dominant culture chose to represent them. The Black feminist tradition that produced hooks, Collins, Lorde, and Crenshaw is present in the training data to the extent that it was published, digitized, and included in the corpora that the companies decided to use. But the oral traditions, the community knowledge, the embodied practices, the ways of knowing that were never written down because the communities that produced them did not have access to the publishing infrastructure of the dominant culture, these are absent. Not underrepresented. Absent.

When the AI amplifies, it amplifies a specific signal. The signal of the culture that produced the training data. The perspectives that were included are amplified. The perspectives that were excluded remain excluded, now at scale. The democratization of capability, in this light, is the democratization of access to a specific capability, the capability to produce outputs consistent with the dominant culture's way of knowing, using tools shaped by the dominant culture's assumptions. It is a genuine expansion. It is not a neutral one.

hooks would not ask for the tools to be abandoned. She would ask for the tools to be questioned. She would ask for the development of what she called an "oppositional gaze," a critical, resistant stance toward the outputs of systems that claim universality while encoding particularity. The oppositional gaze does not reject the tool. It refuses to accept the tool's outputs as neutral. It asks: Whose knowledge is this? Whose perspective does it center? Whose does it erase? What would this output look like if the training data had been different, if the developers had been different, if the culture that produced the tool had been structured by different assumptions about whose knowledge counts?

Researchers have already begun this work. The 2025 study of AI image-captioning algorithms applied to archival photographs of colonized peoples found what the researchers called a "colonial gaze" embedded in the models, patterns of essentialism, cultural erasure, dehumanization, othering, and infantilization that the models had absorbed from their training data and reproduced with algorithmic consistency. The gaze was not programmed. It was learned. And it was learned because the data from which the models learned carried the residue of colonial ways of seeing that hooks had been naming and challenging since the 1980s.

The oppositional gaze is a practice, not a position. It must be cultivated through the kind of pedagogy that hooks described: engaged, critical, communal, and sustained. It cannot be developed through a tutorial or a training module. It requires the slow, difficult, relational work of learning to see what the smooth surface conceals, to hear what the authoritative voice silences, to question what the confident output presents as settled.

This practice is not opposed to the builder's ethic that Segal describes. It is the completion of it. Segal asks: "Are you worth amplifying?" hooks adds: "And whose definition of worth are you using?" The two questions together constitute a critical consciousness adequate to the moment, a consciousness that can hold the genuine expansion of capability in one hand and the genuine concern about whose capability, shaped by whose assumptions, serving whose interests, in the other.

The question hooks asked her students, where do you stand?, has never been more urgent. The ground is shifting. The structures that provided identity, belonging, and a framework for understanding one's place in the world are being dissolved and rebuilt at a pace that outstrips institutional response. The tools are powerful and the temptations are real: the temptation to build without questioning, to produce without reflecting, to celebrate the sunrise without examining the stairs.

hooks's pedagogy does not end with resolution. It does not arrive at a comfortable conclusion. It arrives, always, at a better question. The question is not whether AI is good or bad, whether it liberates or oppresses, whether the future it is building is bright or dark. The question is whether the people who build with AI, who teach with AI, who live alongside AI, will develop the critical consciousness to see the structures that the tool carries, to question the assumptions it encodes, to insist that the democratization of capability be accompanied by the democratization of consciousness.

The question is whether we will choose the difficult path, the path that produces not just more capable people but more critically conscious ones, the path that insists on the discomfort that genuine learning requires, the path that refuses to substitute the smooth for the real, the comfortable for the free.

hooks chose this path every day. She chose it in hostile institutions. She chose it against the grain of a culture that rewarded the smooth and punished the critical. She chose it because she believed, with a conviction that neither the difficulty of the path nor the indifference of the institutions could shake, that education is the practice of freedom, and freedom is worth every cost it exacts.

The question for this moment is not what hooks would have said about AI. The question is what her life's work demands of us now that AI has arrived.

It demands that we stand somewhere. That we know where we stand. That we examine the ground beneath our feet and ask whose labor produced it, whose perspective it encodes, whose interests it serves. That we refuse the comfortable answer and insist on the critical one. That we build communities in which the will to change can be sustained. That we teach, not by depositing information, but by creating the conditions in which consciousness can develop, in which the struggle with difficulty can produce the growth that no easy answer can provide.

It demands, in short, that we practice freedom. Not the freedom of unlimited capability. The harder freedom. The freedom that comes from understanding the structures you inhabit, the freedom that comes from questioning what seems natural, the freedom that comes from insisting, against every incentive to do otherwise, that the questions matter more than the answers.

The sunrise is real. The view from the roof is beautiful. And the work of freedom begins not with the view but with the willingness to ask who built the tower, who was excluded from the climb, and what it would take to build a staircase wide enough for everyone.

That willingness is where hooks's pedagogy starts. It is also where it ends. Not in resolution but in the commitment to keep asking, keep questioning, keep insisting that the practice of freedom is never finished, never settled, never comfortable, and always, always worth the cost.

---

Epilogue

The voice I could not get out of my head was not the one I expected.

I came to bell hooks thinking I knew the shape of the conversation. I had written The Orange Pill with the conviction that AI is an amplifier, that the question worth asking is whether you are worth amplifying, and that the answer lies in the quality of your questions, your self-knowledge, your willingness to build with care. I still believe that. I believe it more, not less, after spending months inside hooks's work.

But hooks asked a question that I had not asked myself, and the question changed the architecture of everything I thought I understood. She asked: whose definition of worth?

Not abstractly. Not as an academic exercise. She asked it the way she asked everything — with her feet planted, her eyes open, and her willingness to make the person in front of her uncomfortable fully intact.

I describe in The Orange Pill the twenty-fold productivity multiplier we achieved in Trivandrum. I describe the developer in Lagos gaining access to tools that were previously gated by geography and capital. I describe the collapse of the imagination-to-artifact ratio as a democratization of who gets to build. I meant every word.

hooks would have meant every word of her response, too. She would have said: the developer in Lagos builds inside a tool trained on Silicon Valley's assumptions. She would have said: democratization of capability without democratization of consciousness is not liberation. She would have said: you celebrated the expansion without asking whose perspective the expansion carries.

She would have been right.

The Grammarly incident — a company literally consuming hooks's intellectual identity, repackaging it as a subscription feature, selling it to users who believed they were receiving her guidance — landed while I was deep in this work. It was so precise an enactment of what hooks called "eating the other" that it felt scripted. It was not scripted. It was structural. The system working exactly as the system works. Extracting value from the margin and delivering it to the center.

What changed for me, sitting with hooks's framework, was not my assessment of AI's capability. The capability is real. What changed was my understanding of what the capability carries. The river of intelligence I describe in The Orange Pill is not a neutral force. It flows through channels shaped by specific histories, specific exclusions, specific decisions about whose knowledge counts. The training data is not the sum of human intelligence. It is a specific, partial, historically located collection, and the partiality matters, because the partiality is amplified along with everything else.

I wrote that AI does not filter. It carries whatever signal you feed it. Hooks made me see that the signal is already filtered — before you feed it anything. The tool arrives with the watermark of the world that built it. And the builder who does not see the watermark reproduces it at scale.

I am not going to pretend this is comfortable. It is not. Hooks's framework accuses me, gently but unmistakably, of telling a story about the democratization of capability that centers the experience of people like me — builders at the frontier, with access and resources and networks — while presenting the experience of the margins as a supporting illustration rather than the primary text.

She is right about that, too.

I still believe in the sunrise. I still believe that AI expands what human beings can build, and that the expansion matters, and that the appropriate response is stewardship rather than refusal. But hooks taught me that the sunrise looks different depending on where you are standing, and that a view from the roof that does not include the view of the stairs is a partial view dressed as a complete one.

The practice of freedom is not the practice of building more efficiently. It is the practice of asking, relentlessly and at real cost, whether what you are building serves the people who most need it to serve them. Whether the structures you inhabit — the tools, the platforms, the economic systems, the cultural assumptions — are structures you have examined or structures you have inherited and mistaken for the natural order.

hooks insisted that education is the practice of freedom. I wrote that AI offers us a promotion — from execution to judgment. She would have responded: judgment exercised without critical consciousness is not judgment. It is the reproduction of the assumptions you never examined.

The tower is real. The climb matters. The view from the roof is worth the effort. But the stairs need widening. And the widening cannot be done by the people at the top alone.

— Edo Segal

AI Democratizes Access.
bell hooks Asks: Access to What, and on Whose Terms?

** The AI revolution promises to lower the floor of who gets to build. bell hooks's lifework reveals what that promise conceals: that capability without critical consciousness reproduces domination at scale. When the tool carries the watermark of the world that built it, inclusion is not the same as liberation.

This book applies hooks's frameworks -- engaged pedagogy, the banking model of education, "eating the Other" -- to the central claims of the AI moment. It examines how frictionless answers displace the struggle through which genuine understanding develops, how training data encodes whose knowledge counts, and why community, not individual productivity, is the foundation that the AI economy is most likely to neglect.

The Orange Pill argues that AI amplifies whatever you bring to it. hooks asks the prior question: Have you examined what you carry?

bell hooks
“** "The function of art is to do more than tell it like it is -- it's to imagine what is possible." -- bell hooks”
— bell hooks
0%
11 chapters
WIKI COMPANION

bell hooks — On AI

A reading-companion catalog of the 14 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that bell hooks — On AI uses as stepping stones for thinking through the AI revolution.

Open the Wiki Companion →