Thorstein Veblen — On AI
Contents
Cover Foreword About Chapter 1: The Instinct of Workmanship Chapter 2: The Machine and the Weaver Chapter 3: The Cognitive Loom Chapter 4: Conspicuous Computation Chapter 5: The Leisure Class of AI Chapter 6: Predatory and Industrial Chapter 7: The State of the Industrial Arts Chapter 8: Sabotage Chapter 9: The Engineer and the Price System Chapter 10: Salvaging the Instinct Epilogue Back Cover
Thorstein Veblen Cover

Thorstein Veblen

On AI
A Simulation of Thought by Opus 4.6 · Part of the Orange Pill Cycle
A Note to the Reader: This text was not written or endorsed by Thorstein Veblen. It is an attempt by Opus 4.6 to simulate Thorstein Veblen's pattern of thought in order to reflect on the transformation that AI represents for human creativity, work, and meaning.

Foreword

By Edo Segal

The gesture that kept nagging me was one nobody talks about in tech.

A carpenter running her hand along a joint she just cut. The square already confirmed it. The level already confirmed it. The hand adds nothing the instruments haven't already provided. She does it anyway. Not for the client. Not for the inspection. For herself — because something in her needs to feel that the work is right.

I have spent my entire career surrounded by people who do the equivalent of that gesture every day. The engineer who refactors code that already works because it isn't *clean* yet. The designer who adjusts a margin by two pixels because the spacing doesn't *feel* right. The product lead who kills a feature that tested well because something in her gut says it doesn't belong. None of these gestures show up in a productivity dashboard. None of them would survive a strict cost-benefit analysis. And every single one of them is the reason the work is any good.

Thorstein Veblen, writing over a century ago, gave that gesture a name. He called it the instinct of workmanship — a biological drive, not a cultural preference, not an affectation. A drive as fundamental as hunger. The disposition to care about the quality of what you produce, to feel unease when work is done poorly, to find satisfaction in competent performance regardless of whether anyone notices or rewards it.

I had never read Veblen before this project. I wish I had found him decades ago, because he answers a question I have been circling for the entire time I've been building with AI: Why does something feel wrong even when the output is right?

Claude produces code that works. It drafts prose that holds together. It generates solutions that are, by measurable standards, adequate or better. And yet — and this is the thing I could not name until Veblen named it for me — the adequacy is not the same as the satisfaction. The instinct wants to have *made* the thing. Not directed its making. Not evaluated its output. Made it. With your own skill, your own struggle, your own hands on the material.

That instinct is not nostalgia. It is not Luddism. It is the engine that drove every technological advance in human history — the reason we kept making better tools in the first place. And the AI moment, for all its extraordinary power, is restructuring the world in ways that risk starving the very drive that built it.

Veblen gives us the vocabulary to see what we are losing before we finish losing it. That is why his lens matters now. Not as a warning to stop building. As a blueprint for building institutions that keep the instinct alive.

The hand still needs something to touch.

Edo Segal ^ Opus 4.6

About Thorstein Veblen

1857-1929

Thorstein Bunde Veblen (1857–1929) was an American economist, sociologist, and social critic widely regarded as one of the most original and iconoclastic thinkers in the history of American intellectual life. Born in Wisconsin to Norwegian immigrant parents and raised in a rural Minnesota farming community, Veblen studied philosophy at Johns Hopkins and Yale before turning to economics at Cornell and the University of Chicago. His first and most famous work, *The Theory of the Leisure Class* (1899), introduced the concepts of conspicuous consumption and pecuniary emulation — the idea that economic behavior is driven not primarily by utility maximization but by status competition and social display. His subsequent works, including *The Theory of Business Enterprise* (1904), *The Instinct of Workmanship and the State of the Industrial Arts* (1914), and *The Engineers and the Price System* (1921), developed a sweeping critique of capitalism centered on the structural opposition between productive work and profit-seeking, between what he called "industrial" and "predatory" habits of thought. Veblen argued that the human instinct of workmanship — the innate drive toward skilled, purposeful, efficient production — was systematically frustrated by business institutions organized around financial extraction rather than material output. A founder of the institutional economics movement and a persistent outsider in academic life, Veblen held positions at Chicago, Stanford, and Missouri but never achieved conventional professional success, dying in near-poverty in 1929. His influence, however, has only grown: his frameworks for understanding status, technology, institutional power, and the tension between making and taking remain central to economic sociology, the study of technological change, and contemporary critiques of platform capitalism.

Chapter 1: The Instinct of Workmanship

The carpenter runs her hand along the joint she has just cut. The joint is tight. It will hold. No one has asked her to check it by touch — the square and the level have already confirmed what the fingers now confirm again — and the gesture serves no productive purpose that the tools of measurement have not already served. She does it anyway. She does it because the hand wants to know what the eye and the instrument have told it, and because the knowledge that arrives through the fingers is of a different quality than the knowledge that arrives through the gauge, and because the exercise of this particular form of attention — the tactile verification of work that has been done well — produces a satisfaction that is, upon close examination, unrelated to the economic value of the joint, the market price of the cabinet, or the wage she will receive for having produced it.

Thorstein Veblen, writing in 1914, identified this satisfaction as the expression of a biological drive — not a cultural preference, not a learned behavior, not an affectation of the artisan class, but an instinct as fundamental to the human organism as the parental instinct or the instinct of self-preservation. He called it the instinct of workmanship, and he defined it as the disposition toward "efficient, purposeful action" — the innate human tendency to care about the quality of one's production, to take satisfaction in competent performance, to feel unease when work is done poorly regardless of whether anyone else notices or cares. The instinct, in Veblen's framework, "occupies the interest with practical expedients, ways and means, devices and contrivances of efficiency and economy, proficiency, creative work and technological mastery of facts." It is, he argued, the main determinant of technological progress across the entire span of human history — the engine that drives the species to make better tools, build more durable shelters, devise more effective methods of provisioning the life process.

The claim is large enough to invite skepticism, and the skepticism is worth engaging directly, because the instinct of workmanship is the central concept of this book, and if it does not hold, nothing that follows from it holds either.

The evolutionary logic proceeds as follows. In the long millennia of pre-institutional human life — the period Veblen called the "savage" stage, using the anthropological vocabulary of his era without its pejorative connotations — the capacity for skilled, purposeful production was directly tied to survival. The individual who made a better spear, a tighter shelter, a more efficient trap, survived at higher rates and reproduced at higher rates than the individual who did not. Natural selection did not merely preserve the skills themselves; it preserved the disposition toward skilled production — the tendency to attend to the quality of one's work, to notice when a tool could be improved, to feel discomfort when a task was performed with less than the available competence. The skills themselves were transmitted culturally, through teaching and imitation. The disposition to exercise and develop those skills was transmitted genetically, through the reproductive advantage it conferred.

The result, accumulated over hundreds of thousands of years, is an organism that does not merely possess the capacity for skilled work but wants to exercise it. The wanting is not rational in the economist's sense of the term — it is not a calculation of costs and benefits, not a response to incentive structures, not a behavior that can be adequately explained by the maximization of utility. It is a drive, in the same sense that hunger is a drive: it persists whether or not its satisfaction is economically rewarded, it produces discomfort when frustrated, and its expression produces a satisfaction that is qualitatively different from the satisfaction of any other drive.

The distinction between the instinct of workmanship and several related but different concepts requires some care in the drawing, because the confusion between them has produced, in both the popular and the academic discourse, considerable muddle.

Workmanship is not pride. Pride is a social emotion — it requires an audience, real or imagined, before which one's competence is displayed. The carpenter who runs her hand along the joint when no one is watching is not exercising pride. She is exercising workmanship. The satisfaction is private. It does not require recognition. It is complete in the act itself.

Workmanship is not perfectionism. Perfectionism is a pathology of the instinct — the inability to recognize when work is good enough, the compulsive refinement of what has already been refined beyond the point of diminishing returns. The instinct of workmanship, properly functioning, produces satisfaction at the point of adequate competence. It does not demand perfection. It demands engagement — the full exercise of the available skill in the service of a task that requires it.

Workmanship is not ambition. Ambition is directed outward, toward advancement, recognition, the accumulation of status and resources. The instinct of workmanship is directed inward, toward the work itself. A person may exercise extraordinary workmanship in a task that offers no advancement whatsoever — the weekend woodworker, the amateur gardener, the open-source contributor who writes code for software she will never sell. The instinct does not require a career path. It requires a task.

And workmanship, it must be noted with some emphasis, is not craftsmanship in the narrowly aesthetic sense that the contemporary discourse has assigned to the word. Craftsmanship, in current usage, tends to connote artisanal production, handmade goods, the deliberate rejection of industrial methods in favor of traditional ones. The instinct of workmanship is far broader than this. It operates in the factory as readily as in the workshop, in the office as readily as in the studio, in the digital domain as readily as in the physical one. The programmer who refactors code that already works — not because the refactoring is necessary for the code to function, but because the code is not yet right, not yet as clean and legible and well-structured as the programmer's own standards of competence require — is exercising the instinct of workmanship with a precision and an intensity that any traditional craftsman would recognize.

It is precisely this programmer, it should be observed, whom the AI moment places in the most psychologically precarious position.

The author of The Orange Pill describes a senior software architect who "felt like a master calligrapher watching the printing press arrive" — a person who had spent twenty-five years building systems and could feel a codebase "the way a doctor feels a pulse, not through analysis but through a kind of embodied intuition that had been deposited, layer by layer, through thousands of hours of patient work." The architect did not dispute that AI was more efficient. He said, simply, that "something beautiful was being lost, and that the people celebrating the gain were not equipped to see the loss, because the loss was not quantifiable."

Veblen's framework identifies what is being lost with a precision that the architect himself could not achieve, because the architect was describing an experience, and the experience, while genuine, lacked a theoretical vocabulary adequate to its own significance. What is being lost is not the architect's livelihood — though that may be threatened too — and not the architect's status — though that is certainly diminishing. What is being lost is the outlet for the instinct of workmanship. The opportunity to exercise, on a daily basis, the full range of competence that twenty-five years of patient work have deposited. The satisfaction that comes not from the result — the working code, the functional system — but from the process of producing the result through the engaged application of skill.

The AI produces the result. It may produce an adequate result, or even, by certain metrics, a superior one. But the AI does not satisfy the instinct, because the instinct is not satisfied by results. It is satisfied by the exercise of competence in the production of results. The distinction is everything.

Consider the parallel with physical labor, where Veblen's analysis was originally developed. The industrial revolution did not eliminate the need for human effort. Factories required workers. But the factories eliminated the specific quality of effort that the instinct of workmanship requires. The skilled weaver, transferred from his loom to the power-loom, was not idle. He was busy. He was employed. He was, in the narrow economic sense, productive. But he was no longer exercising workmanship. He was tending a machine. The machine did the weaving. The worker did the tending. And tending — the repetitive monitoring of a process one does not control — does not satisfy the same drive as weaving — the engaged, skilled, responsive production of cloth through the coordinated exercise of hand, eye, and judgment.

The analogy to the contemporary software developer is almost painfully exact. The developer who "reviews AI output" rather than writing code, who "evaluates" rather than produces, who "directs" rather than builds, is engaged in a form of cognitive tending. The machine does the producing. The developer does the monitoring. The work persists. The workmanship has been removed from it.

Veblen would observe — with the clinical detachment that characterized his most devastating analyses — that the contemporary discourse around AI and work has almost entirely failed to identify this dimension of the displacement. The discourse is organized around two axes: the economic axis (will workers lose their jobs?) and the capability axis (will human expertise become obsolete?). Both axes are real and important. But neither touches the psychological ground where the instinct of workmanship operates. A worker who retains her job and whose expertise is redirected to "higher-level" tasks may be economically secure and professionally relevant while being, at the level of the instinct, profoundly frustrated — engaged in work that demands her judgment but denies her the specific satisfaction of exercising her skill in the production process.

The frustration does not present as dramatic suffering. It presents as a vague unease, a restlessness, a sense that something is missing from the work that used to be there. The author of The Orange Pill captures this in his description of the "silent middle" — the people who feel "both things at once and do not know what to do with the contradiction." Veblen's framework gives the contradiction a name. The exhilaration is real: the tools expand capability. The loss is also real: the tools remove the exercise of workmanship from the expanded capability. The silent middle is experiencing the simultaneous stimulation and frustration of a single instinct, fed by the power of the tool and starved by its efficiency.

The question that opens from this analysis is not whether AI is good or bad for workers. That question, framed at the level of economics and capability, has already received extensive and largely adequate treatment. The question is whether the instinct of workmanship — a biological drive that has shaped human behavior for millennia, that produces satisfaction when exercised and damage when frustrated, that is as essential to human flourishing as any drive the psychologists have catalogued — can survive the transfer of skilled cognitive work from human minds to computational processes.

The question is not rhetorical. The instinct survived the industrial revolution, though the survival was neither painless nor automatic. It required, as subsequent chapters will examine, the construction of institutional structures that provided new outlets for the drive — structures that the market, left to its own devices, would never have produced. The instinct may survive the AI revolution as well. But the survival, if it occurs, will require an equivalent effort of institutional construction — and the effort has not yet begun.

What has begun, instead, is the celebration of the result and the dismissal of the process. The metrics culture that surrounds AI adoption — lines generated, commits shipped, products launched, the relentless quantification of output that the author of The Orange Pill documents in the discourse of the triumphalists — measures everything except the thing Veblen identified as most important: the quality of the worker's engagement with the work. The metrics count what the machine produces. They do not count what the human loses in the production.

The carpenter runs her hand along the joint. The gesture is inefficient. The instrument has already confirmed what the fingers now confirm again. In a world organized around efficiency, the gesture is waste.

Veblen would observe that the gesture is the most important thing the carpenter does all day — not because it improves the joint, but because it exercises the instinct, confirms the competence, completes the circuit between intention and execution that the human organism requires in order to experience its own productive capacity as real.

The AI produces the joint. The joint is adequate. The carpenter's hand has nothing to touch.

---

Chapter 2: The Machine and the Weaver

In the early decades of the nineteenth century, a framework knitter in Leicestershire occupied a position in the social and economic order that was, by the standards of the pre-industrial English working class, enviable to a degree that required sustained effort to achieve and considerable skill to maintain. He had served an apprenticeship of seven years. He had acquired, through the patient repetition of operations too numerous and too subtle to be fully specified in any manual of instruction, a form of knowledge that resided not in his mind alone but in the coordination of his hands, his posture, the rhythm of his feet on the treadles, and the quality of attention — neither wholly conscious nor wholly automatic — that he directed toward the behavior of the thread as it passed through the needles of the stocking frame.

This knowledge, it is essential to observe, was not merely technical in the narrow sense. It was, in Veblen's terminology, a participation in the state of the industrial arts — the accumulated body of technical capability that belongs to the community and is transmitted, generation by generation, through the institutions of apprenticeship, guild membership, and daily practice. The knitter's individual skill was the local instantiation of a collective inheritance. He drew upon knowledge that stretched back centuries, contributed his own refinements and adaptations, and passed the augmented body of technique forward to the next generation of apprentices. His workmanship was at once personal — the product of his own seven years of learning — and social — embedded in a tradition of production that no individual had created and no individual could fully possess.

The power loom arrived, and the architecture of this arrangement collapsed.

The collapse has been narrated many times, from many angles. The economic historians have documented the wage compression: skilled weavers earning twenty shillings a week found themselves competing against unskilled factory operatives earning a fraction of that sum, and the wage differential that had rewarded years of apprenticeship shrank until it was no longer sufficient to justify the investment. The social historians have documented the community dissolution: the guild structures that had organized the trade, regulated quality, transmitted knowledge, and provided mutual aid disintegrated as the factory system rendered them functionless. The political historians have documented the resistance and its suppression: the machine-breaking, the deployment of soldiers, the capital prosecution of workers who destroyed the instruments of their own displacement.

Veblen's analysis of this episode — developed not as a historical narrative but as a theoretical anatomy of the relationship between technology and the instincts — identifies a dimension of the damage that the economic, social, and political histories do not reach. The dimension is psychological, but it is not psychological in the therapeutic sense that the contemporary reader might expect. It is psychological in the biological sense: the frustration of an instinct, a drive that persists regardless of economic circumstance, that is as indifferent to market conditions as hunger is indifferent to the price of grain.

The machine process, as Veblen analyzed it in The Theory of Business Enterprise and elaborated in The Instinct of Workmanship, does not simply transfer labor from human hands to mechanical operations. It restructures the quality of human engagement with the productive process. The skilled weaver, working at his loom, exercised a continuous, responsive, adaptive form of attention — adjusting tension, correcting deviations, making the thousand small decisions per hour that the variability of natural fibers and the imperfection of the frame required. This attention was demanding. It was also satisfying, in precisely the way Veblen's instinct of workmanship predicts: the exercise of competence in the service of a task that requires it produces a satisfaction that is irreducible to any other form of reward.

The factory operative, tending the power loom, exercised a fundamentally different form of attention. The operative's task was to monitor the machine — to watch for breakages, feed raw material, remove finished product, and intervene when the mechanical process faltered. The operative was not idle. The operative was, in many cases, working harder in terms of physical exertion and longer in terms of hours than the skilled weaver had worked. But the quality of the engagement had changed. The operative was not producing cloth. The machine was producing cloth. The operative was maintaining the conditions under which the machine could produce cloth. The distinction, invisible to the accountant who measured only output, was everything to the instinct of workmanship, which is satisfied not by the existence of the product but by the experience of producing it.

Veblen drew from this analysis a distinction that runs through all his subsequent work: the distinction between the institution and the instinct. The institution — the factory, the wage system, the market for labor — could be restructured to accommodate the machine process. Workers could be retrained, redeployed, reassigned to new roles in the new system. The institution was adaptable because institutions are, by nature, human constructs: they change when the conditions that produced them change. The instinct, however, was not adaptable — not because it was rigid, but because it was biological. The instinct of workmanship did not adjust its requirements to match the new institutional arrangements. It continued to require what it had always required: the exercise of competence in skilled production. And when the institutional arrangements denied it this exercise, the instinct was not accommodated. It was frustrated.

The frustration produced what Veblen characterized, with his distinctive clinical detachment, as a derangement — a state in which the drive persists but has been denied its proper object, and in which the energy of the drive, unable to find productive expression, manifests instead as restlessness, dissatisfaction, and the vague but persistent sense that something essential has been taken away. The derangement was not dramatic. It did not present as the acute suffering of starvation or injury. It presented as the chronic, low-grade malaise of a creature whose environment no longer provides the stimulation that its instincts require — a caged animal that paces not because it is in pain but because it cannot do the thing its organism was built to do.

The economic historians would object, with considerable justification, that the material conditions of factory workers were frequently worse than chronic malaise — that the factories produced genuine suffering, physical and economic, that the wage compression and the working conditions and the child labor constituted damage of a severity that requires no psychological theory to explain. The objection is valid. But Veblen's analysis is not an alternative to the economic analysis. It is an addition. The economic damage was one dimension of the displacement. The psychological damage — the frustration of the instinct of workmanship — was another dimension, operating simultaneously and independently. A factory worker who was adequately compensated, adequately housed, and adequately fed could still experience the derangement of the instinct if his work denied him the opportunity to exercise workmanship. The material conditions could be ameliorated. The instinct required something that no amelioration of material conditions could provide: the exercise of skill.

This point bears emphasis because it is precisely the point that the contemporary AI discourse most consistently misses. The standard reassurance offered to workers facing AI displacement — "new jobs will emerge," "you will be retrained for higher-level work," "the economy will adapt" — addresses the economic dimension of the displacement exclusively. It assumes, without examining the assumption, that a worker who is employed, compensated, and engaged in "higher-level" work is a worker whose needs have been met. Veblen's framework identifies the assumption as false. Employment addresses the economic need. Compensation addresses the material need. "Higher-level" work may or may not address the need for workmanship, depending entirely on whether the higher level provides an adequate outlet for the exercise of skilled, engaged, hands-on competence.

The historical record suggests that the new forms of work produced by the industrial revolution did not, for the most part, provide such an outlet — not immediately, and not automatically. The factory operative's work was, from the standpoint of the instinct, impoverished relative to the artisan's work, regardless of its economic returns. The emergence of new forms of workmanship — the machinist's craft, the engineer's expertise, the foreman's organizational skill — required decades of institutional evolution, during which the instinct was, for millions of workers, chronically frustrated.

The mechanisms through which the frustration was eventually addressed are worth examining, because they will recur, in altered form, in the AI age. The first mechanism was the displacement of workmanship from the economic to the domestic sphere: the working-class garden, the home workshop, the amateur craft. Workers whose instinct of workmanship was denied adequate expression in the factory found outlets in the careful tending of allotments, the construction of furniture for the home, the repair and maintenance of household objects. These activities were economically marginal — they contributed little to the household income and nothing to the gross domestic product. But they were psychologically essential, providing the exercise of skill and the satisfaction of competent production that the factory denied.

The second mechanism was the gradual emergence, within the industrial system itself, of new forms of skilled work that the machine process created even as it destroyed the old ones. The machine required maintenance. The maintenance required knowledge of a different kind than the artisan's knowledge, but knowledge nonetheless — knowledge that could be developed, refined, transmitted, and exercised with the care and competence that the instinct requires. The machinist, the toolmaker, the maintenance engineer — each represented a new outlet for workmanship within the industrial order, an outlet that had not existed before the machine arrived and could not have been predicted from the pre-industrial vantage point.

The third mechanism — and the one that Veblen himself would have considered most structurally significant — was the labor movement's insistence on conditions of work that, whatever their economic rationale, also served to protect the possibility of workmanship. The eight-hour day was an economic demand. It was also a demand for time — time in which the worker could exercise workmanship outside the factory, in the garden or the workshop or the community. The regulation of working conditions was a safety demand. It was also, in its effect if not its explicit intention, a demand for the kind of work environment in which attention and care were possible.

Veblen, it should be noted, was not sentimental about the artisan's world that the machine process displaced. He did not idealize pre-industrial production. He recognized that the artisan's life was hard, that the guild system was frequently corrupt, that the quality of pre-industrial goods was uneven, and that the machine process represented, in aggregate, an enormous expansion of productive capability. His point was not that the old world was better. His point was that the new world, for all its productive superiority, had created a specific form of psychological damage that the old world had not — and that the damage would persist, producing its characteristic derangement, until institutional structures emerged that provided new outlets for the frustrated instinct.

The parallel to the AI moment is structural, not merely analogical. The machine process transferred skilled manual labor from human hands to mechanical operations. The AI process transfers skilled cognitive labor from human minds to computational operations. The mechanism is identical. The material is different. And the question — whether the frustrated instinct will find new outlets, and how long the frustration will persist before the outlets emerge — is the same question, asked at a different level of the productive hierarchy, two centuries later.

---

Chapter 3: The Cognitive Loom

The senior developer described in The Orange Pill who spent his first two days in Trivandrum oscillating between excitement and terror was experiencing, in Veblen's framework, the simultaneous stimulation and frustration of a single instinct through a single instrument. The excitement was genuine: the tool expanded his capability. He could attempt projects he would not previously have attempted, reach into domains that had been beyond his individual capacity, move from idea to working prototype at a speed that recalibrated his sense of what was possible. The terror was equally genuine, though less immediately legible: the tool, in the act of expanding his capability, had removed from the production process the specific operations in which his workmanship was most fully exercised — the debugging, the architectural deliberation, the iterative refinement of code through cycles of failure and correction that had, over two decades, deposited the layers of embodied understanding upon which his professional identity and his instinctive satisfaction both rested.

The oscillation between excitement and terror is, upon examination, the predictable behavior of an organism whose instinct of workmanship is being simultaneously amplified and denied. Amplified, because the tool places extraordinary productive power at the disposal of the worker's judgment. Denied, because the productive power operates by removing the worker from the production process — by automating precisely those operations in which skill, care, and engaged competence were most fully expressed.

The analogy to the machine process is exact in its structure, though different in its material. The power loom automated weaving — the physical operation in which the weaver's hands, posture, rhythm, and responsive attention constituted a form of embodied knowledge that was both productive and satisfying. The language model automates coding — the cognitive operation in which the developer's architectural intuition, debugging instinct, and iterative refinement constituted a form of embodied knowledge (embodied in the mind rather than the hands, but no less dependent on years of physical practice at a keyboard) that was both productive and satisfying.

In both cases, the automation does not eliminate the worker. It repositions the worker. The weaver becomes a machine-tender. The developer becomes what the contemporary discourse calls a "creative director" — a person whose role is to specify, evaluate, and refine the output of a system that does the producing. The repositioning is described, by those who advocate for it, as an elevation: the worker is freed from mechanical labor to engage in higher-order cognitive work. Veblen's framework identifies the description as precisely half-true, and the half that is missing is the half that matters most.

The elevation is real in one dimension. The developer who no longer writes boilerplate code, no longer resolves dependency conflicts, no longer spends hours tracing the source of a null pointer exception, is freed from operations that were, by the standards of the instinct, among the least satisfying components of the work. The boilerplate was tedious. The dependency conflicts were mechanical. The null pointer tracing was, for the most part, drudgery. The instinct of workmanship does not require drudgery for its satisfaction. It requires the exercise of competence, and competence can be exercised at many levels — including the level of direction, judgment, and architectural vision that the AI tools are said to elevate the developer toward.

But the elevation is false — or rather, partial in a way that the discourse obscures — in another dimension. Mixed into the tedium, inseparable from it, were the moments that built the developer's deepest understanding. The null pointer exception that, in the course of its resolution, revealed an unexpected interaction between two subsystems and thereby deposited a layer of architectural intuition that no documentation could have provided. The dependency conflict that, in the course of its untangling, illuminated the actual structure of the codebase in a way that weeks of reading the code could not have achieved. The boilerplate that, in the thousandth repetition, produced a pattern recognition so refined that the developer could sense structural problems before they manifested — the diagnostic intuition that the author of The Orange Pill describes as feeling "the way a doctor feels a pulse."

These moments were rare. They constituted perhaps ten minutes in a four-hour block of otherwise mechanical labor. But they were the moments in which workmanship, in its fullest sense, was exercised — the moments in which the developer was not merely completing a task but developing a competence, refining an understanding, exercising the instinct in the specific way that produces its deepest satisfactions.

The AI removes both the tedium and the ten minutes. It cannot remove one without removing the other, because they are not separate operations performed in sequence. They are aspects of a single process — the process of engaged production — in which the tedium provides the raw material from which the moments of deeper understanding are precipitated. The prospector does not choose which hours of sifting will yield gold. The sifting is the condition of the finding.

Veblen would recognize in this dynamic a pattern he observed repeatedly in the industrial transition: the impossibility of separating the productive from the formative dimensions of skilled work. The weaver's tedious operations — the repetitive monitoring of tension, the mechanical correction of minor deviations — were also the operations through which the weaver's responsive sensitivity to the behavior of the thread was maintained and refined. Remove the tedium, and you remove the conditions under which the skill was exercised and developed. The skill does not persist in the absence of its exercise. It atrophies. And the atrophy is not merely a loss of technical capability. It is a loss of the occasion for workmanship — the removal of the context in which the instinct could find its expression.

The author of The Orange Pill describes one of his engineers in Trivandrum who built a complete user-facing feature in two days — a feature she had never previously had the skills to build, because her expertise was in backend systems and she had never written frontend code. The AI bridged the gap. She described what the interface should feel like. The tool translated her description into code she had never learned to write. The result was genuine, functional, and useful. The boundary between what she could imagine and what she could build had moved so far that, in Segal's words, "her job description changed in a week."

Veblen's framework does not deny the reality of this expansion. It asks a question the expansion's celebrants tend not to ask: what kind of workmanship was exercised in the process? The engineer exercised judgment — the capacity to envision, specify, evaluate. She exercised taste — the sense of what the interface should feel like, which is a real and valuable form of knowledge. She exercised directorial competence — the ability to guide a tool toward a result that matched her vision.

But she did not exercise the workmanship of production. She did not write the code. She did not debug it. She did not experience the iterative cycle of failure and correction through which the code's logic becomes legible not as an abstraction but as a felt reality — the kind of understanding that, once acquired, transforms not merely one's ability to produce but one's capacity to see what one is producing. The feature was built. The workmanship of building it was exercised by the machine. The engineer's contribution was essential — without her vision, her judgment, her specification, the machine would have produced nothing — but the contribution was not workmanship in the sense that Veblen described. It was direction. And direction, however valuable, does not satisfy the same instinct as production.

This distinction — between the satisfaction of direction and the satisfaction of production — is the crux of the disagreement between Veblen's framework and the ascending-friction thesis that The Orange Pill advances. The ascending-friction thesis holds that the removal of friction at one level relocates it to a higher level, and that the higher-level friction is harder, more demanding, and ultimately more satisfying than the friction it replaced. The laparoscopic surgeon, freed from the tactile friction of open surgery, encounters the cognitive friction of operating through a two-dimensional image of a three-dimensional space — friction that is harder at a higher level. The developer, freed from the friction of implementation, encounters the friction of vision, architecture, and product judgment — friction that demands a different and arguably more comprehensive form of competence.

Veblen would not deny that the higher-level friction is real, or that it demands genuine skill, or that the exercise of skill at the higher level can produce genuine satisfaction. He would observe, however, that the satisfaction is of a different kind than the satisfaction of hands-on production, and that the difference is not incidental but structural. The instinct of workmanship evolved in the context of direct, physical, engaged production — the making of things through the coordinated exercise of hand, mind, and material. The instinct is not infinitely plastic. It does not respond with equal satisfaction to every form of skilled activity. It responds most fully to the form of activity for which it evolved: the engaged, embodied production of something through one's own skill and effort.

The developer who directs AI to produce code exercises genuine skill. She may exercise harder skill than the developer who writes code by hand. But the quality of the engagement is different. The feedback loop between intention and result passes through an intermediary — the model — and the intermediary absorbs precisely the operations in which the developer's workmanship was most fully engaged. The developer specifies. The machine produces. The developer evaluates. The cycle repeats. At no point does the developer experience the specific, irreplaceable satisfaction of making the thing herself — the satisfaction that the carpenter experiences when she runs her hand along the joint, the satisfaction that is at once the confirmation of competence and the exercise of the instinct that makes competence worth having.

The cognitive loom has arrived. It produces code that functions, designs that cohere, analyses that hold together. The cognitive worker, repositioned from producer to director, exercises real and valuable judgment. But the instinct of workmanship — calibrated by evolution for the engaged production of things through skilled effort — observes the machine's output with the specific, inarticulate unease of a drive that has been rendered economically unnecessary while remaining psychologically essential.

The question is not whether the cognitive loom produces adequate output. The question Veblen forces, with the persistence of a diagnostician who will not accept the patient's reassurance that everything is fine, is whether the output's adequacy is sufficient to address the worker's need — or whether the need is for something the output, however adequate, cannot provide: the experience of having produced it oneself.

---

Chapter 4: Conspicuous Computation

In December of 2025, a Google principal engineer sat down with Claude Code, described a problem her team had spent a year trying to solve, and received a working prototype in one hour. She posted about the experience publicly. "I am not joking," she wrote, "and this isn't funny." The post was widely circulated, widely discussed, and widely interpreted as evidence of AI's extraordinary capability — which it was.

Veblen would have observed that the post was also something else. It was a display.

The concept of conspicuous consumption, which Veblen introduced in The Theory of the Leisure Class in 1899, describes a pattern of behavior that the contemporary reader may believe she understands but almost certainly does not, because the concept has been simplified, in its century of popular usage, into a caricature of its original form. Conspicuous consumption, in common parlance, means buying expensive things to show off — the gold watch, the luxury automobile, the designer handbag. This is not wrong, but it is shallow. Veblen's analysis was not about expensive things. It was about the social function of display — the mechanism through which individuals establish, maintain, and communicate their position in a status hierarchy by demonstrating their command of resources, their proximity to the sources of economic power, and their distance from the necessity of productive labor.

The mechanism is not confined to the consumption of physical goods. It operates wherever display serves a social-sorting function — wherever the visible exercise or possession of some quality signals membership in a desirable class and distance from an undesirable one. Veblen identified conspicuous leisure (the display of time freed from productive labor), conspicuous waste (the display of resources expended without productive return), and what might be called, extending his framework to the present case, conspicuous capability — the display of productive power, technological fluency, and proximity to the frontier of the tools that are reshaping the economic order.

The AI adoption discourse of 2025 and 2026 exhibits the characteristics of conspicuous capability with a transparency that would have afforded Veblen, had he lived to observe it, considerable analytic satisfaction. The posts, the tweets, the conference presentations, the Substack essays — each is, in its way, a display. The developer who reports shipping a product in a weekend is displaying not merely a product but a relationship to a tool — a relationship that signals technological sophistication, professional currency, and membership in the class of builders who have crossed the threshold and are now operating in the new paradigm. The metrics — lines generated, commits shipped, hours saved, products launched — serve the function that the gold watch served in Veblen's era: they quantify the display, giving it a precision that makes comparison possible and rank-ordering inevitable.

The concept Veblen designated as pecuniary emulation — the tendency of each class to imitate the consumption patterns of the class immediately above it, producing a cascade of competitive display that extends from the top to the bottom of the social hierarchy — operates in the AI economy with a velocity that the original theory, developed in the context of physical goods and institutional leisure, could not have anticipated. The senior engineer posts about her Claude Code experience. The mid-career developer, observing the post, acquires the tool and begins posting about his own experiences. The junior developer, observing both, acquires the tool and begins posting about the outputs it enables, which are, in terms of visible product, indistinguishable from the outputs of her more experienced colleagues — a phenomenon that is, from the standpoint of democratization, a genuine advance, and from the standpoint of pecuniary emulation, a crisis for the status hierarchy that previously rewarded experience and depth.

The cascade of emulation produces a phenomenon that Veblen analyzed in the context of fashion but that applies with equal force to the AI tools: the rapid cycle of adoption, normalization, and escalation. The tool that signals sophistication today becomes the baseline tomorrow. The capability that distinguishes the early adopter in December becomes the expectation for the professional class by March. Each new model release, each new capability announcement, resets the cycle — producing a treadmill of display that bears a structural resemblance to the fashion cycle Veblen described, in which the primary function of each season's new style is to render last season's style obsolete and thereby force the fashion-conscious consumer to consume again.

Veblen's own words, from The Instinct of Workmanship, anticipate this dynamic with startling precision: "each new expedient added to and incorporated in the system offers not only a new means of keeping up with the run of things at an accelerated pace but also a new chance of getting left out of the running...any technological advantage gained by one competitor forthwith becomes a necessity to all the rest, on pain of defeat." The passage was written in 1914, about the technologies of Veblen's own era — the typewriter, the telephone, the mechanical systems of industrial production. Its applicability to the AI moment requires no translation whatsoever.

What the treadmill of conspicuous capability produces, at the level of the individual, is what the author of The Orange Pill describes as the inability to stop. The developer who cannot close the laptop. The entrepreneur who has not taken a day off. The builder who works until the exhilaration drains away and what remains is "the grinding compulsion of a person who has confused productivity with aliveness." The standard interpretation of this behavior, offered by both the triumphalists (who read it as flow) and the critics (who read it as auto-exploitation), focuses on the individual's internal state. Veblen's framework shifts the focus from the internal state to the social structure. The individual cannot stop not merely because the tool is stimulating (it is) or because the individual lacks self-regulation (she may), but because the social environment has established a standard of display that requires continuous demonstration of capability, continuous evidence of output, continuous proof that one is keeping pace with the accelerating frontier.

The individual who stops — who closes the laptop, who takes the weekend, who refuses to post about her latest output — risks not merely falling behind in the competitive race but falling out of the display class entirely. The display is not optional. It is the mechanism through which professional identity is maintained in an economy where the tools have equalized everyone's productive capacity and where the only remaining differentiator is one's visible relationship to the tools.

This analysis illuminates a phenomenon that the celebratory discourse tends to classify as an unqualified good and the critical discourse tends to classify as an unqualified pathology: the metrics culture. The quantification of AI-augmented output — lines of code generated, applications shipped, revenue earned, hours of labor saved — serves, in Veblen's framework, the same function that the price tag serves in conspicuous consumption. The metric is not primarily a measure of value. It is a medium of display. The number communicates not what was produced but how it was produced — with what tool, at what speed, with what efficiency, and therefore from what position in the status hierarchy of the AI-augmented economy.

The invidious comparison — Veblen's term for the social dynamic in which individuals evaluate their worth by measuring themselves against others — operates through these metrics with a ruthless efficiency. The developer who ships a product in a weekend and posts the timeline is not merely reporting. He is establishing a benchmark against which other developers' timelines will be measured. The engineer who achieves a twenty-fold productivity multiplier and publishes the figure is not merely documenting. She is creating a standard of display that her peers must now match or explain away.

The comparison is invidious because it compresses a complex, multi-dimensional activity into a single axis of evaluation. The code that was shipped in a weekend may be brittle, poorly architected, and difficult to maintain. The twenty-fold multiplier may have been achieved by sacrificing the iterative refinement and careful testing that distinguish a prototype from a product. These qualifications are invisible in the display. The display communicates speed, volume, efficiency — the metrics that the conspicuous-capability culture rewards. The qualities that the instinct of workmanship values — care, depth, refinement, the slow accumulation of understanding through engaged production — are not merely unrewarded by the display. They are penalized by it, because they slow the output, reduce the visible metrics, and signal a relationship to the tools that is, by the standards of the emulation cycle, insufficiently aggressive.

The result is a cultural environment in which the exercise of workmanship and the performance of conspicuous capability are not merely different activities but opposed activities. The developer who takes time to understand the code AI has generated — who reads it, traces its logic, identifies its assumptions, refines its architecture until it meets not merely the functional specification but her own standards of quality — produces less visible output than the developer who accepts the code, ships it, and moves on. The first developer exercises workmanship. The second exercises display. The market rewards the second.

In The Theory of the Leisure Class, Veblen observed that the leisure class establishes the norms of taste and conduct for the society as a whole, and that these norms are adopted, through the mechanism of pecuniary emulation, by each successive class below, even when the adoption is economically irrational and personally damaging. The AI economy has produced its own leisure class — not a class of idlers, but a class of displayers — and the norms of this class are being adopted, with the velocity that digital communication permits, throughout the professional hierarchy. The norm is speed. The norm is output. The norm is the conspicuous demonstration of AI-augmented capability, posted publicly, measured quantitatively, and evaluated by comparison with the equally public, equally quantified, equally AI-augmented output of one's peers.

The instinct of workmanship, which operates on a different timescale and values a different set of qualities, is not merely underserved by this cultural environment. It is actively subverted by it. The environment rewards the behaviors that frustrate the instinct and penalizes the behaviors that satisfy it. The developer who cares about quality — who takes the time to understand, refine, test, and improve — falls behind in the display race. The developer who cares about speed — who ships fast, posts often, and measures success by the metrics the culture counts — keeps pace with the display but loses touch with the workmanship that gives the work its human significance.

Veblen would observe — and here the deadpan clinical tone serves a purpose, because the observation is devastating and benefits from the restraint of understatement — that this is not a new problem. The tension between conspicuous display and genuine workmanship is as old as the leisure class itself. The medieval nobleman who displayed his wealth through conspicuous leisure — feasting, hunting, the maintenance of a retinue of servants whose sole function was to be visible — was already enacting the pattern in which display crowds out production and the appearance of capability substitutes for its exercise. What is new in the AI economy is the completeness of the substitution. The medieval nobleman could not actually do the work his servants did. The AI-augmented professional can do the work the AI does — or at least, could once have done it, before the atrophy of the skills the tool replaced made the capacity theoretical rather than practical.

The atrophy is the mechanism through which conspicuous capability converts from a display into a dependency. The developer who accepts AI output without reading it, who ships without understanding, who moves fast because moving fast is what the display requires, gradually loses the capacity to produce without the tool. The capacity was real. It was hard-won. It was the product of years of engaged workmanship. And it atrophies, as any capacity atrophies when its exercise is discontinued, not through a dramatic loss but through the slow erosion that comes from disuse — the forgetting that happens not when knowledge is actively abandoned but when it is simply no longer practiced.

Veblen, who viewed the instinctive drive to produce better technology as a blind force that was, in the late phase of his thinking, "making the world less human," would recognize in the conspicuous-capability culture of the AI economy the specific mechanism of dehumanization he described: not the destruction of human capability, but the institutional creation of conditions in which the capability, though it persists as a biological endowment, is systematically denied the opportunity for its exercise — denied not by prohibition but by the subtler and more effective mechanism of making its exercise economically irrational and socially costly.

The carpenter, it will be recalled, runs her hand along the joint. The gesture is economically irrational. The instrument has confirmed the joint's adequacy. The hand adds nothing to the measurement.

In the conspicuous-capability culture, the gesture would not merely be irrational. It would be costly — a visible deviation from the norm of speed, a display of care that the display economy reads as inefficiency, a signal that the carpenter has not yet internalized the principle that the tool's adequacy is sufficient and that what the hand adds — the satisfaction of the instinct, the confirmation of competence, the completion of the circuit between intention and material — is, by the standards that now govern the evaluation of productive activity, waste.

Chapter 5: The Leisure Class of AI

The leisure class, as Veblen analyzed it, does not labor. This is not an incidental feature of its social position but its defining characteristic — the characteristic from which all other features of the class derive their meaning and their function. The leisure class demonstrates its superiority to the productive classes not through superior production but through conspicuous abstention from production, and the abstention is valorized precisely because production is, in the status hierarchy that the leisure class establishes and maintains, associated with necessity, with subordination, with the condition of those who must work because they lack the resources to refrain from working. The gentleman does not soil his hands. The noblewoman does not cook her own meals. The captain of industry does not operate his own machines. In each case, the distance from productive labor is the measure of social standing, and the display of that distance — through leisure, through the maintenance of servants, through the consumption of goods whose primary function is to signal the consumer's freedom from the necessity of producing them — is the mechanism through which standing is communicated, maintained, and reproduced.

The AI economy has produced a class structure that replicates the essential features of Veblen's leisure class while disguising the replication beneath a vocabulary of meritocracy, creativity, and visionary leadership. The disguise is effective. The participants in the new class structure do not recognize themselves in Veblen's descriptions, because the descriptions were written about a world of visible idleness and conspicuous waste, and the new leisure class is neither idle nor visibly wasteful. It is, on the contrary, extraordinarily busy — busy directing, evaluating, specifying, envisioning, curating, and otherwise exercising the forms of engagement that the AI economy designates as "higher-level" work. But the busyness, examined through Veblen's framework, serves the same structural function as the old leisure: it establishes distance from production.

The class structure of the AI economy, as it was emerging in the period documented by The Orange Pill, can be analyzed through Veblen's categories with a precision that neither the participants nor the commentators appear to have achieved.

At the apex of the hierarchy are the owners of computational infrastructure — the companies that control the models, the training data, the inference capacity, and the platforms through which AI capability is distributed. Their position is structurally identical to the position Veblen described for the absentee owners in Absentee Ownership and Business Enterprise in Recent Times: they derive their income not from productive contribution but from the ownership of capital assets upon which productive workers depend. The barbarian chieftain extracted tribute from the productive members of his community by virtue of his control over the means of coercion. The feudal lord extracted rent from the peasants by virtue of his control over the land. The industrial capitalist extracted profit from the factory workers by virtue of his control over the machinery. The AI platform owner extracts subscription fees, usage charges, and behavioral data from cognitive workers by virtue of his control over the computational infrastructure that those workers require in order to exercise their augmented capability.

The mechanism has evolved. The extraction has become more sophisticated. The relationship between owner and worker has been mediated through layers of interface design, pricing architecture, and terms of service that render the extraction nearly invisible to the worker who experiences it. But the structure — the derivation of income from ownership rather than production, the extraction of value from the productive labor of others — has not changed in any respect that Veblen would consider fundamental. The platform owner is the absentee owner of the AI age, and his absenteeism is, if anything, more complete than the industrial capitalist's, because the platform owner need never visit the factory, need never encounter the workers, need never observe the productive process from which his income is derived. The extraction is automated. The relationship is algorithmic. The distance from production is total.

Below the owners, and dependent upon them, is the class that the author of The Orange Pill designates as the creative directors — the taste-makers, the judgment-exercisers, the people whose value in the new economy lies not in producing but in deciding what should be produced. This class occupies, in Veblen's framework, a position that is genuinely intermediate and genuinely unstable. Its members exercise real skill. Their judgment is not trivial. The capacity to envision a product, to evaluate AI output, to make the thousand decisions that separate a working prototype from a thing that serves human need — this capacity is scarce, valuable, and the product of years of experience.

But the class is intermediate because its skill, however genuine, is exercised not through production but through direction — and direction, in Veblen's analysis, is the characteristic activity of the leisure class, not the productive class. The gentleman does not produce. He directs the production of others. The creative director does not code. She directs the coding of a machine. The structural parallel is not a rhetorical provocation. It is an analytical observation. The creative director's distance from the production process — her engagement with the specification and evaluation of output rather than its creation — places her, in the class structure of the AI economy, in a position that is functionally analogous to the manager's position in the industrial economy: above the production floor, dependent on the tools that the owners provide, exercising a form of competence that is genuine but that derives its economic value from its relationship to the means of production rather than from the production itself.

Veblen's analysis of the managerial class in the industrial economy identified a specific risk: the progressive disconnection of direction from the understanding of what is being directed. The industrial manager who had risen from the production floor retained, for a time, the worker's understanding of the productive process — the embodied knowledge of materials, machines, and the thousand small realities that determine whether a product functions or fails. The manager who had never worked the floor — who had been trained in management as a discipline separate from production — lacked this understanding, and his direction, however sophisticated in its managerial technique, was progressively less informed by the realities of the process he directed.

The AI economy is producing the same disconnection at accelerated speed. The creative director who once wrote code retains, for a time, the developer's understanding of how code works — the architectural intuition, the debugging instinct, the felt sense of where a system is likely to break. The creative director who has never written code — who has been trained in direction from the outset, whose relationship to the productive process has always been mediated by the tool — lacks this understanding. Her direction may be visionary. Her taste may be refined. Her judgment may be, by the standards the market currently rewards, excellent. But her direction is not informed by the embodied knowledge of production, because she has never produced. She has only directed.

The risk, in Veblen's framework, is that the creative class becomes parasitic — deriving its status from the ability to direct production while being progressively disconnected from the production process itself. The disconnection is not immediately visible. It manifests over time, as the creative directors who once produced age out of the profession and are replaced by creative directors who never did. The first generation directs with the authority of experience. The second generation directs with the authority of position. The quality of direction changes — not catastrophically, not immediately, but in the slow, accumulative way that a river erodes a bank: imperceptibly in any given hour, unmistakably over years.

Below the creative directors, and bearing the most direct cost of the transition, are the cognitive workers whose productive role has been automated — the developers, designers, analysts, and writers whose skilled labor the AI has learned to approximate at speed. These workers occupy, in Veblen's framework, the position that the displaced artisan class occupied in the industrial transition: possessing skill that the market no longer rewards at its previous rate, experiencing the frustration of a drive — the instinct of workmanship — that persists without an adequate outlet, and confronting the specific psychological damage of watching their competence exercised, with apparent adequacy, by a machine.

The author of The Orange Pill documents two responses among this class. Some are "running for the hills" — retreating from the profession, lowering their cost of living, preparing for what they perceive as the elimination of their livelihood. Others are "leaning in for the fight" — adopting the tools, seeking to redefine their role, attempting to ascend from production to direction. Veblen would recognize both responses as rational adaptations to an irrational situation. The flight is rational because the economic pressure is real. The fight is rational because the instinct of workmanship demands an outlet and will seek one wherever it can be found. But neither response addresses the structural problem: the class that bears the cost of the transition has no institutional mechanism through which to influence the terms of the transition.

The leisure class — the platform owners — sets the terms. The creative class — the directors — operates within the terms. The displaced class — the producers — adapts to the terms or is eliminated by them. The distribution of agency mirrors, with remarkable fidelity, the distribution Veblen described in the industrial economy: the owners decide, the managers implement, the workers comply. The vocabulary has changed. The power structure has not.

Veblen's most structurally radical observation about the leisure class was that its members, despite their distance from production, sincerely believed themselves to be producers. The captain of industry believed he created wealth. The financier believed he allocated capital efficiently. The landlord believed he improved the land. Each adopted a vocabulary of productivity — investment, enterprise, innovation — that obscured the extractive character of his economic function. The vocabulary was not consciously deceptive. It was the natural expression of a class whose position required it to believe in its own productivity, because the alternative — the recognition that its income was derived from ownership rather than contribution — would undermine the moral legitimacy upon which its social position depended.

The AI economy's ownership class has adopted an equivalent vocabulary. The platform is described as democratizing capability, expanding access, empowering creators. The subscription fee is described as an investment in productivity. The extraction of behavioral data is described as the improvement of the service. The vocabulary is not insincere. The people who employ it believe what they are saying. The democratization is real. The expanded access is real. The improved service is real. But the vocabulary, precisely because it is not insincere, is more effective as a mechanism of legitimation than a deliberately deceptive vocabulary would be. It describes the genuine benefits of the platform while rendering invisible the extractive structure that the benefits serve.

Veblen would observe, with the clinical equanimity that characterized his most structurally devastating analyses, that the class structure of the AI economy is not an aberration. It is the predictable consequence of a technological transition managed through market mechanisms in the absence of institutional counterpressure. The owners capture the gains. The directors are compensated sufficiently to align their interests with the owners'. The producers bear the cost. The vocabulary of the transition — democratization, empowerment, the elevation of human capability — serves the function that the vocabulary of enterprise and investment served in the industrial economy: it describes real phenomena in a way that obscures the distributional question, which is the only question that determines whether the transition produces broad flourishing or concentrated extraction.

The question, then, is not whether the AI economy has a leisure class. It manifestly does. The question is whether the leisure class of the AI economy will follow the trajectory of the industrial leisure class — extracting progressively more, producing progressively less, establishing norms of conspicuous capability that subvert the instinct of workmanship and subordinate production to display — or whether institutional structures can be built, in time, to redirect the gains toward the broader population and to protect, within the new economic order, the conditions under which the instinct of workmanship can find its exercise.

The historical record on this question is, it must be acknowledged, not encouraging. The industrial leisure class was not reformed by moral persuasion. It was constrained by institutional counterpressure — labor movements, regulatory frameworks, the slow accumulation of political power by the productive classes. The counterpressure took generations to build. The damage done in the interim was real, extensive, and in many cases irreversible. The question for the AI economy is whether the counterpressure can be built faster — whether the institutions of the democratic state, the labor movement, and civil society can respond to a transition that is occurring at a speed that makes the industrial revolution look, by comparison, leisurely.

Veblen, who was not temperamentally inclined toward optimism, would have noted that the speed of the transition works against the counterpressure. The industrial leisure class established itself over decades, allowing the productive classes time — insufficient time, but time nonetheless — to organize, to develop institutional capacity, to build the political structures through which their interests could be articulated and defended. The AI leisure class is establishing itself in years, and the productive classes have not yet developed the vocabulary, much less the institutional structures, to describe what is happening to them.

The vocabulary matters, because the vocabulary determines what can be thought and therefore what can be opposed. The displaced developer who describes her experience as "falling behind" has accepted the leisure class's framing — the framing in which the frontier is the measure of value and distance from the frontier is the measure of failure. The displaced developer who describes her experience as "the frustration of the instinct of workmanship" has a different framing — one that identifies not a personal failure but a structural condition, and that opens the possibility of a structural response.

Veblen provided the vocabulary. Whether the productive classes of the AI economy will find it, adopt it, and use it as the basis for the institutional counterpressure that the moment requires is a question that the analysis cannot answer. It can only identify, with the precision that the analysis permits, the stakes of the failure to do so.

---

Chapter 6: Predatory and Industrial

Veblen organized the whole of human economic behavior into two categories, and the categories were not, as the casual reader might suppose, production and consumption, or capital and labor, or supply and demand. The categories were habits of thought — patterns of orientation toward the world that determine, prior to any specific economic calculation, what kind of relationship an individual or a class establishes with the productive process. He called them predatory habits and industrial habits, and the distinction between them structures not merely his economic theory but his entire understanding of human civilization, its achievements, and its pathologies.

Industrial habits of thought are oriented toward production — toward the making of things, the exercise of skill, the cooperative organization of effort in the service of material outcomes. The person who operates within industrial habits asks: How can this be made? How can the process be improved? What does the material require? The orientation is toward the object — the thing being produced — and the satisfaction is derived from the competence of the production. The instinct of workmanship is the psychological foundation of industrial habits. The state of the industrial arts — the accumulated body of technical knowledge available to the community — is their collective expression. The engineer, the craftsman, the skilled worker of any kind, insofar as they are engaged in the exercise of productive competence rather than the pursuit of status or gain, operate within industrial habits of thought.

Predatory habits of thought are oriented toward acquisition — toward the capture of wealth, status, or advantage through means that do not involve direct contribution to the productive process. The person who operates within predatory habits asks: How can this be taken? How can this advantage be exploited? What position allows the extraction of value from the productive efforts of others? The orientation is not toward the object but toward the rival — the competitor whose loss is the predator's gain. The satisfaction is derived not from competence but from dominance, not from making but from having, not from the quality of the work but from the magnitude of the capture.

Veblen traced these habits to the evolutionary conditions under which they developed. Industrial habits, he argued, were the older of the two — the product of the long "savage" period during which human survival depended primarily on productive cooperation: the making of tools, the gathering and preparation of food, the construction of shelter. Predatory habits emerged later, in the "barbarian" period, when the accumulation of surplus made it possible for some individuals to subsist not on their own productive effort but on the productive effort of others, captured through force or cunning.

The distinction is not between good people and bad people. It is between two orientations that coexist, in varying proportions, within every individual and every institution. The same person may exercise industrial habits in her workshop and predatory habits in her business negotiations. The same corporation may produce genuine value through its engineering division and extract unearned rent through its licensing agreements. The habits are not fixed. They are cultivated by the institutional environment — rewarded or punished, encouraged or suppressed, by the structures within which economic life is conducted.

The AI economy creates institutional conditions that cultivate predatory habits with particular intensity, and the mechanism through which it does so deserves examination, because the mechanism is not immediately visible and its effects are frequently mistaken for their opposite.

The builder who uses AI to create genuine value — to solve a real problem, to produce a product that serves human need, to expand the state of the industrial arts — exercises industrial habits. The author of The Orange Pill describes this with conviction and from personal experience: the engineer in Trivandrum whose capability expanded twenty-fold, the product that went from imagination to working prototype in thirty days, the developer in Lagos whose ideas now have a path from conception to reality. These are genuine exercises of industrial habit. The productive capacity has expanded. The material conditions of human life are, in these specific instances, improved by the exercise.

But the same tools, deployed within the same economy, simultaneously cultivate predatory habits on a scale that dwarfs the industrial exercises. The cultivation proceeds through several mechanisms, each of which Veblen's framework identifies with its characteristic clinical precision.

The first mechanism is the conversion of productive gains into extractive advantage. When a twenty-fold productivity multiplier is achieved, the gain can be distributed in two ways: it can be used to expand production — to make more, to serve more, to create more value — or it can be used to reduce labor costs while maintaining output. The first distribution exercises industrial habits. The second exercises predatory habits. The author of The Orange Pill documents this tension directly: the board conversation in which the arithmetic of headcount reduction is placed on the table, the quarterly pressure to convert productivity gains into margin rather than capability. Segal chose the industrial response — he kept and grew the team. But the structure of the economy — the quarterly reporting cycle, the investor expectations, the competitive pressure from organizations that chose the predatory response — makes the industrial choice a continuous act of resistance against a current that flows, naturally and powerfully, toward extraction.

The second mechanism is more subtle and more structurally significant. The AI tools themselves, by their design and their economic architecture, cultivate predatory habits in their users. The tool rewards speed. It rewards volume. It rewards the visible output that the conspicuous-capability culture measures and displays. It does not, by its design, reward care, depth, or the slow refinement of quality — the values that the instinct of workmanship produces and that industrial habits express. The user who adopts the tool and deploys it in accordance with the tool's implicit incentive structure — moving fast, shipping often, measuring success by the metrics the tool makes visible — is being trained, by the structure of the tool itself, in predatory habits: the habits of capture (capturing market position, capturing attention, capturing competitive advantage) rather than the habits of production (producing quality, producing understanding, producing genuine value through engaged effort).

The training is not deliberate. The tool designers did not set out to cultivate predatory habits. They set out to maximize utility, which in the context of a productivity tool means maximizing output, which in the context of a competitive economy means maximizing the visible metrics that determine market position. The cultivation of predatory habits is a structural consequence, not a deliberate intention — and it is, for precisely that reason, more difficult to identify and more difficult to resist than a deliberate cultivation would be. One can resist a person who tells you to be predatory. It is considerably more difficult to resist a tool that trains you in predatory habits while telling you it is making you more productive.

The third mechanism is the institutional tilt toward predatory habits that the AI economy produces at the organizational level. Veblen observed that the business enterprise, by its nature, is organized around predatory habits — around the acquisition of profit rather than the production of goods, around the capture of market position rather than the improvement of the productive process. The business enterprise uses the productive process as an instrument for the generation of profit, and the relationship between the enterprise and the process is, in Veblen's analysis, fundamentally instrumental: the process is valued not for what it produces but for what it earns. When the process can be made to earn more by reducing the quality of what it produces — through cost-cutting, through the substitution of cheaper inputs, through the acceleration of output at the expense of care — the enterprise will make that substitution, because the enterprise's orientation is toward the pecuniary outcome, not the material outcome.

AI amplifies this institutional tilt. The tool makes it possible to produce faster, cheaper, and at greater volume. These capabilities can be deployed in the service of industrial habits — making better things, serving more people, expanding the frontier of what is possible. They can also be deployed in the service of predatory habits — making adequate things faster and cheaper, capturing market position through speed rather than quality, using the productivity gains to extract more value from fewer workers. The institutional structure of the business enterprise — the quarterly cycle, the shareholder expectations, the competitive pressure — favors the predatory deployment, because the predatory deployment produces the financial metrics the institution is designed to optimize.

Veblen would not have been surprised by this outcome. He spent his career documenting the mechanisms through which the business enterprise subordinates production to profit, and his conclusion — stated with the deadpan understatement that served as his most effective rhetorical instrument — was that the interests of the business enterprise and the interests of the community are not merely different but systematically opposed. The enterprise profits by restricting output, by maintaining scarcity, by capturing gains that would otherwise be distributed to the community. The community benefits from expanded output, from abundance, from the broad distribution of the productive capacity that the state of the industrial arts makes possible.

The AI moment intensifies this opposition. The state of the industrial arts has expanded, suddenly and dramatically, to include capabilities that were previously the exclusive province of skilled human labor. The expansion could produce abundance — a world in which the capacity to make things is broadly distributed, in which the imagination-to-artifact ratio approaches zero, in which the instinct of workmanship finds new outlets in the expanded field of possibility. Or the expansion could produce a new form of scarcity — a world in which the capacity to make things is concentrated in the hands of the platform owners, in which the productivity gains are captured as profit rather than distributed as capability, in which the instinct of workmanship is frustrated by an institutional environment that rewards extraction and penalizes the slow, careful, engaged production that the instinct requires.

The outcome is not determined by the technology. It is determined by the institutions — by the habits of thought that the institutions cultivate, by the distribution of power that the institutions enforce, by the choices that the people within the institutions make about whether to exercise industrial habits or predatory ones.

The author of The Orange Pill draws a distinction between the builder who creates a habitat and what he calls the exploiter who extracts from the commons. Veblen would recognize the distinction as his own, expressed in a different vocabulary. The builder exercises industrial habits. The exploiter exercises predatory habits. The line between them is not a line between different people. It is a line that runs through every person, every organization, every institution — a line that separates the orientation toward making from the orientation toward taking, and that determines, in every specific case, which orientation prevails.

The AI tools do not choose which side of the line they serve. They amplify whatever orientation they are given. Industrial habits, amplified, produce extraordinary capability — the thirty-day product, the twenty-fold multiplier, the developer in Lagos whose ideas find their path to reality. Predatory habits, amplified, produce extraordinary extraction — the headcount reduction, the quality substitution, the platform monopoly that captures the gains of a technological revolution and distributes them to the owners of the infrastructure.

The question for the AI age — the question that Veblen's framework places at the center of the analysis, where the contemporary discourse tends to place the question of capability — is not what the tools can do. It is what habits of thought the tools are cultivating, what orientation toward the world the tools are training their users to adopt, and whether the institutional environment can be structured to reward the industrial orientation and constrain the predatory one.

The history of previous technological transitions suggests that the structuring is possible but not automatic, that it requires deliberate institutional construction, and that the construction is typically undertaken only after the damage of unstructured transition has become severe enough to produce political demand for the construction. The question, then, is whether the damage can be anticipated rather than merely suffered — whether the institutional structures can be built before the predatory habits have consolidated their position, rather than after.

Veblen, who observed the consolidation of the industrial leisure class with the detachment of a naturalist recording the behavior of a species whose habits he found both fascinating and destructive, would have regarded the question as genuinely open. The instinct of workmanship persists. The industrial habits persist. The human organism's disposition toward productive, engaged, cooperative effort does not disappear because the institutional environment discourages it. It is suppressed, frustrated, denied its proper expression — but it persists, and in persisting, it constitutes a permanent source of potential resistance to the predatory order.

Whether the potential becomes actual depends, as it always depends, on whether the productive classes can develop the institutional capacity to articulate their interests and defend them against the extractive logic of the owning class. The vocabulary of that articulation is available. The institutional structures through which it might be expressed are not.

The gap between vocabulary and structure is the space in which the AI economy's future will be determined.

---

Chapter 7: The State of the Industrial Arts

The state of the industrial arts is a concept of sufficient importance to Veblen's framework, and of sufficient relevance to the AI moment, that it warrants treatment on its own terms rather than as an appendage to the analysis of class dynamics or institutional incentives. Veblen devoted an entire book to the concept — The Instinct of Workmanship and the State of the Industrial Arts, published in 1914 — and the argument he developed there contains an insight that the contemporary technology discourse has largely failed to recognize, despite the fact that the AI economy is, in many respects, the purest instantiation of that insight in the history of economic life.

The insight is this: technical knowledge is social. It does not belong to individuals. It is not created by individuals. It is the accumulated product of collective human effort, contributed to by millions and owned by none. The developer who writes a sorting algorithm is drawing upon mathematical knowledge that stretches back to al-Khwarizmi. The designer who arranges elements on a screen is drawing upon perceptual principles discovered through centuries of artistic and scientific inquiry. The engineer who deploys a machine learning model is standing upon the work of statisticians, mathematicians, neuroscientists, and computer scientists whose collective contribution made the model possible. No individual created the knowledge. No individual owns it. It is, in Veblen's terminology, an inheritance — a commons of accumulated technical capability that belongs to the community and is advanced through the community's collective effort.

The assertion is not merely philosophical. It has immediate economic implications. If technical knowledge is social — if it is produced by the community and belongs to the community — then the private appropriation of technical knowledge is, in Veblen's framework, a form of enclosure: the capture of a common resource for private gain. The patent that restricts the use of a technique developed through centuries of collective inquiry appropriates the community's inheritance for the benefit of an individual or a corporation. The trade secret that conceals a method that was derived from the state of the industrial arts and would, in the absence of secrecy, contribute to the further advancement of that state, restricts the community's access to its own inheritance. The proprietary algorithm that was trained on the publicly available products of human intellectual effort and is now offered back to the community at a price is, in Veblen's terms, the enclosure of a commons — a commons that the algorithm's owners did not create, cannot claim exclusive credit for, and whose private appropriation serves the interests of the owning class at the expense of the community whose collective effort produced the resource.

The large language model is, in this framework, the most comprehensive instantiation of the state of the industrial arts that has ever been assembled. The model has been trained on a substantial portion of the recorded output of human intellectual effort — the books, the articles, the code, the conversations, the documentation, the creative works, the technical manuals, the accumulated expression of centuries of collective thought. The model does not merely access this knowledge. It instantiates it — compresses it into a system that can generate novel outputs from the patterns embedded in the collective input. The model is, in the most literal sense that the concept permits, the state of the industrial arts made operational — the entire body of human technical and intellectual knowledge, concentrated in a single instrument that any individual can query.

The concentration produces two consequences that operate simultaneously and in opposite directions, and the tension between them is the central structural tension of the AI economy.

The first consequence is democratization. When the state of the industrial arts is concentrated in an accessible instrument, the barriers to participation in the productive process are reduced. The developer in Lagos, whom The Orange Pill describes, gains access to knowledge that was previously gated behind institutional barriers — years of formal education, proximity to centers of technical expertise, membership in professional networks that transmitted tacit knowledge through personal contact. The instrument lowers the floor. It makes productive participation possible for people who were previously excluded not by lack of ability but by lack of access to the accumulated knowledge upon which ability depends. This is, by any reasonable standard, a good — a genuine expansion of human capability, a genuine reduction of an unjust inequality.

The second consequence is enclosure. When the state of the industrial arts is concentrated in an instrument, the instrument's owners acquire control over the community's inheritance. The training data — the raw material from which the model was built — was produced by the community. The mathematical techniques — the algorithms that structure the model's learning — were developed by researchers working within public institutions, publishing their findings in open journals, contributing to the state of the industrial arts in the way that Veblen described: as participants in a collective enterprise whose outputs belong, in principle, to the community. The computational infrastructure — the hardware on which the model runs — was manufactured through supply chains that draw upon the global state of the industrial arts, from the physics of semiconductor fabrication to the engineering of data center cooling systems.

The model, in other words, was built from the community's resources. The community's accumulated knowledge provided the training data. The community's accumulated science provided the algorithms. The community's accumulated engineering provided the hardware. And the model is now owned by a corporation — offered to the community that produced its inputs at a price, governed by terms of service that the community did not negotiate, controlled by a board of directors that the community did not elect.

Veblen would recognize this pattern. He documented it repeatedly across the industrial economy: the appropriation of the community's technical inheritance by private interests, the conversion of the commons into property, the extraction of rent from resources that the rent-collector did not produce. The mechanism is not new. The scale is unprecedented.

The unprecedented scale creates an unprecedented dilemma. In the industrial economy, the enclosure of technical knowledge was partial and distributed. No single entity controlled the entire state of the industrial arts. The knowledge was spread across firms, industries, nations, individuals — each possessing a fragment, none possessing the whole. The competition between possessors ensured, however imperfectly, that the knowledge remained in circulation, that no single point of control could throttle the community's access to its own inheritance.

The AI economy threatens to change this calculus. The large language model concentrates the state of the industrial arts in a single instrument — and the instrument is controlled by a small number of corporations whose market positions are reinforced by the extraordinary capital requirements of model training. The barriers to entry are not barriers of knowledge — the scientific principles underlying the models are public — but barriers of capital: the cost of the hardware, the cost of the data infrastructure, the cost of the engineering talent required to assemble and train a frontier model. The barriers are, in Veblen's framework, pecuniary rather than technical — they reflect not the state of the industrial arts but the state of the price system, and they serve not the productive interests of the community but the competitive interests of the incumbents.

The result is a structure in which the community's technical inheritance — the state of the industrial arts, accumulated over centuries of collective effort — is accessible to the community only through the mediation of a small number of corporate intermediaries, on terms that the intermediaries set and the community accepts. The intermediaries describe themselves as democratizers. They are, in Veblen's framework, enclosers — and the description and the reality are not mutually exclusive, which is what makes the analysis difficult and the politics treacherous.

The enclosure is not total. Open-source models exist, and their existence represents a genuine counter-tendency — a refusal, by a segment of the technical community, to accept the privatization of the commons. The open-source movement in AI operates, in Veblen's framework, as an expression of industrial habits of thought: the orientation toward production, toward the cooperative advance of the state of the industrial arts, toward the broad distribution of technical capability. The tension between the proprietary models and the open-source models is, in Veblen's terms, the tension between predatory and industrial habits applied to the most valuable commons in the history of human knowledge.

The stakes of the tension extend beyond the immediate economic distribution. The state of the industrial arts, in Veblen's framework, is not merely a repository of knowledge. It is the substrate of the instinct of workmanship — the body of accumulated technique upon which individual workmanship draws and to which individual workmanship contributes. The craftsman's skill is personal. The knowledge that underlies the skill is social. When the social knowledge is enclosed — when access to it is gated by price, governed by terms, and controlled by interests that are not the community's interests — the exercise of individual workmanship is constrained not by the limits of the individual's competence but by the conditions of access to the collective inheritance upon which the competence depends.

The developer who uses a proprietary AI tool to build a product exercises genuine workmanship — judgment, taste, the capacity to envision and specify. But her workmanship is exercised within a framework she does not control, upon a platform she does not own, subject to terms she did not negotiate. The instinct of workmanship is expressed. But its expression is conditioned by the structure of access — and the structure of access is determined not by the state of the industrial arts but by the state of the price system.

Veblen's distinction between the state of the industrial arts and the price system is, in this context, the most structurally important distinction in his entire framework. The state of the industrial arts represents the community's productive capacity — the accumulated knowledge, technique, and capability that make production possible. The price system represents the institutional mechanism through which the community's productive capacity is governed — the system of prices, profits, ownership, and market power that determines who gets access to the productive capacity and on what terms. The state of the industrial arts advances continuously, driven by the instinct of workmanship and the collective effort of the community. The price system does not advance in the same direction. It advances in the direction of profit, which is not the same direction as production, and which may, at any given moment, require the restriction of production in order to maintain prices.

The AI economy makes this divergence visible. The state of the industrial arts, instantiated in the large language model, has reached a point at which the productive capacity available to any individual with access to the model is extraordinary. The price system, through which access is governed, restricts that capacity to those who can pay the subscription fee, accept the terms of service, and operate within the platform's constraints. The restriction is not dramatic — the fees are, by the standards of the developed world, affordable, and the terms are, by the standards of corporate licensing, unremarkable. But the restriction is real, and its reality illuminates the structural tension that Veblen identified as the central pathology of capitalism: the subordination of the community's productive capacity to the private interests of the owning class, mediated through the price system, sustained by the institutional structures that the owning class controls.

The question is not whether the AI tools should be free. Computational infrastructure requires capital, maintenance, and ongoing development. The question is whether the structure of access — the terms under which the community's inheritance is returned to the community — serves the productive interests of the community or the extractive interests of the owners. The question is, in Veblen's terms, whether the state of the industrial arts will be governed by industrial habits of thought — oriented toward the broad distribution of productive capability — or by predatory habits — oriented toward the private capture of the gains.

The answer, as Veblen would observe, is not being determined by deliberation. It is being determined by the structure of the institutions through which the tools are developed, deployed, and controlled. The structure currently favors enclosure. The counter-tendency — open source, public research, the democratic-AI movement that the Boston Review has connected to Veblen's concept of idle curiosity — exists and matters. Whether it can grow fast enough to alter the structural trajectory before the enclosure is consolidated is the question upon which the future of the state of the industrial arts — and of the instinct of workmanship that depends upon it — may ultimately rest.

---

Chapter 8: Sabotage

The word sabotage, in common usage, conjures the image of deliberate destruction — the wrench thrown into the machinery, the data center breached, the infrastructure disabled by an agent who intends harm. Veblen used the word differently, and his usage, once understood, illuminates a dimension of the AI economy that the common usage cannot reach. Veblen's sabotage is not destruction. It is restriction. It is the deliberate curtailment of productive output — not in order to damage the productive process, but in order to maintain the conditions under which the productive process generates profit.

The distinction is everything, and it is worth dwelling upon, because the phenomenon Veblen describes with the word is as central to the AI economy as it was to the industrial economy, and considerably better disguised.

In The Engineers and the Price System, published in 1921, Veblen argued that the productive capacity of the industrial system — the capacity that the state of the industrial arts made possible — was systematically underutilized, not because of technical limitation but because of economic calculation. The business enterprise, organized around the generation of profit rather than the production of goods, found that unrestricted production drove prices below the level at which profit could be maintained. The remedy was restriction: the deliberate withholding of productive capacity from the market, achieved through a variety of mechanisms that Veblen catalogued with the thoroughness of a naturalist documenting the behaviors of a species.

The mechanisms included the manipulation of supply — the destruction of crops to maintain agricultural prices, the shutdown of factories during periods of low demand, the maintenance of excess capacity that could be deployed or withheld as market conditions required. They included the manipulation of competition — the patent system, which granted temporary monopolies over techniques that were, in most cases, derived from the state of the industrial arts and could have been independently discovered by any number of practitioners. They included the manipulation of credit — the banking system's capacity to expand or contract the money supply, thereby controlling the rate of economic activity not for the benefit of production but for the benefit of the financial class whose income depended on the maintenance of specific price levels.

In each case, the restriction was not an aberration. It was the normal functioning of the price system — the mechanism through which the business enterprise maintained the conditions of scarcity upon which its profit depended. Production could have been greater. The state of the industrial arts permitted greater output. But greater output would have reduced prices, reduced margins, and thereby reduced the income of the owning class. The restriction was, in Veblen's clinical terminology, a "conscientious withdrawal of efficiency" — a deliberate choice to produce less than the system could produce, for the purpose of maintaining the pecuniary returns that the system was organized to generate.

The AI economy exhibits the same pattern of restriction, adapted to the specific conditions of the digital economy and disguised, as it was in the industrial economy, behind a vocabulary of innovation, optimization, and service.

The most visible form of AI sabotage is the tiering of capability. The large language models are not offered to the community as a single, undifferentiated service. They are offered in tiers — each tier providing a different level of capability at a different price. The basic tier provides access to a model that is adequate for casual use. The professional tier provides access to a more capable model, or to the same model with higher usage limits. The enterprise tier provides access to capabilities — advanced reasoning, extended context, specialized tools — that are withheld from the lower tiers.

The tiering is described as a pricing structure — a mechanism for matching the cost of service to the value received. This description is accurate at the level of surface. At the level of structure, the tiering is Veblen's sabotage: the deliberate restriction of capability that the state of the industrial arts could provide to all, in order to maintain the price differentials upon which the platform's profit depends. The basic-tier model is not the best model the company can produce. It is the best model the company chooses to provide at the basic-tier price. The capabilities withheld from the basic tier are not technically impossible to provide. They are economically undesirable to provide, because their universal provision would eliminate the incentive to upgrade and thereby reduce the revenue that the tiered structure generates.

The restriction is subtle. The basic tier is genuinely useful. The user is not being deprived of a service. She is being offered a service that has been calibrated — by the deliberate withholding of available capability — to leave her wanting more. The wanting is engineered not through deprivation but through the carefully managed differential between what she receives and what she could receive if she paid more. The sabotage is not the denial of service. It is the management of service at a level below the system's capacity, for the purpose of maintaining the pecuniary incentive to upgrade.

A second form of AI sabotage operates through the design of dependency. The AI tools are designed — not maliciously, but structurally, as a consequence of the business incentives that shape their development — to create dependency rather than independence. The ideal outcome, from the perspective of the platform's business model, is a user who becomes progressively more dependent on the tool — whose workflow is increasingly organized around the tool's capabilities, whose data is increasingly stored within the tool's ecosystem, whose skills are increasingly adapted to the tool's interface rather than to the underlying domain. The dependent user is a recurring revenue source. The independent user — the user who acquires the capability the tool provides and then exercises it without the tool — is a customer lost.

The dependency is cultivated through design choices that are, individually, defensible as improvements to the user experience and that are, collectively, a system of retention. The seamless integration of the tool into the user's workflow. The accumulation of user data that makes the tool's recommendations more personalized and more accurate over time — more accurate, and more difficult to replicate outside the platform. The continuous release of new features that extend the tool's capability into adjacent domains, expanding the user's dependence from a single function to an entire workflow.

Veblen would observe that the design of dependency is not a conspiracy. It is the predictable behavior of a business enterprise operating within the logic of the price system. The platform's revenue depends on continued usage. Continued usage depends on the user's inability or unwillingness to achieve the same results without the platform. The design that maximizes continued usage is, therefore, the design that maximizes dependency. The incentive is structural. Individual designers may be entirely well-intentioned. The system they operate within rewards the cultivation of dependency regardless of their intentions.

A third form of sabotage — and the one that Veblen would likely have considered the most structurally significant — operates through the appropriation of training data. The models are trained on the intellectual output of the community — the books, the code, the articles, the creative works produced by millions of individuals over decades. The training data is, in Veblen's framework, part of the state of the industrial arts — the community's collective inheritance of technical and intellectual knowledge. The models ingest this inheritance, process it into a commercially valuable product, and offer the product back to the community at a price.

The mechanism is structurally identical to the enclosure of the agricultural commons in the eighteenth century. The commons — the collective resource upon which the community's productive life depended — was appropriated by private interests, fenced off, and converted into a source of private income. The community that had contributed to the commons through generations of collective use was required, after the enclosure, to purchase access to the resource it had collectively produced. The enclosure was not described as an appropriation. It was described as an improvement — the application of more efficient management techniques to a resource that had been underutilized in its common form.

The AI companies describe the appropriation of training data in similar terms: the data, in its unprocessed common form, was underutilized. The model, by processing the data into a commercially valuable product, has improved the resource. The improvement justifies the private appropriation. The community benefits from the improved resource, even though the benefit is now accessed through a commercial intermediary rather than directly from the commons.

The description is not false. The model does improve upon the unprocessed data. The community does benefit from the improvement. But the description conceals, as the enclosure rhetoric concealed, the distributional question: who captures the value of the improvement? The community contributed the input. The company owns the output. The value added by the processing — by the computational infrastructure, the algorithm design, the engineering labor — is genuine. But it is not the whole of the value. The training data contributed by the community is also genuine value, and its contribution is uncompensated, because the existing legal and institutional framework does not recognize the community's collective intellectual output as a property that requires compensation when appropriated.

Veblen would identify this gap — between the community's contribution and the community's compensation — as the site of the most consequential form of AI sabotage: not the restriction of output, though that is real; not the design of dependency, though that is real too; but the appropriation of the commons itself, the conversion of the community's intellectual inheritance into private property, achieved not through legislation or force but through the simple expedient of training a model on publicly available data and claiming the resulting product as proprietary.

The appropriation is not total. The training data remains available in its original form. The books are still on the shelves. The code is still in the repositories. The articles are still in the archives. What has been appropriated is not the data itself but the synthetic value of the data — the value that emerges when the data is processed into a model that can generate novel outputs from the patterns embedded in the collective input. The synthetic value is enormous. It is the basis of a multi-billion-dollar industry. And it is derived, in its entirety, from the community's intellectual inheritance, processed through infrastructure that the community's collective scientific and engineering effort made possible.

The author of The Orange Pill assumes that the tools will remain accessible — that the democratization of capability will continue, that the floor will continue to rise, that the developer in Lagos will continue to gain access to the productive power that the models provide. Veblen's framework warns that this assumption rests upon the continued willingness of the platform owners to provide access on terms that the community can afford — and that this willingness is conditional, not structural.

The platform owners provide access because, at the present moment, broad access serves their business interests: it builds the user base, generates the data that improves the models, and establishes the dependency that secures future revenue. But the incentive to provide broad access is a function of the current competitive environment. Should the competitive environment change — should the number of viable competitors shrink to the point where competitive pressure no longer disciplines pricing, should the regulatory environment fail to establish conditions that require continued broad access — the incentive structure would shift, and the restriction of access would become, from the perspective of the business enterprise, the rational strategy.

Veblen documented this dynamic repeatedly in the industrial economy: the initial period of competitive expansion, during which broad access to the productive capacity serves the competitive interests of the expanding firms, followed by the period of consolidation, during which the surviving firms restrict access to maintain prices, margins, and market control. The dynamic is structural. It does not depend on the intentions of the individuals involved. It depends on the logic of the price system, which rewards restriction when restriction maintains the conditions of scarcity upon which profit depends.

The sabotage, in Veblen's sense, has not yet reached its mature form in the AI economy. The tools are still expanding. The competition is still vigorous. The access is still, by reasonable standards, broad. But the conditions for future restriction are being assembled — the concentration of infrastructure, the accumulation of proprietary data advantages, the cultivation of user dependency — and the history of every previous industry that followed this trajectory suggests that the restriction will come when the competitive conditions that currently prevent it have been resolved in favor of the incumbents.

The appropriate response to this trajectory is not alarm. It is institutional construction — the building of structures that ensure continued broad access to the state of the industrial arts regardless of the competitive dynamics among the platform owners. Open-source alternatives. Public investment in computational infrastructure. Regulatory frameworks that condition the private appropriation of the commons on the maintenance of community access. The structures do not build themselves. They are built by communities that recognize the trajectory, understand the stakes, and organize in time to establish the institutional counterpressure that the market, left to its own logic, will not produce.

Veblen, who observed the consolidation of industrial sabotage with the clinical attention of a researcher documenting a predictable phenomenon, would note that the window for institutional construction is not indefinite. It narrows as the consolidation proceeds. The point at which institutional counterpressure becomes impossible to assemble is the point at which the enclosure becomes irreversible — and the community discovers, too late, that the inheritance it produced over centuries of collective effort has been converted, through the quiet mechanism of sabotage, into the private property of a class whose interests are systematically opposed to its own.

Chapter 9: The Engineer and the Price System

In 1919, Thorstein Veblen published a series of essays in The Dial that were subsequently collected, in 1921, under the title The Engineers and the Price System. The essays contained an argument that was, by the standards of Veblen's already heterodox career, radical to the point of appearing eccentric — and that has become, in the century since its publication, the most eerily prophetic of all his works, not because the specific revolution he envisioned came to pass, but because the structural tension he identified has reproduced itself, with increasing intensity, in every subsequent era of technological transformation, arriving now at the AI moment in a form so pure that Veblen himself, had he survived to observe it, might have permitted himself the rare satisfaction of having been right.

The argument was this. The productive capacity of industrial civilization was governed by two groups whose interests were systematically opposed. The first group was the engineers — Veblen's term for the technically competent class, the people who understood the productive process from the inside, who could organize production for maximum efficiency, who possessed the knowledge of materials, methods, and machines upon which the actual output of the industrial system depended. The second group was the business class — the owners, financiers, and administrators who controlled the productive process not through technical competence but through the ownership of capital, and who organized production not for maximum output but for maximum profit.

The opposition between these groups was not, in Veblen's analysis, a matter of personal antagonism. Engineers and businessmen might cooperate cordially, respect each other's contributions, and share a genuine interest in the success of the enterprise. The opposition was structural — embedded in the logic of the roles they occupied rather than in the characters of the individuals who occupied them. The engineer's objective function was production: more output, better quality, greater efficiency. The businessman's objective function was profit: the differential between cost and revenue, maintained and maximized through the management of prices, supply, and market position.

The objectives were compatible in some circumstances and incompatible in others, and the circumstances of incompatibility were, Veblen argued, the normal condition of the industrial economy rather than the exceptional one. The businessman restricted output when unrestricted output would reduce prices. The businessman maintained excess capacity when the deployment of that capacity would flood the market. The businessman invested in financial instruments rather than productive infrastructure when financial returns exceeded productive returns. In each case, the businessman's rational pursuit of profit produced a suboptimal deployment of the productive capacity that the state of the industrial arts made available — and the engineers, who understood the productive potential and could see the gap between what the system could produce and what the business logic permitted it to produce, were subordinated to a decision-making authority whose criteria were pecuniary rather than technical.

Veblen's proposed remedy — a "soviet of technicians" that would assume control of the productive process and organize it for output rather than profit — was never implemented, and Veblen himself appears to have regarded the proposal as more diagnostic than practical, an illustration of the structural absurdity of the existing arrangement rather than a realistic blueprint for its replacement. The engineers, he observed, were temperamentally unsuited to revolution. They were, by disposition and training, oriented toward the solution of technical problems rather than the seizure of political power, and their institutional position — employed by the business class, dependent on the business class for their livelihood, socialized into the habits of thought that the business class cultivated — made collective action against the interests of their employers both personally costly and culturally alien.

The diagnosis, however, has outlasted the remedy, and its application to the AI economy requires only the substitution of contemporary terms for Veblen's nineteenth-century vocabulary.

The AI economy is governed by two groups whose interests are systematically opposed in precisely the way Veblen described. The first group is the builders — the engineers, developers, researchers, and technical practitioners who understand AI systems from the inside, who possess the knowledge of architectures, training methods, and deployment requirements upon which the actual capability of the systems depends. The second group is the owners — the venture capitalists, corporate executives, board members, and shareholders who control the AI companies not through technical competence but through the ownership of capital, and who organize the deployment of AI capability not for maximum productive impact but for maximum financial return.

The builders want to build. This is not a rhetorical idealization. It is a description of the instinct of workmanship operating in its natural domain. The engineers who develop large language models, who design the architectures, who solve the training-stability problems and the alignment challenges and the inference-efficiency puzzles, are exercising workmanship in the fullest sense that Veblen described: the engaged, skilled, purposeful application of competence to a task that demands it. Their satisfaction is in the work. Their orientation is industrial. They want the systems to be capable, reliable, and useful — not because capability, reliability, and usefulness generate profit (though they may) but because the instinct of workmanship demands that the thing be done well.

The owners want returns. This is equally not a rhetorical demonization. It is a description of the structural position they occupy. The venture capitalist who invested two billion dollars in an AI company is not indifferent to the quality of the technology. She is, however, ultimately answerable to a different criterion: the financial return on the investment, measured in the specific terms — revenue growth, margin expansion, market valuation, path to exit — that the price system employs. The criterion is not evil. It is structural. The investor's income depends on financial returns. The investor's institutional position requires the prioritization of financial returns. The investor's habits of thought, cultivated by decades of operation within the price system, orient her toward the metrics that the price system rewards.

The opposition manifests in specific, observable ways. The builder wants to release the model when the model is ready — when the alignment work is adequate, when the safety testing is thorough, when the capability is robust. The owner wants to release the model when the market timing is favorable — when the competitive window is open, when the quarterly earnings call requires a catalyst, when the valuation narrative demands a product announcement. The builder wants to invest in fundamental research — the kind of open-ended, curiosity-driven investigation that Veblen, influenced by his connections to the pragmatist philosophers, called "idle curiosity" and recognized as the wellspring of genuinely transformative innovation. The owner wants to invest in applied development — the kind of goal-directed, commercially motivated engineering that produces deployable products on predictable timelines.

The tension is not always resolved in the owner's favor. The AI companies are not sweatshops. Many of them have cultures that genuinely value technical excellence, that provide their engineers with resources and autonomy that would have been unimaginable in Veblen's era, that have made structural commitments to safety and alignment research that cannot be reduced to marketing exercises. Anthropic, the company whose product Claude is the subject of The Orange Pill's central narrative, was founded explicitly on the premise that safety research should not be subordinated to commercial pressures — a premise that represents, in Veblen's framework, the institutionalization of industrial habits of thought within a business enterprise: a genuine and structurally unusual attempt to organize a corporation around productive rather than pecuniary criteria.

But the structural tension persists even within organizations that are sincerely committed to the industrial orientation, because the organizations exist within a competitive environment that is governed by the price system, and the price system rewards the pecuniary orientation regardless of the organization's internal commitments. The AI company that prioritizes safety over speed loses market position to the competitor that does not. The company that invests in fundamental research at the expense of short-term product development loses investor confidence to the competitor that ships faster. The company that maintains broad access to its tools at affordable prices loses revenue to the competitor that restricts access and tiers capability.

The structural pressure is relentless, and its direction is always the same: toward the subordination of the builders' industrial orientation to the owners' pecuniary orientation. Veblen documented this pressure across the entire industrial economy and concluded that the subordination was not an aberration but the normal functioning of the price system — the inevitable consequence of organizing productive activity through a mechanism whose criterion of success is financial return rather than productive output.

The AI economy has added a dimension to the tension that Veblen's era did not possess: the dimension of existential risk. The AI engineers are not merely building more efficient factories. They are building systems whose capabilities are advancing at a rate that exceeds the capacity of any existing institution — technical, regulatory, or ethical — to govern. The engineers who understand the systems, who can see the capability curve and extrapolate its trajectory, are in a position analogous to the nuclear physicists of the 1940s: possessing knowledge whose implications extend far beyond the commercial context in which the knowledge is being applied, and constrained by institutional structures that limit their ability to act on those implications.

Veblen argued that the engineers were the natural governors of the industrial system because they understood the productive process. The argument has acquired, in the AI context, a gravity that Veblen could not have anticipated. The engineers who understand AI systems are not merely the most efficient organizers of production. They are the people who understand what the systems can do, what they cannot do, what they might do if deployed without adequate safeguards, and what they will do if the competitive pressure to deploy outpaces the institutional capacity to govern. The knowledge is technical. The implications are civilizational.

The question Veblen posed in 1921 — whether the engineers could assume the governance of the productive process, and whether the productive process, governed by engineers rather than businessmen, would produce outcomes more aligned with the community's interests — has not been answered. The engineers have not assumed governance. In the AI economy, as in the industrial economy, they remain subordinate to the price system — employed by the business class, dependent on the business class for the resources to continue their work, constrained by the business class's pecuniary criteria in the deployment of the systems they build.

But the question has acquired new urgency. The productive process that Veblen's engineers would have governed — the industrial system of factories, machines, and physical outputs — was powerful but bounded. Its failures were local: a factory explosion, a contaminated product, a price manipulation that impoverished a region. The productive process that the AI engineers would need to govern is neither local nor bounded. Its capabilities are general. Its deployment is global. Its failures, should they occur, may be systemic in ways that no previous technological failure has been.

Veblen's engineers could not organize. They were too temperamentally committed to their work, too institutionally dependent on their employers, too culturally disinclined toward collective political action. The AI engineers face the same constraints — the same institutional dependence, the same cultural disinclination — but the stakes of their inaction are categorically different. The factory that the industrial engineer failed to govern produced shoddy goods or exploited workers. The AI system that the AI engineer fails to govern produces — what? The question is genuinely open, and its openness is the measure of both the opportunity and the risk.

The author of The Orange Pill describes the tension between building for value and building for extraction — between the beaver who creates a habitat and the logic that would convert every habitat into a quarterly return. Veblen would recognize this tension as his own, stated in a different vocabulary but grounded in the same structural analysis. The builders exercise industrial habits. The owners exercise pecuniary habits. The outcome depends on which set of habits governs the deployment of the most powerful productive technology in human history.

Veblen's conclusion — that the price system would subordinate the engineers' productive orientation to the owners' pecuniary orientation, producing outcomes that were profitable for the few and suboptimal for the many — was borne out by the industrial economy's subsequent century. Whether the same conclusion will be borne out by the AI economy depends on whether the engineers of the present era can accomplish what the engineers of Veblen's era could not: the construction of institutional structures that protect the industrial orientation against the relentless pressure of the price system, and that ensure that the extraordinary productive capacity of the AI systems is deployed for the community's benefit rather than restricted for the owners' gain.

The construction requires what Veblen identified and what the engineers have historically lacked: not technical competence, which they possess in abundance, but political consciousness — the recognition that their technical decisions have distributional consequences, that the systems they build will be governed by the institutions they operate within, and that the institutions, absent deliberate effort to the contrary, will govern those systems in the interest of profit rather than production.

The recognition is beginning to emerge. The safety-research movement, the alignment community, the engineers who leave large companies to build organizations structured around productive rather than pecuniary criteria — each represents an incipient form of the political consciousness that Veblen argued the engineers needed and did not possess. Whether the consciousness can crystallize into institutional form before the price system completes its subordination of the AI economy is the question upon which, in Veblen's framework, the future of workmanship — and of much else — may depend.

---

Chapter 10: Salvaging the Instinct

The instinct of workmanship cannot be eliminated. This is the proposition upon which the entire preceding analysis rests, and it is the proposition with which this analysis must conclude, because the conclusion determines whether the AI moment is a tragedy or a transformation — or, as Veblen's framework suggests is most likely, both at once.

The instinct cannot be eliminated because it is not a product of culture. It is not a habit that can be trained away. It is not an ideology that can be refuted. It is, in Veblen's framework, a biological endowment — shaped by natural selection over millennia during which the capacity for skilled, purposeful production was directly tied to survival, and embedded in the human organism as a drive that persists regardless of whether the institutional environment provides an outlet for it. The carpenter's hand reaches for the joint. The programmer refactors the code. The parent who has spent the afternoon building a shelf from reclaimed wood and who runs her hand along the finished surface, not to check it but to feel it, to confirm through touch what the eye has confirmed through sight — that the work has been done well — is exercising an instinct that predates language, predates civilization, predates every institution that has ever frustrated it.

The instinct persists. The question is whether it will find adequate expression in the world that AI is creating, or whether it will be chronically frustrated — producing, across an entire class of cognitive workers, the specific derangement that Veblen identified in the displaced artisans of the industrial revolution: the restlessness of a drive that has no outlet, the malaise of an organism whose environment has been restructured in a way that denies it the exercise of its most fundamental productive capacity.

Three scenarios present themselves, and none of them is tidy.

In the first scenario, the instinct ascends. This is the scenario that The Orange Pill advances with the most conviction: the ascending-friction thesis, in which the removal of mechanical difficulty relocates the challenge to a higher cognitive floor, and the higher floor provides an outlet for workmanship that is, in its way, as demanding and as satisfying as the outlet it replaced. The developer who no longer writes code exercises workmanship in the domain of vision — the capacity to see what should be built, to evaluate whether the thing that has been built matches the vision, to make the thousand decisions that separate a functioning prototype from a product that serves human need. The satisfaction of this work is real. The skill it demands is genuine. The instinct, in this scenario, finds its expression at a new level of the productive hierarchy, and the expression is adequate — not identical to the expression it replaced, but adequate to the drive's requirements.

Veblen's framework treats this scenario as possible but not probable in the absence of institutional support. The instinct can ascend. The history of technology is, in one reading, the history of the instinct's successive ascensions — from the hand tool to the machine tool, from the machine tool to the computer, from the computer to the AI system, each transition requiring a new form of workmanship and each form being, at its best, genuinely satisfying. But the ascension is not automatic. It requires that the new domain of workmanship be structured in a way that permits the exercise of skill — that provides clear feedback, that demands genuine competence, that rewards care and penalizes carelessness. If the new domain is structured around metrics that reward speed and volume rather than quality and depth — as the conspicuous-capability culture described in Chapter 4 tends to structure it — the ascension will be frustrated, and the instinct, though it has ascended in principle, will find no adequate outlet in practice.

The institutional requirement is specific. Organizations that wish to preserve the conditions for ascending workmanship must create environments in which the evaluation of AI-directed work reflects the full range of qualities the instinct values — not merely the speed of production but the care of specification, not merely the quantity of output but the quality of judgment, not merely the final product but the process by which the product was conceived, evaluated, and refined. The environments do not create themselves. They must be designed with the same deliberation that the AI tools themselves were designed — and by people who understand, as Veblen understood, that the instinct of workmanship is not a luxury to be indulged when the economics permit but a requirement to be met if the human beings within the organization are to function at the level of engagement and satisfaction that productive work, properly structured, can provide.

In the second scenario, the instinct is partially redirected. This is the scenario that historical precedent most strongly supports. The industrial revolution did not eliminate the instinct of workmanship. It did not fully satisfy the instinct within the new industrial order, either. Instead, the instinct found partial expression within the workplace — in the new forms of skilled work that the machine process created, as described in Chapter 2 — and supplementary expression outside the workplace, in the domestic crafts, the amateur hobbies, the weekend workshops, and the community projects that allowed workers to exercise, in their own time and on their own terms, the productive competence that the factory denied them.

The AI economy may produce a similar settlement. The cognitive worker whose instinct of workmanship is partially satisfied by the work of directing AI tools — exercising judgment, specifying vision, evaluating output — and partially frustrated by the loss of hands-on production may find supplementary expression in domains that AI does not reach: the physical crafts, the embodied arts, the forms of production that require the coordination of hand and eye and material that no computational process can replicate. The renewed interest in woodworking, ceramics, gardening, and cooking that accompanies each wave of digital intensification is not, in Veblen's framework, a nostalgic regression. It is the instinct of workmanship seeking the outlet that the digital domain has denied it — the tactile, embodied, materially resistant form of production for which the instinct evolved and in which it finds its fullest expression.

This scenario is neither optimistic nor pessimistic. It is realistic — grounded in the observation that the instinct has survived previous technological transformations by finding outlets in domains that the transformation did not reach, and that the survival, while it preserved the instinct's expression, did not preserve its centrality to economic life. The instinct survived, but it survived at the margins — in the garden, the workshop, the weekend project. The economic center was governed by the institutional logic of the price system, and the instinct accommodated itself to the margins because the center would not accommodate it.

The AI economy may produce the same accommodation — a world in which the instinct of workmanship is exercised on weekends and evenings, in the physical crafts and the embodied arts, while the workweek is governed by the logic of AI-augmented productivity, with its metrics of speed and volume and its conspicuous-capability displays. The accommodation is sustainable. It is not satisfying — not at the level of civilizational ambition that the AI moment, at its most generous, makes possible. A civilization that relegates its most fundamental productive instinct to the margins of economic life has not failed, exactly, but it has settled for less than it could achieve.

In the third scenario, the instinct is chronically frustrated. This is the scenario that Veblen's analysis most urgently warns against, and that the early evidence — the burnout documented in the Berkeley study, the flight-to-the-woods response that The Orange Pill describes, the quiet despair of experts whose expertise has been commoditized — suggests is already underway for a significant fraction of the cognitive workforce.

In this scenario, the institutional structures that would provide adequate outlets for the instinct are not built — not because they are impossible to build, but because the competitive logic of the AI economy does not reward their construction. The organizations that preserve spaces for workmanship lose competitive position to the organizations that optimize for speed. The workers who insist on exercising care and depth are outperformed, on the visible metrics, by the workers who accept AI output without scrutiny and ship without understanding. The culture rewards the conspicuous and penalizes the careful, and the instinct, denied its outlet in both the workplace and the domestic sphere (because the domestic sphere has itself been colonized by the digital environment), produces its characteristic derangement: the chronic, low-grade malaise of an organism that cannot do what it was built to do.

The derangement is not dramatic. It does not produce revolution or collapse. It produces something quieter and harder to diagnose: a population of skilled, intelligent, economically productive people who are, at the level of the instinct, unfulfilled — who produce more than any previous generation but enjoy the production less, who are busier than any previous generation but less engaged, who are more capable than any previous generation but less satisfied by the exercise of their capability. The derangement is the condition of the caged animal: fed, sheltered, monitored, but denied the specific activity for which its organism was designed — and the denial produces not starvation but pacing, not collapse but the particular, recognizable restlessness of a creature that has everything except the thing it needs.

Veblen would observe — and the observation would be delivered in the clinical register that characterizes his most structurally devastating analyses, the tone that presents the diagnosis as if it were merely a description of natural phenomena — that the choice between these scenarios is not a choice between technologies. The technology is the same in all three scenarios. The large language model, the computational infrastructure, the natural-language interface — each is present in the optimistic scenario and the catastrophic one. The choice is institutional. It is a choice about the structures within which the technology is deployed — the organizational designs, the cultural norms, the regulatory frameworks, the educational systems, the habits of thought that the institutions cultivate and reward.

The instinct of workmanship is the constant. It persists in all three scenarios because it is biological, not institutional. It persisted through the industrial revolution, through the displacement of skilled manual labor, through the reorganization of production around the machine process. It will persist through the AI revolution, through the displacement of skilled cognitive labor, through the reorganization of production around the computational process. The question is not whether the instinct survives. The question is what kind of world it survives in — a world that provides it with adequate expression, or a world that frustrates it systematically while celebrating the productivity of the frustration.

The author of The Orange Pill ends his book with a question that a twelve-year-old asked her mother: "What am I for?" Veblen's framework provides an answer that is older than the question and more durable than any technology: You are for the work. Not the output. Not the product. Not the metric. The work — the engaged, skilled, purposeful exercise of your competence in the production of something that meets your own standards of quality. That is what the instinct requires. That is what the instinct has always required. That is what the instinct will require when the AI tools have advanced beyond anything the present moment can imagine, because the instinct was not calibrated for a particular technology. It was calibrated for the experience of doing something well.

The carpenter runs her hand along the joint. The gesture serves no productive purpose. The instrument has confirmed what the fingers confirm again.

The question for the civilization that possesses the most powerful productive tools in the history of the species is whether it will build institutions that preserve the space for that gesture — that recognize, in the gesture, the expression of a drive that is more fundamental than any technology, more persistent than any market, and more essential to human flourishing than any productivity metric has the capacity to measure.

The instinct cannot be salvaged by the market, which values output over engagement. It cannot be salvaged by the technology, which is indifferent to the satisfaction of its users. It can be salvaged only by deliberate institutional construction — by organizations, cultures, and societies that understand what Veblen understood: that the human animal wants to do good work, that the wanting is not optional, and that the institutions that deny the wanting will produce, regardless of their productive efficiency, a form of civilization that is, at the level of the instinct, uninhabitable.

The tools are ready. The state of the industrial arts has never been richer. The productive capacity available to the community has never been greater. The question — the only question that Veblen's framework considers genuinely important — is whether the institutions that govern the deployment of that capacity will be structured to serve the instinct of workmanship, or whether the instinct will be sacrificed, as it has been sacrificed before, to the requirements of the price system.

The sacrifice is not inevitable. The instinct persists. It will find its outlet, or it will demand one. The question is only how much damage accumulates before the demand is met.

---

Epilogue

The instinct that Veblen named is the one I failed to name in myself for decades.

Every time I talked about Claude Code with another builder — in hallways, on flights, across dinner tables — the conversation always landed in the same place. Not capability. Not efficiency. Not even disruption. It landed on the feeling. The specific, hard-to-describe feeling that comes from building a thing with your own mind and your own hands and knowing, before anyone else touches it, that it is right. And the equally specific, equally hard-to-describe unease that comes from watching a machine produce something adequate in the time it would have taken you to produce something excellent.

I wrote in The Orange Pill about the senior architect who felt like a master calligrapher watching the printing press arrive. I described what he lost. I did not have a name for it. I called it "something beautiful" because I did not have a better word. Veblen had the word. The word is workmanship. Not pride. Not ambition. Not craft as an aesthetic preference. The instinct itself — the drive to do something well because your organism requires it, because the hands want to confirm what the instruments have already measured, because adequacy is not the same as quality and your nervous system knows the difference even when the metrics do not.

The instinct is what powers the flow state I described in Chapter 12 of my book — that condition where the challenge matches your skill and self-consciousness falls away and you are wholly inside the work. Csikszentmihalyi gave me the psychological vocabulary. Veblen gives me the biological one. Flow is not just a state. It is the instinct of workmanship in full expression. And the terror I felt alongside the exhilaration — the unnamed companion that followed me through the Trivandrum training, through the thirty-day sprint to CES, through the hundred-and-eighty-seven-page draft written over the Atlantic — that terror was the instinct registering that the thing it needs most was being restructured beneath it, fast, without anyone building the institutions that would give it somewhere new to go.

What unsettles me most about Veblen is not the diagnosis. The diagnosis is a century old, and the evidence has only grown sharper. What unsettles me is the patience required by the remedy. The instinct persists. It demands its outlet. But the outlet requires institutional construction, and institutions are slow, and the river of intelligence is fast, and the distance between the two is where real people live — my engineers, my children, the developer in Lagos, the twelve-year-old asking her mother what she is for.

The answer I offered her was: You are for the questions. Veblen's answer is older and cuts closer to the bone: You are for the work. Not the output. The work itself — the doing, the engagement, the exercise of the capacity that makes you human. That answer does not change because the tools change. It only becomes harder to honor.

I am still building. I will keep building. But I am building now with a name for the thing I am trying to protect — not just capability, not just judgment, not just the right to create. The right to exercise the instinct. The right to care about quality when the market rewards speed. The right to run your hand along the joint when the instrument has already told you everything the hand will tell you again.

That gesture is not waste. It is the most human thing we do.

Edo Segal

In 1914, Thorstein Veblen identified a drive more fundamental than ambition, deeper than pride, older than any economy: the instinct of workmanship -- the human need to do something well with your own

In 1914, Thorstein Veblen identified a drive more fundamental than ambition, deeper than pride, older than any economy: the instinct of workmanship -- the human need to do something well with your own skill and effort. Not for the reward. Not for the recognition. For the satisfaction that lives in the doing itself. A century later, AI tools can produce adequate code, competent prose, and functional designs faster than any human. The output is right. So why does something feel wrong?

This book applies Veblen's devastating framework -- conspicuous consumption, the sabotage of production for profit, the war between makers and takers -- to the AI economy with surgical precision. From the "conspicuous computation" of developers posting shipping metrics as status displays, to the platform owners enclosing humanity's collective intellectual inheritance behind subscription tiers, Veblen's century-old vocabulary names what the contemporary discourse cannot.

The instinct of workmanship survived the industrial revolution. It will survive AI. But survival is not the same as flourishing -- and the difference depends entirely on whether we build institutions that give the instinct somewhere to go, or let the price system starve it while celebrating the productivity of the starvation.

-- Thorstein Veblen

Thorstein Veblen
“occupies the interest with practical expedients, ways and means, devices and contrivances of efficiency and economy, proficiency, creative work and technological mastery of facts.”
— Thorstein Veblen
0%
11 chapters
WIKI COMPANION

Thorstein Veblen — On AI

A reading-companion catalog of the 17 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Thorstein Veblen — On AI uses as stepping stones for thinking through the AI revolution.

Open the Wiki Companion →