The industrial state Galbraith described in 1967 was organized around the manufacture of physical goods — automobiles, appliances, chemicals, steel — through processes requiring enormous capital investment, specialized labor, and management systems complex enough to coordinate thousands of workers across continental supply chains. The industrial state emerging in 2026 is organized around a different form of production: the production of inferences. Large AI companies produce not physical goods but computational predictions — pattern completions, language generation, code synthesis — through processes requiring enormous capital investment (in compute rather than factories), specialized labor (in machine learning rather than metallurgy), and management systems complex enough to coordinate training, alignment, and deployment of models whose internal operations are not fully understood even by their builders. The parallels operate at the level of economic architecture rather than surface similarity.
The first parallel is the capital barrier. General Motors in 1967 required billions in plant, equipment, and working capital. The requirement was not a contingent feature to diminish as the industry matured; it was structural. Anthropic, OpenAI, and Google DeepMind require billions in compute infrastructure, training data, and research talent. This is not a transitional condition. Frontier models require more compute with each generation, not less. The barrier to entry is rising, not falling. The conventional wisdom about open-source models democratizing the technology is the same conventional wisdom that was applied to the automobile industry in the early twentieth century, when hundreds of small manufacturers competed in an open market. The market consolidated.
The second parallel is technostructure indispensability, with the new technostructure differing from the old in a crucial respect: the concentration of its knowledge is more extreme. General Motors' technostructure numbered in the thousands. The AI technostructure at any frontier lab numbers in the hundreds, and the subset whose knowledge is genuinely indispensable — people who understand architecture, training, alignment at a level sufficient for governance — may number in the dozens. The concentration of indispensable knowledge in so few individuals is historically unprecedented.
The third parallel is the planning system's capacity to shape demand. General Motors did not discover Americans wanted large cars; it created the preference through the most sophisticated demand-management apparatus the world had yet seen. The AI planning system shapes demand through mechanisms different in form but identical in function: the product launch creates anticipation; the free tier creates dependency; the integration with existing workflows creates switching costs; the cultural narrative — AI as empowerment, as democratization — converts a computational tool into an identity marker.
Hunter Lewis observed that AI differs from traditional infrastructure in a crucial respect: "to build an AI product is to consume an AI product." The automobile, once purchased, belonged to the buyer. The road, once built, was available to all. The AI model remains the property of the firm that trained it — accessed through subscription, governed by terms of service, subject to changes in capability and pricing at the firm's discretion. The Software Death Cross marks the transition between industrial states as clearly as any stock chart can.
The concept is developed in Chapter 8 of the Galbraith simulation volume as a systematic application of The New Industrial State's framework to the AI economy. David Lingenfelter's 2025 analysis, tracing the technostructure's evolution "from Detroit to Silicon Valley," provided much of the conceptual groundwork, as did Antonio Ieranò's work on successive waves of information technology. The term "revised industrial state" is adopted rather than invented, reflecting Galbraith's own use of "revised sequence" and "revised version" to indicate his corrections to orthodox models.
Capital barrier as structural permanence. Frontier AI capital requirements are not transitional; they are features of the technology itself, producing permanent oligopoly rather than transient dominance.
Extreme concentration of indispensable knowledge. The AI technostructure's governance subset is smaller than in any previous industry, producing a power dynamic more extreme than anything Galbraith documented.
Demand management preserved, vehicle changed. The industrial planning system's mechanisms for shaping demand translate directly into the AI economy, with different instruments producing identical effects.
Ownership of use rather than possession. AI products are consumed rather than owned; the builder's relationship to AI capability is permanently mediated by the firm that trained the model.