You On AI Encyclopedia · Infrastructure Dependency The You On AI Encyclopedia Home
Txt Low Med High
CONCEPT

Infrastructure Dependency

The unseen foundation beneath every AI interaction — fabs, power plants, data centers, supply chains — whose concentration and opacity create a tenant-landlord relationship between users and providers that the democratization narrative systematically obscures.
A smartphone is a magic trick. A flat rectangle of glass and metal that responds to touch, connects to the sum of human knowledge, navigates by satellite. The user sees the trick. The engineer sees the infrastructure: cell tower, fiber-optic cable, switching station, continental backbone, transatlantic cable, data center, power plant, fuel supply chain, and — behind all of it — a semiconductor fabrication plant that cost fifteen to twenty billion dollars to construct and employs thousands of engineers in cleanroom conditions more stringent than a surgical theater. The AI tools Edo Segal celebrates in You On AI sit on infrastructure comparable in scale and complexity, and the infrastructure behind the smartphone took decades and trillions of cumulative investment to build.
Infrastructure Dependency
Infrastructure Dependency

In The You On AI Encyclopedia

The training infrastructure comes first. A frontier language model requires clusters of thousands of specialized processors — GPUs or TPUs — connected by high-bandwidth networks and powered by dedicated electrical substations. Training a single frontier model consumes electrical power equivalent to tens of thousands of households running continuously for weeks. The hardware represents manufacturing concentration that makes the smartphone supply chain look diversified: the most advanced training chips are fabricated almost exclusively by Taiwan Semiconductor Manufacturing Company (TSMC) using lithography equipment from a single Dutch company, ASML. The entire AI training infrastructure of the Western world depends on a supply chain passing through a single island in the western Pacific.

Inference infrastructure is equally consequential and, in aggregate, more expensive. Training is a one-time cost; serving a model to millions of users requires continuous computation, power, cooling, and bandwidth. The International Energy Agency has flagged AI data centers as a significant and growing fraction of global electricity demand. In several regions, data center construction has been delayed or blocked by insufficient grid capacity. The magic is bumping against the physics of power generation and distribution.

Transistors To Tokens
Transistors To Tokens

The democratization Edo Segal celebrates operates at the interface level, not the infrastructure level. The developer in Lagos accesses Claude Code through a subscription; the subscription gives her access to a capability, not to the infrastructure that produces it. She cannot train her own model, run her own inference cluster, or choose an alternative provider if the one she depends on changes pricing, terms of service, or content policies. She is, in the most literal sense, a tenant on someone else's infrastructure.

The concentration is measurable. As of 2026, the three largest cloud providers — Amazon Web Services, Microsoft Azure, and Google Cloud — control approximately two-thirds of the global cloud infrastructure market. The AI capabilities users access through subscriptions are, in almost every case, running on infrastructure owned by one of these three (or, in the case of Anthropic's Claude, hosted on one of them). Moore's semiconductor history provides the precedent: fab costs escalated from millions in the 1970s to tens of billions today, driving industry consolidation to three companies worldwide — TSMC, Samsung, Intel — with geopolitical significance. The same consolidation dynamic is now visible in AI, compressed onto a faster timeline.

Origin

The framework of infrastructure dependency draws on decades of analysis of the semiconductor supply chain, particularly Chris Miller's Chip War (2022), and on emerging analysis of AI infrastructure concentration by researchers at Epoch AI, the AI Now Institute, and the Center for Security and Emerging Technology. The specific framing — that interface democratization masks infrastructure concentration — is articulated in this volume as a synthesis of Moore's industrial history with contemporary observations of AI industry structure.

Key Ideas

The interface is not the infrastructure. Democratization at the user level — a hundred-dollar subscription that unlocks AI-augmented building — does not democratize the underlying infrastructure, which remains concentrated in a handful of entities.

Energy Wall
Energy Wall

Training is capital; inference is metabolism. Training costs are one-time capital expenses; inference costs are ongoing metabolic expenses that compound with usage.

Supply chain concentration is geopolitical. The fabrication of advanced AI training chips depends on a supply chain passing through Taiwan, creating vulnerabilities no amount of software innovation can compensate for.

Dependency relationships are tenant-to-landlord. Users of AI infrastructure cannot choose alternative providers, cannot replicate the infrastructure, and have limited leverage to influence its terms.

Opacity compounds dependency. Training data, model architectures, inference optimizations, and cost structures are proprietary, leaving users without visibility into the foundation their work rests on.

In The You On AI Book

This concept surfaces across 2 chapters of You On AI. Each passage below links back into the book at the exact page.
Chapter 13 Friction Has Not Disappeared Page 2 · Ascending Friction
…anchored on "their bandwidth was consumed by the plumbing"
Frameworks abstracted away code structure: routing, templating, database connections. The critics said, “You will lose understanding of the architecture.” They were right again: Most framework users could not build the framework they…
The friction that matters is the friction that replaces it.
The lost depth was real. The gained breadth was larger.
Read this passage in the book →
Chapter 14 The Democratization of Capability Page 4 · Access, Not Yet Equality
…anchored on "infrastructure that billions of people do not have"
Access requires connectivity, and connectivity requires infrastructure that billions of people do not have. It requires hardware that costs more relative to local wages in Lagos than in San Francisco. It requires English-language fluency,…
AI tools lower the floor of who gets to build.
A philosophy of friction that cannot account for the rising floor has told only half the truth. The privileged half.
Read this passage in the book →

Further Reading

  1. Chris Miller, Chip War: The Fight for the World's Most Critical Technology (2022)
  2. Kate Crawford, Atlas of AI (2021)
  3. International Energy Agency reports on data center electricity consumption
  4. Epoch AI, Compute Trends Across Three Eras of Machine Learning

Three Positions on Infrastructure Dependency

From Chapter 15 — how the Boulder, the Believer, and the Beaver each read this concept
Boulder · Refusal
Han's diagnosis
The Boulder sees in Infrastructure Dependency evidence of the pathology — that refusal, not adaptation, is the correct posture. The garden, the analog life, the smartphone that is not bought.
Believer · Flow
Riding the current
The Believer sees Infrastructure Dependency as the river's direction — lean in. Trust that the technium, as Kevin Kelly argues, wants what life wants. Resistance is fear, not wisdom.
Beaver · Stewardship
Building dams
The Beaver sees Infrastructure Dependency as an opportunity for construction. Neither refuse nor surrender — build the institutional, attentional, and craft governors that shape the river around the things worth preserving.

Read Chapter 15 in the book →

Explore more
Browse the full You On AI Encyclopedia — over 8,500 entries
← Home 0%
CONCEPT Book →