By Edo Segal
The neighborhood I almost killed was my own company.
Not through malice. Through efficiency. After the Trivandrum sprint I describe in *The Orange Pill* — twenty engineers, each suddenly capable of what twenty used to do together — I watched something happen that I celebrated before I understood it. Everyone went deep into their own conversation with the machine. The output was extraordinary. The dashboards looked incredible. The velocity metrics were the best I'd ever seen.
And then a bug shipped that three people should have caught.
Not a catastrophic bug. A subtle architectural choice that worked in isolation and created conflicts at scale — the kind of thing that, six months earlier, would have been flagged over lunch by a senior engineer who overheard a junior one describing the approach. That lunch conversation stopped happening. Not because anyone canceled it. Because the conditions that produced it — the pause, the proximity, the idle moment where one person's problem drifted into another person's expertise — had been optimized away.
I didn't have the language for what I was seeing until I read Jane Jacobs.
Jacobs spent her life watching cities, not from helicopters but from stoops. She noticed that the safest streets weren't the ones with the most police. They were the ones with the most people — different people, doing different things, at different times, for different reasons. She noticed that the most innovative economies weren't the fastest-growing ones. They were the most diverse. She noticed that when planners replaced messy, organic neighborhoods with clean, rational superblocks, the neighborhoods died. Not metaphorically. The shops closed. The streets emptied. The crime rose. The tax base collapsed.
Every one of those observations maps onto what is happening right now in the digital economy. The platforms that absorbed the open web are the superblocks. The AI tools that enable solitary production are the automobiles that emptied the sidewalks. The catastrophic money flowing into a handful of companies is the urban renewal budget that demolished the neighborhoods where innovation actually lived.
Jacobs was not anti-technology. She was anti-monoculture. She understood that vitality — urban, economic, creative — arises from conditions that efficiency-minded planners consistently destroy: the mixing of uses, the density of contact, the cheap space where experiments can fail affordably, the informal exchanges that no org chart captures.
This book applies her framework to the AI moment with a precision that surprised me. It changed how I think about what I'm building and who it's for.
The sidewalk ballet was never choreographed. It emerged from the conditions. Our job is to maintain the conditions.
— Edo Segal ^ Opus 4.6
1916-2006
Jane Jacobs (1916–2006) was an American-Canadian urbanist, author, journalist, and economic theorist whose work fundamentally reshaped how cities, economies, and communities are understood. Born in Scranton, Pennsylvania, she moved to New York City as a young woman, where she wrote for *Architectural Forum* and became an activist in Greenwich Village. Her landmark book *The Death and Life of Great American Cities* (1961) mounted a devastating critique of modernist urban planning, arguing that the top-down demolition of mixed-use neighborhoods destroyed the organic social and economic order that made cities vital. She followed it with *The Economy of Cities* (1969) and *Cities and the Wealth of Nations* (1984), in which she developed her theories of import replacement as the engine of economic growth and diversity as the structural precondition for economic resilience. Jacobs famously led the fight against Robert Moses's proposed Lower Manhattan Expressway, helping to save the neighborhoods of SoHo and Greenwich Village from demolition. In 1968 she moved to Toronto, where she continued writing and advocating for community-scale urban development until her death. Her concepts — the "sidewalk ballet," "eyes on the street," the economic necessity of old buildings, the danger of monoculture — have influenced urban planning, economics, sociology, and increasingly, thinking about digital ecosystems and platform economies.
In the spring of 1955, a city planner named Robert Moses stood before a map of lower Manhattan and drew a line through what he considered blight. The line was a highway — the Lower Manhattan Expressway — and it would have required the demolition of what Moses saw as a tangle of obsolete buildings, narrow streets, and disorderly commercial activity in neighborhoods including SoHo, Little Italy, and parts of what is now called NoHo. Moses had spent three decades remaking New York according to a single principle: that a modern city required the orderly separation of functions, the free flow of automobile traffic, and the replacement of old, mixed-use neighborhoods with rational, planned superblocks. He had already driven highways through the Bronx, displaced hundreds of thousands of residents, and demolished entire commercial districts. In each case, the logic was the same. The existing neighborhood looked, from above, like chaos. The plan looked like order. The plan was implemented. The neighborhood died.
A woman named Jane Jacobs, who lived on Hudson Street in Greenwich Village, watched Moses and understood something the planners could not see from their offices. What they called chaos was actually a form of order so intricate, so dependent on the specific, particular, daily interactions of the people who lived there, that it could not be replaced by any plan, no matter how rational it appeared on paper. The butcher who knew which customers were having a hard month. The shopkeeper who watched the children walk to school and noticed when one was missing. The old women on stoops who knew every face on the block and could tell a stranger from a neighbor at fifty feet. The combination of residences, shops, bars, small factories, warehouses, and churches that kept the sidewalks populated at all hours and made the street safe not through policing but through the ordinary presence of people going about their different business at different times for different reasons.
Jacobs spent years watching her street before she wrote about it. What she saw was a choreography — she called it the sidewalk ballet — in which the apparent randomness of urban life concealed a deeply functional social order. The hardware store owner put out his sidewalk display at eight in the morning; the bar opened at noon and kept the street populated until two in the morning; the longshoremen came and went on their own schedule; mothers walked children to school at eight-thirty and picked them up at three; the laundry and the locksmith and the deli and the tailor served different customers at different hours, and the cumulative effect was a street that was never empty, never unwatched, never given over entirely to any single use or any single population.
When she published The Death and Life of Great American Cities in 1961, the book was an attack on fifty years of planning orthodoxy. But it was also something more enduring than a polemic. It was an empirical demonstration that vitality — urban, economic, social — arises from conditions that planners consistently destroy: the mixing of uses that the zoning code separates, the density of population that the tower-in-a-park disperses, the short blocks that the superblock eliminates, the old buildings that the renewal program demolishes. These were not aesthetic preferences. They were functional requirements. Remove any one of them, and the neighborhood began to die. Remove all of them, as the planners routinely did, and what remained was the clean, efficient, well-ordered housing project — and the housing project, Jacobs documented with devastating specificity, was more dangerous, more isolated, and more economically dead than the "blight" it replaced.
Sixty years later, a similar pattern has played out in the digital economy, and the results look remarkably familiar to anyone who has read Jacobs carefully.
The early internet — the internet of the 1990s and early 2000s — had qualities that Jacobs would have recognized immediately. It was messy. It was diverse. It was populated by millions of small operators running individual websites, small forums, niche communities, personal blogs, amateur commerce, experimental projects of every conceivable kind. The barriers to entry were low. The variety was enormous. No single entity controlled more than a small fraction of the activity. The "streets" of this early web were populated at all hours by people with different purposes, different interests, different levels of sophistication, and the cumulative effect was an ecosystem that generated novelty, absorbed failure, and sustained the kind of casual, serendipitous knowledge exchange that Jacobs identified as the engine of urban innovation.
Then the platforms arrived.
The consolidation of the digital economy around a handful of dominant platforms — Google, Facebook, Amazon, Apple, Microsoft — followed the logic of urban renewal with eerie precision. The platforms looked at the apparent disorder of the open web and saw inefficiency. They replaced it with rational, planned environments: centralized search, centralized social interaction, centralized commerce, centralized app distribution. Each platform was, in its own domain, a superblock — a planned environment that separated functions, controlled access, and extracted value from the activity it hosted rather than distributing it to the participants.
The result was precisely what Jacobs would have predicted. The platforms killed the sidewalk ballet of the open web. The small forums, the personal blogs, the niche communities, the amateur commerce — all of it was absorbed into the platforms or starved of the traffic that had sustained it. The diversity of the digital economy narrowed. A few enormous enterprises dominated, and the innovation that diversity had produced was captured by the platforms rather than distributed to the communities that generated it. The digital equivalents of the butcher, the shopkeeper, and the old women on stoops — the small operators who had kept the web populated and various — were displaced by algorithmic feeds and platform-mediated interactions that were more efficient by every metric the planners used and less vital by every metric that actually mattered.
The data confirms the consolidation. By 2024, approximately sixty-five percent of all web traffic flowed through properties owned by just five companies. The share of online retail controlled by Amazon exceeded forty percent in the United States. More than eighty percent of social media engagement occurred on platforms owned by Meta. The long tail of small, independent digital enterprises — the equivalent of Jacobs's mixed-use blocks — had not disappeared entirely, but it had been pushed to the margins, starved of the traffic, the attention, and the economic oxygen that a diverse digital ecosystem requires to sustain itself.
The parallel to urban renewal is not metaphorical. It is structural. The same logic that led Moses to demolish neighborhoods — the conviction that centralized, planned order is superior to distributed, organic order — led the platforms to absorb the open web. The same metrics that made the highway look like progress — traffic throughput, speed, efficiency — made the platform look like progress. And the same blindness that prevented the planners from seeing what they were destroying — the intricate, functional, irreplaceable social order of the mixed-use neighborhood — prevented the platform builders from seeing what they were destroying: the diverse, innovative, self-sustaining ecology of the open digital economy.
This is where the AI moment enters the story, and this is where the analogy becomes either a cause for hope or a deeper cause for alarm.
Segal describes in The Orange Pill what he calls the collapse of the imagination-to-artifact ratio — the distance between a human idea and its realization. When that distance was large, only the well-resourced could build. When it collapsed to the width of a conversation with an AI tool, anyone with an idea and the will to pursue it could make something real. Segal celebrates this as the democratization of capability. He is right to celebrate it. But the celebration requires a crucial qualification that Jacobs's framework provides.
Democratization of capability is not the same as diversification of the economy. The two can coincide, but they need not. If every newly empowered builder uses the same tools, trained on the same data, accessed through the same platforms, and producing outputs that converge on the same aesthetic and functional norms, then the democratization has produced not diversity but a more widely distributed monoculture. More people are building, but they are all building with the same materials, in the same style, on the same street — and the result is the digital equivalent of a housing project: uniform, efficient, and lifeless.
Jacobs understood this distinction with a precision that most economists still miss. Diversity is not a description of how many people are participating. It is a description of how many different kinds of activity are occurring, how many different needs are being served, how many different approaches are being tried, and how much of the resulting innovation is retained locally rather than extracted by a centralized authority. A thousand builders all using the same AI to produce the same kind of output is not diversity. It is mass production with a distributed workforce. A hundred builders each solving a different problem, for a different community, with a different approach, retaining the value they create within the community they serve — that is diversity. That is what produces vitality.
The question this book addresses is not whether AI expands who gets to build. It plainly does, and the expansion is significant. The question is whether the institutional conditions exist to sustain the diversity that the expansion makes possible — or whether the same centralizing forces that absorbed the open web will absorb the AI-enabled building boom before the new digital neighborhoods can take root.
Jacobs would not have answered this question with optimism or pessimism. She would have answered it with observation. She would have looked at the actual conditions under which AI-enabled building is occurring and asked whether those conditions favor diversity or monoculture. She would have looked at who captures the value that the new builders create. She would have looked at whether the builders form communities — professional neighborhoods with their own standards, their own knowledge exchange, their own mechanisms for maintaining quality — or whether they build in isolation, each one alone with a tool, producing output that flows upward to the platform rather than outward to a community.
Those observations are what this book attempts to provide. The framework is Jacobs's: import replacement as the engine of economic growth, diversity as the measure of economic health, the sidewalk ballet as the mechanism of knowledge exchange, and the neighborhood as the unit of analysis. The subject is the digital economy in the age of AI. The argument, stated plainly, is that AI is either the most powerful tool for restoring the vitality of the digital economy since the personal computer — or it is the final instrument of the consolidation that killed it.
The outcome depends not on the technology. It depends on the structures that human beings build around it. The dams, the norms, the communities, the institutions that determine whether the gains from AI capability are distributed to the diverse builders who create value or captured by the concentrated providers who supply the tools.
Jacobs won her fight against Moses. The Lower Manhattan Expressway was never built. The neighborhoods survived. They survived not because the planning logic was refuted in the abstract but because the people who lived there organized, resisted, and built the political structures that prevented the highway from being built. The vitality of those neighborhoods — SoHo, which became an art district; Little Italy, which sustained its commercial ecology for another generation; the Village itself, which remains one of the most economically diverse and socially vital neighborhoods in New York — was preserved not by the invisible hand of the market but by the visible hands of people who understood what they had and fought to keep it.
The digital economy needs the same fight. It needs people who can see what the platforms are absorbing, what the consolidation is destroying, and what conditions must be maintained for the new wave of AI-enabled building to produce genuine diversity rather than a more widely distributed monoculture. This book is an attempt to provide the framework for that fight — not from a helicopter, but from a stoop.
---
The conventional story of economic growth goes like this: a region specializes in what it does best, exports its products to the wider market, and uses the revenue to import what it needs from other regions that have their own specializations. This is the theory of comparative advantage, and it has dominated economic thinking since David Ricardo formalized it in 1817. It is elegant. It is logical. It is also, as a description of how economies actually grow, almost entirely wrong.
Jacobs demonstrated the error in The Economy of Cities with her characteristic method: she looked at what actually happened, rather than what the theory said should happen. What she found was that the cities and regions that grew most robustly over long periods did not grow by exporting more efficiently. They grew by replacing their imports with locally produced substitutes. She called this process import replacement, and she argued that it was the single most important mechanism of economic development — more important than export expansion, more important than capital investment, more important than any policy intervention a government could devise.
The logic of import replacement is straightforward. A city imports bicycles from abroad. A local entrepreneur, seeing the demand, begins manufacturing bicycles locally. The city now has a bicycle factory. The factory needs parts — tires, chains, gears, seats. Some of these are imported. Another entrepreneur begins manufacturing tires locally. Now the city has a tire factory as well as a bicycle factory. The tire factory needs rubber, machinery, skilled labor. Each need is an opportunity for another local substitution. Each substitution adds a new enterprise, a new set of skills, a new capability to the local economy. The economy diversifies. It becomes more resilient, because it is no longer dependent on a single export or a single import chain. It becomes more innovative, because the density of diverse enterprises creates the conditions for unexpected combinations — the tire manufacturer who discovers a new use for rubber, the bicycle factory that develops a new gear mechanism, the machine shop that serves both and develops capabilities that neither could have predicted.
This process — gradual, messy, impossible to plan from above — is how Tokyo grew from a provincial city importing American bicycles in the 1870s to the industrial powerhouse that exported automobiles to the world a century later. It is how Manchester grew from a market town importing finished cloth to the textile capital of the industrial revolution. It is how every city that has achieved durable economic vitality has achieved it: not through the grand plan but through the accumulation of thousands of small acts of local substitution, each one adding a new node to the economy, each one making the next substitution possible.
The digital economy has, for the past two decades, operated in the opposite direction. Instead of import replacement, it has produced import dependence. Instead of local entrepreneurs creating local solutions, it has produced a universal dependence on a small number of large platforms that supply standardized tools to a global market.
Consider the situation of a marketing manager at a mid-sized company in 2020. She needs an analytics dashboard that tracks the specific metrics her team uses, in the specific format her reports require, with the specific integrations her workflow demands. The commercial options are Salesforce, HubSpot, Google Analytics, Tableau — large platforms designed for the mass market, optimized for the needs of the median user rather than the needs of any particular user. She can customize these tools, within limits. She can request modifications from her engineering team, and wait six months for a backlog slot. Or she can accept the tool as it is, adjust her workflow to match the tool's assumptions, and live with the gap between what she needs and what the platform provides.
In Jacobs's terms, she is importing. She is consuming a product made elsewhere, designed for someone else's needs, controlled by someone else's priorities. The value she creates using the tool flows partly to her and partly to the platform. The gap between the tool's generic design and her specific need is a form of economic leakage — a space where value that could be created locally is instead lost to the standardization required by the platform's business model.
Now consider what happened when AI tools made it possible for this same marketing manager to build her own analytics dashboard. Segal describes this in The Orange Pill — practitioners who had never written code creating functional tools through conversation with AI. The marketing manager describes what she needs in plain language. The AI produces a working prototype. She tests it, adjusts it, refines it through further conversation. Within days, she has a tool that does exactly what she needs, integrated with her specific systems, formatted for her specific reports, tracking her specific metrics.
Jacobs would recognize this immediately. This is import replacement. The marketing manager has replaced an imported product — the commercial analytics platform — with a locally produced substitute that is better suited to her specific needs. She has not just saved money or time. She has added a new capability to her local economic ecosystem. She understands her own analytics tool in a way she never understood the commercial platform. She can modify it when her needs change. She is no longer dependent on the platform vendor's roadmap, the engineering team's backlog, or the compromises inherent in a tool designed for the mass market.
But the significance extends beyond the individual act. Each import replacement creates the conditions for further import replacement. The marketing manager who has built one tool now understands, in practical terms, what building requires. She sees other imports she could replace — the reporting template, the lead-scoring algorithm, the customer segmentation model. Each replacement adds capability. Each one generates knowledge. Each one makes the next one easier. The process compounds, exactly as it compounds in Jacobs's account of urban economic development.
The teacher who builds educational tools tailored to her specific students is engaged in the same process. She has been importing educational software — products designed by companies that have never met her students, that cannot account for the specific ways her third-graders struggle with fractions or the particular sequence of concepts that works in her classroom. The commercial products are adequate. They are never precise. The gap between adequacy and precision is the space in which import replacement occurs.
When the teacher builds her own tool — a math practice application that uses the specific examples her students respond to, that adjusts difficulty based on the patterns she has observed over years of teaching, that presents concepts in the sequence she has found most effective — she is not merely customizing a product. She is creating one. She is a local entrepreneur in the digital economy, solving a local problem with local knowledge that no commercial vendor possesses.
The architect who builds structural analysis tools suited to his scale of practice follows the same pattern. The commercial structural analysis software — ETABS, SAP2000, RISA — is designed for large engineering firms working on large projects. The architect who designs residential additions and small commercial buildings does not need the full capability of these platforms. He needs something simpler, more specific, better suited to the codes and materials and typical configurations of his practice. He has been importing a product designed for someone else and adjusting his workflow to fit it. Now he can build what he actually needs.
Each of these stories is a single act of import replacement. Taken individually, each one is small — a dashboard, a math app, an analysis tool. But Jacobs's central insight about import replacement is that its significance is never in the individual act. It is in the aggregate effect. When thousands of practitioners across thousands of domains each replace a single import with a locally created substitute, the economy diversifies at a rate that no policy intervention or venture investment could achieve. The diversity is organic. It is driven by actual needs rather than market projections. It is responsive because the builders are the users. And it is resilient because no single failure — no single tool breaking, no single builder quitting — can collapse the ecosystem.
There is, however, a structural problem that Jacobs would identify immediately, and it is the problem that separates genuine import replacement from what might be called dependent building.
Genuine import replacement reduces dependency. The city that manufactures its own bicycles is less dependent on the foreign bicycle supplier. The marketing manager who builds her own dashboard is less dependent on Salesforce. The teacher who builds her own educational tools is less dependent on commercial ed-tech vendors. The replacement creates independence — not total independence, because every economy depends on imports for some things, but the specific, partial independence that diversity provides.
But the AI-enabled builder is replacing one import with a different kind of dependency. She no longer depends on Salesforce for her analytics dashboard. She now depends on Anthropic for the AI tool that enabled her to build it. She has replaced a dependency on a software vendor with a dependency on an AI provider. And the AI provider — unlike the local bicycle manufacturer in Jacobs's account — is not a local enterprise that adds to the diversity of the local economy. It is one of a handful of enormous companies, funded by catastrophic money, operating at a global scale, and extracting value from every act of building that occurs on its platform.
This is the structural tension that runs through every chapter of this book. AI enables import replacement at a scale and speed that Jacobs could not have imagined. But the import replacement occurs within, and ultimately depends upon, a supply chain that is more concentrated than any previous digital infrastructure. The builders are diversifying. The tools they build with are consolidating. Whether the diversification outpaces the consolidation — whether the new builders can retain enough independence and generate enough local economic value to sustain genuine diversity — is the question on which the vitality of the digital economy depends.
Jacobs would not have despaired at this tension. She would have observed it with the same cool empiricism she brought to watching her street. She would have noted that the early stages of import replacement always involve dependency on imported inputs — the bicycle factory imports its rubber, the tire factory imports its machinery — and that the economy grows precisely through the gradual replacement of those imported inputs with local substitutes. The question is whether the AI supply chain permits gradual replacement or whether it is structured to prevent it.
Open-source AI models, local fine-tuning, community-developed tools that reduce dependency on the major providers — these are the early stages of import replacement within the AI supply chain itself. They are small. They are marginal. They are exactly the kind of enterprise that Jacobs identified as the seedbed of economic vitality. Whether they survive and grow or are absorbed by the platforms is a question of institutional structure, not technological destiny.
The builder's independence is real. It is also conditional. The conditions are what the rest of this book examines.
---
The word "efficiency" appears in corporate strategy documents the way the word "progress" appeared in urban planning reports of the 1950s — as an unquestioned good, a metric so obviously desirable that to question it seems perverse. Efficient operations. Efficient allocation of resources. Efficient deployment of capital. The entire management science of the past half-century has been organized around the principle that efficiency is the primary measure of organizational health, and that any practice or structure that reduces efficiency is a cost to be eliminated.
Jacobs spent her career demonstrating that this principle, applied to cities, produced death. Not metaphorical death. Literal, observable, measurable economic and social death. The efficient city — the city of separated uses, uniform buildings, wide roads, and unimpeded traffic flow — was a city in which shops closed, streets emptied, crime rose, and the tax base eroded until the municipality could no longer fund basic services. The inefficient city — the city of mixed uses, varied building stock, narrow streets, dense pedestrian traffic, and a thousand small enterprises serving a thousand different needs — was the city that survived, adapted, and grew.
The distinction Jacobs drew was not between chaos and order. It was between two kinds of order. The planner's order — rational, legible, top-down — produced systems that looked organized and functioned poorly. The neighborhood's order — organic, illegible from above, bottom-up — produced systems that looked messy and functioned brilliantly. The planner could not see the neighborhood's order because it operated at a granularity below the resolution of the planner's map. The planner's map showed land use, traffic patterns, building age, and population density. It did not show the butcher who knew which customers were struggling, the shopkeeper who watched the children, the bar that kept the street safe after midnight by keeping it populated.
The same distinction now operates in the digital economy, and the AI moment has made it more consequential than at any previous point in the history of technology.
Jacobs identified four conditions that generate urban diversity. Each one has a precise digital equivalent, and each one is under simultaneous threat and potential reinforcement from AI.
The first condition is mixed primary uses. A neighborhood must serve more than one primary function. It must have residents and workers and visitors, all present at different times and for different reasons. A neighborhood that is purely residential is empty during working hours. A neighborhood that is purely commercial is empty at night. Only a neighborhood that combines uses keeps its sidewalks populated across the full cycle of the day, and only populated sidewalks support the diverse small enterprises — the deli, the laundry, the bar, the bookshop — that give the neighborhood its texture and its economic resilience.
The digital equivalent of mixed primary uses is the integration of functions within a single practitioner or small team. The engineer who also designs user interfaces. The designer who also writes functional code. The marketing manager who also builds analytics tools. The teacher who also creates educational software. AI is collapsing the boundaries between professional functions at precisely the pace Segal describes in The Orange Pill — the backend engineer who starts building interfaces because the tool makes it possible, the designer who starts writing features because the translation cost has disappeared.
This collapse of functional boundaries is the digital equivalent of mixed-use zoning. It produces the same benefits Jacobs identified in physical neighborhoods: denser activity, more diverse output, more opportunities for unexpected combinations. The engineer who also designs produces work that reflects both engineering constraints and design sensibility — a combination that neither a pure engineer nor a pure designer would produce. The teacher who also builds software creates tools that reflect the specific, granular knowledge of classroom practice that no commercial software developer possesses.
The second condition is short blocks. Jacobs observed that long blocks — the kind produced by superblock planning — reduced pedestrian traffic by forcing everyone onto the same routes. Short blocks created more intersections, more route options, more opportunities for people from different streets to encounter each other. The encounters were the point. Each intersection was a node where the streams of neighborhood life crossed, and each crossing was an opportunity for the casual exchange of information, commerce, and social connection that sustained the neighborhood's vitality.
The digital equivalent of short blocks is rapid feedback loops between builder and user. In the pre-AI software economy, the feedback loop between a practitioner's need and a tool's response was measured in months or years. The practitioner submitted a feature request to a commercial vendor. The vendor evaluated the request against its roadmap. If the request aligned with the vendor's strategic priorities and served a sufficient portion of the customer base, it might be implemented in a future release. The block was long — very long — and the practitioner was forced onto the vendor's route.
AI tools have shortened the block to minutes. The practitioner describes a need. The tool responds. The practitioner tests, adjusts, redescribes. Each cycle is an intersection — a point where the builder's intention meets the tool's capability and produces something that neither anticipated. The speed of this cycle is not merely convenient. It is structurally transformative. It changes the kind of building that is possible, just as short blocks change the kind of neighborhood that is possible, by increasing the density of encounters between need and response.
The third condition is buildings of varying age. Jacobs made the counterintuitive argument that old buildings — not necessarily beautiful or historically significant buildings, just old, cheap, unremarkable buildings — were essential to economic diversity because they provided low-cost space where new enterprises could start. New buildings are expensive. Their rents must be high to cover the cost of construction. Only established, profitable enterprises can afford them. The new restaurant, the experimental gallery, the startup workshop — these need old buildings, with their low rents and flexible layouts and forgiving landlords. A neighborhood of all new buildings is a neighborhood that only the already-successful can afford, which means it is a neighborhood without the experimentation from which future success emerges.
The digital equivalent of buildings of varying age is the availability of tools and platforms at different levels of cost, complexity, and commitment. A healthy digital economy needs the enterprise platforms — the Salesforces and the Workdays — that serve large organizations with complex needs. But it also needs the equivalent of cheap storefronts: low-cost, low-commitment tools that allow individual practitioners to experiment without the pressure of enterprise-scale investment. AI tools, at roughly one hundred dollars per month for the most capable individual plans, are currently serving as old buildings in the digital economy. They provide the low-cost creative space where a marketing manager can experiment with analytics, a teacher can prototype educational software, an architect can test structural analysis tools — all without the capital commitment, the specialized training, or the institutional backing that building software previously required.
The fourth condition is sufficient density of people. Jacobs argued that neighborhoods need enough people, per acre, to support the diverse enterprises that produce vitality. Low-density neighborhoods cannot support the deli, the bookshop, the bar, and the laundry simultaneously, because each enterprise needs a minimum customer base. Only when the density is sufficient can the full ecology of small enterprises sustain itself.
The digital equivalent is the density of practitioner activity — enough builders, working in enough proximity (physical or virtual), that the community sustains shared knowledge, shared standards, and the casual exchange that produces innovation. A single marketing manager building a dashboard in isolation is an act of import replacement, but it is not yet a contribution to a digital neighborhood. A hundred marketing managers, sharing their approaches, comparing their tools, evaluating each other's work, and developing shared norms for quality — that is a neighborhood. That is the density that sustains vitality.
The threat AI poses to each of these conditions is real and specific. Mixed uses are threatened when AI is deployed to deepen specialization rather than enable integration — when organizations use AI to make engineers more efficient at engineering rather than to enable engineers to also design. Short blocks are threatened when AI platforms impose their own feedback cadences — when the model's training cycle, the platform's update schedule, and the vendor's commercial priorities determine the rhythm of the builder's work rather than the builder's own needs. Old buildings are threatened when the AI providers raise prices, restrict access, or change terms in ways that eliminate the low-cost experimentation that current pricing enables. Density is threatened when AI-enabled solitary production replaces the community interactions that previously forced practitioners into proximity.
But the reinforcement AI provides to each condition is equally real and equally specific. AI enables mixed use by collapsing the translation cost between domains. It creates short blocks by providing instant feedback. It provides old buildings through accessible pricing. And it can increase density if the builders who use it form communities rather than working in isolation.
The outcome — diversity or monoculture, vitality or stagnation — depends on which set of forces prevails. The technology does not determine the outcome. The institutional conditions do. The norms, the communities, the pricing structures, the governance frameworks, the cultural expectations about how AI should be used and who should benefit from its use — these are the zoning codes of the digital economy. Whether they favor mixed use or separated use, short blocks or long blocks, old buildings or new, sufficient density or suburban sprawl — these are the questions that will determine whether the AI moment produces a vital digital economy or a more efficiently organized dead one.
---
Jacobs began her most famous passage with a time stamp. Under the heading "The uses of sidewalks: contact," she described an ordinary morning on Hudson Street in Greenwich Village.
The passage is well known because it is well observed. Jacobs describes herself watching from her second-floor window as the morning unfolds. The hardware store owner sets out his display. The bodega opens. Children leave for school. A woman pauses to talk to a man on the stoop. The locksmith arrives. The grocer arranges his produce. Each person is doing something different, for a different reason, at a different pace — and the cumulative effect is a street that is safe, known, and alive. Not because anyone planned it. Not because a community board organized it. Because the mix of uses, the density of activity, and the ordinary presence of people going about their different business at different times created the conditions for what Jacobs called "an intricate ballet in which the individual dancers and ensembles all have distinctive parts which miraculously reinforce each other and compose an orderly whole."
The miracle, Jacobs insisted, was not miraculous. It was functional. The sidewalk ballet worked because the conditions were right: enough different uses to keep the street populated at all hours, enough density to ensure that someone was always watching, enough variety of enterprises that no single schedule dominated, and enough casual contact between regulars to maintain the social knowledge — who belongs here, who does not, who needs help, who is trouble — that kept the street safe without police and connected without organizers.
The sidewalk ballet was not leisure. It was not recreation. It was the mechanism through which the neighborhood sustained itself. The casual exchanges — the nod, the pause, the three-minute conversation outside the bodega, the shopkeeper who noticed an unfamiliar van and mentioned it to the barber — were the infrastructure of community life. They performed functions that no formal institution could replicate: they circulated local knowledge, maintained social norms, built trust incrementally through repeated low-stakes contact, and created the conditions in which cooperation was natural rather than mandated.
Every profession has its own sidewalk ballet, and the ballet works in the same way.
Consider the software development community before AI tools changed the nature of building. A working engineer in a mid-sized company participated in a dense, ongoing, largely informal exchange of knowledge with colleagues. The code review was the most visible form — a structured process in which one engineer examined another's work and offered critique. But the code review was the formal layer over a much richer informal economy. There was the hallway conversation where an engineer mentioned that she had spent two days on a caching problem and found an approach that worked. There was the lunch where a junior developer described a bug he could not solve and a senior colleague recognized the pattern from a different project three years earlier. There was the Slack channel where someone posted a link to a library that solved a common problem in a way no one on the team had tried. There was the conference bar where an engineer from a different company described a failure that another engineer's team was about to walk into.
None of these exchanges were planned. None of them appeared on anyone's calendar or performance review. They were the sidewalk ballet of professional life — the casual, seemingly random, actually deeply functional circulation of knowledge that kept the community's collective intelligence current and its standards maintained.
The knowledge that circulated through this ballet was a specific kind of knowledge: tacit, contextual, perishable, and irreducible to documentation. A senior engineer's sense that a particular architectural approach would cause problems at scale — a sense built from years of observing similar approaches fail in specific ways under specific conditions — could not be written into a style guide or encoded in a linting rule. It lived in the person. It transferred through conversation. It reached the people who needed it through the sidewalk ballet of professional proximity.
The Berkeley researchers whose work Segal cites in The Orange Pill documented something that Jacobs would have predicted with certainty. When AI tools accelerated individual work, the pauses in which casual exchange had previously occurred were colonized by more work. The researchers called this "task seepage" — the tendency of AI-accelerated work to fill every available gap in the working day. The gap between meetings. The walk to the coffee machine. The five minutes before a call. Each of these had served, informally and invisibly, as a node in the sidewalk ballet — a moment when a casual exchange might occur, when a piece of tacit knowledge might transfer, when a norm might be reinforced through the ordinary social pressure of colleagues observing each other's work.
When the gaps filled with AI-mediated production, the ballet lost its dancers. Not because anyone decided to stop the exchanges, but because the conditions that produced them — the pauses, the proximity, the small frictions of collaborative work that forced people into contact — were optimized away.
The teacher who builds her own educational tools with AI is engaged in an act of creation that Jacobs would celebrate. She is a local entrepreneur, solving a local problem, adding diversity to the digital economy. But if the act of building replaces the conversation she used to have with the teacher across the hall — the conversation where she described a student's struggle and her colleague suggested an approach from a different pedagogical tradition — then something has been lost that the tool cannot replace. The tool can produce educational software. The tool cannot produce the specific, contextual, experienced judgment of a colleague who teaches in the same school, knows the same students, and has spent twenty years developing intuitions about how children learn that no training data contains.
This is not an argument against the tool. It is an argument for maintaining the conditions under which the tool produces vitality rather than isolation.
Jacobs would have recognized the pattern immediately, because she had seen it before. When the automobile replaced walking as the primary mode of urban transportation, the sidewalk ballet deteriorated. Not because cars were bad, but because cars eliminated the conditions that produced the ballet: the density of pedestrian traffic, the casual encounters at intersections, the slow pace that allowed for three-minute conversations, the necessity of being physically present on the street. Cars moved people faster. They also moved people through neighborhoods rather than in them, and the neighborhoods that lost their pedestrians lost their ballet, and the neighborhoods that lost their ballet lost their safety, their social cohesion, and eventually their economic vitality.
AI is doing to professional communities what the automobile did to urban neighborhoods. It moves people faster. It enables them to produce more, reach further, build things that were previously beyond their capability. But it also enables them to bypass the casual, inefficient, apparently unproductive encounters through which professional knowledge circulates, standards are maintained, and collective intelligence is sustained.
The developer who uses Claude Code to solve a problem in minutes, without consulting a colleague, has gained speed and lost contact. The contact was not merely social. It was epistemological. It was the mechanism through which the community's knowledge stayed current, its blind spots were exposed, and its standards were enforced through the gentle, constant pressure of peers observing each other's work.
Jacobs's prescription for the physical sidewalk was not to ban cars. It was to maintain the conditions for pedestrian life: mixed uses that gave people reasons to be on the street, sufficient density to keep the street populated, and short blocks that created intersections where encounters could occur. The prescription for the professional sidewalk is analogous. The point is not to ban AI tools. It is to maintain the conditions for casual professional contact: structured collaborative time that is protected from AI-mediated solitary production, physical or virtual spaces where practitioners encounter each other outside the context of formal work, and the deliberate preservation of the pauses and frictions that produce the encounters from which collective knowledge emerges.
The organizations that do this well will sustain their collective intelligence. The organizations that optimize every pause out of the working day — that convert every gap into an opportunity for AI-mediated production — will find, over months and years, that their collective knowledge has atrophied. Not because anyone made a decision to let it atrophy. Because the conditions that sustained it were quietly, incrementally, efficiently eliminated.
The sidewalk ballet of Hudson Street did not survive because someone decided it should survive. It survived because the conditions that produced it were maintained — sometimes through deliberate policy, more often through the organic resistance of a community that valued its own way of life enough to fight for the structures that sustained it. The professional sidewalk ballet of the AI era will survive only if the communities that depend on it recognize what they stand to lose and build the structures — the norms, the protected time, the deliberate inefficiencies — that keep the ballet alive.
Efficiency says eliminate the pause. Vitality says the pause is the point. Jacobs understood this about cities fifty years before it became true of professions. The question is whether the professions will learn it in time.
Detroit in 1950 was the richest city in America per capita. Its single industry — automobile manufacturing — had produced wages, tax revenues, and civic infrastructure that were the envy of every other metropolitan area in the country. The Big Three automakers employed hundreds of thousands of workers directly and supported a vast ecosystem of suppliers, dealers, and service providers. The city's population had grown from less than three hundred thousand at the turn of the century to nearly two million. By every metric the economists used, Detroit was thriving.
Jacobs looked at Detroit and saw a city dying.
What she saw was monoculture — an economy so dominated by a single industry that the diversity from which genuine economic vitality springs had been systematically suppressed. The automobile companies were so large, so profitable, and so dominant that they absorbed the talent, the capital, and the civic attention that a diverse economy requires. Young entrepreneurs did not start restaurants or machine shops or experimental ventures. They went to work for Ford or General Motors or Chrysler, because the wages were better, the benefits were better, and the prestige was incomparable. The small enterprises that Jacobs identified as the seedbed of economic innovation — the marginal firms, the odd shops, the experimental ventures that nobody expected to succeed — could not compete for talent, for space, or for the attention of a city whose identity was inseparable from a single product.
When the automobile industry contracted, Detroit had nothing to fall back on. The decline, when it came, was not gradual. It was catastrophic. The population fell by more than half. Entire neighborhoods were abandoned. The tax base collapsed. Services deteriorated. Crime rose. The city that had been the richest in America became a national symbol of urban failure — not because its people were less capable or less industrious than the people of other cities, but because the economic structure that had produced its wealth was catastrophically fragile. A single industry, no matter how profitable, is a single point of failure. Diversity is not a luxury. It is the structural precondition for survival.
The digital economy of 2025 has its own Detroits, and AI is producing the conditions for more of them.
The monoculture risk in AI is not the risk of a single industry dominating a single city. It is the risk of a single tool — or a small number of functionally similar tools — mediating the creative and productive output of an entire economy. When the marketing manager, the teacher, the architect, the developer, the writer, the designer, the analyst, and the strategist all use the same AI system to produce their work, the diversity of their outputs converges. Not toward the best possible output in each domain, but toward the outputs that the AI system is best equipped to produce — which is to say, toward the patterns most heavily represented in the data on which the system was trained.
This convergence operates at multiple levels, each one reinforcing the others.
At the level of style, AI-mediated work develops a recognizable aesthetic. Prose produced through large language models tends toward a specific register: clear, well-structured, slightly impersonal, with a preference for parallel constructions and a tendency to resolve ambiguity rather than sustain it. Visual design produced through image generation models converges on a specific look — glossy, high-contrast, technically accomplished, and curiously similar across applications. Code produced through AI assistants follows the patterns most common in the training data, which means it follows the conventions of the largest codebases, which means it converges on the approaches favored by the largest companies. Each of these convergences is individually minor. Collectively, they represent a narrowing of the expressive range of an entire economy's creative output.
At the level of approach, AI-mediated work tends toward the solutions the model already knows. A developer who describes a problem to an AI assistant will receive a solution drawn from the patterns in the training data. If the most common approach to a given problem is approach A, the model will suggest approach A — not because approach A is the best solution, but because approach A is the most statistically likely completion given the training distribution. The developer may accept approach A without considering approaches B, C, or D, not because she lacks the capability to evaluate alternatives, but because the path of least resistance leads to the model's suggestion, and the model's suggestion feels authoritative because it arrives with confidence and in well-formed prose.
Jacobs observed exactly this dynamic in cities dominated by a single large employer. The employer's approach — to manufacturing, to management, to technology — became the city's approach. Not because alternative approaches were forbidden, but because the dominant enterprise set the terms of professional culture so thoroughly that alternatives became invisible. The young engineer in Detroit did not consider alternative manufacturing methods because the methods used by Ford were the only methods she encountered, the only methods her mentors knew, the only methods the local professional culture recognized as legitimate. The monoculture was self-reinforcing. The dominance of a single approach suppressed the visibility of alternatives, and the invisibility of alternatives reinforced the dominance.
AI monoculture operates through the same mechanism. When a tool suggests an approach, the suggestion becomes the default. The default is adopted. The adoption reinforces the default. The data generated by the adoption enters the training pipeline. The next version of the model produces the default with even greater confidence. The cycle tightens. The diversity of approaches narrows. The creative economy converges on the patterns the model knows best, and the patterns the model knows best are the patterns the economy has already converged on.
Segal raises this concern in The Orange Pill through his engagement with Byung-Chul Han's critique of smoothness — the observation that when all output passes through the same optimization process, the texture that distinguishes one practitioner's work from another's is polished away. Jacobs's framework sharpens the concern by providing its economic mechanism. The issue is not merely aesthetic. A monoculture is not merely ugly. A monoculture is structurally fragile. An economy in which all builders use the same approaches is an economy in which all builders share the same blind spots, the same vulnerabilities, the same failure modes. When the approach fails — when the pattern the model favors turns out to be wrong, or when conditions change in ways the training data does not represent — the entire economy fails in the same way, at the same time, for the same reason.
This is not hypothetical. The history of technology is dense with examples of monoculture failure. The Irish potato famine was, at its root, a monoculture failure — a population dependent on a single crop variety that was uniformly vulnerable to a single pathogen. The 2008 financial crisis was a monoculture failure — a financial system in which the same risk models, trained on the same data, using the same assumptions, were deployed by every major institution, producing a system in which everyone was exposed to the same risk and no one could see it because no one was using a different model. The technical failures that periodically cascade through cloud computing infrastructure — when a single misconfiguration at Amazon Web Services takes down thousands of unrelated services simultaneously — are monoculture failures. The dependencies are shared. The vulnerabilities are shared. The failure, when it comes, is shared.
AI monoculture does not need to produce a single catastrophic failure to cause damage. The more insidious effect is the slow suppression of the experimental margin — the space at the edges of the economy where unusual approaches are tried, where failures are productive, where the diversity of methods that sustains long-term innovation is maintained. Jacobs called this the self-destruction of diversity, and she identified it as the most dangerous process in urban economics. A successful neighborhood attracts investment. Investment raises rents. Rising rents displace the low-margin enterprises — the experimental gallery, the startup workshop, the eccentric bookshop — that gave the neighborhood its innovative character. The neighborhood becomes more profitable and less diverse. The profitability is visible. The loss of diversity is invisible until the neighborhood stagnates, and by then the enterprises that would have prevented stagnation are gone.
The AI economy follows the same trajectory. As AI tools become more capable, they attract more users. As more users adopt the same tools, the tools' suggested approaches become more dominant. As the approaches become more dominant, the practitioners who use different approaches — who work by hand, or use different tools, or develop methods outside the model's training distribution — find themselves increasingly marginal. Their work takes longer. Their outputs look different from the norm. Their clients or employers, accustomed to the speed and polish of AI-mediated work, may value the difference less. The marginal practitioner, like the marginal enterprise in Jacobs's neighborhood, is slowly displaced — not by competition in any traditional sense, but by the gravitational pull of a dominant approach that makes all other approaches look, by comparison, inefficient.
The defense against monoculture is the same in the digital economy as it is in cities: the deliberate cultivation and protection of diversity. This does not mean rejecting AI tools. It means using them in ways that preserve rather than suppress the variety of approaches, styles, and methods that a healthy economy requires. It means valuing the work that looks different from the AI-mediated norm, not as a nostalgic gesture but as an economic necessity. It means recognizing that the smooth, polished, statistically likely output is not the best output — it is the average output, and an economy that converges on its average is an economy that has stopped innovating.
Jacobs would have no patience for the argument that the market will sort this out. The market did not sort out Detroit. The market rewarded the monoculture until the monoculture collapsed. The market rewards the AI convergence now, because the convergence is efficient, and efficiency is what the market measures. The market does not measure the loss of the experimental margin until the margin is gone and the stagnation that follows is already entrenched.
The structures that protect diversity — the equivalent of mixed-use zoning, of rent stabilization for old buildings, of the deliberate preservation of the conditions Jacobs identified as necessary for vitality — must be built by human decision, not left to market selection. In the digital economy, these structures include the open-source AI projects that provide alternatives to the dominant models, the professional communities that maintain standards independent of any single tool, the educational institutions that teach multiple approaches rather than optimizing for proficiency with a single platform, and the organizational norms that protect time for experimentation outside the AI-mediated workflow.
Detroit did not have to die. It died because no one built the structures that would have diversified its economy before the single industry that sustained it declined. The digital economy does not have to converge. It will converge unless someone builds the structures that sustain diversity before the monoculture becomes self-reinforcing.
The structures are needed now. The convergence is already underway.
---
The most important buildings in any city are not the newest ones.
Jacobs made this argument in The Death and Life of Great American Cities with characteristic precision, and it is one of her arguments that people most frequently misunderstand. She was not making a case for historic preservation, though historic preservation may be a byproduct. She was not arguing that old buildings are beautiful, though some are. She was making an economic argument: new ideas need old buildings because old buildings are cheap, and cheapness is the precondition for experimentation.
A new building must charge high rents. The cost of construction, the cost of financing, the cost of the land beneath it — all of these must be recovered through the rents the building charges, and the rents must be high enough to satisfy the investors who financed the construction. Only established, profitable enterprises can afford these rents. The law firm, the accounting firm, the bank, the successful restaurant chain — these can pay what new buildings charge. The experimental restaurant that may or may not work, the gallery that shows work nobody has heard of, the workshop that repairs things nobody else repairs, the community organization that runs on a shoestring — these need the building whose construction costs were paid off decades ago, whose owner is content with a modest rent, whose spaces are flexible enough to accommodate uses the original builder never imagined.
This is not sentimentality about old things. It is economics. The supply of old buildings is the supply of cheap space, and the supply of cheap space is the supply of economic possibility. When a city demolishes its old buildings — through urban renewal, through redevelopment, through the simple operation of a real estate market that favors new construction over maintenance — it does not merely change the skyline. It eliminates the economic habitat in which new enterprises start. The enterprises that would have started in the cheap spaces never start, and the innovations they would have produced never occur, and the city never misses them because you cannot miss what never existed.
AI tools are, at this moment, functioning as old buildings in the digital economy.
The comparison is precise, not metaphorical. A subscription to Claude Code costs roughly one hundred dollars per month at the individual level. For that price, a practitioner gains access to a capability that, five years ago, would have required hiring a team of developers, or purchasing enterprise software licenses costing thousands of dollars per month, or spending months learning to code. The barrier to building has been lowered to the cost of a modest utility bill. This is cheap space. This is the digital equivalent of the converted warehouse where a new enterprise can set up for a fraction of what a proper office would cost.
The marketing manager who built her analytics dashboard did not need venture capital. She did not need a technical co-founder. She did not need to convince an engineering team to prioritize her project. She needed a hundred dollars a month and the knowledge of what she wanted to build. The teacher who created educational tools for her students did not need a grant, a development team, or a partnership with an ed-tech company. She needed a hundred dollars a month and twenty years of classroom experience. The architect who built structural analysis tools did not need an enterprise license or a software engineering background. He needed a hundred dollars a month and a deep understanding of the specific calculations his practice requires.
Each of these practitioners found a cheap space in which to experiment. Each one produced something that might not work, that might need revision, that might be abandoned after a month — and that was fine, because the cost of the experiment was low enough that failure was affordable. This is exactly the economic function Jacobs identified in old buildings: they make failure affordable, and affordable failure is the precondition for innovation, because innovation requires trying things that might not work, and no one tries things that might not work when the cost of failure is catastrophic.
The explosion of AI-enabled building that Segal documents in The Orange Pill — the millions of practitioners creating tools, prototypes, products, and experiments across every professional domain — is an explosion of experimentation made possible by cheap space. The old buildings are open for business. The rents are low. The tenants are arriving from every direction, with every conceivable idea, and the sheer volume and variety of what they are producing is precisely the kind of diversity that Jacobs identified as the engine of economic vitality.
But Jacobs would immediately ask a question that the celebrants of this moment tend to skip: Who owns the building?
An old building in a healthy city is typically owned by a local landlord — an individual or small company rooted in the neighborhood, with modest financial expectations and a willingness to accommodate the irregular tenants and unconventional uses that new enterprises often require. The relationship between landlord and tenant is personal, flexible, and governed by norms as much as by contract. The landlord who has known the neighborhood for decades understands that the experimental restaurant on the ground floor may fail, but the space will not stay empty for long, because the neighborhood has enough traffic and enough variety to attract the next tenant. The risk is distributed. The tolerance for failure is high.
The AI tools that currently function as old buildings are not owned by neighborhood landlords. They are owned by some of the largest and most highly capitalized companies in the history of commerce. Anthropic, OpenAI, Google, Meta, Microsoft — these are the landlords of the digital old buildings, and their relationship to their tenants is not the relationship of a neighborhood landlord to a local entrepreneur. It is the relationship of a real estate conglomerate to the occupants of a development it can reprice, redesign, or demolish at any time.
The current pricing of AI tools — the hundred dollars a month that makes experimentation affordable — is not a permanent feature of the landscape. It is a business decision, made by companies that are currently investing far more in capability than they are recovering in subscription revenue, sustained by venture capital and corporate balance sheets that are optimizing for market share rather than current profitability. When the investment phase ends and the monetization phase begins — when the companies that own the old buildings decide to charge rents that reflect the actual value of the space — the cheap space may cease to be cheap.
This has happened before in both the physical and digital economies, and the pattern is consistent. The artist lofts in SoHo were cheap because the neighborhood was unfashionable and the buildings were industrial surplus. The artists moved in. The art made the neighborhood desirable. The desirability attracted investment. The investment raised rents. The artists were displaced, and the galleries and studios that had given SoHo its character were replaced by luxury retail and high-end restaurants that could afford the new rents. The process that Jacobs described — the self-destruction of diversity through the success it produces — operated with textbook precision.
The early internet followed the same arc. The blogging platforms of the 2000s were the old buildings of the early web — low-cost, widely accessible, enabling experimentation by anyone with an idea and a connection. WordPress, Blogger, Tumblr, LiveJournal — each provided cheap space where writers, photographers, artists, and commentators could publish without institutional backing. The platforms were subsidized by venture capital, priced below cost to attract users, and governed by terms that favored the users' interests. Then the platforms matured, the business models shifted, and the terms changed. Algorithms replaced chronological feeds. Monetization requirements displaced casual publishing. Platform interests diverged from user interests. The old buildings were demolished, and the diverse, experimental, bottom-up publishing ecosystem they had supported was absorbed into the social media platforms that replaced them.
The question for AI tools is whether this cycle will repeat. The conditions for repetition are present: the tools are currently priced below cost, the pricing is sustained by investment capital rather than operating revenue, and the companies that own the tools have structural incentives to increase extraction as they mature. The conditions for a different outcome are also present: open-source AI models provide alternatives to the proprietary platforms, the cost of running AI models locally is declining, and the community of builders is large enough and technically sophisticated enough to resist lock-in if alternatives exist.
Jacobs's prescription was not to prevent old buildings from being demolished. Buildings have life cycles. Her prescription was to ensure that the supply of cheap space was continuously renewed — that as old buildings were demolished or renovated, new cheap spaces were created elsewhere, through the natural aging of the building stock, through zoning that permitted varied uses, through the organic processes that keep a city's spatial economy diverse. The digital equivalent is the continuous renewal of cheap creative space — through open-source tools, through competitive markets that prevent any single provider from capturing the entire supply, through institutional support for alternatives to the dominant platforms, and through the deliberate cultivation of the conditions that keep the cost of experimentation low.
The old buildings are open. The tenants are arriving. The experiments are underway. The question is whether the buildings will remain affordable long enough for the experiments to compound into the diverse, resilient, locally responsive digital economy that genuine vitality requires — or whether the landlords will raise the rents before the neighborhood can take root.
---
In The Economy of Cities, Jacobs describes a process she calls the transplant economy — the pattern in which large amounts of capital flow into a region not to support what is already growing there but to implant something foreign. A government builds a massive factory in a rural area. A development bank funds a hydroelectric dam that displaces thousands. A foreign corporation establishes a plantation that produces a single export crop. In each case, the capital arrives from outside, the project is designed from outside, the benefits flow to outside interests, and the local economy — the diverse, small-scale, self-sustaining economy that was already there — is overwhelmed.
Jacobs called this catastrophic money. Not because the money itself was malicious, but because its scale and velocity were incompatible with the organic processes through which healthy economies develop. Healthy economic development is gradual. It proceeds through the accumulation of small innovations, each building on the previous, each adding a new capability to the local ecosystem. Catastrophic money short-circuits this process. It arrives too fast, in quantities too large, directed by interests too remote from the local conditions, to produce the fine-grained, diverse, locally responsive development that genuine vitality requires.
The result of catastrophic money is typically a dual economy: a modern, well-capitalized sector that serves external markets and employs a fraction of the local population, surrounded by a neglected, undercapitalized local economy that has been starved of the attention, talent, and investment it needs to develop on its own terms. The modern sector looks like progress. It has impressive buildings, advanced technology, and high productivity. The local economy looks backward by comparison — small, slow, inefficient by every metric the development banks use. But the local economy, for all its apparent inadequacy, is the economy that serves the actual needs of the actual population. And when the modern sector contracts — when the commodity price drops, or the foreign corporation relocates, or the government project runs its course — the local economy is not strong enough to absorb the shock, because catastrophic money stunted its development at the moment when it was most needed.
The AI industry, as of 2026, is being shaped by catastrophic money at a scale that Jacobs could not have imagined but would have recognized instantly.
The numbers are public and staggering. In 2024 and 2025, investment in AI companies exceeded two hundred billion dollars, concentrated in fewer than a dozen firms. Microsoft invested thirteen billion dollars in OpenAI. Amazon invested four billion in Anthropic, then expanded its commitment. Google invested billions in its own AI efforts and in external startups. The venture capital ecosystem directed unprecedented sums toward AI companies at every stage, from seed to late growth, with a strong preference for companies building foundational models — the platforms on which all other AI activity depends.
This concentration of capital has produced exactly the dual economy that Jacobs described. The modern sector — the handful of large AI companies — is spectacularly well-capitalized, technically advanced, and growing at extraordinary speed. The local economy — the millions of small-scale builders, the professional communities, the independent tool-makers, the open-source projects — operates in the shadow of these giants, dependent on their infrastructure, vulnerable to their pricing decisions, and structurally unable to compete for the talent, the compute resources, and the institutional attention that the large companies absorb.
The mechanism through which catastrophic money suppresses local development is not usually hostile. It operates through opportunity cost. When billions of dollars flow to a handful of AI companies, the most talented engineers, researchers, and entrepreneurs follow the money. Not because they are mercenary, but because the resources available at the well-funded companies are incomparably greater than what any small-scale operation can offer. The researcher who wants to push the boundaries of what AI can do goes to the company that can afford the compute. The engineer who wants to work on frontier problems goes to the company that is defining the frontier. The entrepreneur who wants to build an AI-native business goes to the investors who are funding AI-native businesses, and those investors have strong preferences about scale, growth rate, and market approach that favor the patterns already established by the dominant companies.
The talent that flows to the large companies does not flow to the local economy. The independent tool-maker who might have built an alternative approach to AI-assisted design, or the open-source project that might have provided a competitive alternative to the proprietary models, or the professional community that might have developed domain-specific AI norms and standards — each of these lacks the resources that catastrophic money has redirected to the dominant firms. The local economy does not decline. It fails to develop. The innovations that would have occurred do not occur. The diversity that would have emerged does not emerge. The economy looks healthy because the modern sector is growing, but the growth is concentrated, fragile, and dependent on continued infusions of the same catastrophic money that produced it.
Segal's analysis of the SaaS Death Cross in The Orange Pill describes one visible consequence of this concentration. As AI capability consolidates in a few providers, the value of the software layer — the diverse ecosystem of SaaS companies that previously served specialized needs — collapses. A trillion dollars of market value disappeared from software companies in the opening weeks of 2026. The market interpreted this as a repricing of code, and the market was partly right. But Jacobs's framework reveals a deeper structure: the Death Cross is the visible symptom of catastrophic money reorganizing the digital economy around a few dominant providers and away from the diverse, specialized, locally responsive ecosystem that the SaaS economy, for all its imperfections, had sustained.
The SaaS ecosystem was not perfectly diverse. It was dominated by large platforms — Salesforce, Workday, Adobe — that exhibited their own concentration tendencies. But it also contained thousands of smaller companies serving specialized needs: niche CRM providers for specific industries, educational tools for specific pedagogies, design platforms for specific workflows, analytics tools for specific domains. Each of these companies represented a node of diversity in the digital economy — a specialized solution, serving a specific need, maintained by people who understood the domain they served. The Death Cross is, in Jacobs's terms, the displacement of this diverse ecosystem by a more concentrated one. The small SaaS companies that served niche needs are being replaced not by other small companies, but by the general-purpose AI tools provided by the dominant firms. The replacement is more efficient. It is also less diverse. The distinction matters enormously for the long-term health of the digital economy.
The corrective to catastrophic money, in Jacobs's framework, is not the rejection of large-scale investment. Large investments produce genuine capabilities that small-scale operators cannot replicate. The frontier AI models that enable the building boom Segal describes required billions of dollars of investment in compute, data, and research. No local economy could have produced them. Jacobs did not argue that large-scale enterprise is inherently harmful. She argued that large-scale enterprise, when it monopolizes the conditions for economic development, stunts the diverse local economy that is the actual engine of long-term vitality.
The corrective is structural. It consists of institutions, policies, and community practices that redirect some portion of the value generated by the large-scale sector toward the diverse local economy that grows organically. In Jacobs's urban context, this meant zoning that preserved mixed uses, rent structures that maintained cheap space, and civic institutions that supported local enterprise. In the AI context, it means maintaining the supply of cheap creative space — through open-source models, through competitive pricing, through institutional support for alternatives to the dominant platforms. It means supporting the professional communities that develop domain-specific norms and standards independent of any single tool provider. It means ensuring that the knowledge generated by AI-enabled building circulates through communities rather than flowing exclusively to the platforms that capture it.
The question is whether these structures will be built. Catastrophic money is not patient. It demands returns at a pace that is incompatible with the gradual development of diverse local economies. The investors who funded the dominant AI companies expect growth, market dominance, and eventual monopoly-scale returns. These expectations are structurally incompatible with the kind of distributed, diverse, locally responsive digital economy that Jacobs's framework identifies as genuinely vital.
The tension is real. It is not resolvable through optimism or through technology. It is resolvable only through the deliberate construction of institutions that protect local economic development from the gravitational pull of concentrated capital. Jacobs understood this about cities. Whether the builders of the digital economy understand it about their own landscape is the question on which the long-term health of that economy depends.
---
Jacobs made an observation about urban safety that was, at the time, so counterintuitive that it was dismissed by the planning profession as naive. She argued that the safest streets were not the streets with the most police. They were the streets with the most people.
The logic was simple and empirical. A street with shops, residences, bars, and other mixed uses was a street that was populated at all hours. The shopkeeper who saw the street from behind her counter. The apartment dweller who glanced out the window. The bartender who knew every regular and noticed every stranger. The mother who walked her children to school and noticed when an unfamiliar car was parked on the block. None of these people were assigned to watch the street. None of them thought of themselves as performing a safety function. They were going about their business, and their business happened to keep them present, and their presence happened to keep the street safe.
Jacobs called this "eyes on the street," and she argued that it was the only mechanism of urban safety that actually worked at scale. Formal policing was necessary for extreme situations, but the day-to-day safety of a neighborhood — the quality that made people willing to walk at night, to let their children play outside, to leave their doors unlocked — depended on the informal, unorganized, continuous presence of people who watched because they were there, not because they were assigned to watch.
The mechanism worked because it was distributed, redundant, and self-maintaining. No single pair of eyes was critical. The shopkeeper could close for the day, and the bartender's shift would start. The mother could stay home, and the old man on the stoop would be there instead. The coverage was not planned. It was emergent — the natural byproduct of a neighborhood that gave enough different people enough different reasons to be present at enough different times that the street was never entirely unwatched.
Professional quality works through an analogous mechanism, and AI is disrupting it in ways that are not yet widely understood.
Consider the quality of software in a traditional development team. The formal quality mechanisms are well known: code reviews, automated testing, quality assurance processes, release management procedures. These are the police patrols of software quality — necessary, structured, and important. But anyone who has worked in software development knows that the formal mechanisms catch only a fraction of the problems. The majority of quality issues are caught, prevented, or corrected through informal mechanisms that operate below the level of any formal process.
The engineer who glances at a colleague's screen while passing her desk and notices an unusual pattern in the code. The team lead who overhears a conversation about a design decision and interjects with a concern based on a failure he observed three years ago on a different project. The junior developer who asks a question in a meeting that reveals an assumption the senior engineers had not examined. The architect who reviews a pull request and notices not a bug but a direction — a structural choice that will cause problems at scale, visible only to someone who has seen similar choices play out over years.
These are the eyes on the street of software quality. They are distributed across the team, redundant, self-maintaining, and informal. No single observation is critical. The quality of the whole depends on the cumulative effect of dozens of casual, unstructured, unplanned encounters in which practitioners observe each other's work and apply judgment that cannot be captured in a checklist or automated in a test suite.
AI-enabled solitary production removes the conditions that produce these observations. When a developer builds alone, using an AI tool as her primary collaborator, the colleagues who would have seen her work in progress do not see it. The team lead does not overhear the conversation, because the conversation is happening between the developer and a machine. The junior developer does not ask the revealing question, because the junior developer is building her own feature in her own conversation with her own AI. The architect does not notice the structural choice, because the pull request arrives fully formed and the architect reviews the output rather than observing the process.
The formal quality mechanisms — code review, testing, QA — still exist. But the informal mechanisms, the eyes on the street, have been thinned. Not eliminated. Thinned. There are fewer casual observations, fewer unplanned encounters, fewer moments in which one practitioner's judgment is applied, spontaneously and informally, to another practitioner's work.
The thinning is gradual and difficult to measure. Quality does not collapse overnight. It erodes. The problems that the informal mechanisms would have caught begin to accumulate. Each one is small. A suboptimal architectural choice here. An unexamined assumption there. A design pattern that works in isolation but creates conflicts at scale. None of these are bugs in the traditional sense. They are quality deficits that exist below the resolution of formal testing — visible only to human judgment applied in the context of casual observation.
The erosion is particularly dangerous because it is self-concealing. AI-mediated work tends to be well-formed at the surface level. Code generated through AI assistants is syntactically correct, follows established conventions, and passes automated tests with high reliability. The surface quality creates a sense of confidence that may not be warranted by the structural quality beneath it. The code looks right. It compiles. It passes tests. It ships. The problems appear later — in performance under load, in maintainability over time, in the subtle interactions between components that no automated test was designed to check, because the interactions were not anticipated by anyone who understood the system as a whole.
Jacobs saw precisely this pattern in the planned neighborhoods that replaced the organic ones. The housing projects looked safe. They had controlled access, clear sightlines, rational layouts. By every metric the planners used, they were safer than the neighborhoods they replaced. And they were, in fact, more dangerous — because the metrics the planners used did not capture the quality that actually produced safety: the distributed, informal, continuous presence of people who watched the street because the street gave them reasons to be there.
The code that passes all the tests but fails under real-world conditions. The product that looks polished but does not serve users well. The organizational decision that is well-reasoned in isolation but destructive in context. Each of these is a failure of informal quality mechanisms — a failure that formal processes were not designed to catch, because formal processes operate on the things that can be measured, and the things that matter most in the quality of complex work are often the things that cannot.
The solution is not to ban AI tools, any more than the solution to unsafe housing projects was to ban public housing. The solution is to maintain the conditions that produce informal quality oversight: the proximity of practitioners to each other's work, the pauses in which casual observation occurs, the physical or virtual spaces where unplanned encounters happen, and the organizational norms that value the time spent observing and discussing work in progress as highly as the time spent producing it.
Concretely, this means organizations must treat collaborative time as productive time, not as overhead to be minimized. It means structuring the workday so that AI-mediated solitary production is interspersed with periods of shared work — pair programming, design critiques, architectural discussions — in which practitioners observe each other's work in progress and apply the informal judgment that formal processes cannot replicate. It means resisting the temptation to convert every minute of every working day into AI-assisted output, because the minutes that produce the most valuable quality interventions are precisely the minutes that look, from the perspective of an efficiency metric, like waste.
The eyes on the street were never organized. They were never assigned. They were the natural byproduct of a neighborhood that worked. The eyes on the digital street must be cultivated more deliberately, because AI has changed the conditions that used to produce them naturally. The cultivation requires awareness of what has been lost, institutional commitment to maintaining the conditions for informal oversight, and the willingness to protect unproductive-seeming time against the relentless pressure of AI-enabled acceleration.
Jacobs spent her career arguing that the apparent inefficiencies of organic urban life were not inefficiencies at all. They were the mechanisms through which the neighborhood sustained itself. The same argument applies to the apparent inefficiencies of collaborative professional work. The hallway conversation is not waste. The unplanned observation is not distraction. The pause between tasks is not idle time. Each is a moment in which the eyes on the street do their work — quietly, informally, and indispensably.
The most common error in thinking about complex systems is the error of scale. A solution designed for the wrong scale does not merely fail to work. It actively damages the system it was designed to help. A pesticide that eliminates an insect pest on a single farm may be appropriate at the scale of the farm. Applied across an entire watershed, it destroys the aquatic ecosystem downstream. An antibiotic that cures an infection in a single patient is appropriate at the scale of the patient. Prescribed to an entire population as a preventive measure, it breeds resistant bacteria that threaten everyone. The intervention is not wrong in itself. It is wrong at that scale.
Jacobs made this observation about cities with a specificity that the planning profession has still not fully absorbed. The neighborhood is the unit at which urban vitality operates. Not the city. Not the block. The neighborhood — a area large enough to contain the diversity of uses, the density of population, and the variety of building stock that produce vitality, but small enough that the people within it can know each other's faces, maintain informal social norms, and sustain the casual exchanges that keep the community's collective knowledge current.
The city is too large. A policy applied uniformly across an entire city will destroy the specific conditions that produce vitality in particular neighborhoods, because those conditions vary from neighborhood to neighborhood. The zoning that serves an industrial waterfront does not serve a residential quarter. The density appropriate for a commercial corridor is inappropriate for a park edge. The city planner who applies a single standard across the entire city has made the error of scale — has assumed that what works at one level of granularity works at all levels, when in fact the vitality of the whole depends on the specificity of the parts.
The block is too small. A single block cannot contain enough diversity to sustain itself. It depends on neighboring blocks for complementary uses, for the traffic that supports its enterprises, for the variety that keeps its street populated at all hours. The block is a unit of construction, not a unit of vitality. Vitality emerges at the neighborhood scale, where enough blocks interact to produce the complex, self-sustaining ecology that no single block can generate alone.
The AI governance conversation, as it stands in 2026, is making the error of scale in both directions simultaneously.
At the national and international level, governments are producing regulatory frameworks — the EU AI Act, American executive orders, emerging frameworks in Asia — that operate at the scale of the city, or more precisely, at the scale of the nation. These frameworks address the supply side: what AI companies may build, what risks they must assess, what disclosures they must make. They are necessary. They are also structurally incapable of addressing the specific, granular, domain-dependent questions that determine whether AI produces vitality or monoculture in any particular professional community.
A regulation that requires AI companies to assess risk before deployment does not tell the architectural profession how to maintain quality standards when individual architects can generate structural analyses without the peer review that previously caught errors. A disclosure requirement that forces AI companies to document their training data does not tell the nursing profession how to evaluate the patient-tracking tools that nurses are building with AI assistants. A bias audit that checks the outputs of a language model for demographic fairness does not tell the teaching profession how to assess whether AI-generated educational materials serve the specific needs of specific students in specific classrooms.
These are neighborhood-scale questions. They require neighborhood-scale answers — answers developed by the professional communities that understand the specific conditions, the specific risks, and the specific opportunities that AI presents in their particular domains.
At the individual level, the current discourse treats AI adoption as a personal decision — a matter of individual productivity, individual skill development, individual career strategy. Use the tools or fall behind. Learn to prompt or become obsolete. The discourse addresses the person but not the community, the practitioner but not the profession, the builder but not the neighborhood.
The individual is too small a unit for the questions that matter most. An individual practitioner using AI tools wisely — maintaining her own standards, exercising her own judgment, building with care — is admirable but insufficient. Her quality depends not only on her own standards but on the standards of the community she works within. If the community's informal quality mechanisms erode — if the eyes on the street thin, if the sidewalk ballet slows, if the casual exchanges that sustain collective knowledge dry up — then her individual excellence operates in an increasingly impoverished context. She can maintain her own quality. She cannot, alone, maintain the ecosystem that supports it.
The neighborhood — the professional community — is the unit at which the crucial questions can be asked and answered.
Consider what a professional neighborhood looks like in concrete terms. It is a group of practitioners — perhaps a few hundred, perhaps a few thousand — who share a domain, who know enough about each other's work to evaluate it informally, who maintain shared standards through the ordinary interactions of professional life, and who develop domain-specific norms through the gradual, messy, bottom-up process that Jacobs identified as the only process that produces genuine vitality.
In architecture, such a neighborhood might be the community of small-firm practitioners in a particular region — architects who do residential additions, small commercial projects, and renovation work, who encounter each other at local AIA chapter meetings, who share knowledge about local codes and materials and contractors, and who maintain quality through the informal reputation mechanisms that operate in any community small enough for participants to know each other's work. When AI enables any architect to generate structural analyses, this neighborhood is the scale at which standards can be developed for how those analyses should be verified, what level of independent review is appropriate, and how the profession maintains the knowledge that AI-generated analyses may or may not contain.
In education, a professional neighborhood might be the teachers within a school district, or within a network of schools serving similar populations, who share pedagogical approaches, who discuss student challenges in common professional development sessions, and who develop norms for how AI-generated educational materials should be evaluated, adapted, and integrated into teaching practice. The norms that work for a school serving affluent suburban students may not work for a school serving recent immigrants in an urban district. The neighborhood is the scale at which this specificity can be maintained.
In software development, a professional neighborhood might be the community of developers working within a particular technology stack, or serving a particular industry, or contributing to a particular open-source project. These communities already maintain quality norms through code review practices, contribution guidelines, and the informal reputation systems that determine whose judgment is trusted and whose work is respected. When AI changes the nature of code production, these communities are the scale at which new norms must be developed — norms about how AI-generated code should be reviewed, what level of human understanding should accompany AI-assisted implementation, and how the community's collective knowledge is maintained when the pauses that previously sustained it are colonized by AI-mediated production.
The defining characteristic of a functioning professional neighborhood is that its members know each other's work well enough to evaluate it informally. This knowledge is the foundation of everything else — the standards, the norms, the quality oversight, the casual exchanges that sustain collective intelligence. When members of a professional community stop knowing each other's work, the community ceases to function as a neighborhood. It becomes a population — a group of individuals who share a label but not a practice, who occupy the same statistical category but do not sustain each other's quality or knowledge.
AI pushes professional communities in both directions simultaneously. It enables more people to participate in creation, which can increase the density and diversity of the neighborhood. It also enables solitary production that bypasses the community entirely, which can thin the connections that hold the neighborhood together. The outcome depends on whether the technology is used in ways that increase participation in the community or in ways that replace the community with individual capability.
The institutional structures that support professional neighborhoods are not glamorous. They are professional associations, local meetups, domain-specific conferences, mentoring programs, peer review networks, open-source communities with active governance. They are the civic institutions of the professional world — the organizations that maintain the commons, set the standards, and keep the neighborhood's collective knowledge circulating. They are chronically underfunded, undervalued, and under threat from the same efficiency logic that Jacobs identified as the enemy of urban vitality. They are slow. They are messy. They are inefficient by every metric that the technology industry values.
They are also irreplaceable.
The national regulation sets the boundary conditions. The individual practitioner makes the daily decisions. But the professional neighborhood is where the quality of the AI transition will actually be determined — where the norms are set, where the standards are maintained, where the eyes on the street do their work. The neighborhood is where vitality lives or dies. It is the scale at which the dams must be built and maintained. And it is the scale at which, currently, the least attention is being paid.
---
The story that economies tell about themselves is almost always a story about speed. The gross domestic product grew at four percent. The company doubled its revenue in eighteen months. The startup went from founding to billion-dollar valuation in three years. Speed is the metric. Acceleration is the goal. The economy that grows fastest is, by the logic of the story, the healthiest economy.
Jacobs told a different story. The healthiest economies, she argued, are not the ones that grow fastest. They are the ones that grow most diversely. And diverse growth is almost always gradual, because diversity is the product of accumulation — the slow accretion of new enterprises, new capabilities, new connections, each building on the ones that came before, each adding a small increment of resilience and possibility to the local ecosystem.
The distinction is not between fast and slow. It is between two kinds of growth. One kind — the kind that catastrophic money produces — is rapid, concentrated, and fragile. It looks impressive. It produces dramatic metrics. It creates a modern sector that glitters with capability and a local economy that has been starved by the redirection of talent, capital, and attention toward the glittering center. When the center falters, the whole structure falters with it, because the local economy never developed the diversity to absorb the shock.
The other kind of growth — the kind that import replacement produces — is gradual, distributed, and durable. It does not look impressive. Its metrics are modest. No single enterprise within it is a unicorn. No single innovation within it reshapes the industry. But the cumulative effect, over years and decades, is an economy that can survive shocks, adapt to changes, and generate the continuous stream of small innovations from which, occasionally, large innovations emerge.
Tokyo did not become an industrial powerhouse through a single dramatic investment. It became one through decades of import replacement — local entrepreneurs creating local substitutes for imported goods, each substitution adding a new capability, each capability making the next substitution possible. The process was gradual. It was messy. It could not have been planned from above, because each step depended on local knowledge, local initiative, and local conditions that no planner could have anticipated. And it produced one of the most diverse, resilient, and innovative economies in the history of the world.
The AI-enabled building boom that Segal describes in The Orange Pill has the potential to produce either kind of growth. The tools are neutral on the question. They enable rapid, concentrated building just as easily as they enable gradual, distributed building. The question is which pattern the institutional conditions favor.
Consider two scenarios. In the first, the AI building boom follows the pattern of catastrophic money. The dominant AI companies capture the majority of the value. The builders who use their tools create products that flow through the platforms, generating data and engagement and revenue that accrues primarily to the platform providers. The diversity of building activity is real, but the economic structure beneath it is concentrated. The million builders are tenants. The handful of AI companies are landlords. The rents can be raised at any time. The terms can be changed at any time. The creative space that enables the building boom is contingent on the continued goodwill and business strategy of the companies that provide it.
In this scenario, the growth looks impressive. Millions of people are building things they could not have built before. The metrics of creative output — tools created, products shipped, code generated — are spectacular. But the economic structure is fragile. The diversity is superficial — a diversity of outputs produced through a homogeneous infrastructure. When the infrastructure changes — when the pricing shifts, or the terms tighten, or the model's capabilities plateau, or a dominant provider fails — the entire ecosystem of building is disrupted simultaneously, because the builders share a single dependency.
In the second scenario, the building boom follows the pattern of import replacement. Each act of building is an act of local creation — a practitioner solving a specific problem with specific knowledge, creating a tool that serves a specific need. The tools are diverse — some built with proprietary AI, some with open-source models, some with combinations of AI and hand-coding, some with approaches that no AI suggested. The builders form communities — professional neighborhoods where knowledge circulates, standards are maintained, and collective intelligence is sustained through the casual exchanges of practitioners who know each other's work. The economic value of the building activity is distributed — retained by the builders and their communities rather than flowing primarily to the platform providers.
In this scenario, the growth is gradual. No single builder produces a unicorn. No single tool reshapes the industry. The metrics of creative output are modest compared to the first scenario, because the diversity of approaches means that each builder moves at her own pace, in her own direction, without the accelerating effect of a homogeneous infrastructure optimizing for speed. But the economic structure is resilient. The diversity is genuine — a diversity of methods, approaches, tools, and outputs that no single point of failure can disrupt. When one builder's approach fails, others continue. When one tool provider changes terms, alternatives exist. When conditions change in ways no one predicted, the distributed ecosystem adapts through the same mechanism it grew through: thousands of small adjustments by thousands of local entrepreneurs, each responding to specific conditions with specific knowledge.
The conditions that favor the second scenario are identifiable. They are the conditions Jacobs identified for urban vitality, translated into digital terms.
Mixed uses: practitioners who work across domains rather than within narrow specializations, producing diverse outputs that serve diverse needs. The AI tools that enable domain-crossing are the tools that favor this condition.
Short blocks: rapid feedback loops between builders and users, enabling quick iteration and tight responsiveness to specific needs. The conversational interface that Segal describes — the ability to describe a need and see a response in minutes — is a short block. The enterprise deployment cycle that takes months is a long block.
Buildings of varying age: a digital ecosystem that includes mature platforms alongside experimental tools, established approaches alongside novel ones, legacy systems alongside fresh builds. The hundred-dollar-a-month AI subscription is an old building. The enterprise AI platform is a new one. Both are needed. The old buildings are where the experiments happen.
Sufficient density: enough builders working in enough proximity that professional communities form and sustain themselves. The density can be physical or virtual, but it must be real — not a follower count but a community of practitioners who know each other's work.
These conditions do not produce themselves. They are produced by institutional structures — the professional associations, the open-source communities, the educational programs, the governance frameworks, the cultural norms — that maintain the commons against the constant pressure of concentration.
Jacobs spent her life studying what happens when the conditions for vitality are maintained and what happens when they are destroyed. What she found, consistently, across decades of observation and analysis, was that the conditions are fragile. They can be destroyed quickly — by a highway, by a housing project, by catastrophic money, by the simple failure to maintain the structures that sustain them. They take years to rebuild. And the vitality they produce, once lost, is difficult to recover, because the enterprises that would have generated it no longer exist, and the knowledge those enterprises would have produced has been lost.
The digital economy is at the point where the conditions for vitality can still be maintained. The building boom is underway. The diversity of building activity is real. The professional communities that sustain quality and knowledge exchange still exist. The open-source alternatives to the dominant platforms are viable, if underfunded. The old buildings are still cheap.
None of this is guaranteed to last. The concentration of capital in a handful of AI companies exerts a gravitational pull toward the first scenario — toward the pattern of catastrophic money that produces rapid, impressive, fragile growth at the expense of the gradual, modest, durable growth that genuine vitality requires.
The gradual growth is not dramatic. It does not produce headlines. It does not generate the metrics that venture investors reward. It produces, instead, the quiet, continuous, distributed accumulation of capability that sustains an economy over decades — the digital equivalent of the neighborhood that Jacobs loved, alive with the activity of people solving problems they understand, for communities they belong to, with tools they control, at a pace that allows the work to compound into something durable.
That neighborhood is still possible. But only if the conditions that produce it are maintained. And the conditions are maintained not by technology, not by markets, not by the invisible hand of economic selection. They are maintained by the visible hands of people who understand what they have and choose to keep it — who build the structures, tend the institutions, and do the unglamorous work of sustaining the commons against the constant, seductive, economically rational pressure to let it be absorbed.
The building is underway. The neighborhood is forming. The question is whether it will be allowed to grow.
---
The street I kept returning to was not a real street.
It was Hudson Street as Jacobs described it — the hardware store opening at eight, the children walking to school at eight-thirty, the bar keeping the sidewalk populated until two in the morning. A street I have never walked down in the condition she observed it, because that version of Hudson Street no longer exists in the form she described. The rents accomplished what Robert Moses could not.
But the pattern she identified on that street — the one where safety and innovation and economic resilience all emerge from the same source, which is the ordinary presence of different people doing different things in sufficient proximity — that pattern has followed me through every chapter of this book and back into the rooms where I actually work.
In The Orange Pill I wrote about what happened in Trivandrum, about the twenty engineers who each became capable of what twenty engineers used to do together. I celebrated it. I stand by the celebration. But Jacobs made me see something I had been looking past. The speed was real. The capability was real. The democratization was real. And something else was also real: the hallway conversations that stopped happening when everyone was deep in their own conversation with a machine. The lunch where someone used to mention a problem and someone else used to recognize it from a different project. The accumulated, circulating, informal knowledge that kept the team's collective intelligence sharper than any individual's.
I did not notice its absence at first, because absence is the hardest thing to notice. You do not miss what you do not know you had until the gap reveals itself — in a decision that should have been caught earlier, in a pattern that no one flagged because the person who would have flagged it was building alone with headphones on.
Jacobs gave me the vocabulary for what I was seeing. Import replacement — the teacher building her own tools, the architect building his own analysis software, every practitioner becoming a local entrepreneur in the digital economy. Catastrophic money — the hundreds of billions concentrated in a handful of companies whose gravitational pull reshapes everything around them. Monoculture — the convergence that happens when everyone uses the same tool and the diversity of approaches narrows toward the statistically likely. Eyes on the street — the informal quality mechanisms that erode when the conditions that produce them are optimized away.
These are not abstractions. They are descriptions of things I watch happen every week.
The one that keeps me up is the old buildings. Right now, the tools are cheap. A hundred dollars a month buys creative space that would have cost thousands five years ago. The experiments are happening. The diversity is real. But the landlords of these old buildings are trillion-dollar companies, and the rents are subsidized by investment capital that expects returns at a scale incompatible with the gradual, distributed, locally responsive growth that Jacobs identified as the only growth that lasts.
The rents will go up. The question is whether, by the time they do, the neighborhoods will have taken root deeply enough to survive.
What I take from Jacobs is not optimism or pessimism. She had no patience for either. What I take is the discipline of observation — the insistence on watching what actually happens rather than what the theory says should happen. Watch the street. Count the uses. Note who is present at what hours. See whether the enterprises are diversifying or converging. Check whether the knowledge is circulating or pooling. Measure vitality not by output volume but by the variety of approaches being tried and the density of the community sustaining them.
The digital economy does not need a master plan. It needs a million neighborhoods, each one alive with the specific knowledge of the people who inhabit it, each one maintaining its own standards, each one contributing its own diversity to an ecosystem too complex for any single architect to design.
The sidewalk ballet is not choreographed. It never was. It emerges from the conditions. Our job is to maintain the conditions.
Jane Jacobs proved that when planners replaced messy neighborhoods with efficient superblocks, cities died. Now the same pattern is playing out in the digital economy -- and most builders can't see it because the metrics look spectacular.
This book applies Jacobs's framework to the AI revolution with uncomfortable precision. Import replacement, catastrophic money, monoculture risk, the eyes on the street that keep quality alive -- each concept maps onto the forces reshaping how we build, work, and create. When every practitioner goes deep into a solo conversation with an AI, the hallway exchanges that sustained collective intelligence quietly disappear. When a handful of trillion-dollar companies own the "old buildings" where experimentation happens, the rents can be raised at any time.
AI is either the most powerful tool for restoring diversity to the digital economy -- or the final instrument of the consolidation that killed it. The outcome depends on whether we maintain the conditions for vitality, or optimize them away.

A reading-companion catalog of the 10 Orange Pill Wiki entries linked from this book — the people, ideas, works, and events that Jane Jacobs — On AI uses as stepping stones for thinking through the AI revolution.
Open the Wiki Companion →