The Orange Pill · Ch19. The Software Death Cross ← Ch 18 Ch 20 →
Txt Low Med High
PART FIVE — The Long View and the View From the Roof
Chapter 19

The Software Death Cross

Page 1 · Walking Into the Same River
Thompson Nature Letter
Thompson Nature Letter

When I started exploring ways to publish this book, I heard an episode of The Daily from the New York Times that stopped me mid-stride. The guest was Clive Thompson, the technology journalist and author of Coders: The Making of a New Tribe and the Remaking of the World, discussing his article “Coding After Coders: The End of Computer Programming as We Know It.” I went and read the piece that afternoon, because Thompson had done something I recognized immediately: he had walked into the same river.

The river of intelligence I described in Chapter 5 does not care about publishing schedules. It does not wait for your book to be finished before it sends the next wave of evidence downstream. Thompson traveled the entire country and spoke to more than seventy-five software developers at Google, Amazon, Microsoft, and small startups. I sat in rooms in Trivandrum and Tel Aviv and San Francisco and watched the same transformation through the lens of my own team. We were swimming in the same current.

The river of intelligence does not care about publishing schedules. It does not wait for your book to be finished before it sends the next wave of evidence downstream.

The Berkeley study I discussed in Chapter 11 documented what AI did to a functioning organization. The researchers measured the first tremor. Thompson, whose reporting was published on March 12, 2026, documented the earthquake.

· · ·
Page 2 · Inside Hyperspell
Claude Code
Claude Code

Thompson opens his article with a scene that any reader of this book will recognize. He is visiting Manu Ebert, an engineer with a background in neuroscience, at the apartment where Ebert and his co-founder run their startup, Hyperspell. On the screen, Claude Code is doing most of the work. Several agents are operating in parallel—writing, testing, and supervising one another. After a few minutes, Claude flashes: “Implementation complete!”

Ebert, Thompson reports, learned to code the traditional way, writing every line by hand. Now he and his co-founder, Conor Brennan-Burke, barely touch the keyboard for that purpose. Their days are spent directing the AI—explaining what they need in ordinary English, reviewing its proposed approach, and letting the agents execute.

Creative Director Ai
Creative Director Ai

This is the transformation I have been describing since the Foreword. But Thompson captures something specific that I want to dwell on: the weirdness of the new relationship between the human and the machine. The market, it turns out, has been watching the same transformations we have at an individual level. And it has already started to reprice.

· · ·
Page 3 · The SaaSpocalypse
Saaspocalypse
Saaspocalypse

Graph adapted with Nano Banana from @Jumperz

By February 2026, a trillion dollars of market value had vanished from software companies. Workday down thirty-five percent. Adobe down twenty-five. Salesforce down twenty-five. When Anthropic published a blog post about Claude's ability to modernize COBOL, IBM suffered its worst single-day decline in more than twenty-five years. The market called it the SaaSpocalypse.

I prefer a different name for it that picked up steam on social media: The Software Death Cross.

I prefer a different name for it that picked up steam on social media: The Software Death Cross.

In financial analysis, a death cross is the moment a short-term moving average drops below a long-term one. Momentum has flipped. The thing that was rising is now falling. Applied to the software industry, the metaphor is almost too precise. The falling curve is the SaaS valuation index, which peaked at 18.5 times revenue during the COVID bubble and has been compressing ever since. The rising curve is the AI market, climbing with the confidence of a technology that has found its application. Where the lines cross is where the old industry meets its new economics.

The falling curve is easy to see because public markets make it visible in real time. The rising curve is harder to see, because most of the AI economy is still private. But the numbers that do exist are staggering.

· · ·
Page 4 · The Rising Curve
Human Capital Repricing Ai
Human Capital Repricing Ai

Start with the model companies. In March 2026, OpenAI closed a $122 billion funding round at an $852 billion valuation, the largest private financing in history. A month later, Anthropic was fielding offers that would value it at $800 billion, more than double the $380 billion it had raised at in February. These are private AI labs, still years from their first public filings, trading on the secondary market at valuations within shouting distance of a trillion dollars each.

CHAPTER 19
Graph adapted with Nano Banana from @Jumperz

Then look at the chip maker underneath them. Nvidia was worth a little over a trillion dollars in the months after ChatGPT launched in late 2022. Today it is worth close to five trillion, the most valuable company in the world.

Ai Infrastructure Concentration
Ai Infrastructure Concentration

And then there’s the infrastructure layer. The five hyperscalers building the physical substrate for all of this – Amazon, Google, Microsoft, Meta, Oracle – have collectively committed somewhere between $660 and $690 billion in 2026 capital expenditure, roughly three-quarters of it earmarked for AI. That is a 70 percent increase over 2025, and 2025 was itself a record. And it still might not be enough, with the physical needs for all of that compute, from chips to power, land, and water, all constrained not by much more than just whatever these companies can pay.

This is what the rising curve looks like when you draw it at the same scale as the falling one. A trillion dollars has left the public software industry. Something north of a trillion dollars has arrived, in capital commitments and private valuations, on the other side of the cross.

A death cross in the markets is not merely a description of what has happened. It is a forecast. Historically, when the short-term trend breaks below the long-term trend, the drop is not a spike but a sustained decline — months, sometimes years, of new lows before a floor is found. The people who price software companies for a living are not betting that Salesforce has had a bad quarter. They are betting that the business underneath Salesforce is fundamentally different than the one they were pricing in 2021, and that the repricing is not close to finished.

I think the market is right about the event and wrong about what it means. The value of SaaS didn’t disappear when highly capable AI came along. But it did move toward something more human-coded.

I think the market is right about the event and wrong about what it means.
· · ·
Page 5 · Code vs. Ecosystem
Intelligence Fourth Commodity
Intelligence Fourth Commodity

Here is what the market got right. Code, as a product, is approaching commodity pricing. The productivity numbers in Thompson's piece are the kind that used to show up in vendor marketing and get ignored. Now they are showing up in reporting from people with no incentive to exaggerate. A veteran coder at one startup told Thompson he was ten to a hundred times more productive than at any point in his thirty-year career. A twenty-five-year-old at another startup estimated twenty times. Boris Cherny, who runs Claude Code at Anthropic, told Thompson he had not written a single line of the Claude codebase by hand in months and was the most prolific committer on his team.

These numbers are not outliers. They are becoming the median.

I’ve seen this in my own work, too, which has taken on a different shape now that my late nights are coupled with a tireless thought partner. You can’t make something like Napster Station in 30 days without a tool like Claude. And now that my entire team has that kind of capability unlocked, it’s inspiring, and scary, to think about what a product sprint looks like in the months to come.

If code is commoditizing this fast, the question is not whether software companies are overvalued. The market has already answered that question. The question is which software companies were overvalued, and which were priced correctly all along.

Ecosystem Vs Tool
Ecosystem Vs Tool

The market is treating all of them as if they were the same company. They are not.

Nobody uses Salesforce because Salesforce is well-written code. They use Salesforce for the data that twenty years of deployment have accumulated. For the integrations into every other tool the sales organization touches. For the audit trails and the compliance certifications and the workflow assumptions baked into the muscle memory of every rep trained on the platform. The code was always the least defensible part of the product. The code was the part any competent engineer with Claude could rebuild over a weekend. The moat was everything around the code.

The code was always the least defensible part of the product. The moat was everything around the code.

The companies that die in the wake of the Death Cross will be the ones whose value was always just code. Thin applications solving narrow problems without an ecosystem around them. The companies that thrive will be the ones whose value was always above the code layer. Thompson's own reporting from inside Google makes the point inadvertently. At a firm with billions of lines of code, the AI's real contribution is not writing new code. It is figuring out what the existing code does and why, so a human can decide what to change. The ecosystem is what the AI is navigating. The ecosystem is the thing the AI did not build and cannot replace.

This is the repricing. It is not the death.

· · ·
Page 6 · The Forge and the Junior
Apprenticeship Problem Spolsky
Apprenticeship Problem Spolsky

This also explains something that should be strange, and usually isn't remarked on: developers are ostensibly watching their own profession be automated, and most of them sound cheerful about it.

Thompson has the best explanation I have read for why. He quotes Anil Dash, a longtime programmer and tech executive: "In the creative disciplines, LLMs take away the most soulful human parts of the work and leave the drudgery to you. And in coding, LLMs take away the drudgery and leave the human, soulful parts to you."

Skill Atrophy
Skill Atrophy

That is the whole thing. Developers are not losing the part of the job they loved. They are losing the part they endured in order to reach the part they loved. The architecture decisions. The tradeoffs. The taste calls about what a system should be. Thompson writes that the job is becoming "more judging than creating," and the developers he interviewed kept comparing themselves to creative directors reviewing prototypes.

I have been making the same comparison for two years, about everyone on my team. Not just the engineers.

Thompson’s reporting also bears out the cost of this repricing, through the lives of the people working within it and trying to find a new place for themselves.

The most concerning figure in the piece is Pia Torain, a software engineer two years into her career at Point Health AI. Four months of heavy prompting, hundreds of requests a day, and she felt her ability to code slipping. "If you don't use it, you're going to lose it," she told Thompson. She stopped using the tools for a while. Now she reads the AI's output line by line before accepting it, forcing herself to understand what it is doing.

Torain had the presence of mind to notice. She had the discipline to fight for the skill she felt leaving her. Most people will not.

The senior developers Thompson interviewed were not worried for themselves. Their judgment was built. They were worried about the people coming up behind them. If the job is now less about writing code than evaluating it, where does the next generation develop the instinct to evaluate? A junior developer's pattern recognition has always come from the hours spent writing code that did not work and figuring out why. AI removes those hours. It also removes the forge.

AI removes the hours. It also removes the forge.

This is the depth-removal problem from Chapter 11, but at the scale of an industry. My engineer in Trivandrum lost the ten minutes of accidental architectural learning that were hidden inside four hours of plumbing. The industry is about to lose the equivalent across an entire generation of hires. The data is already showing it. Erik Brynjolfsson's analysis, which Thompson cites, found entry-level programming jobs down sixteen percent since 2022. Senior developer jobs are flat. The floor is eroding first, which is what you would expect if AI is replacing the tasks that used to be how juniors became seniors.

The executives Thompson interviewed all pointed to the Jevons paradox as the consolation: When something gets cheaper, we do more of it, and total demand rises. They may be right in aggregate. I suspect they are wrong about the distribution. More software will exist. More of it will be built by fewer, more senior people. The junior role that used to produce tomorrow's senior developer is the role AI is eating first, and no executive I have met has a credible answer for how the pipeline repairs itself.

I have a less defeatist reading of the hiring data. Every junior I have hired at Napster in the last year has shown up fluent in AI the way my generation showed up fluent in the internet. They do not need to be taught how to work with these tools. They are teaching the rest of us how to work with them. The executive who reads Brynjolfsson's data and concludes that AI has eliminated the need for junior hires is making the same mistake as the executive who read early internet-adoption data in 1998 and concluded they did not need to hire anyone under thirty. The new pipeline needs to be built around people who are AI-enabled, and there are few as native to this technology as the current pool of junior employees.

· · ·
Page 7 · Software Like Paper
Paper Knowledge Book
Paper Knowledge Book

Thompson closes his piece with a metaphor that has stayed with me. He compares the coming proliferation of software to the historical proliferation of paper. Before paper was cheap, the average colonial American saw maybe four pieces of paper a year. When paper became ubiquitous, strange things emerged that nobody predicted. Post-it notes. Zines. Receipts. The whole apparatus of a literate society, much of it unimaginable from inside the scarcity that preceded it. "More is not just more," Thompson said. "More is different."

Software is about to become like paper. Not rare. Not precious. Not a profession. Ubiquitous, disposable, summoned into being by people who will never call themselves developers. Kent Beck, who has been coding since 1972, told Thompson that working with AI is "addictive, in a slot-machine way." I know the feeling. You think it, you describe it, and it exists. The cycle of vision to artifact that used to take weeks now takes minutes. The river I have been describing for nineteen chapters is running faster than any of us expected, and it is carrying us somewhere none of us can fully see.

I find this more exciting than frightening.

But I notice that the people I trust most are split roughly evenly on which of those two feelings is the right one, and I am not sure they are wrong to be split.

· · ·
Thompson's Nature Letter (2025)
Related Orange Pill Cycle Topics for This Chapter
75 related entries — click to explore the full topic catalog
Every one of the 75 Orange Pill Wiki entries this chapter links to — the people, ideas, works, and events it uses as stepping stones. Click any card for the full entry.
Concept (60)
AI as Amplifier
Concept
AI as Amplifier

The governing metaphor of The Orange Pill — AI as a signal-amplifier that carries whatever is fed into it further, with terrifying fidelity. Buber's framework extends the metaphor: the amplifier…

AI Infrastructure Concentration
Concept
AI Infrastructure Concentration

The structural feature of the AI economy by which frontier capability is produced by a small number of firms in a small number of countries, producing global dependencies unlike prior technology…

Architectonic Judgment
Concept
Architectonic Judgment

The capacity — demanded by the expanded economy of research — to perceive the logical relationships among lines of inquiry and allocate scarce investigative resources across them.

Automation vs Augmentation (Brynjolfsson)
Concept
Automation vs Augmentation (Brynjolfsson)

The distinction at the heart of the Turing Trap — between AI systems designed to replace human workers (automation) and systems designed to amplify human capabilities (augmentation) — with the same…

Big Market Delusion
Concept
Big Market Delusion

Damodaran's 2020 formalization — applied to AI by Kindleberger's framework — of the pattern by which perceived large addressable markets attract more capital than the market can support, with every…

Big Six AI Investment
Concept
Big Six AI Investment

The $212 billion in AI infrastructure spending by Microsoft, Alphabet, Meta, Amazon, Apple, and peers in 2024 — a 63% year-over-year increase that represents infrastructure investment rather than…

Circular Investing in AI Infrastructure
Concept
Circular Investing in AI Infrastructure

The pattern by which hyperscaler AI investment generates revenue at the same hyperscalers — the endogenous loop in which the signal justifying the investment is produced by the investment itself.

Commodity and Engagement
Concept
Commodity and Engagement

Borgmann's core conceptual distinction: every technology can be analyzed by separating the commodity it delivers from the engagement it demands or eliminates — a distinction the device paradigm…

Competitive Moat
Concept
Competitive Moat

The structural barriers protecting a firm's position from imitation — in the AI age, migrating from execution capability to evaluative judgment embedded in activity systems.

Conversation Among Generations
Concept
Conversation Among Generations

Egan's irreducible core of education — the specific quality of interaction in which an adult's more sophisticated understanding meets a child's developing understanding to produce cognitive…

Deliberate Practice
Concept
Deliberate Practice

Ericsson's empirically grounded mechanism for expertise — effortful, boundary-targeting, feedback-rich, iteratively refined engagement that builds the mental representations no shortcut can replicate.

Democratization of Capability (Senian Reading)
Concept
Democratization of Capability (Senian Reading)

The Orange Pill claim — that AI tools lower the floor of who can build — submitted to Sen's framework, which asks the harder question: does formal access convert into substantive capability expansion?

Depth Atrophy
Concept
Depth Atrophy

The progressive decay of the capacity for sustained, unaided concentration that occurs when practitioners rely continuously on AI assistance — incremental, imperceptible, and grounded in the…

Displacement vs. Disruption
Concept
Displacement vs. Disruption

Le Guin's insistence that the language chosen to describe technological change shapes what is visible: disruption foregrounds progress; displacement foregrounds cost.

Ecosystem Lock-In
Concept
Ecosystem Lock-In

The competitive advantage that emerges when accumulated investments in data, integrations, talent, and process make switching prohibitively expensive — the durable moat that AI cannot replicate…

Ecosystem vs Tool
Concept
Ecosystem vs Tool

Janah's foundational operational distinction between the point solution (the platform, application, or subscription that addresses a specific capability gap) and the web of institutional conditions…

Financial Capital vs Production Capital
Concept
Financial Capital vs Production Capital

Perez's structural distinction between the speculative capital that dominates installation and the patient capital that dominates deployment — two logics with different time horizons, different…

Five Waves of Creative Destruction
Concept
Five Waves of Creative Destruction

Carlota Perez's enumeration of the five technological revolutions since 1771 — each following the same two-phase structure of installation and deployment — with the AI revolution emerging as the…

Five-Stage Model of Skill Acquisition
Concept
Five-Stage Model of Skill Acquisition

The Dreyfus brothers' empirically grounded model of how humans develop expertise—novice, advanced beginner, competent, proficient, expert—in which each transition involves not faster rule-following…

GDP-B
Concept
GDP-B

Brynjolfsson's proposed supplement to traditional GDP that captures the consumer benefit from free digital goods — demonstrating through choice experiments that services the standard metric values at…

Human Capital Repricing in the AI Economy
Concept
Human Capital Repricing in the AI Economy

The market revaluation of educational investments and professional skills when AI commoditizes execution—Beckerian human capital loses premium, judgment-based capital gains it, and institutions lag…

Imagination-to-Artifact Ratio
Concept
Imagination-to-Artifact Ratio

Segal's term for the gap between what a person can conceive and what they can produce — which AI collapsed to approximately the length of a conversation, and which Gopnik's framework reveals to be an…

Infrastructure Concentration
Concept
Infrastructure Concentration

The physical reality beneath the empowerment narrative: chip fabs in Hsinchu, data centers in Iowa, GPU clusters, undersea cables, and trained models representing the extracted intellectual labor of…

Intelligence as the Fourth Fictitious Commodity
Concept
Intelligence as the Fourth Fictitious Commodity

The Polanyian identification of intelligence as the latest extension of market logic into a domain that cannot survive commodification — following labor, land, and money, with characteristic…

Jevons Paradox
Concept
Jevons Paradox

The 1865 observation by William Stanley Jevons that efficiency improvements in coal-fired engines increased rather than decreased total coal consumption — the dynamic that converts AI efficiency…

Jevons Paradox of Cognition
Concept
Jevons Paradox of Cognition

Tegmark's application of Jevons's 1865 coal-efficiency paradox to AI: cheaper cognition does not reduce cognitive demand but intensifies it, creating demand that was previously uneconomical to…

Low-End Disruption
Concept
Low-End Disruption

The form of disruption that enters an existing market at the bottom, serving customers the incumbent has chosen to cede because they are the least profitable — and progressively moving upward as the…

Moral Deskilling
Concept
Moral Deskilling

Vallor's extension of Braverman's industrial analysis to cognition — AI erodes integrated judgment by fragmenting practice, removing occasions for virtue exercise, producing capable operators lacking…

Pricing vs. Valuation
Concept
Pricing vs. Valuation

Damodaran's foundational distinction between what the market will pay (pricing) and what a business is worth (valuation) — the conceptual instrument that diagnoses multiple-anchoring during a…

Productive Addiction
Concept
Productive Addiction

The compulsive engagement pattern produced when the enterprise of the self encounters unlimited productive capability — behavior indistinguishable from addiction, output indistinguishable from…

Public AI Infrastructure
Concept
Public AI Infrastructure

Mazzucato's proposal for publicly funded computational infrastructure that would provide researchers, builders, and institutions with access to AI capability without the dependency dynamics of…

River of Intelligence
Concept
River of Intelligence

Segal's metaphor — given thermodynamic grounding by Wiener's framework — for the 13.8-billion-year trajectory of anti-entropic pattern-creation through increasingly sophisticated channels, of which…

River of Intelligence (Smolin Reading)
Concept
River of Intelligence (Smolin Reading)

Edo Segal's metaphor for intelligence flowing through increasingly complex channels across 13.8 billion years — given physical grounding by Smolin's framework as the temporal operation of the arrow…

Salesforce Ecosystem Decomposition
Concept
Salesforce Ecosystem Decomposition

The worked example at the heart of Damodaran's SaaSpocalypse analysis: a sum-of-parts valuation that estimates Salesforce's intrinsic value at approximately $200-250 billion against a post-correction…

Senior Engineer's Oscillation
Concept
Senior Engineer's Oscillation

The phenomenological signature of performative reconstitution experienced from the inside — the specific alternation between excitement and terror that marks the unmaking and remaking of a…

Skill Atrophy
Concept
Skill Atrophy

The progressive decay of professional capability that AI-mediated workflows produce — the private loss professionals are reluctant to acknowledge because it exposes the contradiction that the tool…

Software Engineering as a Practice
Concept
Software Engineering as a Practice

The claim — central to MacIntyre's application to AI — that software engineering meets the criteria of a genuine practice: internal goods, standards of excellence, and a tradition of argument about…

Spontaneous Generation
Concept
Spontaneous Generation

The ancient doctrine that microorganisms arise directly from nonliving matter — disproved by Pasteur's swan-neck flask experiments and the canonical case of decisive experiment defeating a…

Tacit Knowledge
Concept
Tacit Knowledge

The vast, inarticulate substrate of understanding that operates beneath conscious awareness and cannot be captured in any specification, no matter how detailed—Polanyi's foundational insight that "we…

The AI Discourse as Morning Newspaper
Concept
The AI Discourse as Morning Newspaper

The daily stream of Substack confessions, viral tweets, benchmark announcements, and conference keynotes that functions as the mass ceremony through which the AI-builder community is continuously…

The Apprenticeship Problem
Concept
The Apprenticeship Problem

The structural challenge that AI creates by eliminating the bodily engagement through which expertise was historically developed and transmitted between generations.

The Builder's Wager
Concept
The Builder's Wager

The structural choice facing every builder during the AI turning point — between converting productivity gains into headcount reduction (installation-phase logic) and investing in expanded team…

The Compound Feeling
Concept
The Compound Feeling

Edo Segal's phrase for the simultaneous experience of awe and loss during the AI transition — what Nussbaum's framework identifies as moral sophistication rather than confusion.

The Creative Director in the AI Economy
Concept
The Creative Director in the AI Economy

The role whose contribution—aesthetic vision, taste-driven specification, curation of machine outputs—becomes the highest-leverage input when AI commoditizes execution.

The Data Network Effect
Concept
The Data Network Effect

The third form of network effect, unique to AI platforms, in which each user's interaction improves the model for all users — converting usage into quality and creating an incumbent advantage that…

The Death Cross as Political Event
Concept
The Death Cross as Political Event

The February 2026 repricing of software companies reframed from economic correction to redistribution of economic power at market speed without democratic deliberation.

The Evaluation Bottleneck
Concept
The Evaluation Bottleneck

The structural choke-point produced when AI-generated outputs converge on the manager's capacity for judgment — a bottleneck that is not a temporary inefficiency but the inevitable consequence of…

The Flow-to-Compulsion Gradient
Concept
The Flow-to-Compulsion Gradient

The continuous slope — not a boundary — along which the dorsolateral prefrontal cortex progressively loses the capacity to volitionally terminate an activity, transforming flow into captured…

The Generation-Evaluation Asymmetry
Concept
The Generation-Evaluation Asymmetry

The structural fact at the heart of human-AI creative partnership — generating novelty is computationally tractable, recognizing value in the output requires human judgment the machine cannot supply.

The Moat That Was Not Deep Enough
Concept
The Moat That Was Not Deep Enough

The creative class's supposed immunity to automation — grounded in the belief that non-routine cognitive work required irreplaceable human capacities — which AI drained by making creative production…

The Premature Death Cross
Concept
The Premature Death Cross

The moment when existing capabilities become economically obsolete before alternative capabilities are accessible — stranding entire populations in a gap between the old economy and the new.

The Productivity Multiplier as Phase Transition
Concept
The Productivity Multiplier as Phase Transition

The structural principle — drawn from microprocessor history — that a productivity multiplier of twenty is not an improvement but a phase transition: a qualitative change the organizational…

The Prompt-Execute Cycle
Concept
The Prompt-Execute Cycle

The operational sequence at the heart of generative AI use — user specifies form in natural language, machine produces artifact — read through Ingold's framework as the technical perfection of…

The SaaS Business Model
Concept
The SaaS Business Model

The subscription software model that dominated enterprise technology for two decades — built on the assumption that software is expensive to write, and now being repriced by an AI revolution that has…

The Schumpeterian Reading of The Orange Pill
Concept
The Schumpeterian Reading of The Orange Pill

The framework applied explicitly in this volume: reading Segal's five-stage model of the AI transition as a recovery, from inside the gale, of Schumpeter's creative destruction cycle — threshold,…

The Second-Generation Problem (AI Transition)
Concept
The Second-Generation Problem (AI Transition)

The Le Guin diagnosis that those who inherit a liberation without memory of the prior constraint cannot recognize when the liberation's success produces new walls — the AI-native generation…

The Silent Middle
Concept
The Silent Middle

The Orange Pill's figure for those who hold the exhilaration and the loss simultaneously—recognized here as an intuitive formulation of Heideggerian Gelassenheit.

The Turning Point
Concept
The Turning Point

Perez's term for the moment between the installation and deployment phases of a technological revolution — typically marked by financial crisis and social upheaval — where the institutional response…

The Twenty-Fold Failure Multiplier
Concept
The Twenty-Fold Failure Multiplier

The structural inversion of the twenty-fold productivity gain: if a single AI-augmented worker can produce the output of twenty specialists, she can also produce the failures of twenty, concentrated…

Variable Reward Architecture
Concept
Variable Reward Architecture

The temporal structure of AI interaction—intermittent jackpots (genuine insights) scattered unpredictably among routine outputs—producing the most compulsive engagement pattern known to behavioral…

Technology (1)
Work (6)
Event (7)
IBM COBOL Blog Post Crash
Event
IBM COBOL Blog Post Crash

IBM's largest single-day stock decline in over twenty-five years, triggered in February 2026 by an Anthropic blog post about Claude's COBOL modernization — paradigmatic grain landing on a critical…

Nvidia Sale
Event
Nvidia Sale

Damodaran's late-2025 decision to sell his entire Nvidia position — held for years through the AI infrastructure build-out — on the judgment that "you need too much to go right to break even."

SaaSpocalypse (Schumpeterian Reading)
Event
SaaSpocalypse (Schumpeterian Reading)

The early 2026 repricing of the software industry — a trillion dollars vanished in eight weeks — read through Schumpeter's framework as a textbook case of creative destruction striking an industry…

Software Death Cross
Event
Software Death Cross

The early 2026 repricing event in which a trillion dollars of market value vanished from SaaS companies — the critical-stage moment when AI's displacement of software's code value became visible to…

The Death Cross
Event
The Death Cross

The 2025–2026 trillion-dollar repricing of the software industry — when AI market capitalization overtook SaaS capitalization — read through Nye's framework as a geopolitical repricing of what…

The SaaSpocalypse
Event
The SaaSpocalypse

The eight-week repricing of early 2026 in which a trillion dollars of software industry value vanished — the financial signal that the AI turning point had arrived at sovereign speed and scale.

The Trivandrum Training (Schumpeterian Reading)
Event
The Trivandrum Training (Schumpeterian Reading)

Edo Segal's February 2026 training session in southern India — the twenty-fold productivity demonstration — read through Schumpeter's framework as the empirical confirmation of what happens when the…

Organization (1)
Carlota Perez
Further Reading From The Orange Pill Cycle · Related Thinkers
15 voices alongside this chapter — click to meet them
Continue · Chapter 20
The Sunrise
← Prev 0%
Ch19 Next →