Franklin used the mortgage metaphor to describe the deferred costs that prescriptive technology imposes on the societies deploying it. The short-term gains—increased productivity, reduced labor requirements, standardized outputs—are immediate and visible. The costs are long-term and hidden: the erosion of independent judgment, the depletion of holistic understanding, the dependency created when workers can no longer function without the prescriptive system. The mortgage payments come due not when the technology is working but when it fails—when the system encounters a situation its designers did not anticipate and the workers cannot improvise because they have been trained in compliance rather than judgment. Applied to AI: the social mortgage is being accumulated now, in every interaction where a worker accepts output without understanding, in every organization measuring throughput without measuring comprehension, in every educational institution teaching students to prompt effectively without teaching them to evaluate independently. The mortgage is invisible in current accounting because current accounting measures only production-model metrics. It will appear only when the mortgage comes due, and by then the cost will be far higher than if the accounting had been honest from the beginning.
The mortgage framework is precise economic metaphor, not loose analogy. A mortgage allows present consumption by borrowing against future income. The consumption is real. The debt is real. And the debt must be paid—if not by the borrower, then by someone who inherits the obligation. Prescriptive technology allows present production gains by borrowing against the cognitive capital accumulated by previous generations. The gains are real. The depletion is real. And the depletion creates obligation that future workers must meet—if they can. If they cannot, the practice collapses when it encounters challenges requiring depth that is no longer there.
Franklin identified the characteristic feature of the social mortgage: it is invisible to the metrics evaluating the technology's success. The factory owner measuring output per worker-hour does not measure the erosion of craft knowledge. The manager measuring features per sprint does not measure the narrowing of independent problem-solving capacity. The dashboard shows productivity increasing. What it does not show is the foundation depleting. The gap between what is measured and what matters is where the mortgage accumulates undetected.
The mortgage's payment schedule is unpredictable but inevitable. In industrial contexts, the payments came due during economic contractions, wars, supply-chain disruptions—moments when the system's resilience was tested and found wanting because the workers who could improvise, who possessed holistic understanding of the production process, had retired or died and had not been replaced. In cognitive work, the payments will come due when AI tools produce errors workers cannot diagnose because their diagnostic capacity has atrophied, when systems fail in ways training data did not anticipate and workers cannot reason about the unexpected because they have been trained to accept rather than to question, when organizations face novel challenges requiring deep independent thought and discover the depth is no longer there.
The mortgage metaphor also captures the distributional injustice Franklin diagnosed in every prescriptive technology. The people who accumulate the mortgage—the current generation of workers accepting the productivity gains without maintaining the growth-model infrastructure—are not the people who will pay it. The payment falls on the next generation, who inherit reduced cognitive capital, narrowed judgment, weakened capacity for independent thought. They did not make the decision to prioritize production over growth. They experience its consequences. This is the social character of the mortgage: the gains are captured by current decision-makers, the costs are borne by future inhabitants who had no voice in the decision.
Franklin's mortgage framework built on her study of nuclear testing—the paradigmatic case of a technology producing immediate benefits (weapons capability, political leverage) while accumulating long-term costs (atmospheric contamination, genetic damage) that would be paid by generations who did not consent to the testing. The framework extends directly into AI: the productivity gains captured now, the cognitive capital consumed now, the depletion of growth-model infrastructure now—all create obligations that future workers and citizens will either meet or collapse under. The mortgage is already large. It grows with every quarter prioritizing output over understanding.
Deferred costs hidden by immediate gains. Productivity increases are visible and immediate; erosion of independent judgment is invisible and delayed—the mortgage accumulates undetected in the gap between what is measured and what matters.
Payments due when system fails. Not when technology is working but when it encounters situations designers did not anticipate and workers cannot improvise because they have been trained in compliance rather than judgment—resilience tests reveal depletion.
Future generation pays for present gains. The social character of the mortgage: current decision-makers capture productivity, next generation inherits depleted cognitive capital and narrowed judgment—they had no voice in the decision but bear its consequences.
Invisible to production metrics. The dashboard shows productivity increasing while the foundation depletes—the mortgage accumulates in the unmeasured growth-model dimension, becoming apparent only when independent capacity is tested and found wanting.
Already large and growing. Every quarter prioritizing output over understanding, every organization measuring throughput without comprehension, every educational institution teaching prompting without evaluation—the mortgage compounds, the payment date approaches.