Borrowed Competence — Orange Pill Wiki
CONCEPT

Borrowed Competence

Spolsky's framework's name for the confidence that abstraction provides — real while the abstraction holds, owed with interest when it fails — and the specific form of professional fluency that AI-era developers are accumulating without the diagnostic strata that would let them repay the loan.

Borrowed competence is the confidence a practitioner inherits from the reliability of the tools she uses rather than from her own understanding of what those tools do. When the abstraction holds, the confidence is justified — the developer using AI-generated code feels capable, productive, liberated from mechanical labor, and the feeling maps accurately onto her output. When the abstraction leaks, the confidence becomes a liability: the developer who believed she understood the system discovers she understood the abstraction, and the system is the thing that is breaking. The interest rate on the loan is determined by the size of the gap between the abstraction level and the complexity it conceals. For AI-generated code, the gap is every layer deep, and the interest compounds silently until the moment the loan is called.

In the AI Story

Hedcut illustration for Borrowed Competence
Borrowed Competence

The concept inverts a common intuition. Most discussions of AI-era developer productivity treat the productivity gain as a pure expansion — capability added without cost. Spolsky's framework treats it as an exchange: capability at the abstraction level traded for exposure at the underlying level. The exchange is favorable as long as the abstraction holds. When it fails, the developer must suddenly supply capability at the level where the failure lives — capability she may never have developed because the abstraction made its daily exercise unnecessary.

The metaphor of borrowing is precise. A loan has a principal (the borrowed capability), a repayment schedule (when diagnostic understanding is demanded), and an interest rate (how much more understanding the repayment requires than the original borrowing seemed to imply). A small abstraction — a single-purpose library — is a small loan: when it leaks, the repayment is minor because the gap between the library's interface and its implementation is small. A large abstraction — an entire technology stack — is a large loan: when it leaks, the repayment requires understanding across the full range of concealed layers.

Borrowed competence is structurally indistinguishable from genuine competence when the abstraction holds. This is what makes it dangerous: the practitioner, her colleagues, her manager, and her customers all perceive her as competent, because the output she produces meets the standards by which competence is evaluated. The divergence between borrowed and genuine competence is revealed only under conditions that stress the abstraction beyond its reliable domain. Until that moment, the two look identical.

The concept applies beyond software. Every profession that has adopted powerful tools — aviation with autopilots, medicine with diagnostic imaging, navigation with GPS — has produced populations of borrowed-competence practitioners whose surface performance matched the expectations of their roles and whose underlying capability eroded in the specific domains their tools handled. The professions that survived this transition built institutional mechanisms — mandatory practice, recurrent certification, simulator training — to prevent the borrowing from becoming permanent. Software engineering has not yet built equivalents.

Origin

The phrase 'borrowed competence' is used throughout this volume as a distillation of a pattern Spolsky identified across multiple essays — particularly 'The Law of Leaky Abstractions,' 'The Guerrilla Guide to Interviewing,' and 'The Development Abstraction Layer.' Spolsky did not coin the phrase, but the concept runs through his work: the insistence that fluency at a tool's interface does not constitute competence in the underlying system, and that treating the former as the latter produces predictable and avoidable failures.

Key Ideas

Confidence is borrowed from reliability. A practitioner's certainty that she can handle a system reflects the tool's track record, not her own understanding.

Borrowing compounds silently. Each successful deployment of AI-generated code increases the practitioner's reliance without developing her underlying capability.

The loan is called at the worst moment. Repayment is demanded under time pressure, at scale, with consequences that amplify the gap between borrowed and genuine competence.

Interest scales with abstraction power. More powerful abstractions produce larger loans and steeper repayment schedules.

Institutions can prevent permanent borrowing. Aviation, medicine, and navigation built practices that maintain underlying capability alongside abstraction use; software has not yet done so.

Appears in the Orange Pill Cycle

Further reading

  1. Joel Spolsky, The Guerrilla Guide to Interviewing (joelonsoftware.com, 2000)
  2. Donald Schön, The Reflective Practitioner (Basic Books, 1983)
  3. Harry Collins, Artifictional Intelligence (Polity, 2018)
  4. Shannon Vallor, Technology and the Virtues (Oxford University Press, 2016)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT