The gap compounds generationally. Three cohorts of software developers now share the profession: those who learned before AI tools existed and built systems by hand; those who learned alongside AI tools and used them for routine work while writing complex code themselves; and those learning now, with AI tools as the default development environment, who have never written the code that runs their systems. The first cohort's diagnostic strata are deep. The second's are thinner but present. The third's are, in many cases, absent — not thin, absent — because the geological process that builds diagnostic understanding has not occurred.
The gap is self-concealing in a specific and dangerous way. A developer who has never encountered a leak she could not resolve by describing symptoms to Claude and receiving a fix believes she is competent to manage the systems she has shipped. By every visible metric — features delivered, deadlines met, promotions earned — she is competent. The gap between her visible competence and her diagnostic capability is the distance the Law of Leaky Abstractions measures, and the measurement is only taken at the moment of the leak.
The closest historical parallel is aviation. Modern aircraft abstract away vast complexity through autopilots, flight management systems, and autothrottles. Pilots monitor these systems for most of a flight. When the automation fails — unexpected disengagement, unreliable sensor data, flight regimes the automation was not designed to handle — the pilot must hand-fly. The aviation industry learned, through fatal accidents like Air France 447, that abstraction competence and underlying competence are different things and that the former does not maintain the latter. It built institutional responses: mandatory hand-flying hours, simulator training targeting automation failures, recurrent training that exercises skills daily work does not require. The software industry, in 2026, has not yet had its Air France 447 moment and has built none of the equivalent infrastructure.
The gap is organizational as much as individual. Organizations hire based on AI-augmented productivity, which rewards surface competence. The senior engineers who carry deep diagnostic capability are expensive; their daily work is less visible than the output of AI-augmented juniors; the short-term financial case for replacing them is strong; the long-term institutional case against replacing them is invisible until the moment it is decisive. This is the dynamic that produced the Y2K remediation crisis and that is being reproduced at larger scale in the AI transition.
The concept of a diagnostic gap is implicit throughout Spolsky's writings, particularly in his essay 'The Guerrilla Guide to Interviewing' and his repeated insistence that hiring must test for diagnostic capability rather than surface fluency. The explicit framing of the gap as a generational, organizational phenomenon developed in 2024–2026 as practitioners began articulating what they were observing in AI-era teams: developers who could ship but could not debug, fluency without depth, confidence without understanding. The 2025 ResearchGate paper on AI and expertise formalized the dynamic: 'the simplified interfaces of AI will inevitably fail or leak, practitioners who do not understand the underlying principles will be unable to diagnose or solve critical problems.'
The gap is structural, not moral. It results from abstraction doing exactly what abstraction is designed to do.
The gap is self-concealing. The developer who lacks diagnostic capability does not know she lacks it until the abstraction fails.
The gap widens with abstraction power. More powerful abstractions produce wider gaps and more consequential leaks.
The gap compounds generationally. Cohorts that never worked at the implementation level cannot pass on diagnostic intuition.
The measurement is taken at the worst moment. Diagnostic capability is invisible until it is required, and required only under stress.
Critics argue that the diagnostic gap framing is nostalgic — that every generation of software developers has lacked the capabilities of the previous one, and the industry has survived. The defenders respond that the current transition differs in scope and speed: prior transitions replaced one skill with another at a pace institutions could absorb, while the AI transition evacuates multiple layers of skill simultaneously, faster than the institutional mechanisms that previously bridged such gaps can adapt.