In Do Androids Dream of Electric Sheep?, kipple is the tide of junk that rises in every apartment, every abandoned building, every corner of the post-apocalyptic Earth. It reproduces itself, or seems to: leave a room empty and kipple fills it. The tendency of disorder to increase, of the useless to accumulate, of the dead weight of discarded objects to crowd out the living. Dick's characters fight a losing battle against kipple, which represents not merely physical clutter but the universe's fundamental tendency toward entropy — the second law of thermodynamics experienced as the daily struggle to maintain order against a force that does not rest. The term has become shorthand in technology culture for low-quality proliferation, and its relevance to the AI age is direct: when the cost of producing content approaches zero, the volume of content explodes, and most of that content is kipple — generated not because anyone needs it but because generation is trivially easy.
Dick's kipple operates according to what one of his characters describes as the First Law of Kipple: 'Kipple drives out nonkipple.' The presence of junk attracts more junk. The accumulation is self-reinforcing. A room that contains some kipple will, if left unattended, fill entirely with kipple. The only defense is constant vigilance — the ongoing, never-completed labor of sorting, discarding, maintaining the boundary between the useful and the useless. The moment attention lapses, kipple wins. This law maps directly onto information ecosystems in the AI age. When AI-generated content fills channels faster than human attention can evaluate it, the signal-to-noise ratio degrades. Not through a single catastrophic failure but through the cumulative effect of plausible, adequate, good-enough content that occupies bandwidth without providing genuine informational value.
The Orange Pill identifies this dynamic in the account of AI-generated code: when every developer can produce twenty times more code, the total volume of code in the world increases proportionally, and most of that code will be adequate rather than excellent. The adequacy is sufficient for the immediate purpose — the feature ships, the test passes — but the code enters the maintenance queue, the dependency graph, the technical debt ledger. The system must now carry it, update it, secure it, eventually refactor or retire it. The maintenance burden grows. The kipple accumulates. And the human judgment required to distinguish code that serves a purpose from code that merely occupies space becomes both more necessary and more scarce, because the same tools that produce the kipple produce the conditions under which human attention is fragmented and overwhelmed.
Dick's most disturbing kipple insight is that human institutions themselves can become kipple — structures that once served a function but now exist primarily to perpetuate their own existence. In the novel, the kipple is physical: broken appliances, obsolete magazines, packaging materials. But Dick's cultural criticism extended the concept to bureaucracies, rituals, entire systems of meaning that continue operating after their animating purpose has been lost. The AI age produces institutional kipple at an accelerating rate: governance frameworks that address yesterday's risks, educational policies designed for pre-AI workflows, employment structures built around skill-gates that AI has eliminated. These structures are not deliberately obstructive. They are kipple — the accumulated residue of previous solutions that have outlived their problems but have not yet been cleared away.
Dick coined 'kipple' for Do Androids Dream of Electric Sheep?, published in 1968. The term has no clear etymological source, though it phonetically resembles 'rubble' and 'tipple,' suggesting both debris and consumption. Dick defined it precisely in the novel through the character J.R. Isidore, who describes kipple's behavior with the precision of a natural law: 'Kipple is useless objects, like junk mail or match folders after you use the last match or gum wrappers or yesterday's homeopape. When nobody's around, kipple reproduces itself.' The concept captured something real enough that it entered the technology lexicon as a descriptor for low-quality proliferation — 'code kipple,' 'content kipple,' 'data kipple' — wherever zero-cost replication produces accumulation that overwhelms curation.
Entropy made visible. Kipple is the second law of thermodynamics experienced as the daily condition of physical and informational space — the tendency toward disorder that can be resisted but never defeated.
Drives out nonkipple. The First Law: the presence of low-quality content attracts more, creating a self-reinforcing cycle that progressively degrades the entire environment unless actively curated.
Maintenance is never finished. The only defense against kipple is continuous sorting, discarding, and boundary-enforcement — the unglamorous work that AI's productivity gains make more necessary precisely because they make generation easier.
Digital kipple at scale. AI-generated content produces informational kipple at volumes that make physical kipple manageable: auto-expanded documents, generated summaries of generated reports, synthetic training data contaminating future models.
Institutional kipple. The concept extends beyond objects to structures — policies, frameworks, governance mechanisms that continue operating after their animating purpose has been lost, occupying bandwidth without providing value.