Noise is unwanted random variability in judgments that should be equivalent. Two judges reviewing the same case with the same facts impose sentences differing by years. Two underwriters evaluating identical risks price them fifty percent apart. The same pathologist on different days reaches different diagnoses on identical biopsies. Where bias is systematic error that shifts the mean, noise is scattered error that widens the distribution. Kahneman argued that organizations obsess over bias and ignore noise, despite noise being at least as damaging. AI systems are, by architecture, noiseless: given the same input under the same conditions, they produce the same output. This is a genuine and substantial improvement in domains where consistency matters. But noise elimination has a cost the framework itself illuminates: some variability is not unwanted — it is the raw material of creative insight, and AI compresses human output toward its competent average with consequences for the tails of the distribution where distinctive work lives.
Noise: A Flaw in Human Judgment, co-authored with Olivier Sibony and Cass Sunstein in 2021, was the culmination of Kahneman's late-career concern with a phenomenon he believed his field had systematically neglected. The book documented noise across criminal sentencing, medical diagnosis, insurance underwriting, performance evaluation, and many other professional domains.
Three sources of noise are distinguished. Level noise: different judges have different baselines — some harsher, some more lenient. Pattern noise: different judges respond differently to the same features. Occasion noise: the same judge produces different judgments on different occasions, influenced by mood, fatigue, recent experience, the order of cases.
Occasion noise is the most unsettling category. It means a single expert applying her best judgment to the same case on different days produces different judgments. The verdict depends, in part, on when the judge happened to see it.
Kahneman's prescription — controversial, consistent — was that algorithms should replace humans in noisy judgment tasks even when the algorithms are imperfect, because removing noise is itself an improvement. He told the 2017 University of Toronto conference: "You should replace humans by algorithms whenever possible." The reasoning was simple: noisy judgment is, on average, worse than consistent judgment, even imperfect consistent judgment.
The Orange Pill reading introduces a complication. Productive noise — the random variations that lead to unexpected connections and original insights — is eliminated along with unwanted noise. The compression of creative variance is experienced as improvement, because consistency feels like competence, but the tails of the distribution where distinctive work lives are pulled toward the mean.
In group decision-making, Kahneman found that aggregating independent judgments produces more accurate decisions than letting judges influence each other, because independent errors cancel. AI collaboration removes the independence of human judgment from machine output — human thinking becomes anchored on and shaped by machine patterns, eliminating the mechanism by which independent human judgment could correct the machine's systematic tendencies.
The concept emerged from Kahneman's work on clinical vs. statistical prediction going back to the 1970s, synthesizing Paul Meehl's earlier findings with decades of experimental work on judgment. The specific term "noise" was foregrounded in the 2021 book to name something the field lacked language for.
Kahneman considered the noise argument a natural extension and completion of his bias research: bias and noise together account for most of the error in human judgment, and addressing only one leaves the other unaddressed.
Noise vs. bias. Noise is random scatter; bias is systematic shift. Both reduce accuracy.
Three sources. Level noise, pattern noise, and occasion noise all contribute to total noise.
AI eliminates noise. Deterministic systems produce identical outputs for identical inputs.
Productive noise lost. Random variability is the raw material of creative insight; AI compresses it.
Independence preserves aggregate accuracy. Collaboration that anchors human thinking on machine output removes the noise-cancellation mechanism.
Whether AI collaboration compresses creative variance measurably — and whether the compression outweighs the consistency gains in creative domains — is an open empirical question. Kahneman himself argued the consistency benefits generally outweigh the variance losses; critics including artists and scientists have argued the opposite for their specific domains.