Every organizational disaster Weick and Sutcliffe studied was preceded by weak signals that were available but not extracted. The nurse at Bristol Royal Infirmary who kept the private tally of pediatric cardiac mortality. The engineer who questioned the O-ring performance before Challenger. The co-pilot at Tenerife who hesitated before the captain's takeoff. In each case, the signal was present. The information existed. What was missing was the organizational capacity to extract the signal as meaningful rather than as noise, to amplify it against the prevailing interpretation, and to incorporate it into collective action before the consequences became irreversible. Weak signals are structurally disadvantaged in sensemaking: they are small, they are ambiguous, they are inconsistent with established patterns, and they typically come from people without institutional authority. The capacity to attend to weak signals is the core of organizational mindfulness, and it is precisely the capacity AI threatens to atrophy — both by filtering signals through its own pattern-recognition biases and by eliminating the developmental friction through which practitioners learn to extract signals the algorithm cannot see.
The concept became central to Weick's analysis because it specified what high-reliability organizations do differently. They do not avoid failure through superior planning; they detect failures early by attending to cues that other organizations dismiss as insignificant. The attention is effortful. It requires tolerating discomfort — the cognitive load of taking seriously a signal whose meaning is unclear. It requires authority structures that permit the practitioner closest to the signal to escalate it without deference to rank. It requires cultural norms that treat near-misses as information rather than as successes.
AI's relationship to weak signals is ambivalent. On one hand, language models can surface patterns in data that humans could not process unaided — the epidemiological signal hidden in ten thousand patient records, the financial anomaly visible only across millions of transactions. On the other hand, the patterns AI is good at surfacing are patterns that match its training data. The genuinely novel weak signal — the anomaly that does not fit any prior pattern — is precisely the signal AI systematically filters out, because the system's categories have no slot for it.
Worse, AI-mediated attention reshapes which signals human practitioners learn to extract. The engineer who reviews Claude's log summary instead of reading the logs herself loses the embodied intuition for when something in the pattern feels wrong. The judgment that something does not feel right — the specifically human capacity Segal's senior engineer identified as the twenty percent that mattered — is built through the friction of years of direct contact with the data. When AI mediates that contact, the developmental pathway through which weak-signal detection is trained is bypassed.
The organizational remedy is not to reject AI monitoring. It is to preserve the conditions under which human practitioners continue to develop weak-signal detection alongside AI capability — protected time for direct engagement with the data, structured exposure to anomalies, cultural authority for the practitioner who says "this doesn't feel right" even when no algorithm has flagged it.
The concept runs through Weick's writing but is most directly developed in his work with Sutcliffe on high-reliability organizations. It draws on earlier work in signal detection theory and on Barry Turner's 1978 Man-Made Disasters, which Weick cited frequently as the foundational treatment of organizational disaster as a failure of sensemaking rather than a failure of information.
The signal is usually present. Disasters are rarely caused by missing information; they are caused by available information that was not extracted as meaningful.
Weak signals are structurally disadvantaged. They are small, ambiguous, inconsistent with patterns, and typically produced by practitioners without institutional authority.
Mindfulness surfaces them. The five hallmarks of organizational mindfulness are, in aggregate, the conditions under which weak signals get extracted and escalated.
AI both helps and hurts. Models surface patterns within training data; they filter out the genuinely novel anomalies that do not match any prior pattern.
The embodied pathway matters. Weak-signal detection is built through friction-rich direct engagement with the data; AI mediation can atrophy the capacity even while it enhances the output.