Progressive Affordance Disclosure — Orange Pill Wiki
CONCEPT

Progressive Affordance Disclosure

The Norman volume's proposed design pattern for conversational AI — revealing capabilities dynamically through dialogue rather than statically through fixed interface layouts, extending Norman's progressive disclosure into the conversational medium.

Progressive disclosure was a cornerstone of Norman's usability framework: reveal complexity to users gradually, matching their evolving expertise rather than presenting everything at once (overwhelming) or nothing at all (opaque). The classic example is the word processor that shows basic editing tools to novices and reveals advanced formatting as expertise develops. The pattern worked because the underlying capability space was discrete and hierarchical — features could be arranged from simple to advanced and disclosed accordingly. The AI interface defeats this arrangement: capabilities are unbounded and person-relative. Progressive affordance disclosure is the Norman volume's name for the design pattern that preserves the function of progressive disclosure in a medium where its traditional implementation is impossible.

In the AI Story

Hedcut illustration for Progressive Affordance Disclosure
Progressive Affordance Disclosure

The challenge progressive affordance disclosure addresses is the person-dependent nature of AI affordances. A single system can produce a basic script for a novice and a sophisticated distributed architecture for an expert. The capabilities are identical; what varies is the user's articulational capacity. A static interface layout cannot accommodate this variance — any fixed presentation either overwhelms the novice or underserves the expert.

The proposed pattern responds through conversational scaffolding. When a user makes a request, the system assesses what she has specified, what she has omitted, and what her phrasing implies about her current level of understanding. It responds not by producing an immediate output but by offering clarifying questions, relevant alternatives, or structured suggestions calibrated to expand her sense of what she could ask for — without overwhelming her with possibilities beyond her current horizon.

The novice who types "make me a website" does not receive a generic website. She receives questions: What is the website for? Who will visit it? What should they be able to do? Each question is a signifier — a perceivable cue that communicates a dimension of the problem space she may not have considered. The system is not merely responding to her request. It is teaching her how to make better requests.

The expert who types a detailed technical specification receives a different kind of response — not clarifying questions but proposed approaches, architectural alternatives, and considerations she may not have addressed. The system calibrates its disclosure to her apparent level, continuously adjusting as the conversation proceeds. This is progressive disclosure reimagined for a dynamic, conversational, emergent medium — a significantly harder design problem than static hierarchy, and one the current generation of AI systems has barely begun to address.

Origin

The concept extends Norman's traditional progressive disclosure pattern, developed in his UCSD work and elaborated in The Design of Everyday Things, into the conversational medium. The Norman volume's Chapter 2 formalizes the pattern as a response to the blank prompt problem and the person-dependent affordance challenge.

Related ideas appear in conversational agent research and in the emerging literature on adaptive user interfaces, though the Norman volume's explicit connection to Norman's original framework gives the pattern its distinctive design grounding.

Key Ideas

Capability revelation through dialogue. AI systems cannot show all their capabilities at once and cannot hide them arbitrarily. They must reveal them in response to the user's evolving engagement.

Questions as signifiers. In the conversational medium, the system's questions function as signifiers — perceivable cues that communicate the dimensions of the problem space relevant to the current task.

Calibrated disclosure. The pattern requires the system to model the user's current understanding and adjust its disclosures accordingly, neither overwhelming nor underserving.

Pedagogy as interface. The system teaches the user to make better requests over time. The interface is not merely a channel; it is a developmental partner.

Appears in the Orange Pill Cycle

Further reading

  1. Donald A. Norman, The Design of Everyday Things, rev. ed. (Basic Books, 2013).
  2. Anthony Jameson, "Adaptive Interfaces and Agents," in The Human-Computer Interaction Handbook, ed. J. Jacko and A. Sears (Lawrence Erlbaum, 2003).
  3. Nielsen Norman Group, "Progressive Disclosure," various articles, 2006–2024.
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT