Franklin's most important analytical move was to insist that technology is not what the public conversation treats it as—a collection of artifacts, devices, gadgets, programs. Technology is a practice: a way of doing things, a system of relationships between the worker, the work, and the institution within which the work occurs. The distinction sounds simple but carries radical implications. Understanding the artifact tells you what the tool can do—its capabilities, features, performance metrics. Understanding the practice tells you what the tool does to the person using it—how it reorganizes the work, who controls the process, who gains skill and who loses it, what forms of knowledge are rewarded and what forms become invisible. Applied to AI, the distinction reveals that the public conversation has been conducted almost entirely in the language of artifacts. Claude Code is an artifact. GPT-4 is an artifact. The natural language interface is an artifact. Their capabilities are remarkable and measurable. But capability is only one dimension of a technology's impact. The practice—how the technology reorganizes knowledge work—determines whether the capability produces flourishing or depletion.
Franklin developed this framework through decades of observing how technologies reorganize work. She was a materials scientist who understood that the properties of any system are determined not by components alone but by relationships between them. The same atoms, arranged differently, produce materials with completely different properties. Soft iron or brittle steel. Same atoms. Different practice. She applied this insight to technology with devastating precision: the artifact is the components, the practice is the arrangement, and the arrangement determines the consequences. Change the practice and you change everything, regardless of whether the device looks the same.
The practice-not-artifact orientation explains why Franklin insisted on examining the real world—not demonstrations, keynotes, or quarterly presentations, but Tuesday afternoon where tools are used by actual people under actual constraints. The demonstration world displays benefits; the real world is where costs are paid. A technology that increases output while degrading the worker's capacity for independent judgment has not succeeded—it has succeeded at one thing while failing at another, and the failure matters as much as the success because the failure determines sustainability. The worker's experience of the practice is not secondary consideration to be weighed against the artifact's capability—it is primary evidence about the technology's actual social consequences.
Applied to AI-augmented cognitive work, the framework reveals that nearly all institutional evaluation has been conducted at the artifact level. How many lines of code generated? How many features shipped? How much faster is delivery? These measure the artifact. The practice questions go almost unasked: What happened to the worker's understanding? Who controls the process? What forms of knowledge are being valued and what forms are being silenced? The worker who feels creeping inability to concentrate, who notices that capacity for sustained focus has eroded since she began using AI tools for every task, who senses that something has changed in the quality of her engagement but cannot name what—she is experiencing the practice. She does not need a PhD in computer science to describe what is happening to her. She needs a framework that takes her experience seriously as evidence.
The political implication Franklin intended: a house can be designed by its inhabitants or for them by others whose interests may not align with theirs. In a democracy, the design of the house should be a collective decision. The current house of AI technology is being designed by technology companies whose incentive structures reward engagement, adoption rates, revenue growth. The inhabitants—workers, students, parents, citizens who live inside the practice—have had almost no voice in the design. This is not conspiracy; it is the normal operation of prescriptive technology within a market economy. The companies building the tools are rewarded for capability. The organizations deploying the tools are rewarded for productivity. The workers using the tools are rewarded for output. Nobody in this chain is rewarded for asking Franklin's questions—for examining the practice rather than the artifact, for measuring what happens to the worker's understanding alongside what happens to throughput.
The practice-not-artifact distinction draws on intellectual traditions including Marx's analysis of the labor process, Mumford's study of technics and civilization, and the workplace ethnography tradition in sociology. What Franklin contributed was clarity and empirical grounding. She did not remain in philosophical abstraction—she pointed to specific factories, specific communications technologies, specific workplace reorganizations, and asked: what is this doing to the people inside it? The framework became influential precisely because it provided vocabulary for experiences workers had but could not articulate within the dominant pro-technology discourse. The framework says: your experience matters, it is evidence, and it can be examined systematically rather than dismissed as anecdotal resistance to progress.
The artifact measures capability; the practice measures consequence. What the tool can do (artifact analysis) versus what the tool does to the person using it (practice analysis)—two different questions producing different evaluations of the same technology.
The practice is where power relationships live. Who controls the process, who gains skill, who loses it, what knowledge forms are rewarded—these questions cannot be answered by examining the artifact but only by examining the system of relationships the artifact organizes.
Worker experience is primary evidence. The worker sensing that something has changed in the quality of her engagement is experiencing the practice—her testimony is not anecdotal but diagnostic, requiring frameworks that take it seriously rather than dismissing it.
The demonstration world conceals the real world. Benefits are displayed in demonstrations; costs are paid on Tuesday afternoon—honest analysis must begin where consequences are experienced by people with least power to refuse them.
Governance is practice-level intervention. Regulating the artifact (what AI companies may build) is insufficient; governing the practice (how AI tools are deployed, who controls the process, what metrics evaluate impact) is where democratic participation matters most.