Defying the Data Priests — Orange Pill Wiki
WORK

Defying the Data Priests

Crawford's 2021 Senate testimony naming algorithmic governance as a new priesthood — concentrating power in those who mediate between the public and algorithmic processes the public cannot inspect.

Defying the Data Priests is the published version of Crawford's 2021 Senate testimony on algorithmic governance. The testimony argues that AI's inscrutability creates a new form of unaccountable power — what Crawford calls a priesthood that "peers into a hidden layer of reality that is revealed only by a self-taught AI program, the logic of which is beyond human knowing." The priesthood metaphor is not rhetorical flourish. It is a precise description of the structural position data scientists and AI engineers occupy: mediators between the public and a process the public cannot access directly, whose authority derives from their privileged relationship to an opaque source of truth.

In the AI Story

Hedcut illustration for Defying the Data Priests
Defying the Data Priests

The priest analogy illuminates several features of the AI-governance situation. A priest mediates between the laity and a reality the laity cannot access directly — the divine, in religious contexts. The data scientist mediates between the public and the algorithmic process the public cannot inspect or evaluate. The mediation concentrates power in the mediator. The mediator's judgments about what the hidden reality implies for specific decisions become the effective decisions, because the laity has no independent means of checking. The mediator becomes, in practice, the decision-maker, even though the formal rhetoric positions her as merely interpreting an external source.

Crawford's testimony is particularly pointed about the democratic implications. Democratic self-government requires that citizens can evaluate the basis on which decisions affecting them are made. When decisions are delegated to algorithmic systems whose logic is inscrutable, the evaluation requirement cannot be satisfied — not because citizens lack sophistication but because the system has been structured to prevent such evaluation. The opacity is not incidental. It is a feature, for the institutions deploying such systems, because it insulates their decisions from democratic accountability.

The testimony proposes several remedies, none of them simple. Transparency requirements: systems that affect public decisions must be inspectable. Accountability structures: specific humans must be identifiable as responsible for specific decisions. Institutional separation: decisions of democratic significance should not be fully delegated to systems whose logic exceeds the capacity of elected officials to understand or oversee. The remedies are incomplete, and Crawford acknowledges this. The point of the testimony is to name the structure of the problem rather than to propose comprehensive solutions.

The testimony connects to Crawford's broader framework in The World Beyond Your Head and Why We Drive. The question of who evaluates whose judgment is a question about the distribution of cognitive authority, and the AI-age distribution concentrates authority in a technical class whose members are not democratically accountable to the publics affected by their work. The priesthood metaphor is Crawford's attempt to give political form to a concern that most technology discourse treats as merely technical.

Origin

Crawford delivered the testimony that became Defying the Data Priests to the United States Senate in 2021. The testimony was subsequently adapted and published, contributing to the emerging philosophical literature on algorithmic governance and democratic accountability.

Key Ideas

The priesthood analogy. Data scientists occupy a structural position analogous to religious priesthoods — mediating between publics and an opaque source of authority.

Opacity as power. The inscrutability of algorithmic systems is not incidental but structurally protective of the institutions deploying them.

Democratic evaluation requirement. Self-government requires that citizens can evaluate decision-making processes, a requirement algorithmic governance systematically fails.

Accountability structures needed. Remedies include transparency requirements, specific human accountability, and institutional separation of decisions from opaque systems.

Political framing of technical questions. The apparently technical question of algorithmic design is a political question about the distribution of cognitive authority.

Appears in the Orange Pill Cycle

Further reading

  1. Matthew B. Crawford, "Algorithmic Governance and Political Legitimacy," American Affairs (2019).
  2. Frank Pasquale, The Black Box Society (Harvard University Press, 2015).
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
WORK