The Parliament of Networks — Orange Pill Wiki
CONCEPT

The Parliament of Networks

Latour's proposal for a governance architecture adequate to hybrid entities — deliberative structures where technical mechanism and political effect confront each other with neither claiming exclusive authority. The governance AI demands and does not yet have.

The parliament of networks is the alternative to the modern constitution's division of authority between scientists and politicians. Latour's proposal, most systematically developed in Politics of Nature (2004), calls for governance structures that bring technical expertise and democratic participation into sustained confrontation — where neither side can claim the exclusive right to speak, and where the hybrid nature of the governed object is reflected in the hybrid composition of the governing body. The parliament gives non-human actants — training data, optimization targets, infrastructure dependencies — a place in the deliberative process, not by granting them votes but by making their characteristics objects of sustained political attention. For AI, this architecture is required and almost entirely absent.

In the AI Story

Hedcut illustration for The Parliament of Networks
The Parliament of Networks

The proposal emerged from Latour's engagement with ecological politics in the 1990s. Climate change, biodiversity loss, and environmental toxicity were all hybrid phenomena that the modern constitution could not govern. Scientific expertise alone could not decide what level of warming was acceptable — that was a political question involving distribution and value. Democratic deliberation alone could not decide what warming levels would produce — that was an empirical question involving atmospheric physics. A governance architecture was needed in which empirical findings and democratic values could inform each other within a single deliberative space.

The same architecture is needed for AI. The supply-side regulation of the EU AI Act — what AI companies may build, what disclosures they must make — addresses one dimension. The civic advocacy that pushes for algorithmic transparency and data rights addresses another. The professional ethics of technical communities addresses a third. But none of these brings the empirical understanding of how the mechanism works into sustained confrontation with the democratic evaluation of what effects the mechanism produces. The technical community regulates the technology without deliberating the politics; the political community regulates the effects without understanding the mechanism; and the hybrid — the technology-society assemblage that AI actually is — remains ungoverned.

Closing this gap requires institutional design of a kind that does not currently exist. Not because the ingredients are unavailable — technical expertise and democratic participation are both abundant — but because the spaces where they meet are either rare (occasional legislative hearings) or dysfunctional (comment periods on regulatory proposals). What is needed are standing bodies that continuously deliberate about specific AI systems: their training data composition, their optimization targets, their deployment contexts, their failure modes, their distributional effects. Bodies in which technical expertise informs without dictating, and democratic participation evaluates without requiring technical fluency as a prerequisite to legitimate voice.

The proposal has concrete implications. Training data composition should be a matter of public deliberation, not a proprietary secret. Optimization targets — the value choices embedded in training — should be visible and contestable. Infrastructure dependencies should be traced as political facts that shape who can build frontier models. Corporate practices like Constitutional AI, currently operating as quiet hybrid governance inside a handful of companies, should be subjected to the transparency and accountability that hybrid governance of civilizational consequence requires. None of this eliminates the technical work; it locates the technical work within a deliberative frame where the value choices embedded in technical decisions become visible and subject to democratic evaluation.

Origin

The proposal was developed in Politics of Nature: How to Bring the Sciences into Democracy (2004), Latour's most systematic attempt to design alternative governance architectures for hybrid phenomena. The book drew on his participation in French debates about ecological politics and his long-running dialogue with Isabelle Stengers about the relationship between scientific inquiry and democratic deliberation.

The 'parliament' metaphor was deliberately political. It signaled that Latour was not proposing a technical advisory committee or an expert panel but a deliberative body — one with the legitimacy conferred by democratic processes and the empirical grounding provided by sustained engagement with the specifics of what is being governed.

Key Ideas

Neither scientists nor politicians alone. Hybrid phenomena cannot be governed by authorities restricted to one side of the modern constitution. The parliament requires both kinds of participation.

Non-humans represented, not enfranchised. The parliament does not give training data the vote. It gives the characteristics of training data — biases, compositions, exclusions — a place in deliberation.

Continuous deliberation, not episodic consultation. The parliament operates as a standing body, not an occasional hearing. Hybrid phenomena require sustained governance, not periodic input.

Corporate governance already happens. Frameworks like Constitutional AI are already hybrid governance. The question is whether this governance will remain private and unaccountable or be subjected to the democratic scrutiny that hybrid governance of civilizational consequence requires.

Legitimacy from process. The parliament's authority comes from its process — the quality of the deliberation, the diversity of participation, the seriousness with which technical and political considerations inform each other — not from expertise or representation alone.

Debates & Critiques

Critics ask whether the parliament is practically achievable. Deliberative bodies are slow; AI moves fast. The technical complexity is formidable; public comprehension is limited. Who decides which questions go to the parliament? Who decides when deliberation has reached sufficient closure? The reply acknowledges the difficulties and responds that the alternative is worse: continued governance by the purified categories that produce either technical regulation without political deliberation or political advocacy without technical grounding. The parliament of networks is not utopia; it is the institutional response to problems the modern constitution cannot handle, and the absence of that response does not make the problems disappear.

Appears in the Orange Pill Cycle

Further reading

  1. Bruno Latour, Politics of Nature: How to Bring the Sciences into Democracy (Harvard University Press, 2004)
  2. Bruno Latour, Making Things Public: Atmospheres of Democracy (MIT Press, 2005)
  3. Noortje Marres, Material Participation: Technology, the Environment and Everyday Publics (Palgrave Macmillan, 2012)
  4. Archon Fung, Empowered Participation: Reinventing Urban Democracy (Princeton University Press, 2004)
  5. Isabelle Stengers, Cosmopolitics I–II (University of Minnesota Press, 2010–2011)
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT