The AI-Underdeveloped — Orange Pill Wiki
CONCEPT

The AI-Underdeveloped

Escobar's extension of his foundational insight into the AI transition: the discursive construction of a new global category of deficiency — populations classified as lacking AI capability — that mirrors Truman's 1949 construction of the underdeveloped world and legitimizes the same pattern of intervention.

The AI discourse that crystallized in the mid-2020s performs the same discursive operation that Truman performed in 1949. Billions of people who have lived through their own knowledge systems are now reclassified as lacking AI capability, awaiting the remedy that the technology apparatus is uniquely positioned to provide. The vocabulary has been modernized. The institutional actors wear different logos. But the underlying grammar — the discursive architecture that constructs a population as deficient, positions a technology as the remedy, and presents distribution as liberation — remains intact. The category is not discovered empirically. It is constructed discursively, by a framework that measures human capability against the standard of what AI tools provide.

In the AI Story

Hedcut illustration for The AI-Underdeveloped
The AI-Underdeveloped

The construction operates through specific mechanisms. Adoption rates define success. Productivity multipliers confirm the technology's value. The expansion of the user base generates data that improves the models, which increases the productivity multipliers, which accelerates the adoption rates. Within this loop, the only relevant question is how to distribute the tools more widely and more equitably. The question of whether the tools encode assumptions that conflict with the knowledge systems and governance structures of the communities to which they are distributed does not arise.

The figure who appears most prominently in this discourse is the developer in Lagos, the student in Dhaka, the builder in the Global South who possesses ideas and intelligence but lacks access to tools. The figure is invoked with genuine moral seriousness. The argument is not false — the tools do expand capability, the expansion is measurable. But the argument operates within a discursive structure that Escobar spent his career anatomizing: the intended beneficiary appears not as a knower but as a figure of lack awaiting remedy.

The construction is maintained by a specific rhetorical move. Populations that decline to adopt AI tools, or adopt them on terms different from those the industry prescribes, are classified as backward, fearful, or technologically illiterate — the same vocabulary that development discourse deployed against communities that resisted structural adjustment or refused Green Revolution seeds. Resistance is converted into evidence of deficiency rather than read as diagnostic knowledge about the tool's inadequacy.

The AI-underdeveloped category is not homogeneous. It contains wildly different populations — knowledge workers in Bangalore facing wage arbitrage collapse, traditional artisans facing AI-generated competition, rural communities whose knowledge does not fit the training corpus, elderly populations whose literacy was formed in pre-digital contexts. The homogenization of these populations under a single category is itself the work of the apparatus, making them legible as a market while erasing the specificity of their conditions.

Origin

The concept is implicit in Escobar's extension of his postdevelopment framework into the domain of digital technology, explicit in his 2025 writings and in the collaborative work Incomputable Earth with Michal Osterweil and Kriti Sharma.

It draws on the foundational analysis of Encountering Development (1995), now applied to the specific discursive operations through which AI companies, research institutions, and technology media construct the populations they claim to serve.

Key Ideas

Discursive construction. The category of AI-underdeveloped is produced by the framework that claims to address it, not discovered as a pre-existing condition.

Productivity as metric of deficiency. Communities are classified as deficient against the standard of AI-augmented productivity, a metric whose universality is assumed rather than demonstrated.

Resistance pathologized. Communities that decline AI tools on their own terms are classified as technologically backward rather than as diagnosticians of the tool's inadequacy.

Homogenization of difference. The category treats radically different populations as a single market, erasing the specificity of their conditions and knowledge systems.

Preemption of alternatives. Framing the conversation as a conversation about access forecloses the question of whether the tools, as designed, constitute the right form of capability.

Appears in the Orange Pill Cycle

Further reading

  1. Arturo Escobar, Michal Osterweil, and Kriti Sharma, contributions in Incomputable Earth (2025).
  2. Ramon Amaro, The Black Technical Object (Sternberg Press, 2022).
  3. Shakir Mohamed, Marie-Therese Png, and William Isaac, 'Decolonial AI: Decolonial Theory as Sociotechnical Foresight in Artificial Intelligence,' Philosophy & Technology 33 (2020).
  4. Payal Arora, The Next Billion Users: Digital Life Beyond the West (Harvard University Press, 2019).
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
CONCEPT