Male-lens investing describes the empirical pattern Wajcman documented in UK venture capital data: between 2012 and 2022, eighty percent of AI venture capital funded all-male founding teams while all-female teams received 0.3 percent of total investment. When female-founded AI startups did secure funding, they received on average six times less capital per deal. The pattern reflects neither explicit discrimination nor differences in founder quality — controlled studies have found women-founded startups produce comparable or superior returns — but a systematic bias in how the predominantly male venture ecosystem evaluates opportunity through a lens shaped by its own social position.
The term operationalizes Wajcman's broader framework of mutual shaping at the specific point where AI's future is being materially determined. Venture capital does not merely fund companies; it selects which visions of AI's future will be pursued. When the funding ecosystem systematically favors male founders, it systematically favors the problems those founders identify, the solutions they envision, and the workflows their predominantly male teams design. The resulting technologies encode the assumptions of their makers, and when deployed become the infrastructure within which everyone else must operate.
The mechanism operates through pattern-matching. Venture capital evaluation is notoriously reliant on intuitive assessment — the investor's sense of whether a founder looks like someone who can execute, whether the opportunity feels like a winner. These intuitions are shaped by previous experience, and previous experience in venture capital has been overwhelmingly male. Investors recognize competence more readily in candidates who resemble the founders they have previously funded, and the feedback loop hardens over time.
The consequence is not merely inequitable allocation but directional influence on the AI landscape. Problems that predominantly male founders find interesting get solved. Problems that predominantly female founders would find interesting — including problems related to care infrastructure, health management, and domestic coordination — receive disproportionately less attention. The AI tools that exist reflect the funding priorities of the ecosystem that produced them.
Wajcman's research extends this analysis to the demographic composition of AI research itself. Her Alan Turing Institute studies documented that women are disproportionately concentrated in lower-status AI roles — data labeling, model testing, project coordination — while higher-status architecture, strategy, and leadership roles remain disproportionately male. The funding pattern and the workforce pattern reinforce each other: the male-dominated industry produces male-dominated founders, who pattern-match to male investors, who fund male-led companies, which hire from a male-dominated talent pool.
Interventions to break the pattern are difficult precisely because the bias operates through intuition rather than explicit policy. Diversity requirements in funding, targeted capital pools for underrepresented founders, and institutional reforms to reduce intuitive pattern-matching have all been attempted with modest results. The pattern's persistence across decades of explicit effort suggests that changing it requires changing the composition of the evaluating ecosystem itself — a slow structural transformation rather than a quick policy adjustment.
The specific term male-lens investing appears in Wajcman's post-2023 public writing and interviews, though the underlying analysis draws on decades of feminist economic sociology. The UK data that grounds the argument comes from her 2023 Alan Turing Institute report on the AI workforce, which documented the funding disparity alongside occupational segregation and confidence gaps.
The framework extends earlier work by Candida Brush, Dana Kanze, and other feminist economists who documented pattern-matching biases in venture evaluation. Wajcman's contribution is to connect the funding pattern to the AI-specific question of which visions of the technology's future become material.
Funding shapes the future. Venture capital allocation determines which AI visions receive the capital to become real, making funding composition a structural determinant of the technology's trajectory.
Bias operates through intuition. Pattern-matching in evaluation favors founders who resemble previously funded founders, hardening existing demographics into evaluation criteria.
80/0.3 is not a rounding error. The scale of the UK disparity — two orders of magnitude between all-male and all-female team funding — reflects structural dynamics rather than marginal preferences.
Per-deal amounts compound the disparity. When female-founded startups do secure funding, they receive a fraction of what comparable male-founded startups receive, compounding inequality at every stage.
Structural change requires ecosystem change. The bias persists through intuitive evaluation, which cannot be reformed through explicit policy alone — only through changing who does the evaluating.
Venture capital industry defenders argue that the funding pattern reflects differential risk preferences between male and female founders, or differences in the sectors they pursue. Wajcman's counter is that controlled studies of equivalent pitches presented by male and female founders consistently show investor preference for male pitches, demonstrating that the pattern reflects bias in evaluation rather than differences in opportunity quality.