SB 1047 — Orange Pill Wiki
EVENT

SB 1047

California's 2024 Safe and Secure Innovation for Frontier Artificial Intelligence Models Act — the first serious attempt in the United States to impose safety obligations on the developers of the most powerful AI models.

SB 1047, introduced by California State Senator Scott Wiener in February 2024, would have required companies training the largest AI models (those costing more than $100 million or using more than 10^26 FLOPS of compute) to perform safety assessments before release, implement kill-switch capabilities, and provide whistleblower protections for employees identifying safety risks. The bill passed the California legislature but was vetoed by Governor Gavin Newsom in September 2024. It was endorsed by Lawrence Lessig, Yoshua Bengio, Geoffrey Hinton, Stuart Russell, and others as 'the bare minimum for effective regulation of this technology.' Lessig's endorsement in a Boston Globe op-ed became a touchstone of the public debate. The bill's defeat — and the subsequent introduction of weaker successor legislation — is the paradigmatic recent case of the legislative modality of AI governance encountering the political power of the industry it attempts to govern.

In the AI Story

Hedcut illustration for SB 1047
SB 1047

The bill's core provisions were modest by the standards of the European Union's AI Act. It applied only to the largest models. It required safety assessments rather than prohibiting specific capabilities. It did not impose licensing or pre-market approval. It did include a narrow private right of action for catastrophic harms — a provision that became the focus of industry opposition.

The opposition was intense and well-funded. OpenAI, Anthropic, Meta, Google, and the venture capital community (particularly Andreessen Horowitz) argued that the bill would chill innovation, drive AI development out of California, and impose ill-defined liability on developers for downstream uses. Proponents argued that the bill imposed standard duties of care on the most powerful actors in an industry with potentially catastrophic risks, that the private right of action was narrower than comparable provisions in product liability law, and that the 'chilling innovation' argument was a rhetorical standard deployed against nearly every safety regulation in industrial history.

Governor Newsom's veto message cited specific technical concerns about the bill's scope but was widely understood as reflecting the political difficulty of imposing meaningful constraints on the industry most concentrated in California. The veto accelerated debate about whether AI governance can be achieved at the state level given the mobility of AI development capital and the concentration of industry lobbying power.

Lessig's analysis of the defeat treats it as exemplary rather than exceptional. SB 1047 illustrated the structural dynamics he has diagnosed throughout his career: the legislative modality is slow, the industry modality is fast and well-resourced, and without coordinated intervention across law, norms, markets, and architecture, the slower modality loses. The weaker successor legislation eventually enacted (AB 2013) addressed transparency requirements for training data but left the safety assessment and liability provisions of SB 1047 unaddressed.

Origin

SB 1047 was drafted by Senator Wiener's office in consultation with AI safety researchers and policy experts including Dan Hendrycks, Nathan Calvin, and others affiliated with the Center for AI Safety. The bill drew on earlier proposals from the Federation of American Scientists and on provisions of the Biden administration's 2023 executive order on AI. It was introduced in February 2024, amended multiple times during the legislative session to address industry concerns, passed the Assembly on August 28, 2024, passed the Senate on August 29, and was vetoed by Governor Newsom on September 29, 2024.

Key Ideas

Bare minimum regulation. Lessig's endorsement framed SB 1047 as the floor of responsible governance, not the ceiling.

Scale-targeted rather than capability-prohibitive. The bill applied only to the largest models and required assessment rather than banning specific capabilities.

Whistleblower protections included. The bill incorporated elements of the right to warn framework.

Industry opposition was decisive. Well-resourced industry lobbying combined with selected technical objections produced the veto.

Exemplary of the governance gap. The defeat illustrates the structural dynamics that make single-modality (legal) governance insufficient against multi-modal industry power.

Appears in the Orange Pill Cycle

Further reading

  1. Lawrence Lessig, 'California's SB 1047 is the bare minimum' (Boston Globe, August 2024).
  2. Scott Wiener, SB 1047 (California State Legislature, 2024).
  3. Gavin Newsom, Veto Message for SB 1047 (September 29, 2024).
  4. Yoshua Bengio et al., Open letter supporting SB 1047 (August 2024).
Part of The Orange Pill Wiki · A reference companion to the Orange Pill Cycle.
0%
EVENT