Doom Debates

Are We A Circular Firing Squad? — with Holly Elmore, Executive Director of PauseAI US

42 snips
Sep 27, 2025
Holly Elmore, Executive Director of PauseAI US and AI-risk activist, explores the dire consequences of unchecked AI development. She discusses the stark message of Eliezer Yudkowsky's new book, emphasizing the urgent need for community action. Holly examines the rationalist community's reluctance to support prominent calls for AI safety, likening it to a 'circular firing squad.' She highlights innovative activism tactics and critiques the media’s portrayal of AI risks, urging listeners to engage and signal support for crucial safety movements.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Book's Core Thesis Is High-Risk

  • If Anyone Builds It, Everyone Dies argues current techniques will likely produce unaligned superintelligence and catastrophic outcomes.
  • Holly and Liron see the book as an accessible distillation of Eliezer's long-standing AI-risk case.
ADVICE

Advocate Pauses Even For Non-Extinction Harms

  • PauseAI promotes democratic deliberation and risk reduction even if you don't believe extinction is nearly certain.
  • Advocate for pauses to address gradual harms like disempowerment and mental-health exploitation.
INSIGHT

P(Doom) Depends On Feelings And Trust

  • Holly reports subjective P(Doom) ranges: ~20–40% for extreme outcomes and broader probabilities across other bad outcomes.
  • She emphasizes emotional factors and trust in authorities strongly shift her estimates.
Get the Snipd Podcast app to discover more snips from this episode
Get the app