80,000 Hours Podcast cover image

80,000 Hours Podcast

#200 – Ezra Karger on what superforecasters and experts think about existential risks

Sep 4, 2024
Ezra Karger, research director at the Forecasting Research Institute and economist at the Federal Reserve Bank of Chicago, discusses the complexities of forecasting existential risks like AI and nuclear conflict. He shares insights from the Existential Risk Persuasion Tournament, where predictions from experts and superforecasters revealed striking disparities in extinction probabilities. Karger emphasizes the importance of clear reference points for informed discussions and highlights the need for better forecasting methods to navigate uncertain futures involving advanced technology.
02:49:24

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • The Existential Risk Persuasion Tournament utilized innovative methodologies to reveal how different individuals assess catastrophic risks differently.
  • Superforecasters provided insights that highlighted a significant gap between their assessments and the more alarming projections from concerned experts.

Deep dives

AI Risk and Human Extinction Probability

The discussion outlines the concerns surrounding the existential risks posed by artificial intelligence, particularly the potential for AI to cause human extinction. A survey of opinions revealed that while those worried about AI risks estimated a 40% chance of these outcomes over the next millennium, skeptics believed the probability was 30%. The convergence in thought highlights a shared concern, albeit with differing timelines and degrees of worry, about the implications of AI development. This sentiment demonstrates a broader recognition of risk but implies a significant disconnect over urgency and likelihood among different perspectives.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner