80k After Hours

Highlights: #200 – Ezra Karger on what superforecasters and experts think about existential risks

Sep 18, 2024
Ezra Karger, an expert on superforecasting and existential risks, dives into the fascinating world of predicting future threats. He discusses why accurate forecasts are crucial for understanding existential risks and highlights the stark disparity between super forecasters and experts on extinction probabilities. The conversation addresses the ongoing disagreements about AI risks and explores how differing worldviews shape these views. Karger emphasizes the practical utility of expert forecasting in navigating these pressing global challenges.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Forecasting Existential Risks

  • Forecasts are essential for decision-making, even in complex areas like existential risks.
  • Implicit forecasts influence discussions, so making them explicit improves clarity and facilitates productive debate.
INSIGHT

Extinction Risk Estimates

  • Experts estimated a 6% extinction risk by 2100, while superforecasters estimated 1%.
  • These seemingly high probabilities highlight the challenge of forecasting low-probability events, but documenting them is crucial.
INSIGHT

Disagreement on AI Risks

  • Disagreements on AI risk may stem from differing short-term expectations, long-run expectations, or fundamental worldview differences.
  • Exploring these hypotheses can help identify the root causes of disagreement and potential paths to consensus.
Get the Snipd Podcast app to discover more snips from this episode
Get the app