Long Now

Philip Tetlock: Superforecasting

Nov 24, 2015
Philip Tetlock, an academic and author known for leading the Good Judgment Project, dives into the art of forecasting. He reveals how amateur 'superforecasters' outperformed seasoned intelligence officers by applying rigorous scoring techniques. Tetlock discusses balancing cognitive errors and teamwork to achieve better predictions. He also critiques the certainty often seen in intelligence assessments and emphasizes the need for probabilistic thinking. Lastly, he proposes using forecasting methods to enhance public debates and tackle complex policy issues.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Keep Score To Improve Forecasting

  • Systematically scoring forecasts improves probabilistic accuracy dramatically over vague punditry.
  • Superforecasters reached ~75% probabilities for events that occurred and ~25% for those that did not.
INSIGHT

Diversity Justifies Stronger Estimates

  • Diverse independent information sources justify extremizing aggregate probability estimates.
  • Multiple 70% independent judgments can combine into ~85–90% confidence.
ADVICE

Score Forecasts With Proper Rules

  • Use proper scoring rules (Brier score) and report true probabilistic beliefs as if betting money.
  • Penalize extreme shifts wrongfully to incentivize honest probability estimates.
Get the Snipd Podcast app to discover more snips from this episode
Get the app