Ezra Karger, research director at the Forecasting Research Institute and economist at the Federal Reserve Bank of Chicago, discusses the complexities of forecasting existential risks like AI and nuclear conflict. He shares insights from the Existential Risk Persuasion Tournament, where predictions from experts and superforecasters revealed striking disparities in extinction probabilities. Karger emphasizes the importance of clear reference points for informed discussions and highlights the need for better forecasting methods to navigate uncertain futures involving advanced technology.
02:49:24
forum Ask episode
web_stories AI Snips
view_agenda Chapters
menu_book Books
auto_awesome Transcript
info_circle Episode notes
insights INSIGHT
XPT Overview
The Existential Risk Persuasion Tournament (XPT) provides estimates of catastrophic risks.
It surveys experts, superforecasters, and the public using innovative methods.
insights INSIGHT
Pre-XPT Risk Estimates
Before the XPT, estimates of existential risks were scarce and unsystematic.
Toby Ord's 'The Precipice' offered individual forecasts, but a broader, systematic approach was needed.
insights INSIGHT
Importance of Precise Definitions
Precise definitions are crucial for discussing complex topics like AI risk.
The XPT used detailed definitions to facilitate productive discussion and comparison of forecasts.
Get the Snipd Podcast app to discover more snips from this episode
The book describes the 100 years following 1870 as the 'Special Century,' a period of revolutionary growth and prosperity driven by innovations such as electric lighting, indoor plumbing, motor vehicles, air travel, and television. Gordon argues that this era of growth has been flatlining since 1970, marked by growing inequality, stagnating education, an aging population, and rising debt. He contends that the productivity growth of the past cannot be repeated and identifies several 'headwinds' that will continue to slow American economic growth. The book is divided into two main parts, covering the period from 1870 to 1940 and from 1940 to 2010, and includes detailed statistical analysis and historical anecdotes to support its arguments.
The Second Kind of Impossible
null
Paul Steinhardt
The Precipice
Existential Risk and the Future of Humanity
Toby Ord
In this book, Toby Ord argues that humanity is in a uniquely dangerous period, which he terms 'the Precipice,' beginning with the first atomic bomb test in 1945. Ord examines various existential risks, including natural and anthropogenic threats, and estimates that there is a one in six chance of humanity suffering an existential catastrophe within the next 100 years. He advocates for a major reorientation in how we see the world and our role in it, emphasizing the need for collective action to minimize these risks and ensure a safe future for humanity. The book integrates insights from multiple disciplines, including physics, biology, earth science, computer science, history, anthropology, statistics, international relations, and moral philosophy[1][3][5].
"It’s very hard to find examples where people say, 'I’m starting from this point. I’m starting from this belief.' So we wanted to make that very legible to people. We wanted to say, 'Experts think this; accurate forecasters think this.' They might both be wrong, but we can at least start from here and figure out where we’re coming into a discussion and say, 'I am much less concerned than the people in this report; or I am much more concerned, and I think people in this report were missing major things.' But if you don’t have a reference set of probabilities, I think it becomes much harder to talk about disagreement in policy debates in a space that’s so complicated like this." —Ezra Karger
How forecasting can improve our understanding of long-term catastrophic risks from things like AI, nuclear war, pandemics, and climate change.
What the Existential Risk Persuasion Tournament (XPT) is, how it was set up, and the results.
The challenges of predicting low-probability, high-impact events.
Why superforecasters’ estimates of catastrophic risks seem so much lower than experts’, and which group Ezra puts the most weight on.
The specific underlying disagreements that superforecasters and experts had about how likely catastrophic risks from AI are.
Why Ezra thinks forecasting tournaments can help build consensus on complex topics, and what he wants to do differently in future tournaments and studies.
Recent advances in the science of forecasting and the areas Ezra is most excited about exploring next.
Whether large language models could help or outperform human forecasters.
How people can improve their calibration and start making better forecasts personally.
Why Ezra thinks high-quality forecasts are relevant to policymakers, and whether they can really improve decision-making.
And plenty more.
Chapters:
Cold open (00:00:00)
Luisa’s intro (00:01:07)
The interview begins (00:02:54)
The Existential Risk Persuasion Tournament (00:05:13)
Why is this project important? (00:12:34)
How was the tournament set up? (00:17:54)
Results from the tournament (00:22:38)
Risk from artificial intelligence (00:30:59)
How to think about these numbers (00:46:50)
Should we trust experts or superforecasters more? (00:49:16)
The effect of debate and persuasion (01:02:10)
Forecasts from the general public (01:08:33)
How can we improve people’s forecasts? (01:18:59)
Incentives and recruitment (01:26:30)
Criticisms of the tournament (01:33:51)
AI adversarial collaboration (01:46:20)
Hypotheses about stark differences in views of AI risk (01:51:41)
Cruxes and different worldviews (02:17:15)
Ezra’s experience as a superforecaster (02:28:57)
Forecasting as a research field (02:31:00)
Can large language models help or outperform human forecasters? (02:35:01)
Is forecasting valuable in the real world? (02:39:11)
Ezra’s book recommendations (02:45:29)
Luisa's outro (02:47:54)
Producer: Keiran Harris Audio engineering: Dominic Armstrong, Ben Cordell, Milo McGuire, and Simon Monsour Content editing: Luisa Rodriguez, Katy Moore, and Keiran Harris Transcriptions: Katy Moore