
#63 – Ben Garfinkel on AI Governance
Hear This Idea
The Consequences of Estimating the Chance of Human Extinction
A lot of estimates about AGI risk are in the 5 to 20% range. This seems like a sweet spot for tractability stuff, but it's not clear if there is any suspicious clustering around that number. There absolutely are people who are closer to 99% than the R to 20%. So I don't really know.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.