Hear This Idea cover image

#63 – Ben Garfinkel on AI Governance

Hear This Idea

00:00

The Consequences of Estimating the Chance of Human Extinction

A lot of estimates about AGI risk are in the 5 to 20% range. This seems like a sweet spot for tractability stuff, but it's not clear if there is any suspicious clustering around that number. There absolutely are people who are closer to 99% than the R to 20%. So I don't really know.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app