Future of Life Institute Podcast cover image

Holly Elmore on Pausing AI, Hardware Overhang, Safety Research, and Protesting

Future of Life Institute Podcast

00:00

Exploring Varying Views on AI Safety Concerns and PDoom

Exploring varying levels of concern regarding the probability of human extinction from AI, ranging from 10% to 90%, and the impact of a suggested pause in AI research on reassessing AI safety risks.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app