
Holly Elmore on Pausing AI, Hardware Overhang, Safety Research, and Protesting
Future of Life Institute Podcast
00:00
Exploring Varying Views on AI Safety Concerns and PDoom
Exploring varying levels of concern regarding the probability of human extinction from AI, ranging from 10% to 90%, and the impact of a suggested pause in AI research on reassessing AI safety risks.
Transcript
Play full episode