
Holly Elmore on Pausing AI, Hardware Overhang, Safety Research, and Protesting
Future of Life Institute Podcast
00:00
Exploring the Concept of Pausing AI Development
The chapter delves into the importance of considering a global pause on high-cost AI development, shifting the burden of proof to developers of potentially dangerous technology to ensure safety. It discusses the complexity of enforcing a pause through an international body, drawing parallels to historical nuclear non-proliferation efforts.
Transcript
Play full episode