Future of Life Institute Podcast

Holly Elmore on Pausing AI, Hardware Overhang, Safety Research, and Protesting

15 snips
Feb 29, 2024
Discussion on pausing frontier AI, risks during a pause, hardware overhang, safety research, social dynamics of AI risk, and the challenges of cooperation among AGI corporations. Also, explores the impact on China and protesting AGI companies.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Why Pause AI?

  • We are developing AI technology that we cannot control.
  • We need time to figure out how to control it and ensure each development step is safe.
INSIGHT

Conflicting Beliefs

  • Tech CEOs genuinely believe in AI's potential dangers yet proceed with development.
  • They might possess different risk appetites and worldviews.
ANECDOTE

Shifting Public Opinion

  • Initially, pausing AI was deemed impossible.
  • Public opinion shifted after the Future of Life Institute's six-month pause letter.
Get the Snipd Podcast app to discover more snips from this episode
Get the app