Robert Wright's Nonzero cover image

Two Visions of AI Apocalypse (Robert Wright & David Krueger)

Robert Wright's Nonzero

00:00

Why Eliezer Yudkowsky Warns of AI Takeover

David summarizes Yudkowsky's core argument that superintelligent AIs could be uncontrollable and pose existential risk.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app