Modern Wisdom cover image

#1011 - Eliezer Yudkowsky - Why Superhuman AI Would Kill Us All

Modern Wisdom

00:00

Machine Extrapolated Volition and Alignment Limits

Chris references machine extrapolated volition; Eliezer reflects on past hopes and explains alignment may be solvable but not before first-risky attempts.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app