Decoding the Gurus cover image

Eliezer Yudkowksy: AI is going to kill us all

Decoding the Gurus

00:00

The Problem With Alignment

The first time you fail at aligning something much smarter than you are, you die and you do not get to try again. Eudkowski: If every time we built a poorly aligned super intelligence and it killed us all, we got to observe how it had killed us. In other words, I do not think that alignment is fundamentally harder than artificial intelligence was in the first place.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app