The Logan Bartlett Show cover image

EP 63: Eliezer Yudkowsky (AI Safety Expert) Explains How AI Could Destroy Humanity

The Logan Bartlett Show

00:00

The Vision for Good AI Singularity

I didn't until age 21 figure out that alignment was going to be a thing. Alignment is trying to shape an AI such that its effect upon the world when you run it is net positive from the perspective of humans. I would rather expect it to be more for raising the alarm and getting it successfully shut down.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app