The Logan Bartlett Show cover image

EP 63: Eliezer Yudkowsky (AI Safety Expert) Explains How AI Could Destroy Humanity

The Logan Bartlett Show

00:00

The Vision for Good AI Singularity

I didn't until age 21 figure out that alignment was going to be a thing. Alignment is trying to shape an AI such that its effect upon the world when you run it is net positive from the perspective of humans. I would rather expect it to be more for raising the alarm and getting it successfully shut down.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner
Get the app