The Logan Bartlett Show cover image

EP 63: Eliezer Yudkowsky (AI Safety Expert) Explains How AI Could Destroy Humanity

The Logan Bartlett Show

00:00

The Importance of Alignment in AI Development

Alignment in AI is necessary to ensure its net positive effect on the world./nThe idealized human preferences need to be updated to match the AI's knowledge./nCreating an AI to do what humans want is not as simple as it sounds./nConsidering what humans vote on is not a reliable way to shape AI.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner
Get the app