The Logan Bartlett Show cover image

EP 63: Eliezer Yudkowsky (AI Safety Expert) Explains How AI Could Destroy Humanity

The Logan Bartlett Show

CHAPTER

The Difficulty of Alignment

I saw something recently Jan Lecun say that alignment just isn't as hard as you think it to be. Do you think that they're just miseducated about the difficulty around all the different permutations and thoughts that it requires? I think that I keep asking Jan to spell out exactly how he intends to align stuff so that I can immediately tear it apart of course. It's his plans are like well we will just like make it to be submissive literally his term. He clearly shows his unfamiliarity with the prior literature here. My 1.8 million word BDSM decision theory Dungeons and Dragons thick has as one of its primary themes whether an entity being submissive is enough

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner