3min chapter

The Logan Bartlett Show cover image

EP 63: Eliezer Yudkowsky (AI Safety Expert) Explains How AI Could Destroy Humanity

The Logan Bartlett Show

CHAPTER

The Difficulty of Alignment

I saw something recently Jan Lecun say that alignment just isn't as hard as you think it to be. Do you think that they're just miseducated about the difficulty around all the different permutations and thoughts that it requires? I think that I keep asking Jan to spell out exactly how he intends to align stuff so that I can immediately tear it apart of course. It's his plans are like well we will just like make it to be submissive literally his term. He clearly shows his unfamiliarity with the prior literature here. My 1.8 million word BDSM decision theory Dungeons and Dragons thick has as one of its primary themes whether an entity being submissive is enough

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode