5min chapter

Dwarkesh Podcast cover image

Eliezer Yudkowsky - Why AI Will Kill Us, Aligning LLMs, Nature of Intelligence, SciFi, & Rationality

Dwarkesh Podcast

CHAPTER

The Problem With Alignment

I think this stuff is weirder and harder than people might have imagined initially. I'm not really that confident of their ability to understand if I told them but maybe you have some folks who can understand or Anyways I can sort of see what I try These people will not try it um, but in the current crop that is And I'm not actually sure that that if that if somebody else takes over like the government or something that they listen to me either but I can Now maybe you So so some of the trouble here is that you have a choice of targets and and like matters all that great One is you look for the niceness that's in humans and you try to bring it

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode