3min chapter

Dwarkesh Podcast cover image

Eliezer Yudkowsky - Why AI Will Kill Us, Aligning LLMs, Nature of Intelligence, SciFi, & Rationality

Dwarkesh Podcast

CHAPTER

The Importance of Understanding Intelligence

I was just kind of curious if you had some sort of uh Information about it not as a I have a bunch of particular aspects of that that I understand. Could you ask a narrower question? How important is it in your view to have that understanding of intelligence? Is it plausible that one set full explanation is available that our current frame around Intelligence environment turns out to be wrong? If you understand the concept of like here is my preference ordering over outcomes Here is the complicated transformation of the environment  and then invert the environment's transformation to project stuff high in my preference ordering Back onto my actions options decisions choices policies actions That when I run them through the environment will end up in an outcome

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode