2min chapter

Dwarkesh Podcast cover image

Eliezer Yudkowsky - Why AI Will Kill Us, Aligning LLMs, Nature of Intelligence, SciFi, & Rationality

Dwarkesh Podcast

CHAPTER

The Importance of Money in AI Alignment

I don't think that That doesn't even strike me as hope I honestly the way you described it seemed kind of compelling like I don't know why that doesn't even rise to 1% Uh the possibility works out that way. It's going to be more dangerous than when I was thinking about when I was dreaming about it in 2003 And I think in a very real sense it feels to me like the the the people doing this stuff now literally not gotten as far as it I was in 2003 and You know I've now like written out my answer sheet for that It's on the podcast.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode