2min chapter

Dwarkesh Podcast cover image

Eliezer Yudkowsky - Why AI Will Kill Us, Aligning LLMs, Nature of Intelligence, SciFi, & Rationality

Dwarkesh Podcast

CHAPTER

The Importance of Predicting Text

There's a difference between saying that we should be wary and that Like there's no hope right like I could imagine so many things that could be happening in the shagith's brain especially in our level of Confusion and mysticism over what is happening. You made an earlier claim which seemed much stronger than the idea that you don't mind hope Which is that we're going from like zero person probability to an order of magnitude greater at zero person probability. Why then are we so sure that whatever the drives that come about because of this motive are gonna be incompatible with the survival and flourishing with humanity? Most drives that happen when you take a loss function and splinter it into things correlated with

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode