Lex Fridman Podcast cover image

#368 – Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization

Lex Fridman Podcast

NOTE

List of Lethalities wishlist blog

There's still confusion about intelligence and AGI, but we should adjust our models rather than insisting on being right. Seeing Bing doesn't change my understanding of intelligence, just what processes can perform certain tasks. It's like realizing a flyer can fly with wings instead of insisting it can't. This doesn't change the substance of flight. We haven't had a major update like discovering the laws of physics are wrong. Let's define AGI and super intelligence instead.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner