
AI #134: If Anyone Reads It
Don't Worry About the Vase Podcast
00:00
AGI Timelines and Yudkowsky's Dragon Analogy
Parsing 5–10 year AGI horizon claims from lab leaders and Eliezer Yudkowsky's dragon metaphor illustrating core alignment and trajectory risks.
Play episode from 01:30:56
Transcript


