
565: AGI: The Apocalypse Machine
Super Data Science: ML & AI Podcast with Jon Krohn
00:00
Intro
Jeremy Harris discusses the potential of AGI development, the concept of the Singularity, and the importance of aligning AGI with human values to mitigate existential risks. He also shares insights on becoming an AI safety expert, quantum mechanics, and pursuing a PhD.
Transcript
Play full episode