The Generalist

Existential Risk and the Future of Humanity: Lessons from AI, Pandemics, and Nuclear Threats | Toby Ord (Author of "The Precipice")

Jun 24, 2025
Toby Ord, a Senior Researcher at Oxford's AI Governance Initiative and author of The Precipice, dives deep into existential risks facing humanity. He argues that we face a one-in-six chance of civilization-ending catastrophe this century. The discussion delves into AI-related threats, from alignment failures to geopolitical tensions. Ord emphasizes our moral duty to future generations and reflects on COVID-19’s missed lessons. He also outlines actionable steps for individuals to help steer humanity away from potential extinction.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Understanding Existential Risks

  • Existential risks threaten to permanently destroy humanity's long-run potential.
  • They include natural disasters like asteroids and human-made threats like nuclear war or permanent civilizational collapse.
INSIGHT

Caring for Future Generations

  • Population ethics urge us to care about future generations who might never exist if extinction occurs.
  • Moral consideration should include past, present, and potential future lives when weighing existential risks.
INSIGHT

Why Human Extinction Matters

  • Extinction matters both due to immediate loss of lives and loss of all future potential.
  • Humanity may be cosmically unique, responsible for fostering moral progress and understanding the universe.
Get the Snipd Podcast app to discover more snips from this episode
Get the app