Doom Debates cover image

This $85M-Backed Founder Claims Open Source AGI is Safe — Debate with Himanshu Tyagi

Doom Debates

00:00

Exploring Human Extinction Risks

This chapter examines the probabilities of human extinction from nuclear war, pandemics, and AI threats by 2100. The discussion highlights historical examples and the balance between humanity's resilience and the potential for catastrophic events. Philosophical considerations on AI's capabilities and the critical implications for civilization are interwoven throughout the dialogue.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app