80k After Hours cover image

Robert Wright & Rob Wiblin on the truth about effective altruism

80k After Hours

NOTE

Assessing Existential Risks and Long-Termism

Existential risks, defined as threats that could wipe out all of human life, are limited in number. Even nuclear war, often seen as a major risk, may not result in true extinction. Hardcore AI doomsayers are among the few presenting scenarios of true extinction, although such scenarios are deemed unlikely. Other catastrophic events like pandemics or biological weapons attacks are also unlikely to result in total human eradication. While long-termism is intriguing in theory, its practical importance may not significantly impact decisions on the best areas to focus on.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner