AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Assessing Existential Risks and Long-Termism
Existential risks, defined as threats that could wipe out all of human life, are limited in number. Even nuclear war, often seen as a major risk, may not result in true extinction. Hardcore AI doomsayers are among the few presenting scenarios of true extinction, although such scenarios are deemed unlikely. Other catastrophic events like pandemics or biological weapons attacks are also unlikely to result in total human eradication. While long-termism is intriguing in theory, its practical importance may not significantly impact decisions on the best areas to focus on.