

#229 - Mara Cortona - Why Is No One Talking About Existential Risk?
Oct 8, 2020
Mara Cortona, Executive Director of the Astropolitics Institute and Founder of Nöonaut, dives deep into the pressing but under-discussed hazards humanity faces. She highlights the alarming statistic that we have only a 1 in 3 chance of surviving the next century. Cortona urges a shift in focus from climate change to threats like poorly aligned AI and biotechnology. The conversation also touches on the complexities of morality, the necessity of collective action, and the importance of wisdom in power dynamics as we navigate the future of human survival.
AI Snips
Chapters
Books
Transcript
Episode notes
X-Risks vs. Catastrophic Risks
- Existential risks (X-risks) threaten the entire species, while global catastrophic risks cause mass die-offs but not total extinction.
- Anthropogenic risks, caused by human activity, now outweigh natural X-risks like asteroids.
Technology's Double-Edged Sword
- Technology causes anthropogenic X-risks, but it's also needed to mitigate natural X-risks.
- Abandoning technology isn't viable long-term due to inevitable natural disasters.
Societal Pain Avoidance
- People are good at avoiding personal pain, but struggle with that on a societal level.
- Short-term thinking often prioritizes the economy over long-term existential risks.