The idea of accidential risk is any risk that would prevent us from creating a better universe with all this value. A lot of the biggest existential risks are threats that could potentially affect the richest people, unlike global poverty. Omine described thi sort of like an apex model of risk. It's like, you know, o for the people who are at the apex, een, at the top chlon of the socia, economic hierarchy.
Paris Marx is joined by Émile P. Torres to discuss why longtermism isn’t just about long-term thinking, but provides a framework for Silicon Valley billionaires to justify ignoring the crises facing humanity so they can accumulate wealth and go after space colonization.
Émile P. Torres is a PhD candidate at Leibniz University Hannover and the author of the forthcoming book Human Extinction: A History of Thinking About the End of Humanity. Follow Phil on Twitter at @xriskology.
Tech Won’t Save Us offers a critical perspective on tech, its worldview, and wider society with the goal of inspiring people to demand better tech and a better world. Follow the podcast (@techwontsaveus) and host Paris Marx (@parismarx) on Twitter, and support the show on Patreon.
Find out more about Harbinger Media Network at harbingermedianetwork.com.
Also mentioned in this episode:
Support the show