One in six is an estimate of putting together the total natural risks, asteroids, super volcanoes and stellar explosions. The anthropogenic risks are much bigger, nuclear war, climate catastrophe,. that are engineered, a analined artificial intelligence. And i would add to toby's list a widespread authoritarianism enabled by surveillance technologies. All right, i'm getting, i don't know what to think about these things. I do have these pesimistic moments when we start talking about thesethings. But, but let me, let me back up be more philosophical about it for a second.
We’re pretty well-calibrated when it comes to dealing with common, everyday-level setbacks. But our brains aren’t naturally equipped for dealing with unlikely but world-catastrophic disasters. Yet such threats are real, both natural and human-induced. We need to collectively get better at anticipating and preparing for them, at the level of political action. Andrew Leigh is an academic and author who now serves in the Parliament of Australia. We discuss how to move the conversation about existential risks from the ivory tower to implementation in real policies.
Support Mindscape on Patreon.
Andrew Leigh received his Ph.D. in Public Policy from the Kennedy School of Government at Harvard University. He is a member of the Australian House of Representatives representing Fenner. He was previously a professor of economics at Australian National University, and has served as Shadow Assistant Minister for Treasury and Charities. His recent book is What’s the Worst That Could Happen? Existential Risk and Extreme Politics.
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.