AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Global Catastrophic Risks and Excessential Risks
I want to do some more psychology research on a global catastrophic risks and exessential risks. I would really like humanity to survive in the end of the 20 second century. A nuclear war is still major risk. His nuclear war a true x risk. Like it might mean we colonized mars in the year 26 hundred rather than 20 50. We never evolve the ability to be long termised about what affects all humanity. What'll we do about it?