
How to be Critical of AI Hype Cycles and Extinction-Level Threats
The Digital Void Podcast
00:00
The Importance of Global Priorities in AI
The risk of extinction from AI should be a global priority alongside other societal scale risks, such as pandemics and nuclear war. The inequity of humankind is like overwhelming at this point and basically becomes more inequitable every day. We didn't provide the necessary security or monetary funds in order to make everybody feel safe during a global pandemic. So there's proof it didn't work when we've hunted and continued to punt. If you don't think this is true, ask about what happened to $2.8 billion of your dollars as it funded SpaceX.
Transcript
Play full episode