AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Our top researchers and industry leaders have been warning us that superintelligent AI may cause human extinction in the next decade.
If you haven't been following all the urgent warnings, I'm here to bring you up to speed.
* Human-level AI is coming soon
* It’s an existential threat to humanity
* The situation calls for urgent action
Listen to this 15-minute intro to get the lay of the land.
Then follow these links to learn more and see how you can help:
A longer written introduction to AI doom by Connor Leahy et al
* AGI Ruin — A list of lethalities
A comprehensive list by Eliezer Yudkowksy of reasons why developing superintelligent AI is unlikely to go well for humanity
A catalogue of AI doom arguments and responses to objections
The largest volunteer org focused on lobbying world government to pause development of superintelligent AI
Chat with PauseAI members, see a list of projects and get involved
---
Doom Debates’ Mission is to raise mainstream awareness of imminent extinction from AGI and build the social infrastructure for high-quality debate.
Support the mission by subscribing to my Substack at DoomDebates.com and to youtube.com/@DoomDebates. Thanks for watching.