LessWrong (Curated & Popular) cover image

"An artificially structured argument for expecting AGI ruin" by Rob Bensinger

LessWrong (Curated & Popular)

00:00

The Importance of Human Survival in STEM-Level AGI

This argument focuses on human survival, in quotes. From my perspective the most important claim is that STEM-level AGI systems very likely won't value awesome cosmopolitan outcomes at all. It's not just that we'll die, it's that there probably won't be anything else of significant value that the AGI creates in our place. If we had decades to work with STEM- level AGI before catastrophe, rather than months or years, we would have far more time to act and learn. This is not, of course, to say that AGI can achieve decisive strategic advantage within five years is necessary for the AGI situation to be dire.

Play episode from 09:16
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app