LessWrong (Curated & Popular) cover image

"AGI Ruin: A List of Lethalities" by Eliezer Yudkowsky

LessWrong (Curated & Popular)

00:00

The Biggest Challenge From a G I Alignment

All the difficulty is in getting to less than certainty of killing literally everyone. Trolley problems are not an interesting sub problem inall this. If there are any survivors you solved alignment, at this point, i no longer care how it works. I'm cause agnostic about whatever methodology you used. All i want is that we ave justifiable cause to believe of a pivotly useful a g i. This will not kill literally everyone. But if you can get as n y far as less than roughly certain to kill everybody, then you can probably get down to under a five % chance with only slightly more effort. Anybody telling you i 'm asking for stricter alignment than this has

Play episode from 03:25
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app