
"Why I think strong general AI is coming soon" by Porby
LessWrong (Curated & Popular)
00:00
Probability of Doom for AGI Development at Different Dates
The future fund prize that prompted me to write this post estimated the following at 15%. Conditional on AGI being developed by 2070, humanity will go extinct or drastically curtail its future potential. If your timelines are relatively long, almost all probability mass passed 2050. A 15% chance of doom seems reasonable to me. While the field of AI not kill everyone ism, is pretty new and is not yet in an ideal position. It does exist, and there's a chance it can actually do something.
Play episode from 01:03:51
Transcript


