
"An artificially structured argument for expecting AGI ruin" by Rob Bensinger
LessWrong (Curated & Popular)
00:00
The Importance of Pivotal Act Loading
If AGI won't have our values by default, then the obvious response is to try to instill these values into the system. Pivotal act loading is substantially difficult. As a strong default, absent alignment breakthroughs, we won't be able to safely cause one of the first STEM level AGI systems to want to perform an operator intended task that helps prevent the world from being destroyed by other AGIs. I put more probability on AGI systems being safe if they aren't internally representing humans at all,. with safety coming from this fact in combination with other alignment measures. And 2D. If we can't have everything right off the bat, we can't shoot for enough to
Play episode from 35:52
Transcript


