
Ep 186: Brett, Naval and more on AI & AGI
ToKCast
00:00
The Importance of Agility in Alignment
If we are trying to build an AGI that cannot have its own desires, that means we're not allowing it to have its own problems. And so it's never going to functionally be a true agent. It's always going to be an autonomous slave. So alignment basically just boils down to keeping this thing as a dumb task doing robot rather than as an intelligent agent, free-willed creature.
Transcript
Play full episode