
Ep 186: Brett, Naval and more on AI & AGI
ToKCast
The Importance of Agility in Alignment
If we are trying to build an AGI that cannot have its own desires, that means we're not allowing it to have its own problems. And so it's never going to functionally be a true agent. It's always going to be an autonomous slave. So alignment basically just boils down to keeping this thing as a dumb task doing robot rather than as an intelligent agent, free-willed creature.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.