
Ep 137: Things that make you go mmmmm? Part 4: Minds - Part the First
ToKCast
00:00
AI and the Goals of AI
If we build something more intelligent than us it's important to have its goals aligned. The concept of aligning goals is another word for coercion and enslavement, he says. If you have a super intelligence then genuinely in the future when we do have AGI however far into the future this is the way in which goals will become aligned. He concludes by saying: "We should fear competence because if you have an excuse what is intelligence?"
Transcript
Play full episode