
AI Professor Stuart Russell: - what could possibly go wrong?
World Economic Forum
00:00
The Importance of Goals in Language Models
The systems are not, as far as we can tell, AGI. They exhibit a number of weaknesses because it seems they don't maintain a consistent internal model of the world and how it works. So if you want to be good at imitating that kind of behavior, then it's natural that if you possess those same kinds of goals, you're going to begood at producing the same kinds of behavior.
Transcript
Play full episode