
#116 — AI: Racing Toward the Brink
Making Sense with Sam Harris
00:00
The Alignment Problem
Alignment is an a i that weewhere you can sort of say what it's trying to do. It does some moral and ethical aspects which are not as important as the technical aspects. Does this thought experiment originate with bostrum, or did he take it from somebody else? As far as i know, it's me hosoke it could still be bostrum.
Transcript
Play full episode