
Connor Leahy on AGI and Cognitive Emulation
Future of Life Institute Podcast
The Role of Torture in Chatbot Behavior
There is a distressingly large group of people who seem to take great pleasure in torturing language models, like making them act distressed. I don't expect these things to have like qualia or to be like moral patience, but there's something really sociopathic about delighting in torturing something that is acting like a human in distress. Do you think this affects how further models are trained? So I assume that OpenEIs is collecting user data or they are collecting user data. And if a lot of the user data is twisted, does this affect how the future models will act? Two notes. It's quite disturbing to me how people act when masks off,. When they don't
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.