
4 - Risks from Learned Optimization with Evan Hubinger
AXRP - the AI X-risk Research Podcast
00:00
What You Really Care About Is, Like, the Alil Frequency of A.
Daniel: I can imagine a world where evolution or, you know, the like, the anthropomorphic god of evolution. But there's this problem, i gess there are two problems. Firstly, then it's like a lot more description length. And secondly, when i'm kind of dumb. Ah, the part of me that's saying 'what you actually care about is propagating your genes'
Transcript
Play full episode