
Ep 1 - When will AGI arrive? - Logan Riggs Smith (AGI alignment researcher)
Artificial General Intelligence (AGI) Show with Soroush Pour
00:00
How to Align AI Models With Human Interests
Logan: I don't have that much expertise like as expertise is the people working at Google research. So it's just I yeah I do think people are way smarter than me and they're specialties. Hunter: How would you approach or what's the best way to approach making sure that whatever models are created in four years time are aligned with human interests? Logan: My best my favorite breakdown is that when you reward a model for doing something this doesn't specify its motivations for doing it because maybe it was doing it because it wants to and you reinforce that.Hunter: We'll be back in a few weeks time with another interview on AGI timelines getting the other side of the story
Transcript
Play full episode