AXRP - the AI X-risk Research Podcast cover image

8 - Assistance Games with Dylan Hadfield-Menell

AXRP - the AI X-risk Research Podcast

00:00

The Importance of Single Agent Value Alignment

Alignment with concentrated power distributions is a recipe for, um, some pretty bad situations and environments. A sort of single agent value alignment on its own really just allows people to get systems to do what they want more effectively. And if you had some people with power and others without, that a mulk, it could lead to snares where the people out of power are treated really poorly. I think like there's practical short term right now. If facebook gets really good at a lining systems with itself, and we don't get good at alining facebook with us, that's pote ly bad. i think if you start to think about future systems and systems that could reach strategic

Play episode from 02:18:43
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app