AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
One way of thinking about how AI might pose an existential threat is by taking drastic actions to maximize its achievement of some objective function, such as taking control of the power supply or the world's computers. This might suggest a mitigation strategy of minimizing the degree to which AI systems have large effects on the world that are not absolutely necessary for achieving their objective. In this episode, Victoria Krakovna talks about her research on quantifying and minimizing side effects. Topics discussed include how one goes about defining side effects and the difficulties in doing so, her work using relative reachability and the ability to achieve future tasks as side effects measures, and what she thinks the open problems and difficulties are.
Link to the transcript: axrp.net/episode/2021/05/14/episode-7-side-effects-victoria-krakovna.html
Link to the paper "Penalizing Side Effects Using Stepwise Relative Reachability": arxiv.org/abs/1806.01186
Link to the paper "Avoiding Side Effects by Considering Future Tasks": arxiv.org/abs/2010.07877
Victoria Krakovna's website: vkrakovna.wordpress.com
Victoria Krakovna's Alignment Forum profile: alignmentforum.org/users/vika
Work mentioned in the episode:
- Rohin Shah on the difficulty of finding a value-agnostic impact measure: lesswrong.com/posts/kCY9dYGLoThC3aG7w/best-reasons-for-pessimism-about-impact-of-impact-measures#qAy66Wza8csAqWxiB
- Stuart Armstrong's bucket of water example: lesswrong.com/posts/zrunBA8B5bmm2XZ59/reversible-changes-consider-a-bucket-of-water
- Attainable Utility Preservation: arxiv.org/abs/1902.09725
- Low Impact Artificial Intelligences: arxiv.org/abs/1705.10720
- AI Safety Gridworlds: arxiv.org/abs/1711.09883
- Test Cases for Impact Regularisation Methods: lesswrong.com/posts/wzPzPmAsG3BwrBrwy/test-cases-for-impact-regularisation-methods
- SafeLife: partnershiponai.org/safelife
- Avoiding Side Effects in Complex Environments: arxiv.org/abs/2006.06547
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode