The Singularity Discussion Series cover image

The Singularity Discussion Series

FBL91: Connor Leahy - The Existential Risk of AI Alignment

Feb 20, 2023
53:40

This week our guest is AI researcher and founder of Conjecture, Connor Leahy, who is dedicated to studying AI alignment. Alignment research focuses on gaining an increased understanding of how to build advanced AI systems that pursue the goals they were designed for instead of engaging in undesired behavior. Sometimes, this means just ensuring they share the values and ethics we have as humans so that our machines don’t cause serious harm to humanity.

In this episode, Connor provides candid insights into the current state of the field, including the very concerning lack of funding and human resources that are currently going into alignment research. Amongst many other things, we discuss how the research is conducted, the lessons we can learn from animals, and the kind of policies and processes humans need to put into place if we are to prevent what Connor currently sees as a highly plausible existential threat. 

Find out more about Conjecture at conjecture.dev or follow Connor and his work at twitter.com/NPCollapse

**

Apply for registration to our exclusive South By Southwest event on March 14th @ www.su.org/basecamp-sxsw

Apply for an Executive Program Scholarship at su.org/executive-program/ep-scholarship

Learn more about Singularity: su.org

Host: Steven Parton - LinkedIn / Twitter

Music by: Amine el Filali

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner