Robert Wright's Nonzero cover image

AI and Existential Risk (Robert Wright & Connor Leahy)

Robert Wright's Nonzero

00:00

The Importance of Interpretability in AI

I don't know whether strict regulation is needed, but it just seems to me there should be an informed conversation that involves at least some politicians. I have trouble kind of imagining like what would aligned AI be like. So maybe we should start at that end and ask like, what is the goal of conjecture? What is your dream, AGI? What properties would it have that it wouldn't have if we just let things evolve on the current trajectory?"

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app