AXRP - the AI X-risk Research Podcast cover image

12 - AI Existential Risk with Paul Christiano

AXRP - the AI X-risk Research Podcast

00:00

Are You Interested in Interpretability?

In the realm of things roughly like outer alignement, or, you know, sort of alignment yo're dealing with loy stakes, reputable problems. What kind of solutions are yo most interested in from a research perspective? I don't have a very short answer to this question so i guess you'll get a kind of long answer that in itself is interesting. And maybe there's also two kinds of answers i can give. One is, like, the thing that i am most animated by, that, like, i am working on myself. Another is like a broader licer kind of th things people do in the world that i am particularly excited by. Um, e directions. Maybe my

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app