Future of Life Institute Podcast cover image

Connor Leahy on the State of AI and Alignment Research

Future of Life Institute Podcast

CHAPTER

The Importance of Deconfusing Agency and Alignment

My understanding is that he has like, you know, 30% P-doom even on the current path or something like that. So by that, I deduce that he doesn't expect this to be necessary. And that kind of stuff, well, he has this belief that if it's just neural networks, we're super screwed. We're just super, super screwed. There's nothing we can do. It's way too hard.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner