Future of Life Institute Podcast cover image

Joe Carlsmith on How We Change Our Minds About AI Risk

Future of Life Institute Podcast

CHAPTER

The Importance of Scalable Alignment Techniques

A lot of the hardest problems here come specifically from, you know, what does your alignment technique do with a system that is way smarter than you? So I think in general, people just really need to be asking of their alignment techniques. Is this scalable? They're like scalable part is really important. And there's a whole bucket of research into kind of threat modeling and kind of demos where we have this set of concerns.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner