The Nonlinear Library

AF - LLMs for Alignment Research: a safety priority? by Abram Demski

Apr 4, 2024
Ask episode
Chapters
Transcript
Episode notes