
19 - Mechanistic Interpretability with Neel Nanda
AXRP - the AI X-risk Research Podcast
Is There Something Weird About the Optimizer?
There's just something weird about the optimizer we're using that changes our results a bunch because optimizers are really really weird specifically we're using Adam W which is the variance of Adam. Adam uses weight decay in a principled way and the details of how it works are annoying but like very roughly Adam kind of has momentum so it tracks like all of the recent gradients and kind of points what's an average of those.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.