Deep Papers cover image

Explaining Grokking Through Circuit Efficiency

Deep Papers

00:00

Balancing Cross Entropy Loss and Weight Decay in Deep Learning Models

This chapter explores the tradeoff between increasing parameters for better performance and reducing parameters for better generalization in deep learning models. It also highlights the role of weight decay in promoting generalization over memorization.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app