Brain Inspired cover image

BI NMA 04: Deep Learning Basics Panel

Brain Inspired

00:00

Do We Need to Regularize?

Desetabi: In theory, if you don't regularize at all, you still won't wildly overfit. You just pick and practice a the level of recsation that helps you best perform on your test set then validate or best perform in your validation set,. And then eventually test on a test. So i think it's always a good idea to regularize, but i think what the theory tells us is why you don't do that bad if they don'tRegularization has take in the field of statistics by storm; there are card carrying statistical theorists theorizing about this now.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app