Gradient Dissent: Conversations on AI cover image

Neural Network Pruning and Training with Jonathan Frankle at MosaicML

Gradient Dissent: Conversations on AI

00:00

How to Speed Up Training With ResNet 50

There are 17 different interventions into training, affecting everything from data augmentation to the inductive bias of the network. You need to use that wisely on things that won't slow down training and there are regularization methods that are no effect. The early part of training, nothing that important or interesting tends to get learned. There is a lot of art to this as well.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app