Gradient Dissent: Conversations on AI cover image

Neural Network Pruning and Training with Jonathan Frankle at MosaicML

Gradient Dissent: Conversations on AI

00:00

How Much Pruning Can You Prune?

The answer is usually somewhere between 2x and 10x compression via pruning. But in some crazy cases, if people set it up right, you can prune 100x or prune down to 100x smaller. It does tend to be pretty consistent across random initializations and random seeds. We spent 20 or 30 years trying to do exactly that.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app