Comparing Rewinding and Fine-tuning in Neural Network Pruning

Book • 2025
This research compares fine-tuning with weight rewinding and learning rate rewinding in neural network pruning.

Both rewinding techniques outperform fine-tuning, offering a network-agnostic approach that matches state-of-the-art pruning methods in terms of accuracy and compression ratios.

Mentioned by

Mentioned in 0 episodes

Mentioned by
undefined
Keith Duggar
when discussing
undefined
Jonathan Frankle
's research on sparse networks.
The Lottery Ticket Hypothesis with Jonathan Frankle

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app