

Comparing Rewinding and Fine-tuning in Neural Network Pruning
Book • 2025
This research compares fine-tuning with weight rewinding and learning rate rewinding in neural network pruning.
Both rewinding techniques outperform fine-tuning, offering a network-agnostic approach that matches state-of-the-art pruning methods in terms of accuracy and compression ratios.
Both rewinding techniques outperform fine-tuning, offering a network-agnostic approach that matches state-of-the-art pruning methods in terms of accuracy and compression ratios.
Mentioned by
Mentioned in 0 episodes
Mentioned by 

when discussing ![undefined]()

's research on sparse networks.


Keith Duggar

Jonathan Frankle

The Lottery Ticket Hypothesis with Jonathan Frankle