
#169 - Google's Search Errors, OpenAI news & DRAMA, new leaderboards
Last Week in AI
00:00
Optimizing Learning Rate in Training Models
A consistent learning rate throughout the training process is not optimal. It is suggested to start with a faster learning rate and larger steps early on in training when the model is learning from scratch, then gradually reduce the learning rate as the model improves. Strategies like momentum-based techniques can help optimize the learning rate intelligently using sophisticated heuristics. The optimal learning rate is noted to be 10^-4, known as the 'magic number' by Andrei Kapafi.
Transcript
Play full episode