AI Today Podcast cover image

AI Today Podcast: AI Glossary Series – Backpropagation, Learning Rate, and Optimizer

AI Today Podcast

00:00

How to Optimize Your Neural Network

Kathleen: Back propagation is an approach that's really helped optimize and get this neural network to convergence. We want the neural net to learn something and we don't want to take forever and a ton of data to do it, so we want to get there quickly. An optimizer is an algorithmic function that really focuses to speed up and of course, optimize back propagation.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app