
AI Today Podcast: AI Glossary Series – Loss Function, Cost Function and Gradient Descent
AI Today Podcast
00:00
How to Adjust Weights in a Neural Network
Convergence is almost literally or figuratively what we mean, like converge all those errors on the bullseye. We want to get to the right answer as quickly as possible. The squaring of the errors basically really penalizes things that are wrong. Now this by itself is a strategy we use to adjust the weights in an artificial neural net.
Transcript
Play full episode