
AI Today Podcast: AI Glossary Series – Loss Function, Cost Function and Gradient Descent
AI Today Podcast
00:00
The Terms Loss Function, Cost Function, and Gradient Decline
On today's podcast, we're going to go over the terms loss function, cost function, and gradient descent. A loss function is a function which measures the error between a single prediction and the corresponding actual value. Cost function is the aggregation of all the losses or errors in a neural network during one pass of training. As we train our models, we iterate them with each training cycle to lower and lower the total error of the network,. that total cost and get to convergence.
Transcript
Play full episode