Machine Learning Guide cover image

MLG 007 Logistic Regression

Machine Learning Guide

00:00

Linear Regression - The Logistic Regression Function

Step one is to predict, predict randomly. And then figure out how off we were, how bad we were. That's our error or loss function. Ours is called the log likelihood function because it uses a log rhythm in the function. It starts at zero and it goes towards infinity. Y equals infinity at x equals one. The closer my guess is to zero, the closer to zero is the error. K, this is very confusing, and don't dwell on the details. One more time. Logistic regression function that gives you that s curve on a grap is one over one plus e to the negative theta transpose x. So lineal regression is inside of that log

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app