Super Data Science: ML & AI Podcast with Jon Krohn cover image

771: Gradient Boosting: XGBoost, LightGBM and CatBoost, with Kirill Eremenko

Super Data Science: ML & AI Podcast with Jon Krohn

00:00

Evolution and Mathematics Behind Gradient Boosting

The chapter explores the history of boosting algorithms, focusing on gradient boosting and its development from Adaboost. It explains the iterative process of building models to predict errors from previous models, gradually improving accuracy by learning from mistakes. The episode also delves into the underlying mathematics of gradient boosting, discussing loss functions, gradients, and the evolution of advanced variants like XGBoost and LightGBM.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app