2min chapter

Ken's Nearest Neighbors  cover image

Creating Career Insurance in the Age of Tech Layoffs (Matt Harrison) - KNN Ep. 136

Ken's Nearest Neighbors

CHAPTER

XGBoost - The Most Complete End-to-End Introduction

The impetus for the XGBoost book is, you know, like I said, I do a lot of training, a lot of predictive modeling training. And so my book is kind of diving into that, but also going into some of the other features and NCAA libraries around X GBoost. So sort of an end-to-end start off by talking about like decision trees, how those are the basis or the nucleus of that build an intuition around underfitting and overfitting. It's pretty easy to use. If you need to tune it, you can tune it. But tuning is not too hard.

00:00
Speaker 2
Oh, Matt, we're coming sort of to an end here. I definitely want to talk a little bit more about your new book on XGBoost. First, I'm just interested in what you find so exceptional about XGBoost and is exceptional enough to write a book on it. And then a little bit more where people can learn more about
Speaker 1
you and the background. Yeah, the impetus for the XGBoost book is, you know, like I said, I do a lot of training, a lot of predictive modeling training. And I think XGBoost is just in a good place right now, where if you need to, if you have tabular data and you want to make predictions, it does a great job out of the box. It has a lot of things that are going well for it. And performance-wise, it's really hard to beat. It's pretty easy to use. If you need to tune it, you can tune it. But tuning is not too hard. And so my book is kind of diving into that, but also going into some of the other features and NCAA libraries around XGBoost, like diving into SHAP, how to use that. There's some libraries around feature interactions, how to use those. And then also going sort of a lot of, so as an, as a, you know, educator and author, I read a lot of content. A lot of a lot of machine learning content sort of stops with like, this is how you make a prediction. And so taking that a little bit further, like, how do you integrate this into MLflow? How do you deploy it? How do you use Docker? How do you query that? Right? So not just like using the notebook, but actually doing production with it as well. So sort of an end-to-end start off by talking about like decision trees, how those are the basis or the nucleus of that build an intuition around underfitting and overfitting. And then sort of go through what I think based on what I've read and seen it, the most complete end-to-end XGBoost introduction. That's awesome. I'm a huge fan of using case studies.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode