AI Snips
Chapters
Transcript
Episode notes
Boosting Explained for Time Series
- Boosting enhances accuracy by combining many weak learners into a strong learner iteratively focusing on hard examples.
- It naturally fits time series by successively removing effects of categorical variables to purify the series for forecasting.
Boosted Embeddings Remove Seasonality
- Boosted embeddings remove seasonal and categorical effects yielding a purified time series for modeling.
- Freezing embedding weights while successively removing categorical impacts fits naturally into a boosting framework.
Use DeepGB As Preprocessor
- Use DeepGB to compute robust embeddings removing nonlinear seasonal components and covarying categorical effects.
- After embedding, fit any forecasting model like CatBoost or ARIMA modularly on the adjusted time series.