Data Skeptic

Boosted Embeddings for Time Series

Oct 4, 2021
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Boosting Explained for Time Series

  • Boosting enhances accuracy by combining many weak learners into a strong learner iteratively focusing on hard examples.
  • It naturally fits time series by successively removing effects of categorical variables to purify the series for forecasting.
INSIGHT

Boosted Embeddings Remove Seasonality

  • Boosted embeddings remove seasonal and categorical effects yielding a purified time series for modeling.
  • Freezing embedding weights while successively removing categorical impacts fits naturally into a boosting framework.
ADVICE

Use DeepGB As Preprocessor

  • Use DeepGB to compute robust embeddings removing nonlinear seasonal components and covarying categorical effects.
  • After embedding, fit any forecasting model like CatBoost or ARIMA modularly on the adjusted time series.
Get the Snipd Podcast app to discover more snips from this episode
Get the app