The Data Scientist Show cover image

The Data Scientist Show

Becoming a deep learning researcher without a PhD, graph neural network(GNN), time series, recommender system with Kyle Kranen - The Data Scientist Show#028

Mar 3, 2022
Exploring deep learning research topics with Kyle Kranen, a Deep Learning Software Engineer at Nvidia. Topics include Graph Neural Network (GNN), Temporal Fusion Transformer (TFT), time series, and other insights into his career journey.
01:57:10

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • Temporal Fusion Transformers use attention and recurrent frameworks for multi-horizon forecasts.
  • TFT outperforms ARIMA and XGBoost in non-stationary time series data adaptation.

Deep dives

Temporal Fusion Transformers in Time Series Forecasting

Temporal Fusion Transformers (TFT) employ attention and recurrent encoder-decoder frameworks to predict multi-horizon forecasts, extending beyond the next time step. The model incorporates feature input selection based on initial state and static information to gate input features, enhancing interpretability. TFT uses attention to reduce interpretability across both feature and temporal axes, allowing for detailed insights into what influences predictions.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode