The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

Recurrence and Attention for Long-Context Transformers with Jacob Buckman - #750

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

00:00

RNNs vs. Transformers: State Imbalance Problem

Jacob contrasts transformers' huge state with RNNs' tiny state and argues RNNs need larger state while transformers need reductions.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app