
#031 WE GOT ACCESS TO GPT-3! (With Gary Marcus, Walid Saba and Connor Leahy)
Machine Learning Street Talk (MLST)
00:00
Exploring Neural Network Architectures: LSTMs vs. Transformers
This chapter explores the limitations of LSTMs and transformers, concentrating on the challenges of forgetting and context fragmentation. The discussion emphasizes the potential of scaling LSTMs to match transformer performance, prompting a reevaluation of the differences between these neural network architectures.
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.