
Long Context Language Models and their Biological Applications with Eric Nguyen - #690
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Advancements in Hyena Models
This chapter discusses the significant advancements in the hyena model's context length capabilities, evolving from 64,000 to 1 million tokens. Key topics include innovative attention techniques, challenges in time complexity, and the impact of synthetic language on model training and evaluation.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.