
Dynamic Token Merging for Efficient Byte-level Language Models with Julie Kallini - #724
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Exploring GPT-2 and Impossible Languages
This chapter examines the challenges language models like GPT-2 face when processing 'impossible languages' that defy natural language rules. It discusses the approach of training these models from scratch on unique language variations and reflects on the architectural adjustments needed for better performance.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.