
Quantizing Transformers by Helping Attention Heads Do Nothing with Markus Nagel - #663
The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)
Advancements in Multitask Learning and Equivariant Diffusion
This chapter examines the innovations in multitask learning, particularly through the lens of the 'EDGY' paper which introduces Equivariant Diffusion for planning with embodied agents. It emphasizes the integration of geometric algebra with transformer architectures to enhance algorithmic efficiency and scalability in robotic applications.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.