The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

Quantizing Transformers by Helping Attention Heads Do Nothing with Markus Nagel - #663

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

00:00

Enhancing Efficiency through Multitask and Multidomain Learning

This chapter explores Scalarization in multitask and multidomain learning, emphasizing its role in improving training and inference efficiency. It introduces a population-based training method that dynamically adjusts scalar weights for combining multiple losses, optimizing model performance through evolutionary algorithms.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app