Connected cover image

577: Big Boy American Man

Connected

00:00

Mixture-of-Experts Architecture Simplified

Federico outlines mixture-of-experts (MoE) models that activate only subsets of parameters to run massive models efficiently.

Play episode from 59:20
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app