Towards Data Science cover image

120. Liam Fedus and Barrett Zoph - AI scaling with mixture of expert models

Towards Data Science

00:00

Is There a Paradime of Different Inputs?

In future our models will sure to have, like, a much greater degree of adaptivity. And again, that flexibility coming up as a final thing. I'm curious for people who are thinking about doing research on mixture of experts models and implementing their own even. What are some of the design principles that you've identified over the course of your research that might save people a lot of time?

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app