Towards Data Science cover image

120. Liam Fedus and Barrett Zoph - AI scaling with mixture of expert models

Towards Data Science

00:00

How Many Experts Are We Talking About?

We basically only want one expert most of the time. For most of our recent work, we had like 64 or less experts. We've acta also seem to be beneficial even if you only have two experts. You don't actually really even need a super computer to get the benefits from something like this as well. So it almost makes you think of dense models aslike one end of a continuum of a mixture of experts model really.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app