Last Week in AI cover image

#177 - Instagram AI Bots, Noam Shazeer -> Google, FLUX.1, SAM2

Last Week in AI

00:00

Advancements in Mixture of Experts Models

This chapter explores the complexities of mixture of experts (MoE) models in AI, focusing on Meta's Llama 3.1 and the introduction of multimodality. It also discusses new methodologies to enhance model performance and the importance of alignment with human values to improve AI safety.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app