
Changelog News
LMMS are the new LLMs
Oct 16, 2023
Chip Huyen, an expert on large data models, talks about the shift from LLMs to LMMs, notable companies using LMMs, and the Zimaboard as a home server. Herman Õunapuu reviews the Zimaboard. Bryan Braun shares his recent VSCode configuration discoveries. Swizec Teller summarizes the AI Engineer Summit.
06:28
Episode guests
AI Summary
AI Chapters
Episode notes
Podcast summary created with Snipd AI
Quick takeaways
- Developing large multimodal models expands the capabilities of machine learning by incorporating multiple data modalities.
- The Zimaboard offers an affordable and polished alternative to Raspberry Pi for home server setups with its pre-attached case, cooling setup, and satisfactory performance.
Deep dives
Shifting to Large Multimodal Models
Chip Huyan discusses the importance of incorporating multiple modalities into machine learning models. Previously, ML models operated in a single data mode (text, image, or audio), limiting their abilities. Humans, however, can work with multiple modalities, such as reading and writing text, viewing images, watching videos, and listening to music. To address this limitation, large multimodal models (LMMs) are being developed by companies like DeepMind, Salesforce, Microsoft, Tencent, and OpenAI. Chip's post provides detailed insights and highlights the significance of working with multimodal data.
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.