The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

OLMo: Everything You Need to Train an Open Source LLM with Akshita Bhagia - #674

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

CHAPTER

Challenges in Training Large-Scale Language Models

This chapter explores the complexities faced when training large-scale machine learning models, particularly in reproducing existing results and the impact of architectural decisions like weight tying. It highlights the significance of collaborative research, the importance of evaluation methods like perplexity, and the innovative Paloma project that provides a comprehensive benchmarking framework. Additionally, the discussion emphasizes the balance between open sourcing models and maintaining security, arguing that transparency fosters a healthier AI development ecosystem.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner