Weaviate Podcast cover image

Weaviate Podcast

Weights and Biases on Fine-Tuning LLMs - Weaviate Podcast #68!

Sep 20, 2023
52:09

Hey everyone! Thank you so much for watching the 68th episode of the Weaviate Podcast! We are super excited to welcome Morgan McGuire, Darek Kleczek, and Thomas Capelle! This was such a fun discussion beginning with generally how see the space of fine-tuning from why you would want to do it, to the available tooling, intersection with RAG and more! Check out W&B Prompts! https://wandb.ai/site/prompts Check out the W&B Tiny Llama Report! https://wandb.ai/capecape/llamac/reports/Training-Tiny-Llamas-for-Fun-and-Science--Vmlldzo1MDM2MDg0 Chapters 0:00 Tiny Llamas! 1:53 Welcome! 2:22 LLM Fine-Tuning 5:25 Tooling for Fine-Tuning 7:55 Why Fine-Tune? 9:55 RAG vs. Fine-Tuning 12:25 Knowledge Distillation 14:40 Gorilla LLMs 18:25 Open-Source LLMs 22:48 Jonathan Frankle on W&B 23:45 Data Quality for LLM Training 25:55 W&B for Data Versioning 27:25 Curriculum Learning 29:28 GPU Rich and Data Quality 30:30 Vector DBs and Data Quality 32:50 Tuning Training with Weights & Biases 35:47 Training Reports 42:28 HF Collections and W&B Sweeps 44:50 Exciting Directions for AI

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode