This Week in Startups cover image

How open-source & distributed models can win AI with MosaicML’s Naveen Rao | E1754

This Week in Startups

CHAPTER

Tokens and Model Training in AI

This chapter explores the role of tokens in language processing and the mechanisms of prompting in AI models. It highlights the relationship between fine-tuning and pre-training, drawing parallels to human learning while discussing strategies for effective model response conditioning. The conversation also addresses the impact of open-source versus proprietary models and the costs associated with implementing large-scale AI solutions, particularly for startups.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner