This Week in Startups cover image

How open-source & distributed models can win AI with MosaicML’s Naveen Rao | E1754

This Week in Startups

00:00

Tokens and Model Training in AI

This chapter explores the role of tokens in language processing and the mechanisms of prompting in AI models. It highlights the relationship between fine-tuning and pre-training, drawing parallels to human learning while discussing strategies for effective model response conditioning. The conversation also addresses the impact of open-source versus proprietary models and the costs associated with implementing large-scale AI solutions, particularly for startups.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app