How AI Is Built  cover image

#022 The Limits of Embeddings, Out-of-Domain Data, Long Context, Finetuning (and How We're Fixing It)

How AI Is Built

00:00

Evolution of Text Embedding Models: Innovations and Techniques

This chapter explores the progression of embeddings in machine learning, from the early models like Google's universal sentence encoder to advanced techniques that utilize triplet loss. It highlights the key innovations, such as larger batch sizes and hard negatives, that have significantly improved the performance of text embedding models.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner
Get the app