How AI Is Built cover image

#022 The Limits of Embeddings, Out-of-Domain Data, Long Context, Finetuning (and How We're Fixing It)

How AI Is Built

00:00

Evolution of Text Embedding Models: Innovations and Techniques

This chapter explores the progression of embeddings in machine learning, from the early models like Google's universal sentence encoder to advanced techniques that utilize triplet loss. It highlights the key innovations, such as larger batch sizes and hard negatives, that have significantly improved the performance of text embedding models.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app