Real World Serverless with theburningmonk cover image

#80: Is AWS Bedrock the OpenAI killer, with Randall Hunt

Real World Serverless with theburningmonk

00:00

Tokenization Process and Vector Stores

This chapter discusses the tokenization process of the GPT transformer model, the limitations of previous language models, and the use of vector stores for document storage and retrieval. It also explores context management in AWS Bedrock and the process of training models in parameter efficient fine-tuning.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app