AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How to Use Llama Index to Connect Your Data to Language Models
The central mission of llama index is to connect your data to language models. It's just conditioning, you're providing the right conditioning variable for your input and using that to send some sort of input prompt to the language model and get back a response. And so as I was playing around with how do I stuff this big call transcript into the prompt window at GPT three, the prompt size is around four thousand tokens but the call transcript can be a lot bigger than that. So then I thought, what's the best way to like structure this data in some way such that we can still utilize the prompt window of GPT three,. But also maintain some sort of external RAM or hard drive