AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Maximizing RAG Applications with Long Context Windows
The increasing hype around long context windows in the field of Large Language Models (LLMs) is expected to supercharge RAG applications in the future. With the ability to provide more documents to the LLM due to longer context windows, there may be less emphasis on re-ranking after retrieval. However, the concept of infinite context length windows is not likely to disrupt RAG significantly in the next few years, especially when dealing with tens of millions of documents.