The Stack Overflow Podcast cover image

Are long context windows the end of RAG?

The Stack Overflow Podcast

00:00

Introduction

The chapter examines the groundbreaking Google's Gemini 1.5 model, capable of handling a vast context window of up to 700,000 words, leading to discussions on its impact on retrieval augmented generation and speculation on its practical applications.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app