Latent Space: The AI Engineer Podcast cover image

2024 in Post-Transformers Architectures (State Space Models, RWKV) [LS Live @ NeurIPS]

Latent Space: The AI Engineer Podcast

00:00

Exploring the Relevance of RAG and Infinite Context in State-Based Models

This chapter discusses the significance of Retrieval-Augmented Generation in shaping future models, particularly concerning the idea of 'infinite context.' It highlights personal experiences with RAG, the role of embedding model quality, and the integration of external data sources into language models, while encouraging audience engagement and critical thinking about long context prompts.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app