Latent Space: The AI Engineer Podcast cover image

2024 in Post-Transformers Architectures (State Space Models, RWKV) [LS Live @ NeurIPS]

Latent Space: The AI Engineer Podcast

CHAPTER

Exploring the Relevance of RAG and Infinite Context in State-Based Models

This chapter discusses the significance of Retrieval-Augmented Generation in shaping future models, particularly concerning the idea of 'infinite context.' It highlights personal experiences with RAG, the role of embedding model quality, and the integration of external data sources into language models, while encouraging audience engagement and critical thinking about long context prompts.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner