Deep Papers cover image

Lost in the Middle: How Language Models Use Long Contexts

Deep Papers

00:00

The Pros and Cons of Pushing the Context to the Start

The paper talks about their promising directions for improving models that's essentially pushing the context to the start or running fewer ducks. But I think we're really going to see build off of this though is I'm like concerned about the transformer architecture. The more and more we understand we'll know how to inject the right context in the right location.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner