Gradient Dissent: Conversations on AI cover image

Scaling LLMs and Accelerating Adoption with Aidan Gomez at Cohere

Gradient Dissent: Conversations on AI

00:00

The Future of Language Models

The sequence length constraint could be problematic enough that it actually might push us off of transformers. That is like a scaling quadratically in the sequence length is a huge problem. I would think then that you would predict that the model performance will continue to scale consistently as compute is added basically forever. And how do you think about data constraints as the model complexity rose? Is data going to be the next constraint for building very large LLMs? Yeah, it already really is.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner