Gradient Dissent: Conversations on AI cover image

Scaling LLMs and Accelerating Adoption with Aidan Gomez at Cohere

Gradient Dissent: Conversations on AI

00:00

The Future of Language Models

The sequence length constraint could be problematic enough that it actually might push us off of transformers. That is like a scaling quadratically in the sequence length is a huge problem. I would think then that you would predict that the model performance will continue to scale consistently as compute is added basically forever. And how do you think about data constraints as the model complexity rose? Is data going to be the next constraint for building very large LLMs? Yeah, it already really is.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app