Gradient Dissent: Conversations on AI cover image

Scaling LLMs and Accelerating Adoption with Aidan Gomez at Cohere

Gradient Dissent: Conversations on AI

CHAPTER

The Importance of Scalability in Neural Networks

So another experience that I've had with neural networks is that it seems like the exact details of the architecture often doesn't matter. And we end up kind of picking these architectures based on trying to replicate previous papers and not wanting to mess something up. How fundamental do you think transformers are? And maybe to make it a more well formed question, if you've ran back history a thousand times, how many of those times do you think you would get the transformer Architecture specifically? Yeah. That's a really interesting thought experiment. I don't think it's that fundamental. All you really need to do is saturate compute in a good way. You need to come up with an architecture that

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner