AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Scalability in Neural Networks
So another experience that I've had with neural networks is that it seems like the exact details of the architecture often doesn't matter. And we end up kind of picking these architectures based on trying to replicate previous papers and not wanting to mess something up. How fundamental do you think transformers are? And maybe to make it a more well formed question, if you've ran back history a thousand times, how many of those times do you think you would get the transformer Architecture specifically? Yeah. That's a really interesting thought experiment. I don't think it's that fundamental. All you really need to do is saturate compute in a good way. You need to come up with an architecture that