
Is scaling all you need for AI Large Language Models? Scaling laws and the Inverse Scaling Challenge. Featuring Ian McKenzie, FAR AI Research Scientist
Orchestrate all the Things
00:00
Is There a Parallel to the Bitter Lesson in AI?
I'm unsure if adding a lot of detail in architectural improvements will outperform just making them bigger. Are there architectural improvements we can make versus just waiting for even more compute? In one way, it seems like it would be nice to be able to put more of our own structure on these systems rather than just making the bigger and seeing what happens. It's another question of how competitive that will be with continuing to scale.
Transcript
Play full episode