
#60 – Jaime Sevilla on Trends in Machine Learning
Hear This Idea
What Scaling Laws Are Saying?
The relationship between performance and these inputs is surprisingly regular so far, such that if you just throw more compute and more data, then you get this like reliable improvement in performance. So we expect this kind of regularity can we somehow anticipate when are we going to see advancements in like other fields? This is more speculative. Got it. And then again, show us earlier, but you mentioned scaling yours. Can you write me exactly what scaling laws are saying? Yeah, so scaling laws are like this, this regularity between like the compute that is being used to train the system, the data that is used toTrain the System.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.