AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
What Scaling Laws Are Saying?
The relationship between performance and these inputs is surprisingly regular so far, such that if you just throw more compute and more data, then you get this like reliable improvement in performance. So we expect this kind of regularity can we somehow anticipate when are we going to see advancements in like other fields? This is more speculative. Got it. And then again, show us earlier, but you mentioned scaling yours. Can you write me exactly what scaling laws are saying? Yeah, so scaling laws are like this, this regularity between like the compute that is being used to train the system, the data that is used toTrain the System.