AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Scaling Machine Learning Models
As you make models bigger, you have to make data sets bigger. There's a whole class of these ways of training we call distillation where the input to the tiny ML training isn't data sets. It's what does this other big model think about things. So it's very early days for this field. I love that we can talk about the aesthetics of our technical solution on here. And then behind the scenes, there's basically material from large model to just narrow down to small model. How much data do you need now because we've got access to things maybe we didn't have before?