Brain Inspired cover image

BI 123 Irina Rish: Continual Learning

Brain Inspired

00:00

Scaling Scalable Networks

When you don't have enough data, then using priors or inductive biases from the domain is extremely ful. If those rigulization constraints or prior so inductive biaces are right, they help tremendously. And that's where you culd use specific architectures and so on. But say you do it right now, it looks like s priors, inductive biass become less and less important. Weit ome deps on something. There are many kind of important chemists here about how to do it. Sot aso scaling model. Well, actually capture the amount of information while you scale data. Sointher smarter ways, to do that in less smart ways

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app