Programming Throwdown cover image

161: Leveraging Generative AI Models with Hagay Lupesko

Programming Throwdown

CHAPTER

The Scaling Laws of Machine Learning

A paper published by OpenMind. basically talks about the scaling laws, which are all referring to very similar architecture. So a model like OPT or even GPT-3 was actually under trained for its size. You can take an actually a smaller model with less parameters, train it on more data and it will perform just as well or even better than a bigger model training on less data.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner