
5. Charlie Snell on DALL-E and CLIP
The Inside View
Scaling Laws
JB3 requires less training samples to get like to a lower loss or the same loss. People might say GPT three is overfitting, but in a technical sense, it really can't overfit because it didn't even see all the data points more than one. So yeah, that was something that impressed me about scaling laws. I would say I'm generally pro scaling. There's a difference between being pro scaling and being optimistic about scaling.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.