
5. Charlie Snell on DALL-E and CLIP
The Inside View
 00:00 
Scaling Laws
JB3 requires less training samples to get like to a lower loss or the same loss. People might say GPT three is overfitting, but in a technical sense, it really can't overfit because it didn't even see all the data points more than one. So yeah, that was something that impressed me about scaling laws. I would say I'm generally pro scaling. There's a difference between being pro scaling and being optimistic about scaling.
 Play episode from 01:01:40 
 Transcript 


