
Exhaustion of High-Quality Data Could Slow Down AI Progress in Coming Decades
The Data Exchange with Ben Lorica
00:00
The Scaling Loss of a Model
The study was motivated by certain things you saw, certain scaling loss. And it suggested that the models that we have right now don't really use that much data. Chinsilla corrected that and they Google published a new set of scaling loss for NLP.
Transcript
Play full episode