AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Is There a Higher Level of Performance in Large Language Models?
The J.A.A. report estimates how much compute a human uses up to adulthood. Can we ask the same question about how much like data I was quote unquote trained on, let's say before my 18th birthday or something? We did this calculation back then, a deep book just for fun internally. And it, it takes out to about like 140 million words. So sounds like large language models now, even the very biggest ones trained on more words than humans need to be trained on to reach higher level of performance right now.