AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Getting 10,000 GPT3s in Half a Year?
GPT3 takes about 1000 A100s equivalent to a compute to train, and the rough estimate that it gives is that an A100 price is about $12,500. If Google or Meta or any kind of crunches altogether as Google get a soft, once 1000 GPT3s they'd have to buy 10 months worth of NVIDIA's current production. It would cost 10 to $15 billion, but Google made 70 billion in revenue in quarter two of this year. So you're still talking like five months of production at a price affordable to these hyperscalers. Not years. Once you've trained one, you can run it on incredibly, you know, sparser resources