AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How Much Compute Is Needed to Train the Largest Machine Learning Models?
Since 2012, the amount of computational power used to train our largest machine learning models has grown by over a billion times. This is how things like GPT-3 are able to perform tasks they weren't specifically trained for. It's hard to say whether these trends will continue, but they speak to incredible gains over the past decade in what it's possible to do with machine learning.