AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Step-Tage of GPUs
10 years ago, cheap GPUs were used to train lots of units using backpropagation. Nowadays, it's not so clear why we need such a large number of layers in the first place. And was that simply down to the give it more data? And it had enough neurons, enough computational depth, is it? What was it, David, that really was the step-tage?