AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Designing Neural Network Architectures That Are Efficient in the Memory Space
A lot of neural networks are characterized in terms of flops, right? Flops basically being the floating point operations. Now it turns out that flops are really not a good indicator of how well a particular network is. And what a better indicator is is the activation or the memory that is being used by this particular model. Regnet is an architecture family that's particularly good at both flops and Memory.