AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Course of Dimensionality
The course of dimensionality refers generally to the to the inability of a algaritms to keep certifying certain performance as the data becomes more complex. Andter and te curse can also be from the computational side, rite the sense that if i keep adding parameters and parameters paters to my training, moral i might have to optimize a nott to solve an optimitation problem that becomes expodentially harder. So why do we think that geometric to deep learning is at least an unimportant piece to overcome this curse? We are basically making the hypothesis class if you want smaller, eh? Thats said there's still some path to go, right? As we were describing just a bunch