
Matrix Factorization For k-Means
Data Skeptic
00:00
Using Seft Maxes in Dep Learning
The confidence gets higher the more like the mapt input goes to infinity. But yet, this is one of these very werd properties of using seft maxs in dep learning. This is where the magic happens in because this is like the final layer to which you apply t in softmex and then you already get your results. You don't do any more mapping or so in this embedded space, the class is in equal to cones.
Transcript
Play full episode