AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How to Minimize the Entropy of Noise in a Generative Model
If we want to prevent the mode collapse, then we should try to maximize the entropy. So I thought it was a really nice solution. How is the entropy computed? Is that expensive to compute? You don't need to compute the entropy. That's a nice thing there too, because we're doing the gradient descent. Ultimately, all you need is to have gradients approximately.Yeah, so you contractively approximate the gradients, because you already are in a situation where you have a sample from the posterior. And so you need to unroll HMC for like to get two samples, two burning samples and then two samples to approximate the expectation.