AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Best Intuition for the Right-Handed Term
In a normal autoencoder, like if you didn't have this, it could basically place the Zs anywhere in like vector space. So when we do this KL divergence, we're sort of pulling the posterior distribution over the latents so that they can't spread out too much. This is kind of regularizing the posterior, which is this Q of Z given X or pulling it towards the prior. That'ssort of the best intuition for the right-hand term.