Latent Space: The AI Engineer Podcast cover image

ICLR 2024 — Best Papers & Talks (ImageGen, Vision, Transformers, State Space Models) ft. Durk Kingma, Christian Szegedy, Ilya Sutskever

Latent Space: The AI Engineer Podcast

NOTE

Variational Auto Encoders and The Reparameterization Trick

Variational auto encoders map input onto a distribution by replacing the bottleneck vector with separate vectors for mean and standard deviation. The training process involves a reconstruction loss and KL divergence to ensure the learned distribution is close to a standard Gaussian. To address the issue of gradients not passing through a sampling node, the reparameterization trick is used, where the latent vector is split into fixed parameters (mu and sigma) and a stochastic part (epsilon) for which gradients don't need to be computed, enabling end-to-end training.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner