The Thesis Review cover image

[06] Yoon Kim - Deep Latent Variable Models of Natural Language

The Thesis Review

00:00

Exploring Variational Inference in Generative Models

This chapter investigates the complexities of variational inference as a method for learning generative models, particularly through the lens of maximum likelihood and posterior inference challenges. It delves into techniques like amortized variational inference and the use of continuous latent variable models for tasks such as sentence generation, discussing their limitations and the need for structured supervision. The conversation also addresses interpretability concerns in language modeling while evaluating the potential advantages of deep latent variable models over traditional methods.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app