The Thesis Review cover image

[13] Adji Bousso Dieng - Deep Probabilistic Graphical Modeling

The Thesis Review

CHAPTER

How to Marry Recurrent Neural Networks and Topic Models

The elbow that people might be familiar with with VAEs is kind of derived using the KL divergence. Why specifically do we use the KL divergence? Is it for like historical reasons or computational reasons? It's mainly computational. You cannot bounce to any, you're not bound to any divergence. So I would say computation is the main reason why people have stuck with the elbow. And then I would say the thesis is structured as coming up with different models and then come up with different algorithms. Yes.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner