2min chapter

The Thesis Review cover image

[13] Adji Bousso Dieng - Deep Probabilistic Graphical Modeling

The Thesis Review

CHAPTER

How to Marry Recurrent Neural Networks and Topic Models

The elbow that people might be familiar with with VAEs is kind of derived using the KL divergence. Why specifically do we use the KL divergence? Is it for like historical reasons or computational reasons? It's mainly computational. You cannot bounce to any, you're not bound to any divergence. So I would say computation is the main reason why people have stuck with the elbow. And then I would say the thesis is structured as coming up with different models and then come up with different algorithms. Yes.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode