AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How to Marry Recurrent Neural Networks and Topic Models
The elbow that people might be familiar with with VAEs is kind of derived using the KL divergence. Why specifically do we use the KL divergence? Is it for like historical reasons or computational reasons? It's mainly computational. You cannot bounce to any, you're not bound to any divergence. So I would say computation is the main reason why people have stuck with the elbow. And then I would say the thesis is structured as coming up with different models and then come up with different algorithms. Yes.