
Episode 16: Yilun Du, MIT, on energy-based models, implicit functions, and modularity
Generally Intelligent
00:00
I'm Not Sure Why It Gets Disentangled.
Kendrin: I'm not completely sure why it gets disentangled. The objective itself doesn't actually encourage you to get disentangled factors. We use the same network architecture for energy function, so we use exact same weights. Camia: It's much easier to explain the scene by associating multiple objects together into one thing than it is to isolate individual objects when your objects as lok, so diverse from each other.
Transcript
Play full episode