Generally Intelligent cover image

Episode 16: Yilun Du, MIT, on energy-based models, implicit functions, and modularity

Generally Intelligent

00:00

I'm Not Sure Why It Gets Disentangled.

Kendrin: I'm not completely sure why it gets disentangled. The objective itself doesn't actually encourage you to get disentangled factors. We use the same network architecture for energy function, so we use exact same weights. Camia: It's much easier to explain the scene by associating multiple objects together into one thing than it is to isolate individual objects when your objects as lok, so diverse from each other.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app