
Zach Furman
Author of the essay 'Deep Learning as Program Synthesis' whose text is narrated in this episode; synthesizes mechanistic interpretability, Solomonoff induction, and singular learning theory to argue that deep learning finds compositional algorithms.
Best podcasts with Zach Furman
Ranked by the Snipd community

27 snips
Jan 24, 2026 • 1h 12min
"Deep learning as program synthesis" by Zach Furman
Zach Furman, author of the essay 'Deep Learning as Program Synthesis' and mechanistic interpretability researcher, presents a hypothesis that deep nets search for simple, compositional programs. He traces evidence from grokking, vision circuits, and induction heads. He explores paradoxes of approximation, generalization, and convergence and sketches how SGD and representational structure could enable program‑like solutions.


