
LessWrong (30+ Karma) “Resampling Conserves Redundancy & Mediation (Approximately) Under the Jensen-Shannon Divergence” by David Lorell
Oct 31, 2025
David Lorell, an expert in information theory and contributor to LessWrong, delves into the intriguing world of Jensen-Shannon divergence. He explains why this metric eclipses KL divergence in demonstrating how resampling can conserve redundancy and mediation. With a focus on clarity, David outlines key definitions and theorems behind his proof, sharing practical implications of his findings. Listeners will be captivated by the interplay between complex mathematical concepts and their real-world applications.
AI Snips
Chapters
Transcript
Episode notes
A Past Proof Was Corrected
- David Lorell recounts that a prior proof claiming resampling conserves redundancy was shown wrong by Jeremy Gillen and Alfred Harwood.
- That correction motivated the new JS-based proof presented in this episode.
Why Jensen-Shannon Works Here
- The Jensen-Shannon (JS) divergence can succeed where KL failed for proving resampling bounds.
- JS's metric-like properties and interpretability as sample-distinguishability make it useful for factorization error.
JS Has Useful Mathematical Intuition
- JS's square root is a metric, which KL lacks, giving useful mathematical structure for bounds.
- JS equals the mutual information between a sample and a selector, making it intuitively measure distinguishability.
