
David Lorell
Author of the article being narrated on this episode; presents a proof showing that resampling approximately conserves redundancy and mediation under the Jensen–Shannon divergence. Contributor to LessWrong posts on information-theoretic properties of latent variables.
Best podcasts with David Lorell
Ranked by the Snipd community

6 snips
Oct 31, 2025 • 9min
“Resampling Conserves Redundancy & Mediation (Approximately) Under the Jensen-Shannon Divergence” by David Lorell
David Lorell, an expert in information theory and contributor to LessWrong, delves into the intriguing world of Jensen-Shannon divergence. He explains why this metric eclipses KL divergence in demonstrating how resampling can conserve redundancy and mediation. With a focus on clarity, David outlines key definitions and theorems behind his proof, sharing practical implications of his findings. Listeners will be captivated by the interplay between complex mathematical concepts and their real-world applications.

Feb 9, 2024 • 12min
[HUMAN VOICE] "A Shutdown Problem Proposal" by johnswentworth, David Lorell
In this podcast, johnswentworth and David Lorell propose a solution to the shutdown problem in AI by using a sub-agent architecture and negotiation between utility-maximizing subagents. They discuss the design of an agent with multiple subagents and the importance of corrugibility. They also explore alignment problems, ontological issues, designing utility functions, and challenges in bridging the theory-practice gap.


