

Entropy: Gaining Knowledge by Admitting Ignorance
5 snips Dec 3, 2018
In a fascinating discussion, Alexander Schekochihin, a plasma physicist and Professor of Theoretical Physics, dives into the role of ignorance in understanding complex physical systems. He argues that acknowledging our limitations can be a powerful tool for making predictions about the universe. The conversation explores entropy as a measure of uncertainty, tying in insights from Boltzmann and Shannon. Schekochihin also addresses 'good enoughism' in statistics, showing how embracing incompleteness can lead to greater insight in physics.
AI Snips
Chapters
Books
Transcript
Episode notes
Systems and Probabilities
- Systems have many possible states called microstates, each with an associated probability.
- Knowing these probabilities lets us predict averages and measurement outcomes.
Probabilities as Likelihoods
- Probabilities in physics can be seen as likelihoods based on partial information, not just frequencies.
- Assigning probabilities fairly means considering only known information and admitting ignorance of the rest.
Shannon's Entropy Discovery
- Claude Shannon, a communication engineer, mathematically defined entropy as a measure of uncertainty.
- His formula uniquely satisfies requirements for symmetry, continuity, and additivity in uncertainty.