AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Entropy in Probability Distributions
In probability, there is an entropy that you can calculate. If it's kind of piqued or concentrated at one, in that sense, it's the entropys zero. So if i have a function that spits out a number on a probability distribution, how do i know that i can interpret that number as some type of ounow amount of information conveyed? Are there proper ies that entropy satisfies? You know, can i list them? Is that list sufficient to tell me,. oh, let me pull out my check list. Hey, you're also intropy, right? You might want to know about that.