AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Entropy in Mathematics
Entropy is defined as the sum over N of P of X times log P of X, right? So Shannon entropy. And then there's entropy rates and so forth for like Markovian genetic. But you know what's so funny? We don't have any really good measures in mathematics for entropy. All the measures we use are just uniform distribution measures.