In probability, there is an entropy that you can calculate. If it's kind of piqued or concentrated at one, in that sense, it's the entropys zero. So if i have a function that spits out a number on a probability distribution, how do i know that i can interpret that number as some type of ounow amount of information conveyed? Are there proper ies that entropy satisfies? You know, can i list them? Is that list sufficient to tell me,. oh, let me pull out my check list. Hey, you're also intropy, right? You might want to know about that.
Mathematics is often thought of as the pinnacle of crisp precision: the square of the hypotenuse of a right triangle isn’t “roughly” the sum of the squares of the other two sides, it’s exactly that. But we live in a world of messy imprecision, and increasingly we need sophisticated techniques to quantify and deal with approximate statistical relations rather than perfect ones. Modern mathematicians have noticed, and are taking up the challenge. Tai-Danae Bradley is a mathematician who employs very high-level ideas — category theory, topology, quantum probability theory — to analyze real-world phenomena like the structure of natural-language speech. We explore a number of cool ideas and what kinds of places they are leading us to.
Support Mindscape on Patreon.
Tai-Danae Bradley received her Ph.D. in mathematics from the CUNY Graduate Center. She is currently a research mathematician at Alphabet, visiting research professor of mathematics at The Master’s University, and executive director of the Math3ma Institute. She hosts an explanatory mathematics blog, Math3ma. She is the co-author of the graduate-level textbook Topology: A Categorical Approach.
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.