
S4E02 Underachievers, Overachievers, & Maximum Likelihood Estimation
Quantitude
How Do We Translate This Into the Maximum Probability World?
Much of what we talk about today is going to relate to naturally normal theory, maximum likelihood. We are going to assume this normal distribution. All of us have seen this beautiful expression for that one,. Over sigma root to pi, e to the negative squared over two. Nice. Now what that does is tit just like a line, where if you have a plus b, x, and you drop in points across x, and it draws a line. This is exactly the same at's a sausage maker, andyou drop in values of x, andIt draws out this beautiful normal distributioni so here's my question for you. How do you define what that is to your class?
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.