AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Bays's Theorem
Bays's theorem says you have a prior probability for some theory or some proposition to be true. To get the posterior probability, conditionalized on new data that came in, you multiply that pot prior by the likelihood that that data would be coming in if your proposition were true and then normalized it. So when murray says, doesn't that require the basiand has to be a hundred % confidence in the evidenc confident in the evidence? I would say no. It requires that the basion needs to be very, very accurate and honest about the probability of thinking they've gotten that evidence if the theory was correct. Now, how do you do that? I think that's the