Bays's theorem says you have a prior probability for some theory or some proposition to be true. To get the posterior probability, conditionalized on new data that came in, you multiply that pot prior by the likelihood that that data would be coming in if your proposition were true and then normalized it. So when murray says, doesn't that require the basiand has to be a hundred % confidence in the evidenc confident in the evidence? I would say no. It requires that the basion needs to be very, very accurate and honest about the probability of thinking they've gotten that evidence if the theory was correct. Now, how do you do that? I think that's the
Welcome to the August 2022 Ask Me Anything episode of Mindscape! These monthly excursions are funded by Patreon supporters (who are also the ones asking the questions). We take questions asked by Patreons, whittle them down to a more manageable number — based primarily on whether I have anything interesting to say about them, not whether the questions themselves are good — and sometimes group them together if they are about a similar topic.
Here is a link to the Mindscape Big Picture Scholarship. Please consider donating!
Support Mindscape on Patreon.
See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.