Bays's theorem says you have a prior probability for some theory or some proposition to be true. To get the posterior probability, conditionalized on new data that came in, you multiply that pot prior by the likelihood that that data would be coming in if your proposition were true and then normalized it. So when murray says, doesn't that require the basiand has to be a hundred % confidence in the evidenc confident in the evidence? I would say no. It requires that the basion needs to be very, very accurate and honest about the probability of thinking they've gotten that evidence if the theory was correct. Now, how do you do that? I think that's the

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode