2min snip

Learning Bayesian Statistics cover image

#110 Unpacking Bayesian Methods in AI with Sam Duffield

Learning Bayesian Statistics

NOTE

Moving from Classical Bayesian Statistics to Mini Batch Era

The shift is happening from classical Bayesian statistics to a mini-batch approach in the era of big data. While traditional methods like Metropolis-Hastings are designed for full-batch settings, there is a trend towards adopting mini-batch techniques to speed up inference. Posterior is recommended for exploring mini-batch posterior sampling, particularly beneficial for large-scale models such as language models and big neural networks. Despite challenges in convergence checks, crude Bayesian approximations offer benefits like out-of-distribution detection, improved attribution performance, and continual learning, which may not be achievable with traditional training methods.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode