Learning Bayesian Statistics cover image

#110 Unpacking Bayesian Methods in AI with Sam Duffield

Learning Bayesian Statistics

00:00

Utilizing Posterior in Bayesian Methods for AI

This chapter explores the applications of posterior in Bayesian methods within AI, particularly in transitioning from full batch to mini batch settings to handle large datasets efficiently. It emphasizes the advantages of using Bayesian approximations in high-scale models like neural networks for tasks such as out of distribution detection and continual learning. Additionally, it discusses the concepts of approximate inference, Laplace approximation, and stochastic gradient descent in the context of Bayesian methods, aiming to make these techniques more accessible for research and practical applications.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app