Learning Bayesian Statistics

#90, Demystifying MCMC & Variational Inference, with Charles Margossian

8 snips
Sep 6, 2023
Charles Margossian, computational mathematician, discusses the differences between MCMC and Variational Inference (VI). They explore beginner questions, the practical applications of Bayesian methods in pharmacometrics and epidemiology, and the challenges of fitting mechanistic models in drug absorption and effects. They also touch on nested Laplace approximations and the complexity of Bayesian methods and data analysis.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

MCMC as Approximation

  • MCMC, like variational inference, is an approximate method in practice due to finite computation.
  • MCMC becomes exact only asymptotically with infinite iterations, achieving arbitrary precision.
INSIGHT

MCMC Bias

  • MCMC is not unbiased as it doesn't start from the stationary distribution and doesn't reach it.
  • Convergence diagnostics like R-hat help ensure the bias is sufficiently small.
ADVICE

Choosing VI or MCMC

  • Consider the family of variational inference and the objective function when choosing between VI and MCMC.
  • Mean-field approximation works for high-dimensional problems needing linear scaling.
Get the Snipd Podcast app to discover more snips from this episode
Get the app