Learning Bayesian Statistics

#136 Bayesian Inference at Scale: Unveiling INLA, with Haavard Rue & Janet van Niekerk

13 snips
Jul 9, 2025
Haavard Rue, a professor and the mastermind behind Integrated Nested Laplace Approximations (INLA), joins Janet van Niekerk, a research scientist specializing in its application to medical statistics. They dive into the advantages of INLA over traditional MCMC methods, highlighting its efficiency with large datasets. The conversation touches on computational challenges, the significance of carefully chosen priors, and the potential of integrating GPUs for future advancements. They also share insights on using INLA for complex models, particularly in healthcare and spatial analysis.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

Janet's INLA Journey Begins

  • Janet van Niekerk joined Haavard Rue's research group after reaching out post-PhD, starting her journey with INLA.
  • She initially struggled to understand latent Gaussian models but grew deeply involved in INLA applications and development.
INSIGHT

INLA vs MCMC Explained

  • INLA uses deterministic mathematical approximations for Bayesian posteriors, avoiding sampling like MCMC.
  • This leads to much faster computation while still providing credible intervals and means from the posterior.
INSIGHT

Latent Gaussian Models Fit INLA

  • INLA is specifically designed for latent Gaussian models where model effects have joint Gaussian priors.
  • This covers a broad class including GAMs, random effects, time series, survival, and spatial models.
Get the Snipd Podcast app to discover more snips from this episode
Get the app