
Learning Bayesian Statistics #136 Bayesian Inference at Scale: Unveiling INLA, with Haavard Rue & Janet van Niekerk
15 snips
Jul 9, 2025 Haavard Rue, a professor and the mastermind behind Integrated Nested Laplace Approximations (INLA), joins Janet van Niekerk, a research scientist specializing in its application to medical statistics. They dive into the advantages of INLA over traditional MCMC methods, highlighting its efficiency with large datasets. The conversation touches on computational challenges, the significance of carefully chosen priors, and the potential of integrating GPUs for future advancements. They also share insights on using INLA for complex models, particularly in healthcare and spatial analysis.
AI Snips
Chapters
Transcript
Episode notes
Janet's INLA Journey Begins
- Janet van Niekerk joined Haavard Rue's research group after reaching out post-PhD, starting her journey with INLA.
- She initially struggled to understand latent Gaussian models but grew deeply involved in INLA applications and development.
INLA vs MCMC Explained
- INLA uses deterministic mathematical approximations for Bayesian posteriors, avoiding sampling like MCMC.
- This leads to much faster computation while still providing credible intervals and means from the posterior.
Latent Gaussian Models Fit INLA
- INLA is specifically designed for latent Gaussian models where model effects have joint Gaussian priors.
- This covers a broad class including GAMs, random effects, time series, survival, and spatial models.
