Learning Bayesian Statistics

#144 Why is Bayesian Deep Learning so Powerful, with Maurizio Filippone

30 snips
Oct 30, 2025
Maurizio Filippone, an associate professor at KAUST and leader of the Bayesian Deep Learning Group, dives into the fascinating world of Bayesian function estimation. He explains why Gaussian Processes are still crucial for function estimation and how deep Gaussian Processes introduce flexibility for complex tasks. Maurizio discusses practical strategies like Monte Carlo Dropout for uncertainty quantification in neural networks, the trade-offs between model complexity and interpretability, and the role of Bayesian methods in modern generative models.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

GPs And Bayesian Nets Are Unified

  • Deep Gaussian processes and Bayesian neural networks converge conceptually: you can view deep GPs as special cases of Bayesian neural nets.
  • Scalable GP inference techniques transfer naturally to Bayesian deep learning and bridge the two paradigms.
INSIGHT

Composition Adds Flexible Function Classes

  • Deep GPs compose Gaussian processes so the resulting function class becomes far richer and non-Gaussian marginally.
  • Composition yields flexibility similar to deep nets, enabling nonstationary, input-dependent behaviors without handcrafting kernels.
ADVICE

Pick Models By What You Know About Functions

  • Choose GPs when you know the function properties you want to encode, like lengthscale or smoothness.
  • Use data-driven deep models when you have lots of data or lack prior structural knowledge about the function.
Get the Snipd Podcast app to discover more snips from this episode
Get the app