
Learning Bayesian Statistics BITESIZE | Why is Bayesian Deep Learning so Powerful?
Nov 5, 2025
Join Maurizio Filippone, a Bayesian machine learning researcher specializing in Gaussian processes, as he unpacks the magic of deep Gaussian processes. He explains how composing GPs enhances flexibility and offers insights into modeling complex data. Discover practical approximations for implementing Deep GPs in TensorFlow, and learn when to use them over traditional deep neural networks. Maurizio also shares how to map neural networks to GP-like behavior for better interpretability and uncertainty quantification. It's a fascinating dive into the future of machine learning!
AI Snips
Chapters
Transcript
Episode notes
Gaussian Processes As Distributions Over Functions
- A Gaussian process (GP) is a distribution over functions derived from an infinite linear model via the kernel trick.
- Drawing from a GP yields smooth function samples whose behavior is set by the covariance kernel.
Composing GPs To Gain Expressivity
- Deep Gaussian processes stack GPs by feeding functions as inputs to other GPs, creating compositions of functions.
- Composing GPs yields far richer, non-Gaussian marginals and models more complex, nonstationary behavior.
Prefer Composition Over Kernel Engineering
- Use function composition rather than hand-designing complex kernels to capture nonstationarity and position-dependent behavior.
- Composition is often easier and more powerful than combinatorial kernel engineering.
