Learning Bayesian Statistics

BITESIZE | Is Bayesian Optimization the Answer?

7 snips
Aug 27, 2025
In this discussion, Max Balandat, a key figure in Bayesian optimization and an advocate for open-source culture at Meta, shares insights on the integration of BoTorch with PyTorch. He highlights the flexibility and user-friendly nature of GPyTorch for handling optimization challenges with large datasets. Max explores the advantages of using neural networks as feature extractors in high-dimensional Bayesian optimization and emphasizes the importance of open-source collaboration in advancing research in this dynamic field.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ADVICE

Keep Models Differentiable For Efficient Optimization

  • Use PyTorch tensors when plugging external posteriors into BoTorch to maintain compatibility.
  • Expect to lose differentiability and gradient-based optimization if you feed in non-PyTorch samples.
ADVICE

Start With PyTorch And Built-In Models

  • Start with a PyTorch model to get the smoothest experience using BoTorch out of the box.
  • Use BoTorch's packaged GP implementations if you prefer not to bring your own model.
INSIGHT

PyTorch Made Differentiable Research Practical

  • Choosing PyTorch let the developers leverage existing autograd rather than reimplementing gradients.
  • Proximity to PyTorch developers at Meta made integration and ecosystem membership easier.
Get the Snipd Podcast app to discover more snips from this episode
Get the app