Learning Bayesian Statistics

#139 Efficient Bayesian Optimization in PyTorch, with Max Balandat

5 snips
Aug 20, 2025
Max Balandat, who leads the modeling and optimization team at Meta, discusses the fascinating world of Bayesian optimization and the BoTorch library. He shares insights on the seamless integration of BoTorch with PyTorch, enhancing flexibility for researchers. The conversation delves into the significance of adaptive experimentation and the impact of LLMs on optimization. Max emphasizes the importance of effectively communicating uncertainty to stakeholders and reflects on the transition from academia to industry, highlighting collaboration in research.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Flexible Research-Focused Optimization

  • BoTorch targets researchers and advanced users who need modular, flexible Bayesian optimization tools.
  • Its unique value is full PyTorch differentiability and Monte Carlo acquisition functions for rapid experimentation.
ADVICE

Prefer Differentiable MC Acquisition

  • Use Monte Carlo acquisition functions when you can sample differentiably from your model posterior.
  • This lets you swap surrogate models quickly and leverage gradient-based optimization for acquisition tuning.
INSIGHT

Differentiability As A Design Anchor

  • Writing BoTorch in PyTorch enables end-to-end differentiability and easy integration with other PyTorch modules.
  • That choice accelerates research and lets users plug in encoders or pretrained networks seamlessly.
Get the Snipd Podcast app to discover more snips from this episode
Get the app