Learning Bayesian Statistics

#142 Bayesian Trees & Deep Learning for Optimization & Big Data, with Gabriel Stechschulte

Oct 2, 2025
Gabriel Stechschulte is a software engineer specializing in Bayesian methods and optimization. He discusses the power of Bayesian Additive Regression Trees (BART) for uncertainty quantification and its re-implementation in Rust, enhancing performance for big data. Gabriel explores how BART contrasts with other models, its strengths in avoiding overfitting, and its integration into optimization frameworks for decision-making. He also emphasizes the importance of open-source communities, encouraging newcomers to contribute actively.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ADVICE

Scale Bayes With Pragmatic Strategies

  • Reduce dataset size with aggregation before full Bayesian inference when data are massive.
  • Use variational inference or GPU-accelerated frameworks like NumPyro when aggregation isn't feasible.
ANECDOTE

Rust Reimplementation Origin Story

  • Gabriel reimplemented PyMC BART's particle Gibbs sampler in Rust to speed up inference and integrate with Python.
  • He started by translating Python code into Rust and then optimized hotspots rather than rewriting from scratch.
INSIGHT

BART Is Boosting With Bayesian Uncertainty

  • BART combines many trees like boosting but provides a posterior over functions, giving principled uncertainty estimates.
  • That uncertainty makes BART suitable as a surrogate in decision and optimization workflows where confidence matters.
Get the Snipd Podcast app to discover more snips from this episode
Get the app