
Learning Bayesian Statistics #142 Bayesian Trees & Deep Learning for Optimization & Big Data, with Gabriel Stechschulte
20 snips
Oct 2, 2025 Gabriel Stechschulte is a software engineer specializing in Bayesian methods and optimization. He discusses the power of Bayesian Additive Regression Trees (BART) for uncertainty quantification and its re-implementation in Rust, enhancing performance for big data. Gabriel explores how BART contrasts with other models, its strengths in avoiding overfitting, and its integration into optimization frameworks for decision-making. He also emphasizes the importance of open-source communities, encouraging newcomers to contribute actively.
AI Snips
Chapters
Books
Transcript
Episode notes
Scale Bayes With Pragmatic Strategies
- Reduce dataset size with aggregation before full Bayesian inference when data are massive.
- Use variational inference or GPU-accelerated frameworks like NumPyro when aggregation isn't feasible.
Rust Reimplementation Origin Story
- Gabriel reimplemented PyMC BART's particle Gibbs sampler in Rust to speed up inference and integrate with Python.
- He started by translating Python code into Rust and then optimized hotspots rather than rewriting from scratch.
BART Is Boosting With Bayesian Uncertainty
- BART combines many trees like boosting but provides a posterior over functions, giving principled uncertainty estimates.
- That uncertainty makes BART suitable as a surrogate in decision and optimization workflows where confidence matters.








