

BITESIZE | Exploring Dynamic Regression Models, with David Kohns
Jun 18, 2025
In this engaging discussion, David Kohns, a researcher at Aalto University specializing in probabilistic programming, shares his insights on the future of Bayesian statistics. He explores the complexities of time series modeling and the significance of setting informative priors. The conversation highlights innovative tools like normalizing flows that streamline Bayesian inference. David also delves into the intricate relationship between AI and prior elicitation, making Bayesian methods more accessible while maintaining the need for practical understanding.
AI Snips
Chapters
Transcript
Episode notes
Expert Priors Needed in Time Series
- Setting priors using expert knowledge and about predictive summary statistics remains an underdeveloped area in Bayesian time series modeling.
- There is significant potential in multivariate time series modeling where variables like inflation and GDP are modeled jointly with informed priors.
When Does Selection Matter?
- The necessity of variable or component selection in models depends heavily on the quality of priors, like R-squared priors.
- Good priors can sometimes reduce the need for selection even in complex causal analysis scenarios.
Use Normalizing Flows for Efficiency
- Explore normalizing flows and amortized Bayesian inference to simplify complex computations in large-scale Bayesian estimation.
- These methods can make Bayesian computational workflows much easier and reusable across models.