Learning Bayesian Statistics cover image

Learning Bayesian Statistics

Latest episodes

undefined
Jul 24, 2024 • 1h 26min

#111 Nerdinsights from the Football Field, with Patrick Ward

Guest Patrick Ward discusses applying Bayesian statistics in sports analytics, challenges in communicating concepts to non-statistical decision-makers, predicting training load impact on athlete performance, analyzing player tracking data, and creating a comprehensive patient model in team sports.
undefined
5 snips
Jul 10, 2024 • 1h 12min

#110 Unpacking Bayesian Methods in AI with Sam Duffield

Expert Sam Duffield discusses leveraging Bayesian methods in AI, focusing on mini-batch techniques, approximate inference, thermodynamic computing, and the Posteriors python package. He simplifies complex concepts for non-expert audiences and highlights the role of temperature in Bayesian models, stochastic gradient MCMC, and uncertainty quantification for improved predictions.
undefined
4 snips
Jun 25, 2024 • 1h 11min

#109 Prior Sensitivity Analysis, Overfitting & Model Selection, with Sonja Winter

Sonja Winter, an Assistant Professor at the University of Missouri specializing in Bayesian methods for educational research, dives into intriguing discussions. She elaborates on the importance of prior sensitivity analysis for robust findings and how Bayesian techniques elegantly address missing data issues. The conversation also tackles the challenges of overfitting in structural equation modeling, emphasizing the need for caution in model selection. Winter's insights highlight the transformative potential of Bayesian approaches in navigating complex educational data.
undefined
Jun 14, 2024 • 1h 18min

#108 Modeling Sports & Extracting Player Values, with Paul Sabin

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work!Visit our Patreon page to unlock exclusive Bayesian swag ;)TakeawaysConvincing non-stats stakeholders in sports analytics can be challenging, but building trust and confirming their prior beliefs can help in gaining acceptance.Combining subjective beliefs with objective data in Bayesian analysis leads to more accurate forecasts.The availability of massive data sets has revolutionized sports analytics, allowing for more complex and accurate models.Sports analytics models should consider factors like rest, travel, and altitude to capture the full picture of team performance.The impact of budget on team performance in American sports and the use of plus-minus models in basketball and American football are important considerations in sports analytics.The future of sports analytics lies in making analysis more accessible and digestible for everyday fans.There is a need for more focus on estimating distributions and variance around estimates in sports analytics.AI tools can empower analysts to do their own analysis and make better decisions, but it's important to ensure they understand the assumptions and structure of the data.Measuring the value of certain positions, such as midfielders in soccer, is a challenging problem in sports analytics.Game theory plays a significant role in sports strategies, and optimal strategies can change over time as the game evolves.Chapters00:00 Introduction and Overview09:27 The Power of Bayesian Analysis in Sports Modeling16:28 The Revolution of Massive Data Sets in Sports Analytics31:03 The Impact of Budget in Sports Analytics39:35 Introduction to Sports Analytics52:22 Plus-Minus Models in American Football01:04:11 The Future of Sports AnalyticsThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary, Blake Walters, Jonathan Morgan and Francesco Madrisotti.Links from the show:LBS Sports Analytics playlist: https://www.youtube.com/playlist?list=PL7RjIaSLWh5kDiPVMUSyhvFaXL3NoXOe4Paul’s website: https://sabinanalytics.com/Paul on GitHub: https://github.com/sabinanalytics Paul on Linkedin: https://www.linkedin.com/in/rpaulsabin/Paul on Twitter: https://twitter.com/SabinAnalyticsPaul on Google Scholar: https://scholar.google.com/citations?user=wAezxZ4AAAAJ&hl=enSoccer Power Ratings & Projections: https://sabinanalytics.com/ratings/soccer/Estimating player value in American football using plus–minus models: https://www.degruyter.com/document/doi/10.1515/jqas-2020-0033/htmlWorld Football R Package: https://github.com/JaseZiv/worldfootballRTranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
May 29, 2024 • 1h 22min

#107 Amortized Bayesian Inference with Deep Neural Networks, with Marvin Schmitt

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, Marvin Schmitt introduces the concept of amortized Bayesian inference, where the upfront training phase of a neural network is followed by fast posterior inference.Marvin will guide us through this new concept, discussing his work in probabilistic machine learning and uncertainty quantification, using Bayesian inference with deep neural networks. He also introduces BayesFlow, a Python library for amortized Bayesian workflows, and discusses its use cases in various fields, while also touching on the concept of deep fusion and its relation to multimodal simulation-based inference.A PhD student in computer science at the University of Stuttgart, Marvin is supervised by two LBS guests you surely know — Paul Bürkner and Aki Vehtari. Marvin’s research combines deep learning and statistics, to make Bayesian inference fast and trustworthy. In his free time, Marvin enjoys board games and is a passionate guitar player.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero, Will Geary and Blake Walters.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:Amortized Bayesian inference combines deep learning and statistics to make posterior inference fast and trustworthy.Bayesian neural networks can be used for full Bayesian inference on neural network weights.Amortized Bayesian inference decouples the training phase and the posterior inference phase, making posterior sampling much faster.BayesFlow is a Python library for amortized Bayesian workflows, providing a user-friendly interface and modular architecture.Self-consistency loss is a technique that combines simulation-based inference and likelihood-based Bayesian inference, with a focus on amortizationThe BayesFlow package aims to make amortized Bayesian inference more accessible and provides sensible default values for neural networks.Deep fusion techniques allow for the fusion of multiple sources of information in neural networks.Generative models that are expressive and have one-step inference are an emerging topic in deep learning and probabilistic machine learning.Foundation models, which have a large training set and can handle out-of-distribution cases, are another intriguing area of research.Chapters:00:00 Introduction to Amortized Bayesian Inference07:39 Bayesian Neural Networks11:47 Amortized Bayesian Inference and Posterior Inference23:20 BayesFlow: A Python Library for Amortized Bayesian Workflows38:15 Self-consistency loss: Bridging Simulation-Based Inference and Likelihood-Based Bayesian Inference41:35 Amortized Bayesian Inference43:53 Fusing Multiple Sources of Information45:19 Compensating for Missing Data56:17 Emerging Topics: Expressive Generative Models and Foundation Models01:06:18 The Future of Deep Learning and Probabilistic Machine LearningLinks from the show:Marvin’s website: https://www.marvinschmitt.com/Marvin on GitHub: https://github.com/marvinschmittMarvin on Linkedin: https://www.linkedin.com/in/marvin-schmitt/Marvin on Twitter: https://twitter.com/MarvinSchmittMLThe BayesFlow package for amortized Bayesian workflows: https://bayesflow.org/ BayesFlow Forums for users: https://discuss.bayesflow.orgBayesFlow software paper (JOSS): https://joss.theoj.org/papers/10.21105/joss.05702Tutorial on amortized Bayesian inference with BayesFlow (Python): https://colab.research.google.com/drive/1ub9SivzBI5fMbSTwVM1pABsMlRupgqRb?usp=sharing Towards Reliable Amortized Bayesian Inference: https://www.marvinschmitt.com/speaking/pdf/slides_reliable_abi_botb.pdfExpand the model space that we amortize over (multiverse analyses, power scaling, …): “Sensitivity-Aware Amortized Bayesian Inference” https://arxiv.org/abs/2310.11122Use heterogeneous data sources in amortized inference: “Fuse It or Lose It: Deep Fusion for Multimodal Simulation-Based Inference” https://arxiv.org/abs/2311.10671Use likelihood density information (explicit or even learned on the fly): “Leveraging Self-Consistency for Data-Efficient Amortized Bayesian Inference”  https://arxiv.org/abs/2310.04395LBS #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/LBS #101 Black Holes Collisions & Gravitational Waves, with LIGO Experts Christopher Berry & John Veitch: https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/Deep Learning book: https://www.deeplearningbook.org/Statistical Rethinking: https://xcelab.net/rm/TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
18 snips
May 16, 2024 • 1h 17min

#106 Active Statistics, Two Truths & a Lie, with Andrew Gelman

Andrew Gelman, an acclaimed statistician and author, discusses his new book, Active Statistics. He explores the significance of engaging teaching methods that emphasize storytelling and active participation in statistics education. Gelman critiques traditional grading systems in the U.S. and France, highlighting how cultural perspectives shape learning experiences. The conversation also delves into challenges in teaching causal analysis and the importance of innovative strategies in making Bayesian statistics accessible to all.
undefined
May 2, 2024 • 1h 15min

#105 The Power of Bayesian Statistics in Glaciology, with Andy Aschwanden & Doug Brinkerhoff

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, Andy Aschwanden and Doug Brinkerhoff tell us about their work in glaciology and the application of Bayesian statistics in studying glaciers. They discuss the use of computer models and data analysis in understanding glacier behavior and predicting sea level rise, and a lot of other fascinating topics.Andy grew up in the Swiss Alps, and studied Earth Sciences, with a focus on atmospheric and climate science and glaciology. After his PhD, Andy moved to Fairbanks, Alaska, and became involved with the Parallel Ice Sheet Model, the first open-source and openly-developed ice sheet model.His first PhD student was no other than… Doug Brinkerhoff! Doug did an MS in computer science at the University of Montana, focusing on numerical methods for ice sheet modeling, and then moved to Fairbanks to complete his PhD. While in Fairbanks, he became an ardent Bayesian after “seeing that uncertainty needs to be embraced rather than ignored”. Doug has since moved back to Montana, becoming faculty in the University of Montana’s computer science department.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero and Will Geary.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:- Computer models and data analysis play a crucial role in understanding glacier behavior and predicting sea level rise.- Reliable data, especially on ice thickness and climate forcing, are essential for accurate modeling.- The collaboration between glaciology and Bayesian statistics has led to breakthroughs in understanding glacier evolution forecasts.-There is a need for open-source packages and tools to make glaciological models more accessible. Glaciology and ice sheet modeling are complex fields that require collaboration between domain experts and data scientists.- The use of Bayesian statistics in glaciology allows for a probabilistic framework to understand and communicate uncertainty in predictions.- Real-time forecasting of glacier behavior is an exciting area of research that could provide valuable information for communities living near glaciers. -There is a need for further research in understanding existing data sets and developing simpler methods to analyze them.- The future of glaciology research lies in studying Alaskan glaciers and understanding the challenges posed by the changing Arctic environment.Chapters:00:00 Introduction and Background08:54 The Role of Statistics in Glaciology31:46 Open-Source Packages and Tools52:06 The Power of Bayesian Statistics in Glaciology01:06:34 Understanding Existing Data Sets and Developing Simpler MethodsLinks from the show:Andy’s website: https://glaciers.gi.alaska.edu/people/aschwandenDoug’s website: https://dbrinkerhoff.org/Andy on GitHub: https://github.com/aaschwanden Doug on GitHub: https://github.com/douglas-brinkerhoff/Andy on Twitter: https://twitter.com/glacierandy?lang=frAndy on Google Scholar: https://scholar.google.com/citations?user=CuvsLvMAAAAJ&hl=enDoug on Google Scholar: https://scholar.google.com/citations?user=FqU6ON8AAAAJ&hl=enLBS #64, Modeling the Climate & Gravity Waves, with Laura Mansfield: https://learnbayesstats.com/episode/64-modeling-climate-gravity-waves-laura-mansfield/Parallel Ice Sheet Model: www.pism.ioPISM on GitHub: https://github.com/pism/pismGreenland View of Three Simulated Greenland Ice Sheet Response Scenarios: https://svs.gsfc.nasa.gov/4727/TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
Apr 16, 2024 • 1h 31min

#104 Automated Gaussian Processes & Sequential Monte Carlo, with Feras Saad

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meGPs are extremely powerful…. but hard to handle. One of the bottlenecks is learning the appropriate kernel. What if you could learn the structure of GP kernels automatically? Sounds really cool, but also a bit futuristic, doesn’t it?Well, think again, because in this episode, Feras Saad will teach us how to do just that! Feras is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. He received his PhD in Computer Science from MIT, and, most importantly for our conversation, he’s the creator of AutoGP.jl, a Julia package for automatic Gaussian process modeling.Feras discusses the implementation of AutoGP, how it scales, what you can do with it, and how you can integrate its outputs in your models.Finally, Feras provides an overview of Sequential Monte Carlo and its usefulness in AutoGP, highlighting the ability of SMC to incorporate new data in a streaming fashion and explore multiple modes efficiently.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell and Gal Kampel.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:- AutoGP is a Julia package for automatic Gaussian process modeling that learns the structure of GP kernels automatically.- It addresses the challenge of making structural choices for covariance functions by using a symbolic language and a recursive grammar to infer the expression of the covariance function given the observed data.-AutoGP incorporates sequential Monte Carlo inference to handle scalability and uncertainty in structure learning.- The package is implemented in Julia using the Gen probabilistic programming language, which provides support for sequential Monte Carlo and involutive MCMC.- Sequential Monte Carlo (SMC) and inductive MCMC are used in AutoGP to infer the structure of the model.- Integrating probabilistic models with language models can improve interpretability and trustworthiness in data-driven inferences.- Challenges in Bayesian workflows include the need for automated model discovery and scalability of inference algorithms.- Future developments in probabilistic reasoning systems include unifying people around data-driven inferences and improving the scalability and configurability of inference algorithms.Chapters:00:00 Introduction to AutoGP26:28 Automatic Gaussian Process Modeling45:05 AutoGP: Automatic Discovery of Gaussian Process Model Structure53:39 Applying AutoGP to New Settings01:09:27 The Biggest Hurdle in the Bayesian Workflow01:19:14 Unifying People Around Data-Driven InferencesLinks from the show:Sign up to the Fast & Efficient Gaussian Processes modeling webinar: https://topmate.io/alex_andorra/901986Feras’ website: https://www.cs.cmu.edu/~fsaad/LBS #3.1, What is Probabilistic Programming & Why use it, with Colin Carroll: https://learnbayesstats.com/episode/3-1-what-is-probabilistic-programming-why-use-it-with-colin-carroll/LBS #3.2, How to use Bayes in industry, with Colin Carroll: https://learnbayesstats.com/episode/3-2-how-to-use-bayes-in-industry-with-colin-carroll/LBS #21, Gaussian Processes, Bayesian Neural Nets & SIR Models, with Elizaveta Semenova: https://learnbayesstats.com/episode/21-gaussian-processes-bayesian-neural-nets-sir-models-with-elizaveta-semenova/LBS #29, Model Assessment, Non-Parametric Models, And Much More, with Aki Vehtari: https://learnbayesstats.com/episode/model-assessment-non-parametric-models-aki-vehtari/LBS #63, Media Mix Models & Bayes for Marketing, with Luciano Paz: https://learnbayesstats.com/episode/63-media-mix-models-bayes-marketing-luciano-paz/LBS #83, Multilevel Regression, Post-Stratification & Electoral Dynamics, with Tarmo Jüristo: https://learnbayesstats.com/episode/83-multilevel-regression-post-stratification-electoral-dynamics-tarmo-juristo/AutoGP.jl, A Julia package for learning the covariance structure of Gaussian process time series models: https://probsys.github.io/AutoGP.jl/stable/Sequential Monte Carlo Learning for Time Series Structure Discovery: https://arxiv.org/abs/2307.09607Street Epistemlogy: https://www.youtube.com/@magnabosco210You're not so smart Podcast: https://youarenotsosmart.com/podcast/How Minds Change: https://www.davidmcraney.com/howmindschangehomeJosh Tenebaum's lectures on computational cognitive science: https://www.youtube.com/playlist?list=PLUl4u3cNGP61RTZrT3MIAikp2G5EEvTjfTranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
Apr 5, 2024 • 1h 15min

#103 Improving Sampling Algorithms & Prior Elicitation, with Arto Klami

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meChanging perspective is often a great way to solve burning research problems. Riemannian spaces are such a perspective change, as Arto Klami, an Associate Professor of computer science at the University of Helsinki and member of the Finnish Center for Artificial Intelligence, will tell us in this episode.He explains the concept of Riemannian spaces, their application in inference algorithms, how they can help sampling Bayesian models, and their similarity with normalizing flows, that we discussed in episode 98.Arto also introduces PreliZ, a tool for prior elicitation, and highlights its benefits in simplifying the process of setting priors, thus improving the accuracy of our models.When Arto is not solving mathematical equations, you’ll find him cycling, or around a good board game.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:- Riemannian spaces offer a way to improve computational efficiency and accuracy in Bayesian inference by considering the curvature of the posterior distribution.- Riemannian spaces can be used in Laplace approximation and Markov chain Monte Carlo algorithms to better model the posterior distribution and explore challenging areas of the parameter space.- Normalizing flows are a complementary approach to Riemannian spaces, using non-linear transformations to warp the parameter space and improve sampling efficiency.- Evaluating the performance of Bayesian inference algorithms in challenging cases is a current research challenge, and more work is needed to establish benchmarks and compare different methods. - PreliZ is a package for prior elicitation in Bayesian modeling that facilitates communication with users through visualizations of predictive and parameter distributions.- Careful prior specification is important, and tools like PreliZ make the process easier and more reproducible.- Teaching Bayesian machine learning is challenging due to the combination of statistical and programming concepts, but it is possible to teach the basic reasoning behind Bayesian methods to a diverse group of students.- The integration of Bayesian approaches in data science workflows is becoming more accepted, especially in industries that already use deep learning techniques.- The future of Bayesian methods in AI research may involve the development of AI assistants for Bayesian modeling and probabilistic reasoning.Chapters:00:00 Introduction and Background02:05 Arto's Work and Background06:05 Introduction to Bayesian Inference12:46 Riemannian Spaces in Bayesian Inference27:24 Availability of Romanian-based Algorithms30:20 Practical Applications and Evaluation37:33 Introduction to Prelease38:03 Prior Elicitation39:01 Predictive Elicitation Techniques39:30 PreliZ: Interface with Users40:27 PreliZ: General Purpose Tool41:55 Getting Started with PreliZ42:45 Challenges of Setting Priors45:10 Reproducibility and Transparency in Priors46:07 Integration of Bayesian Approaches in Data Science Workflows55:11 Teaching Bayesian Machine Learning01:06:13 The Future of Bayesian Methods with AI Research01:10:16 Solving the Prior Elicitation ProblemLinks from the show:LBS #29, Model Assessment, Non-Parametric Models, And Much More, with Aki Vehtari: https://learnbayesstats.com/episode/model-assessment-non-parametric-models-aki-vehtari/LBS #20 Regression and Other Stories, with Andrew Gelman, Jennifer Hill & Aki Vehtari: https://learnbayesstats.com/episode/20-regression-and-other-stories-with-andrew-gelman-jennifer-hill-aki-vehtari/LBS #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/Arto’s website: https://www.cs.helsinki.fi/u/aklami/Arto on Google Scholar: https://scholar.google.com/citations?hl=en&user=v8PeLGgAAAAJMulti-source probabilistic inference Group: https://www.helsinki.fi/en/researchgroups/multi-source-probabilistic-inferenceFCAI web page: https://fcai.fiProbabilistic AI summer school lectures: https://www.youtube.com/channel/UCcMwNzhpePJE3xzOP_3pqswKeynote: "Better priors for everyone" by Arto Klami: https://www.youtube.com/watch?v=mEmiEHsfWyc&ab_channel=ProbabilisticAISchoolVariational Inference and Optimization I by Arto Klami: https://www.youtube.com/watch?v=60USDNc1nE8&list=PLRy-VW__9hV8s--JkHXZvnd26KgjRP2ik&index=3&ab_channel=ProbabilisticAISchoolPreliZ, A tool-box for prior elicitation: https://preliz.readthedocs.io/en/latest/AISTATS paper that presents the new computationally efficient metric in context of MCMC: https://researchportal.helsinki.fi/en/publications/lagrangian-manifold-monte-carlo-on-monge-patchesTMLR paper that scales up the solution for larger models, using the metric for sampling-based inference in deel learning: https://openreview.net/pdf?id=dXAuvo6CGIRiemannian Laplace approximation (to appear in AISTATS’24): https://arxiv.org/abs/2311.02766Prior Knowledge Elicitation -- The Past, Present, and Future: https://projecteuclid.org/journals/bayesian-analysis/advance-publication/Prior-Knowledge-Elicitation-The-Past-Present-and-Future/10.1214/23-BA1381.fullTranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
5 snips
Mar 20, 2024 • 1h 9min

#102 Bayesian Structural Equation Modeling & Causal Inference in Psychometrics, with Ed Merkle

Explore Bayesian Structural Equation Modeling (SEM) in psychometrics with Ed Merkle. Learn about the significance of BSEM, challenges in model estimation, and the blavaan package in R. Discover the role of Bayesian methods in forecasting and wisdom crowdsourcing. Dive into the complexities of prior distributions, model convergence, and the development of blavaan. Gain insights on Bayesian workflow challenges, model placement in research papers, and future developments in patient psychometrics. Revolutionize math education and foster interest in STEM fields.

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode