Learning Bayesian Statistics cover image

Learning Bayesian Statistics

Latest episodes

undefined
18 snips
May 16, 2024 • 1h 17min

#106 Active Statistics, Two Truths & a Lie, with Andrew Gelman

Andrew Gelman, an acclaimed statistician and author, discusses his new book, Active Statistics. He explores the significance of engaging teaching methods that emphasize storytelling and active participation in statistics education. Gelman critiques traditional grading systems in the U.S. and France, highlighting how cultural perspectives shape learning experiences. The conversation also delves into challenges in teaching causal analysis and the importance of innovative strategies in making Bayesian statistics accessible to all.
undefined
May 2, 2024 • 1h 15min

#105 The Power of Bayesian Statistics in Glaciology, with Andy Aschwanden & Doug Brinkerhoff

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, Andy Aschwanden and Doug Brinkerhoff tell us about their work in glaciology and the application of Bayesian statistics in studying glaciers. They discuss the use of computer models and data analysis in understanding glacier behavior and predicting sea level rise, and a lot of other fascinating topics.Andy grew up in the Swiss Alps, and studied Earth Sciences, with a focus on atmospheric and climate science and glaciology. After his PhD, Andy moved to Fairbanks, Alaska, and became involved with the Parallel Ice Sheet Model, the first open-source and openly-developed ice sheet model.His first PhD student was no other than… Doug Brinkerhoff! Doug did an MS in computer science at the University of Montana, focusing on numerical methods for ice sheet modeling, and then moved to Fairbanks to complete his PhD. While in Fairbanks, he became an ardent Bayesian after “seeing that uncertainty needs to be embraced rather than ignored”. Doug has since moved back to Montana, becoming faculty in the University of Montana’s computer science department.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell, Gal Kampel, Adan Romero and Will Geary.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:- Computer models and data analysis play a crucial role in understanding glacier behavior and predicting sea level rise.- Reliable data, especially on ice thickness and climate forcing, are essential for accurate modeling.- The collaboration between glaciology and Bayesian statistics has led to breakthroughs in understanding glacier evolution forecasts.-There is a need for open-source packages and tools to make glaciological models more accessible. Glaciology and ice sheet modeling are complex fields that require collaboration between domain experts and data scientists.- The use of Bayesian statistics in glaciology allows for a probabilistic framework to understand and communicate uncertainty in predictions.- Real-time forecasting of glacier behavior is an exciting area of research that could provide valuable information for communities living near glaciers. -There is a need for further research in understanding existing data sets and developing simpler methods to analyze them.- The future of glaciology research lies in studying Alaskan glaciers and understanding the challenges posed by the changing Arctic environment.Chapters:00:00 Introduction and Background08:54 The Role of Statistics in Glaciology31:46 Open-Source Packages and Tools52:06 The Power of Bayesian Statistics in Glaciology01:06:34 Understanding Existing Data Sets and Developing Simpler MethodsLinks from the show:Andy’s website: https://glaciers.gi.alaska.edu/people/aschwandenDoug’s website: https://dbrinkerhoff.org/Andy on GitHub: https://github.com/aaschwanden Doug on GitHub: https://github.com/douglas-brinkerhoff/Andy on Twitter: https://twitter.com/glacierandy?lang=frAndy on Google Scholar: https://scholar.google.com/citations?user=CuvsLvMAAAAJ&hl=enDoug on Google Scholar: https://scholar.google.com/citations?user=FqU6ON8AAAAJ&hl=enLBS #64, Modeling the Climate & Gravity Waves, with Laura Mansfield: https://learnbayesstats.com/episode/64-modeling-climate-gravity-waves-laura-mansfield/Parallel Ice Sheet Model: www.pism.ioPISM on GitHub: https://github.com/pism/pismGreenland View of Three Simulated Greenland Ice Sheet Response Scenarios: https://svs.gsfc.nasa.gov/4727/TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
Apr 16, 2024 • 1h 31min

#104 Automated Gaussian Processes & Sequential Monte Carlo, with Feras Saad

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meGPs are extremely powerful…. but hard to handle. One of the bottlenecks is learning the appropriate kernel. What if you could learn the structure of GP kernels automatically? Sounds really cool, but also a bit futuristic, doesn’t it?Well, think again, because in this episode, Feras Saad will teach us how to do just that! Feras is an Assistant Professor in the Computer Science Department at Carnegie Mellon University. He received his PhD in Computer Science from MIT, and, most importantly for our conversation, he’s the creator of AutoGP.jl, a Julia package for automatic Gaussian process modeling.Feras discusses the implementation of AutoGP, how it scales, what you can do with it, and how you can integrate its outputs in your models.Finally, Feras provides an overview of Sequential Monte Carlo and its usefulness in AutoGP, highlighting the ability of SMC to incorporate new data in a streaming fashion and explore multiple modes efficiently.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser, Julio, Edvin Saveljev, Frederick Ayala, Jeffrey Powell and Gal Kampel.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:- AutoGP is a Julia package for automatic Gaussian process modeling that learns the structure of GP kernels automatically.- It addresses the challenge of making structural choices for covariance functions by using a symbolic language and a recursive grammar to infer the expression of the covariance function given the observed data.-AutoGP incorporates sequential Monte Carlo inference to handle scalability and uncertainty in structure learning.- The package is implemented in Julia using the Gen probabilistic programming language, which provides support for sequential Monte Carlo and involutive MCMC.- Sequential Monte Carlo (SMC) and inductive MCMC are used in AutoGP to infer the structure of the model.- Integrating probabilistic models with language models can improve interpretability and trustworthiness in data-driven inferences.- Challenges in Bayesian workflows include the need for automated model discovery and scalability of inference algorithms.- Future developments in probabilistic reasoning systems include unifying people around data-driven inferences and improving the scalability and configurability of inference algorithms.Chapters:00:00 Introduction to AutoGP26:28 Automatic Gaussian Process Modeling45:05 AutoGP: Automatic Discovery of Gaussian Process Model Structure53:39 Applying AutoGP to New Settings01:09:27 The Biggest Hurdle in the Bayesian Workflow01:19:14 Unifying People Around Data-Driven InferencesLinks from the show:Sign up to the Fast & Efficient Gaussian Processes modeling webinar: https://topmate.io/alex_andorra/901986Feras’ website: https://www.cs.cmu.edu/~fsaad/LBS #3.1, What is Probabilistic Programming & Why use it, with Colin Carroll: https://learnbayesstats.com/episode/3-1-what-is-probabilistic-programming-why-use-it-with-colin-carroll/LBS #3.2, How to use Bayes in industry, with Colin Carroll: https://learnbayesstats.com/episode/3-2-how-to-use-bayes-in-industry-with-colin-carroll/LBS #21, Gaussian Processes, Bayesian Neural Nets & SIR Models, with Elizaveta Semenova: https://learnbayesstats.com/episode/21-gaussian-processes-bayesian-neural-nets-sir-models-with-elizaveta-semenova/LBS #29, Model Assessment, Non-Parametric Models, And Much More, with Aki Vehtari: https://learnbayesstats.com/episode/model-assessment-non-parametric-models-aki-vehtari/LBS #63, Media Mix Models & Bayes for Marketing, with Luciano Paz: https://learnbayesstats.com/episode/63-media-mix-models-bayes-marketing-luciano-paz/LBS #83, Multilevel Regression, Post-Stratification & Electoral Dynamics, with Tarmo Jüristo: https://learnbayesstats.com/episode/83-multilevel-regression-post-stratification-electoral-dynamics-tarmo-juristo/AutoGP.jl, A Julia package for learning the covariance structure of Gaussian process time series models: https://probsys.github.io/AutoGP.jl/stable/Sequential Monte Carlo Learning for Time Series Structure Discovery: https://arxiv.org/abs/2307.09607Street Epistemlogy: https://www.youtube.com/@magnabosco210You're not so smart Podcast: https://youarenotsosmart.com/podcast/How Minds Change: https://www.davidmcraney.com/howmindschangehomeJosh Tenebaum's lectures on computational cognitive science: https://www.youtube.com/playlist?list=PLUl4u3cNGP61RTZrT3MIAikp2G5EEvTjfTranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
Apr 5, 2024 • 1h 15min

#103 Improving Sampling Algorithms & Prior Elicitation, with Arto Klami

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meChanging perspective is often a great way to solve burning research problems. Riemannian spaces are such a perspective change, as Arto Klami, an Associate Professor of computer science at the University of Helsinki and member of the Finnish Center for Artificial Intelligence, will tell us in this episode.He explains the concept of Riemannian spaces, their application in inference algorithms, how they can help sampling Bayesian models, and their similarity with normalizing flows, that we discussed in episode 98.Arto also introduces PreliZ, a tool for prior elicitation, and highlights its benefits in simplifying the process of setting priors, thus improving the accuracy of our models.When Arto is not solving mathematical equations, you’ll find him cycling, or around a good board game.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:- Riemannian spaces offer a way to improve computational efficiency and accuracy in Bayesian inference by considering the curvature of the posterior distribution.- Riemannian spaces can be used in Laplace approximation and Markov chain Monte Carlo algorithms to better model the posterior distribution and explore challenging areas of the parameter space.- Normalizing flows are a complementary approach to Riemannian spaces, using non-linear transformations to warp the parameter space and improve sampling efficiency.- Evaluating the performance of Bayesian inference algorithms in challenging cases is a current research challenge, and more work is needed to establish benchmarks and compare different methods. - PreliZ is a package for prior elicitation in Bayesian modeling that facilitates communication with users through visualizations of predictive and parameter distributions.- Careful prior specification is important, and tools like PreliZ make the process easier and more reproducible.- Teaching Bayesian machine learning is challenging due to the combination of statistical and programming concepts, but it is possible to teach the basic reasoning behind Bayesian methods to a diverse group of students.- The integration of Bayesian approaches in data science workflows is becoming more accepted, especially in industries that already use deep learning techniques.- The future of Bayesian methods in AI research may involve the development of AI assistants for Bayesian modeling and probabilistic reasoning.Chapters:00:00 Introduction and Background02:05 Arto's Work and Background06:05 Introduction to Bayesian Inference12:46 Riemannian Spaces in Bayesian Inference27:24 Availability of Romanian-based Algorithms30:20 Practical Applications and Evaluation37:33 Introduction to Prelease38:03 Prior Elicitation39:01 Predictive Elicitation Techniques39:30 PreliZ: Interface with Users40:27 PreliZ: General Purpose Tool41:55 Getting Started with PreliZ42:45 Challenges of Setting Priors45:10 Reproducibility and Transparency in Priors46:07 Integration of Bayesian Approaches in Data Science Workflows55:11 Teaching Bayesian Machine Learning01:06:13 The Future of Bayesian Methods with AI Research01:10:16 Solving the Prior Elicitation ProblemLinks from the show:LBS #29, Model Assessment, Non-Parametric Models, And Much More, with Aki Vehtari: https://learnbayesstats.com/episode/model-assessment-non-parametric-models-aki-vehtari/LBS #20 Regression and Other Stories, with Andrew Gelman, Jennifer Hill & Aki Vehtari: https://learnbayesstats.com/episode/20-regression-and-other-stories-with-andrew-gelman-jennifer-hill-aki-vehtari/LBS #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/Arto’s website: https://www.cs.helsinki.fi/u/aklami/Arto on Google Scholar: https://scholar.google.com/citations?hl=en&user=v8PeLGgAAAAJMulti-source probabilistic inference Group: https://www.helsinki.fi/en/researchgroups/multi-source-probabilistic-inferenceFCAI web page: https://fcai.fiProbabilistic AI summer school lectures: https://www.youtube.com/channel/UCcMwNzhpePJE3xzOP_3pqswKeynote: "Better priors for everyone" by Arto Klami: https://www.youtube.com/watch?v=mEmiEHsfWyc&ab_channel=ProbabilisticAISchoolVariational Inference and Optimization I by Arto Klami: https://www.youtube.com/watch?v=60USDNc1nE8&list=PLRy-VW__9hV8s--JkHXZvnd26KgjRP2ik&index=3&ab_channel=ProbabilisticAISchoolPreliZ, A tool-box for prior elicitation: https://preliz.readthedocs.io/en/latest/AISTATS paper that presents the new computationally efficient metric in context of MCMC: https://researchportal.helsinki.fi/en/publications/lagrangian-manifold-monte-carlo-on-monge-patchesTMLR paper that scales up the solution for larger models, using the metric for sampling-based inference in deel learning: https://openreview.net/pdf?id=dXAuvo6CGIRiemannian Laplace approximation (to appear in AISTATS’24): https://arxiv.org/abs/2311.02766Prior Knowledge Elicitation -- The Past, Present, and Future: https://projecteuclid.org/journals/bayesian-analysis/advance-publication/Prior-Knowledge-Elicitation-The-Past-Present-and-Future/10.1214/23-BA1381.fullTranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
12 snips
Mar 20, 2024 • 1h 9min

#102 Bayesian Structural Equation Modeling & Causal Inference in Psychometrics, with Ed Merkle

Explore Bayesian Structural Equation Modeling (SEM) in psychometrics with Ed Merkle. Learn about the significance of BSEM, challenges in model estimation, and the blavaan package in R. Discover the role of Bayesian methods in forecasting and wisdom crowdsourcing. Dive into the complexities of prior distributions, model convergence, and the development of blavaan. Gain insights on Bayesian workflow challenges, model placement in research papers, and future developments in patient psychometrics. Revolutionize math education and foster interest in STEM fields.
undefined
Mar 16, 2024 • 12min

How to find black holes with Bayesian inference

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/ Watch the interview: https://www.youtube.com/watch?v=ZaZwCcrJlikOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
undefined
Mar 14, 2024 • 9min

How can we even hear gravitational waves?

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meListen to the full episode: https://learnbayesstats.com/episode/101-black-holes-collisions-gravitational-waves-ligo-experts-christopher-berry-john-veitch/ Watch the interview: https://www.youtube.com/watch?v=ZaZwCcrJlikOur theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)
undefined
Mar 7, 2024 • 1h 10min

#101 Black Holes Collisions & Gravitational Waves, with LIGO Experts Christopher Berry & John Veitch

Proudly sponsored by PyMC Labs, the Bayesian Consultancy. Book a call, or get in touch!My Intuitive Bayes Online Courses1:1 Mentorship with meIn this episode, we dive deep into gravitational wave astronomy, with Christopher Berry and John Veitch, two senior lecturers at the University of Glasgow and experts from the LIGO-VIRGO collaboration. They explain the significance of detecting gravitational waves, which are essential for understanding black holes and neutron stars collisions. This research not only sheds light on these distant events but also helps us grasp the fundamental workings of the universe.Our discussion focuses on the integral role of Bayesian statistics, detailing how they use nested sampling for extracting crucial information from the subtle signals of gravitational waves. This approach is vital for parameter estimation and understanding the distribution of cosmic sources through population inferences.Concluding the episode, Christopher and John highlight the latest advancements in black hole astrophysics and tests of general relativity, and touch upon the exciting prospects and challenges of the upcoming space-based LISA mission.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, William Benton, James Ahloy, Robin Taylor,, Chad Scherrer, Zwelithini Tunyiswa, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Ian Moran, Paul Oreto, Colin Caprani, Colin Carroll, Nathaniel Burbank, Michael Osthege, Rémi Louf, Clive Edelsten, Henri Wallen, Hugo Botha, Vinh Nguyen, Marcin Elantkowski, Adam C. Smith, Will Kurt, Andrew Moskowitz, Hector Munoz, Marco Gorelli, Simon Kessell, Bradley Rode, Patrick Kelley, Rick Anderson, Casper de Bruin, Philippe Labonde, Michael Hankin, Cameron Smith, Tomáš Frýda, Ryan Wesslen, Andreas Netti, Riley King, Yoshiyuki Hamajima, Sven De Maeyer, Michael DeCrescenzo, Fergal M, Mason Yahr, Naoya Kanai, Steven Rowland, Aubrey Clayton, Jeannine Sue, Omri Har Shemesh, Scott Anthony Robson, Robert Yolken, Or Duek, Pavel Dusek, Paul Cox, Andreas Kröpelin, Raphaël R, Nicolas Rode, Gabriel Stechschulte, Arkady, Kurt TeKolste, Gergely Juhasz, Marcus Nölke, Maggi Mackintosh, Grant Pezzolesi, Avram Aelony, Joshua Meehl, Javier Sabio, Kristian Higgins, Alex Jones, Gregorio Aguilar, Matt Rosinski, Bart Trudeau, Luis Fonseca, Dante Gates, Matt Niccolls, Maksim Kuznecov, Michael Thomas, Luke Gorrie, Cory Kiser and Julio.Visit https://www.patreon.com/learnbayesstats to unlock exclusive Bayesian swag ;)Takeaways:  ⁃    Gravitational wave analysis involves using Bayesian statistics for parameter estimation and population inference.    ⁃    Nested sampling is a powerful algorithm used in gravitational wave analysis to explore parameter space and calculate the evidence for model selection.    ⁃    Machine learning techniques, such as normalizing flows, can be integrated with nested sampling to improve efficiency and explore complex distributions.    ⁃    The LIGO-VIRGO collaboration operates gravitational wave detectors that measure distortions in space and time caused by black hole and neutron star collisions.    ⁃    Sources of noise in gravitational wave detection include laser noise, thermal noise, seismic motion, and gravitational coupling.    ⁃    The LISA mission is a space-based gravitational wave detector that aims to observe lower frequency gravitational waves and unlock new astrophysical phenomena.    ⁃    Space-based detectors like LISA can avoid the ground-based noise and observe a different part of the gravitational wave spectrum, providing new insights into the universe.    ⁃    The data analysis challenges for space-based detectors are complex, as they require fitting multiple sources simultaneously and dealing with overlapping signals.    ⁃    Gravitational wave observations have the potential to test general relativity, study the astrophysics of black holes and neutron stars, and provide insights into cosmology.Links from the show:Christopher’s’ website: https://cplberry.com/John’s website: https://www.veitch.me.uk/Christopher on GitHub: https://github.com/cplb/ John on GitHub: https://github.com/johnveitchChristopher on Linkedin: http://www.linkedin.com/in/cplberry John on Linkedin: https://www.linkedin.com/in/john-veitch-56772225/Christopher on Twitter: https://twitter.com/cplberryJohn on Twitter: https://twitter.com/johnveitchChristopher on Mastodon: https://mastodon.scot/@cplberry@mastodon.online John on Mastodon: https://mastodon.scot/@JohnVeitchLIGO website: https://www.ligo.org/LIGO Gitlab: https://git.ligo.org/users/sign_inGravitational Wave Open Science Center: https://gwosc.org/LIGO Caltech Lab: https://www.ligo.caltech.edu/page/ligo-dataExoplanet, python package for probabilistic modeling of time series data in astronomy: https://docs.exoplanet.codes/en/latest/Dynamic Nested Sampling with dynesty: https://dynesty.readthedocs.io/en/latest/dynamic.htmlNessai, Nested sampling with artificial intelligence: https://nessai.readthedocs.io/LBS #98 Fusing Statistical Physics, Machine Learning & Adaptive MCMC, with Marylou Gabrié: https://learnbayesstats.com/episode/98-fusing-statistical-physics-machine-learning-adaptive-mcmc-marylou-gabrie/bayeux, JAX models with state-of-the-art inference methods: https://jax-ml.github.io/bayeux/LBS #51 Bernoulli’s Fallacy & the Crisis of Modern Science, with Aubrey Clayton: https://learnbayesstats.com/episode/51-bernoullis-fallacy-crisis-modern-science-aubrey-clayton/Aubrey Clayton's Probability Theory Lectures based on E.T Jaynes book: https://www.youtube.com/playlist?list=PL9v9IXDsJkktefQzX39wC2YG07vw7DsQ_TranscriptThis is an automatic transcript and may therefore contain errors. Please get in touch if you're willing to correct them.
undefined
Mar 1, 2024 • 11min

The Role of Variational Inference in Reactive Message Passing

Exploring variational inference in reactive message passing for continuous posterior updates, efficacy in teaching statistics, commercializing research tools for industrial use, and trade-offs in patient inference architecture for real-time signal processing applications.
undefined
Feb 28, 2024 • 9min

Reactive Message Passing in Bayesian Inference

Exploring reactive message passing in Bayesian inference for real-time data scenarios with unknown structures, including applications like denoising speech and real-time position tracking systems. Discussing the unique characteristics of RX Infar, a Bayesian inference tool inspired by the free energy principle, and the efficiency and speed advantages of using Julia programming language.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app