Learning Bayesian Statistics cover image

Learning Bayesian Statistics

Latest episodes

undefined
Sep 24, 2020 • 57min

#24 Bayesian Computational Biology in Julia, with Seth Axen

Do you know what proteins are, what they do and why they are useful? Well, be prepared to be amazed! In this episode, Seth Axen will tell us about the fascinating world of protein structures and computational biology, and how his work of Bayesian modeler fits into that!Passionate about mathematics and statistics, Seth is finishing a PhD in bioinformatics at the Sali Lab of the University of California, San Francisco (UCSF). His research interests span the broad field of computational biology: using computer science, mathematics, and statistics to understand biological systems. His current research focuses on inferring protein structural ensembles. Open source development is also very dear to his heart, and indeed he contributes to many open source packages, especially in the Julia ecosystem. In particular, he develops and maintains ArviZ.jl, the Julia port of ArviZ, a platform-agnostic python package to visualize and diagnose your Bayesian models. Seth will tell us how he became involved in ArviZ.jl, what its strengths and weaknesses are, and how it fits into the Julia probabilistic programming landscape.Ow, and as a bonus, you’ll discover why Seth is such a fan of automatic differentiation, aka « autodiff » — I actually wanted to edit this part out but Seth strongly insisted I kept it. Just kidding of course — or, am I… ?Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Links from the show:Seth website: http://sethaxen.com/Seth on Twitter: https://twitter.com/sethaxenSeth on GitHub: https://github.com/sethaxenArviZ.jl -- Exploratory analysis of Bayesian models in Julia: https://arviz-devs.github.io/ArviZ.jl/dev/PyCon2020 -- Colin Carroll -- Getting started with automatic differentiation: https://www.youtube.com/watch?v=NG21KWZSiokThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
undefined
Sep 10, 2020 • 59min

#23 Bayesian Stats in Business and Marketing Analytics, with Elea McDonnel Feit

If you’ve studied at a business school, you probably didn’t attend any Bayesian stats course there. Well this isn’t like that in every business schools! Elea McDonnel Feit does integrate Bayesian methods into her teaching at the business school of Drexel University, in Philadelphia, US. Elea is an Assistant Professor of Marketing at Drexel, and in this episode she’ll tell us which methods are the most useful in marketing analytics, and why.Indeed, Elea develops data analysis methods to inform marketing decisions, such as designing new products and planning advertising campaigns. Often faced with missing, unmatched or aggregated data, she uses MCMC sampling, hierarchical models and decision theory to decipher all this.After an MS in Industrial Engineering at Lehigh University and a PhD in Marketing at the University of Michigan, Elea worked on product design at General Motors and was most recently the Executive Director of the Wharton Customer Analytics Initiative.Thanks to all these experiences, Elea loves teaching marketing analytics and Bayesian and causal inference at all levels. She even wrote the book R for Marketing Research and Analytics with Chris Chapman, at Springer Press.In summary, I think you’ll be pretty surprised by how Bayesian the world of marketing is…Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Links from the show:Elea's website: http://eleafeit.com/R for Marketing Research and Analytics: http://r-marketing.r-forge.r-project.org/Elea's Tutorials & Online Courses: http://eleafeit.com/teaching/Elea on Twitter: https://twitter.com/eleafeitElea on GitHub: https://github.com/eleafeitTutorial on Conjoint Analysis in R: https://github.com/ksvanhorn/ART-Forum-2017-Stan-TutorialTest & Roll app: https://testandroll.shinyapps.io/testandroll/Test & Roll Paper -- Profit-Maximizing A/B Tests: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3274875Principal Stratification for Advertising Experiments: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3140631CausalImpact R package: https://google.github.io/CausalImpact/CausalImpact.htmlChapter on Data Fusion in marketing: https://link.springer.com/referenceworkentry/10.1007/978-3-319-05542-8_9-1Statistical Analysis with Missing Data (Little & Rubin): https://onlinelibrary.wiley.com/doi/book/10.1002/9781119013563R-Ladies Philly YouTube channel: https://www.youtube.com/channel/UCPque9BaFV9p0hcgImrYBzgThank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
undefined
Aug 26, 2020 • 1h 7min

#22 Eliciting Priors and Doing Bayesian Inference at Scale, with Avi Bryant

If, like me, you’ve been stuck in a 40 square-meter apartment for two months, you’re going to be pretty jealous of Avi Bryant. Indeed, Avi lives on Galiano Island, Canada, not very far from Vancouver, surrounded by forest, overlooking the Salish Sea. In this natural and beautiful — although slightly deer-infested — spot, Avi runs The Gradient Retreat Center, a place where writers, makers, and code writers can take a week away from their regular lives and focus on creative work. But it’s not only to envy him that I invited Avi on the show — it’s to talk about Bayesian inference in Scala, prior elicitation, how to deploy Bayesian methods at scale, and how to enable Bayesian inference for engineers. While working at Stripe, Avi wrote Rainier, a Bayesian inference framework for Scala. Inference is based on variants of the Hamiltonian Monte Carlo sampler, and the implementation is similar to, and targets the same types of models as both Stan and PyMC3. As Avi says, depending on your background, you might think of Rainier as aspiring to be either "Stan, but on the JVM", or "TensorFlow, but for small data".In this episode, Avi will tell us how Rainier came into life, how it fits into the probabilistic programming landscape, and what its main strengths and weaknesses are.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Links from the show:Avi on Twitter: https://twitter.com/avibryantAvi on GitHub: https://github.com/avibryantRainier -- Bayesian Inference in Scala: https://rainier.fit/The Gradient Retreat: https://gradientretreat.com/Facebook's Prophet: https://facebook.github.io/prophet/BAyesian Model-Building Interface (Bambi) in Python: https://bambinos.github.io/bambi/BRMS -- Bayesian regression models using Stan: https://paul-buerkner.github.io/brms/Using Bayesian Decision Making to Optimize Supply Chains -- Thomas Wiecki & Ravin Kumar: https://twiecki.io/blog/2019/01/14/supply_chain/Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
undefined
Aug 13, 2020 • 1h 2min

#21 Gaussian Processes, Bayesian Neural Nets & SIR Models, with Elizaveta Semenova

Elizaveta Semenova, a postdoctorate in Bayesian Machine Learning, discusses her work on Gaussian Processes for studying the spread of Malaria and fitting dose-response curves in pharmaceutical tests. She also talks about her latest paper on Bayesian neural networks for drug toxicity prediction and the interesting link between BNNs and Gaussian Processes.
undefined
17 snips
Jul 30, 2020 • 1h 4min

#20 Regression and Other Stories, with Andrew Gelman, Jennifer Hill & Aki Vehtari

Join Andrew Gelman, a statistics and political science professor at Columbia, Jennifer Hill from NYU specializing in causal questions, and Aki Vehtari, an expert in computational modeling from Aalto University, as they dive into the enchanting world of regression analysis. They share insights on their writing journey, offer ten tips to enhance regression modeling, tackle the challenges of statistical significance, and reveal the power of storytelling in data education. Plus, there's a whimsical discussion about exploring Mars!
undefined
Jul 3, 2020 • 1h

#19 Turing, Julia and Bayes in Economics, with Cameron Pfiffer

Do you know Turing? Of course you do! With Soss and Gen, it’s one of the blockbusters to do probabilistic programming in Julia. And in this episode Cameron Pfiffer will tell us all about it — how it came to life, how it fits into the probabilistic programming landscape, and what its main strengths and weaknesses are.Cameron did some Rust, some Python, but he especially loves coding in Julia. That’s also why he’s one of the core-developers of Turing.jl. He’s also a PhD student in finance at the University of Oregon and did his master’s in finance at the University of Reading. His interests are pretty broad, from cryptocurrencies, algorithmic and high-frequency trading, to AI in financial markets and anomaly detection – in a nutshell he’s a fan of topics where technology is involved.As he’s the first economist to come to the show, I also asked him how Bayesian the field of economics is, why he thinks economics is quite unique among the social sciences, and how economists think about causality — I later learned that this topic is pretty controversial!Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Links from the show:Bayesian Econometrics on Cameron's Blog: http://cameron.pfiffer.org/2020/03/24/bayesian-econometrics/Cameron on Twitter: https://twitter.com/cameron_pfifferCameron on GitHub: https://github.com/cpfifferTuring.jl -- Bayesian inference in Julia: https://turing.ml/dev/Gen.jl -- Programmable inference embedded in Julia: https://www.gen.dev/Soss.jl -- Probabilistic programming via source rewriting: https://github.com/cscherrer/Soss.jlThe Julia Language -- A fresh approach to technical computing: https://julialang.org/What is Probabilistic Programming -- Cornell University: http://adriansampson.net/doc/ppl.htmlMostly Harmless Econometrics Book: http://www.mostlyharmlesseconometrics.com/Thank you to my Patrons for making this episode possible!Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
undefined
Jun 26, 2020 • 8min

#SpecialAnnouncement: Patreon Launched!

I hope you’re all safe! Some of you also asked me if I had set up a Patreon so that they could help support the show, and that’s why I’m sending this short special episode your way today. I had thought about that, but I wasn’t sure there was a demand for this. Apparently, there is one — at least a small one — so, first, I wanna thank you and say how grateful I am to be in a community that values this kind of work!The Patreon page is now live at patreon.com/learnbayesstats. It starts as low as 3€ and you can pick from 4 different tiers:"Maximum A Posteriori" (3€): Join the Slack, where you can ask questions about the show, discuss with like-minded Bayesians and meet them in-person when you travel the world."Full Posterior" (5€): Previous tier + Your name in all the show notes, and I'll express my gratitude to you in the first episode to go out after your contribution. You also get early access to the special episodes. -- that I'll make at an irregular pace and will include panel discussions, book releases, live shows, etc."Principled Bayesian" (20€): Previous tiers + Every 2 months, I'll ask my guest two questions voted-on by "Principled Bayesians". I'll probably do that with a poll in the Slack channel, which will be only answered by the "Principled Bayesians" and of these questions, I will ask the top 2 every two months on the show. "Good Bayesian" (200€, only 8 spots): Previous tiers + Every 2 months, you can come on the show and you ask one question to the guest without a vote. So that's why I can't have too many people in that tier.Before telling you the best part: I already have a lot of ideas for exclusive content and options. I first need to see whether you're as excited as I am about it. If I see you are, I'll be able to add new perks to the tiers! So give me your feedback about the current tiers or any benefits you'd like to see there... but don't see yet! BTW, you have a new way to do that now: sending me voice messages at anchor.fm/learn-bayes-stats/message!Now, the icing on the cake: until July 31st, if you choose the "Full Posterior" tier (5$) or higher, you get early access to the very special episode I'm planning with Andrew Gelman, Jennifer Hill and Aki Vehtari about their upcoming book, "Regression and other stories". To top it off, there will be a promo code in the episode to buy the book at a discount price — now, that is an offer you can't turn down!Alright, that is it for today — I hope you’re as excited as I am for this new stage in the podcast’s life! Please keep the emails, the tweets, the voice messages, the carrier pigeons coming with your feedback, questions and suggestions.In the meantime, take care and I’ll see you in the next episode — episode 19, with Cameron Pfiffer, who’s the first economist to come on the show and who’s a core-developer of Turing.jl. We’re gonna talk about the Julia probabilistic programming landscape, Bayes in economics and causality — it’s gonna be fun ;) Again, patreon.com/learnbayesstats if you want to support the show and unlock some nice perks. Thanks again, I am very grateful for any support you can bring me!Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Links from the show:LBS Patreon page: patreon.com/learnbayesstatsSend me voice messages: anchor.fm/learn-bayes-stats/message---Send in a voice message: https://anchor.fm/learn-bayes-stats/message
undefined
Jun 18, 2020 • 58min

#18 How to ask good Research Questions and encourage Open Science, with Daniel Lakens

Daniel Lakens, an experimental psychologist at Eindhoven University of Technology, dives into the art of crafting effective research questions and experimental designs. He sheds light on the importance of open science and how it can reshape funding and publishing practices. The discussion also tackles the ongoing reproducibility crisis in psychology and the value of acknowledging flawed research. Lakens champions transparency and collaboration, advocating for better statistical education to enhance the credibility of scientific findings.
undefined
Jun 4, 2020 • 52min

#17 Reparametrize Your Models Automatically, with Maria Gorinova

Have you already encountered a model that you know is scientifically sound, but that MCMC just wouldn’t run? The model would take forever to run — if it ever ran — and you would be greeted with a lot of divergences in the end. Yeah, I know, my stress levels start raising too whenever I hear the word « divergences »…Well, you’ll be glad to hear there are tricks to make these models run, and one of these tricks is called re-parametrization — I bet you already heard about the poorly-named non-centered parametrization?Well fear no more! In this episode, Maria Gorinova will tell you all about these model re-parametrizations! Maria is a PhD student in Data Science & AI at the University of Edinburgh. Her broad interests range from programming languages and verification, to machine learning and human-computer interaction. More specifically, Maria is interested in probabilistic programming languages, and in exploring ways of applying program-analysis techniques to existing PPLs in order to improve usability of the language or efficiency of inference.As you’ll hear in the episode, she thinks a lot about the language aspect of probabilistic programming, and works on the automation of various “tricks” in probabilistic programming: automatic re-parametrization, automatic marginalization, automatic and efficient model-specific inference.As Maria also has experience with several PPLs like Stan, Edward2 and TensorFlow Probability, she’ll tell us what she thinks a good PPL design requires, and what the future of PPLs looks like to her.Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !Links from the show:Maria on the Web: http://homepages.inf.ed.ac.uk/s1207807/index.htmlMaria on Twitter: https://twitter.com/migorinovaMaria on GitHub: https://github.com/mgorinovaAutomatic Reparameterisation of Probabilistic Programs (Maria's paper with Dave Moore and Matthew Hoffman): https://arxiv.org/abs/1906.03028Stan User's Guide on Reparameterization: https://mc-stan.org/docs/2_23/stan-users-guide/reparameterization-section.htmlHMC for hierarchical models -- Background on reparameterization: https://arxiv.org/abs/1312.0906NeuTra -- Automatic reparameterization: https://arxiv.org/abs/1903.03704Edward2 -- A library for probabilistic modeling, inference, and criticism: http://edwardlib.org/Pyro -- Automatic reparameterization and marginalization: https://pyro.ai/Gen -- Programmable inference: http://probcomp.csail.mit.edu/software/gen/TensorFlow Probability: https://www.tensorflow.org/probability/
undefined
May 21, 2020 • 1h 8min

#16 Bayesian Statistics the Fun Way, with Will Kurt

Will Kurt, lead Data Scientist at Hopper, shares insights on Bayesian statistics, his journey from a Boston librarian to a data scientist, and the value of Bayesian inference. He discusses the mind projection fallacy, logistic regression, upcoming plans, and promoting critical thinking in society.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner