AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Navigating Uncertainty in Statistical Analysis
This chapter explores the intricacies of statistical significance and p-values, shedding light on the emotional debates surrounding these concepts. It critiques the common reliance on weaker priors in regression analysis and reflects on the authors' own efforts to reconcile classical and Bayesian statistical methods.
Once upon a time, there was an enchanted book filled with hundreds of little plots, applied examples and linear regressions — the prettiest creature that was ever seen. Its authors were excessively fond of it, and its readers loved it even more. This magical book had a nice blue cover made for it, and everybody aptly called it « Regression and other Stories »!
As every good fairy tale, this one had its share of villains — the traps where statistical methods fall and fail you; the terrible confounders, lurking in the dark; the ill-measured data that haunt your inferences! But once you defeat these monsters, you’ll be able to think about, build and interpret regression models.
This episode will be filled with stories — stories about linear regressions! Here to narrate these marvelous statistical adventures are Andrew Gelman, Jennifer Hill and Aki Vehtari — the authors of the brand new Regression and other Stories.
Andrew is a professor of statistics and political science at Columbia University. Jennifer is a professor of applied statistics at NYU. She develops methods to answer causal questions related to policy research and scientific development. Aki is an associate professor in computational probabilistic modeling at Aalto University, Finland.
In this episode, they tell us why they wrote this book, who it is for and they also give us their 10 tips to improve your regression modeling! We also talked about the limits of regression and about going to Mars…
Other good news: until October 31st 2020, you can go to http://www.cambridge.org/wm-ecommerce-web/academic/landingPage/GoodBayesian2020 and buy the book with a 20% discount by entering the promo code “GoodBayesian2020” upon checkout!
That way, you’ll make up your own stories before going to sleep and dream of a world where we can easily generalize from sample to population, and where multilevel regression with poststratification is a bliss…
Our theme music is « Good Bayesian », by Baba Brinkman (feat MC Lars and Mega Ran). Check out his awesome work at https://bababrinkman.com/ !
Links from the show:
Thank you to my Patrons for making this episode possible!
Yusuke Saito, Avi Bryant, Ero Carrera, Brian Huey, Giuliano Cruz, Tim Gasser, James Wade, Tradd Salvo, Adam Bartonicek, William Benton, Alan O'Donnell, Mark Ormsby, Demetri Pananos, James Ahloy, Jon Berezowski, Robin Taylor, Thomas Wiecki, Chad Scherrer, Vincent Arel-Bundock, Nathaniel Neitzke, Zwelithini Tunyiswa, Elea McDonnell Feit, Bertrand Wilden, James Thompson, Stephen Oates, Gian Luca Di Tanna, Jack Wells, Matthew Maldonado, Ian Costley, Ally Salim, Larry Gill, Joshua Duncan, Ian Moran and Paul Oreto.
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode