
Learning Bayesian Statistics #78 Exploring MCMC Sampler Algorithms, with Matt D. Hoffman
7 snips
Mar 1, 2023 Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
Introduction
00:00 • 3min
Learning Bayesian Statistics - Matt Huffman
02:51 • 4min
Late and Dirichlet Allocation - What About You?
06:39 • 4min
McMc Methods That Take Advantage of Hardware Acceleration
10:22 • 6min
What's the Common Threat to Machine Learning?
16:05 • 2min
When Do You Need Fully Bayesian Inference in a Decision Making Model?
17:41 • 2min
The Bayesian Framework and Deep Learning
20:01 • 2min
Generalized Hamiltonian Monte Carlo
21:56 • 5min
Generalized HMC
27:07 • 4min
Is Meads Complementary to HMC?
31:13 • 4min
Is There a Barrier to Entry?
34:50 • 2min
Using a Nested R Hat to Compare Chains
36:50 • 3min
Is It Necessary to Tune the Parameters of Your Sampler?
39:50 • 2min
The Biggest Herderal in the Bayesian Workflow?
41:32 • 4min
Is G HMC a Substitute for HMC?
45:50 • 3min
Is There Really a Place for Bayesian Methods in Machine Learning?
48:51 • 4min
I'd Like to See a Future Where the Formalism Is Simple and Easy to Communicate
53:05 • 3min
Is There a Food Based Name for the Samplers?
55:58 • 2min
What's the Biggest Problem in the Show?
57:49 • 2min
Earning Patient Stats - Part 2
59:42 • 3min
