

The Thesis Review
Sean Welleck
Each episode of The Thesis Review is a conversation centered around a researcher's PhD thesis, giving insight into their history, revisiting older ideas, and providing a valuable perspective on how their research has evolved (or stayed the same) since.
Episodes
Mentioned books

Sep 25, 2020 • 1h 1min
[08] He He - Sequential Decisions and Predictions in NLP
He He is an Assistant Professor at New York University. Her research focuses on enabling reliable communication in natural language between machine and humans, including topics in text generation, robust language understanding, and dialogue systems.
Her PhD thesis is titled "Sequential Decisions and Predictions in NLP", which she completed in 2016 at the University of Maryland. We talk about the intersection of language with imitation learning and reinforcement learning, her work in the thesis on opponent modeling and simultaneous translation, and how it relates to recent work on generation and robustness.
Episode notes: https://cs.nyu.edu/~welleck/episode8.html
Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
Support The Thesis Review at www.buymeacoffee.com/thesisreview

Sep 11, 2020 • 1h 4min
[07] John Schulman - Optimizing Expectations: From Deep RL to Stochastic Computation Graphs
John Schulman, a Research Scientist and co-founder of OpenAI, co-leads efforts in reinforcement learning, focusing on algorithms that learn through trial and error. He shares insights on the evolution from TRPO to PPO and the intricate role of stochastic computation graphs. Schulman discusses the challenges of generalization in RL and how OpenAI Five leveraged these techniques for Dota victories. The conversation also touches on navigating AI alignment challenges and the significance of integrating human intuition into machine learning.

Aug 28, 2020 • 1h 6min
[06] Yoon Kim - Deep Latent Variable Models of Natural Language
Yoon Kim, a Research Scientist at the MIT-IBM AI Watson Lab, shares insights into his research on deep latent variable models and natural language processing. He discusses uncovering latent structures in language, including vector representations and grammar induction. Yoon explores the complexities of variational inference in generative models and the challenges faced in training these models. Additionally, he reflects on his coding practices and the role of luck and opportunity in navigating academia, emphasizing the importance of inclusivity in tech.

Aug 14, 2020 • 1h 12min
[05] Julian Togelius - Computational Intelligence and Games
Julian Togelius, an Associate Professor at NYU and co-director of the Game Innovation Lab, dives into the exciting world of AI in gaming. He discusses how video games serve as a testing ground for AI techniques and the evolution of intelligent behaviors in NPCs. Togelius highlights the significance of procedural content generation and its role in enhancing AI capabilities. He also shares insights on unconventional research paths, training AI in racing games, and the intricacies of neural networks, all while driving innovation in game design.

4 snips
Jul 31, 2020 • 1h 45min
[04] Sebastian Nowozin - Learning with Structured Data: Applications to Computer Vision
Sebastian Nowozin, a researcher at Microsoft Research Cambridge, delves into the fascinating world of probabilistic deep learning and its ties to computer vision. He discusses the significance of structured data and innovative loss functions in image analysis. Listeners will learn about the evolution from traditional methods in object recognition to advanced, machine learning-driven techniques. The conversation also covers the challenges and insights in Bayesian deep learning, highlighting the importance of stability in models and sound programming practices in the field.

Jul 17, 2020 • 1h 25min
[03] Sebastian Ruder - Neural Transfer Learning for Natural Language Processing
Sebastian Ruder is currently a Research Scientist at Deepmind. His research focuses on transfer learning for natural language processing, and making machine learning and NLP more accessible.
His PhD thesis is titled "Neural Transfer Learning for Natural Language Processing", which he completed in 2019. We cover transfer learning from philosophical and technical perspectives, and talk about its societal
implications, focusing on his work on sequential transfer learning and cross-lingual learning.
Episode notes: https://cs.nyu.edu/~welleck/episode3.html
Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html

Jul 3, 2020 • 1h 16min
[02] Colin Raffel - Learning-Based Methods for Comparing Sequences
Colin Raffel is currently a Senior Research Scientist at Google Brain, and soon to be an assistant professor at the University of North Carolina. His recent work focuses on transfer learning and learning from limited labels.
His thesis is titled "Learning-Based Methods for Comparing Sequences, with Applications to Audio-to-MIDI Alignment and Matching", which we discuss along with the connections to his later work, and plans for the future.
Episode notes: https://cs.nyu.edu/~welleck/episode2.html
Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html

Jun 18, 2020 • 58min
[01] Gus Xia - Expressive Collaborative Music Performance via Machine Learning
Gus Xia is an assistant professor at New York University Shanghai. His research explores machine learning for music, with a goal of building intelligent systems that understand and extend musical creativity and expression.
His PhD thesis is titled Expressive Collaborative Music Performance via Machine Learning, which we discuss in depth along with his ongoing research at the NYU Shanghai Music X Lab.
- Gus Xia's homepage: https://www.cs.cmu.edu/~gxia/
- Thesis: http://reports-archive.adm.cs.cmu.edu/anon/ml2016/CMU-ML-16-103.pdf
Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html

Jun 13, 2020 • 2min
[00] The Thesis Review Podcast - Introduction
[00] The Thesis Review Podcast - Introduction by Sean Welleck