The Thesis Review cover image

The Thesis Review

Latest episodes

undefined
Sep 25, 2020 • 1h 1min

[08] He He - Sequential Decisions and Predictions in NLP

He He is an Assistant Professor at New York University. Her research focuses on enabling reliable communication in natural language between machine and humans, including topics in text generation, robust language understanding, and dialogue systems. Her PhD thesis is titled "Sequential Decisions and Predictions in NLP", which she completed in 2016 at the University of Maryland. We talk about the intersection of language with imitation learning and reinforcement learning, her work in the thesis on opponent modeling and simultaneous translation, and how it relates to recent work on generation and robustness. Episode notes: https://cs.nyu.edu/~welleck/episode8.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview
undefined
Sep 11, 2020 • 1h 4min

[07] John Schulman - Optimizing Expectations: From Deep RL to Stochastic Computation Graphs

John Schulman is a Research Scientist and co-founder of Open AI. John co-leads the reinforcement learning team, researching algorithms that safely and efficiently learn by trial and error and by imitating humans. His PhD thesis is titled "Optimizing Expectations: From Deep Reinforcement Learning to Stochastic Computation Graphs", which he completed in 2016 at Berkeley. We talk about his work on stochastic computation graphs and TRPO, how it evolved to PPO and how it's used in large-scale applications like Open AI Five, as well as his recent work on generalization in RL. Episode notes: https://cs.nyu.edu/~welleck/episode7.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview
undefined
Aug 28, 2020 • 1h 6min

[06] Yoon Kim - Deep Latent Variable Models of Natural Language

Yoon Kim is currently a Research Scientist at the MIT-IBM AI Watson Lab, and will be joining MIT as an assistant professor in 2021. Yoon’s research focuses on machine learning and natural language processing. His PhD thesis is titled "Deep Latent Variable Models of Natural Language", which he completed in 2020 at Harvard University. We discuss his work on uncovering latent structure in natural language, including continuous vector representations, tree structures, and grammars. We cover learning and variational inference methods that he developed during his PhD, and he offers a look at where latent variable models will be heading in the future. Episode notes: https://cs.nyu.edu/~welleck/episode6.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at www.buymeacoffee.com/thesisreview
undefined
Aug 14, 2020 • 1h 12min

[05] Julian Togelius - Computational Intelligence and Games

Julian Togelius is an Associate Professor at New York University, where he co-directs the NYU Game Innovation Lab. His research is at the intersection of computational intelligence and computer games. His PhD thesis is titled "Optimization, Imitation, and Innovation: Computational Intelligence and Games", which he completed in 2007. We cover his work in the thesis on AI for games and games for AI, and how it connects to his recent work on procedural content generation. Episode notes: https://cs.nyu.edu/~welleck/episode5.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html Support The Thesis Review at https://www.buymeacoffee.com/thesisreview
undefined
Jul 31, 2020 • 1h 45min

[04] Sebastian Nowozin - Learning with Structured Data: Applications to Computer Vision

Sebastian Nowozin is currently a Researcher at Microsoft Research Cambridge. His research focuses on probabilistic deep learning, consequences of model misspecification, understanding agent complexity in order to improve learning efficiency, and designing models for reasoning and planning. His PhD thesis is titled "Learning with Structured Data: Applications to Computer Vision", which he completed in 2009. We discuss the work in his thesis on structured inputs and structured outputs, which involves beautiful ideas from polyhedral combinatorics and optimization. We talk about his recent work on Bayesian deep learning and the connections it has to ideas that he explored during his PhD. Episode notes: https://cs.nyu.edu/~welleck/episode4.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
undefined
Jul 17, 2020 • 1h 25min

[03] Sebastian Ruder - Neural Transfer Learning for Natural Language Processing

Sebastian Ruder is currently a Research Scientist at Deepmind. His research focuses on transfer learning for natural language processing, and making machine learning and NLP more accessible. His PhD thesis is titled "Neural Transfer Learning for Natural Language Processing", which he completed in 2019. We cover transfer learning from philosophical and technical perspectives, and talk about its societal implications, focusing on his work on sequential transfer learning and cross-lingual learning. Episode notes: https://cs.nyu.edu/~welleck/episode3.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
undefined
Jul 3, 2020 • 1h 16min

[02] Colin Raffel - Learning-Based Methods for Comparing Sequences

Colin Raffel is currently a Senior Research Scientist at Google Brain, and soon to be an assistant professor at the University of North Carolina. His recent work focuses on transfer learning and learning from limited labels. His thesis is titled "Learning-Based Methods for Comparing Sequences, with Applications to Audio-to-MIDI Alignment and Matching", which we discuss along with the connections to his later work, and plans for the future. Episode notes: https://cs.nyu.edu/~welleck/episode2.html Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
undefined
Jun 18, 2020 • 58min

[01] Gus Xia - Expressive Collaborative Music Performance via Machine Learning

Gus Xia is an assistant professor at New York University Shanghai. His research explores machine learning for music, with a goal of building intelligent systems that understand and extend musical creativity and expression. His PhD thesis is titled Expressive Collaborative Music Performance via Machine Learning, which we discuss in depth along with his ongoing research at the NYU Shanghai Music X Lab. - Gus Xia's homepage: https://www.cs.cmu.edu/~gxia/ - Thesis: http://reports-archive.adm.cs.cmu.edu/anon/ml2016/CMU-ML-16-103.pdf Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter, and find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
undefined
Jun 13, 2020 • 2min

[00] The Thesis Review Podcast - Introduction

[00] The Thesis Review Podcast - Introduction by Sean Welleck

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode