The Thesis Review cover image

The Thesis Review

Latest episodes

undefined
Jan 8, 2022 • 1h 5min

[38] Andrew Lampinen - A Computational Framework for Learning and Transforming Task Representations

Andrew Lampinen is a research scientist at DeepMind. His research focuses on cognitive flexibility and generalization. Andrew’s PhD thesis is titled "A Computational Framework for Learning and Transforming Task Representations", which he completed in 2020 at Stanford University. We talk about cognitive flexibility in brains and machines, centered around his work in the thesis on meta-mapping. We cover a lot of interesting ground, including complementary learning systems and memory, compositionality and systematicity, and the role of symbols in machine learning. - Episode notes: https://cs.nyu.edu/~welleck/episode38.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
undefined
Dec 21, 2021 • 1h 9min

[37] Joonkoo Park - Neural Substrates of Visual Word and Number Processing

Joonkoo Park is an Associate Professor and Honors Faculty in the Department of Psychological and Brain Sciences at UMass Amherst. He leads the Cognitive and Developmental Neuroscience Lab, focusing on understanding the developmental mechanisms and neurocognitive underpinnings of our knowledge about number and mathematics. Joonkoo’s PhD thesis is titled "Experiential Effects on the Neural Substrates of Visual Word and Number Processing", which he completed in 2011 at the University of Michigan. We talk about numerical processing in the brain, starting with nature vs. nurture, including the learned versus built-in aspects of neural architectures. We talk about the difference between word and number processing, types of numerical thinking, and symbolic vs. non-symbolic numerical processing. - Episode notes: https://cs.nyu.edu/~welleck/episode37.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
undefined
Nov 30, 2021 • 1h 2min

[36] Dieuwke Hupkes - Hierarchy and Interpretability in Neural Models of Language Processing

Dieuwke Hupkes is a Research Scientist at Facebook AI Research and the scientific manager of the Amsterdam unit of ELLIS. Dieuwke's PhD thesis is titled, "Hierarchy and Interpretability in Neural Models of Language Processing", which she completed in 2020 at the University of Amsterdam. We discuss her work on which aspects of hierarchical compositionality and syntactic structure can be learned by recurrent neural networks, how these models can serve as explanatory models of human language processing, what compositionality actually means, and a lot more. - Episode notes: https://cs.nyu.edu/~welleck/episode36.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
undefined
Nov 6, 2021 • 1h 16min

[35] Armando Solar-Lezama - Program Synthesis by Sketching

Armando Solar-Lezama is a Professor at MIT, and the Associate Director & COO of CSAIL. He leads the Computer Assisted Programming Group, focused on program synthesis. Armando’s PhD thesis is titled, "Program Synthesis by Sketching", which he completed in 2008 at UC Berkeley. We talk about program synthesis & his work on Sketch, how machine learning's role in program synthesis has evolved over time, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode35.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
undefined
Oct 20, 2021 • 1h 8min

[34] Sasha Rush - Lagrangian Relaxation for Natural Language Decoding

Sasha Rush, an Associate Professor at Cornell Tech and a researcher at Hugging Face, delves into the intricacies of Natural Language Processing. He shares insights from his PhD thesis on Lagrangian Relaxation and its relevance today. The conversation touches on the balance between discrete and continuous algorithms, the evolution of coding practices, and the importance of community in open-source innovations. Additionally, they explore navigating depth and breadth in academia and the necessity of risk-taking in research for true innovation.
undefined
Oct 1, 2021 • 1h 13min

[33] Michael R. Douglas - G/H Conformal Field Theory

Michael R. Douglas is a theoretical physicist and Professor at Stony Brook University, and Visiting Scholar at Harvard University. His research focuses on string theory, theoretical physics and its relations to mathematics. Michael's PhD thesis is titled, "G/H Conformal Field Theory", which he completed in 1988 at Caltech. We talk about working with Feynman, Sussman, and Hopfield during his PhD days, the superstring revolutions and string theory, and machine learning's role in the future of science and mathematics. - Episode notes: https://cs.nyu.edu/~welleck/episode33.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
undefined
Sep 16, 2021 • 1h 27min

[32] Andre Martins - The Geometry of Constrained Structured Prediction

Andre Martins is an Associate Professor at IST and VP of AI Research at Unbabel in Lisbon, Portugal. His research focuses on natural language processing and machine learning. Andre’s PhD thesis is titled, "The Geometry of Constrained Structured Prediction: Applications to Inference and Learning of Natural Language Syntax", which he completed in 2012 at Carnegie Mellon University and IST. We talk about his work in the thesis on structured prediction in NLP, and discuss connections between his thesis work on later work on sparsity, sparse communication, and more. - Episode notes: https://cs.nyu.edu/~welleck/episode32.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
undefined
Aug 29, 2021 • 1h 34min

[31] Jay McClelland - Preliminary Letter Identification in the Perception of Words and Nonwords

Jay McClelland is a Professor in the Psychology Department and Director of the Center for Mind, Brain, Computation and Technology at Stanford. His research addresses a broad range of topics in cognitive science and cognitive neuroscience, including Parallel Distributed Processing (PDP). Jay's PhD thesis is titled "Preliminary Letter Identification in the Perception of Words and Nonwords", which he completed in 1975 at University of Pennsylvania. We discuss his work in the thesis on the word superiority effect, how it led to the Integrated Activation model, the path to Parallel Distributed Processing and the connectionist revolution, and distributed vs rule-based and symbolic approaches to modeling human cognition and artificial intelligence. - Episode notes: https://cs.nyu.edu/~welleck/episode31.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
undefined
Aug 14, 2021 • 1h 3min

[30] Dustin Tran - Probabilistic Programming for Deep Learning

Dustin Tran, a research scientist at Google Brain, specializes in probabilistic programming and deep learning. He discusses his PhD thesis on integrating probabilistic modeling with deep learning, highlighting the innovative Edward library and new inference algorithms. The conversation dives into the evolution of AI tools like TensorFlow, emphasizing their democratizing impact. Dustin also shares insights on transitioning from PhD to research roles, the importance of addressing uncertainty in neural networks, and the balance between academic benchmarks and practical advancements.
undefined
Aug 1, 2021 • 1h 17min

[29] Tengyu Ma - Non-convex Optimization for Machine Learning

Tengyu Ma is an Assistant Professor at Stanford University. His research focuses on deep learning and its theory, as well as various topics in machine learning. Tengyu's PhD thesis is titled "Non-convex Optimization for Machine Learning: Design, Analysis, and Understanding", which he completed in 2017 at Princeton University. We discuss theory in machine learning and deep learning, including the 'all local minima are global minima' property, overparameterization, as well as perspectives that theory takes on understanding deep learning. - Episode notes: https://cs.nyu.edu/~welleck/episode29.html - Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter - Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html - Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode