Andrew Lampinen is a research scientist at DeepMind. His research focuses on cognitive flexibility and generalization.
Andrew’s PhD thesis is titled "A Computational Framework for Learning and Transforming Task Representations", which he completed in 2020 at Stanford University.
We talk about cognitive flexibility in brains and machines, centered around his work in the thesis on meta-mapping. We cover a lot of interesting ground, including complementary learning systems and memory, compositionality and systematicity, and the role of symbols in machine learning.
- Episode notes: https://cs.nyu.edu/~welleck/episode38.html
- Follow the Thesis Review (@thesisreview) and Sean Welleck (@wellecks) on Twitter
- Find out more info about the show at https://cs.nyu.edu/~welleck/podcast.html
- Support The Thesis Review at www.patreon.com/thesisreview or www.buymeacoffee.com/thesisreview
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode