

[38] Andrew Lampinen - A Computational Framework for Learning and Transforming Task Representations
Jan 8, 2022
Andrew Lampinen, a research scientist at DeepMind with a PhD from Stanford, discusses the fascinating intersection of cognitive flexibility and machine learning. He delves into metamapping, exploring how it improves task adaptability and zero-shot generalization. The conversation extends to the balance between human and artificial learning, emphasizing the importance of contextual understanding in AI symbolism. Lampinen also shares insights on transitioning from academia to industry and the significance of balancing personal life with research commitments.
AI Snips
Chapters
Transcript
Episode notes
Human Creativity From Little
- Andrew Lampinen loves how humans build complex systems from minimal resources.
- Kids play with sticks to create entire fantasy worlds, showcasing human creativity.
Teaching Math Shaped Research
- Andrew taught math in undergrad and noticed students struggled with abstract concepts.
- He observed how working with examples was crucial for grasping theorems.
Understanding Cognitive Flexibility
- Cognitive flexibility means adapting learned knowledge to new situations that differ slightly.
- Neural networks excel at learning but struggle to adapt knowledge flexibly like humans.