
797: Deep Learning Classics and Trends, with Dr. Rosanne Liu
Super Data Science: ML & AI Podcast with Jon Krohn
00:00
Exploring Failure in Machine Learning Research and Parameter-Efficient Techniques for Large Language Models
The chapter delves into the concept of a failure CV, the ML Collective, and the speaker's machine learning research at Google DeepMind. It focuses on topics like training dynamics, model capacity, scaling, Intrinsic Dimension, and low rank adaptation for parameter-efficient fine-tuning of large language models. The discussion also includes the use of parameter-efficient techniques in training large language models, balancing curiosity-driven and goal-driven research, and nuances between ML engineers and researchers.
Transcript
Play full episode