

Alex Tamkin on Self-Supervised Learning and Large Language Models
Nov 11, 2021
Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26
Introduction
00:00 • 3min
AI and the Future of Photography
02:55 • 2min
How to Keep Doing Research
04:27 • 2min
The Paradigm Shift in Deep Learning
06:22 • 2min
How to Scale Up AI
08:03 • 5min
The Benefits of ImageNet for Language Modeling
12:59 • 3min
ViewMaker Networks Learning Views for Unsupervised Representation Learning
16:15 • 3min
The Future of Neural Networks for Contrastive Learning
19:35 • 3min
The Benefits of Self Supervised Learning
22:06 • 3min
The Importance of Input Dependent Views
25:05 • 2min
The Journey of Augmentation in Supervised Learning
26:50 • 3min
The Power of Practice
30:19 • 2min
DABs: A Benchmark for Self-Supervised Learning
31:54 • 5min
The Principles of Self-Supervised Learning
36:44 • 2min
The Importance of Pre-Training Objectives
39:03 • 3min
How to Train a Model to Do Two Domains
41:50 • 2min
The Future of Wearable Sensors
44:10 • 2min
The Interconnectedness of Machine Learning
45:58 • 3min
The Complexity of Large Language Models
48:47 • 5min
The Importance of Consequences in Research
53:31 • 2min
How to Foster a Healthy and Inclusive Research Culture
55:23 • 4min
The Importance of Hanging Out
59:36 • 2min
The Importance of Mental Well-Being
01:01:10 • 2min
How to Be a Better Mentor for Undergraduates
01:03:31 • 2min
How to Stay Sane During the Pandemic
01:05:36 • 2min
Gradient Podcast: How I Learned to Photograph
01:07:43 • 3min