

Episode 02: Sarah Jane Hong, Latent Space, on neural rendering & research process
Jan 7, 2021
Chapters
Transcript
Episode notes
1 2 3 4 5 6 7 8 9 10 11 12 13
Introduction
00:00 • 2min
How Did You Initially Get Into Research?
01:32 • 3min
How to Read a Paper
05:00 • 3min
I'm Not Enough in the Weeds
08:30 • 2min
How Have You Improved Your Efficacy as a Researcher?
10:18 • 3min
Staccastic Spatial Networks for Low Block Distortion
13:19 • 2min
The Simple Version of the Algorithm
15:25 • 3min
Transfer Learning Is Really Good for Fast Iteration Time
18:33 • 4min
Latent Resolution in a Neural Rendering Environment
22:07 • 2min
How Do You Think You Can Generate Wolverine?
24:09 • 3min
Machine Learning - Is Scaling a Good Idea?
26:44 • 2min
Using Multi Level Transformers Over Elisans?
29:03 • 3min
The Future of Visually Grounded Language?
32:19 • 4min