
Setting the stage for 2023
Eye On A.I.
Unsupervised Pre-Training Is Important for Deep Learning
Deep learning really got going in about 2006 when we discovered that if you train stacks of autoencoders, or restricted bolts from machines, one hit and layer at a time, and then you fine-tune it. People then did things like speech and vision on ImageNet where they said, you don't need the pre-training. You can just train the whole thing supervised. That was fine for a while, but then when they got even bigger data sets and even bigger networks, people have gone back to this unsupervised pre- Training is what BERT is doing now.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.