
107 - Multi-Modal Transformers, with Hao Tan and Mohit Bansal
NLP Highlights
00:00
Using Pre-Trained Transformers to Do Long Tail Image Processing
Namig: I think we could be even a little bit more specific and more general than this. Think how had a good example in one of our emales, likee, how he remembers a couple of months ago, he had some nice visual verses or textual clusters. And so perhaps you can get some leverage out of using some pre trained language representations to do better ment on long tail image phenomena.
Transcript
Play full episode