
129 - Transformers and Hierarchical Structure, with Shunyu Yao
NLP Highlights
Why Isn't the Absolute Position Incoding So Important?
N coating fared worse on like wickytex to and some other. We only tried on like a language molony on wicked text and a. It's not like a total failure. It's just slight ligly was a right? Yet, i'm wondering if redundancy has any anything to do with that. And another one we are thinking of is maybe in natural language processing, the absolute position isn't that important - as evidenced by success in relative position ding.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.