NLP Highlights cover image

129 - Transformers and Hierarchical Structure, with Shunyu Yao

NLP Highlights

00:00

Why Isn't the Absolute Position Incoding So Important?

N coating fared worse on like wickytex to and some other. We only tried on like a language molony on wicked text and a. It's not like a total failure. It's just slight ligly was a right? Yet, i'm wondering if redundancy has any anything to do with that. And another one we are thinking of is maybe in natural language processing, the absolute position isn't that important - as evidenced by success in relative position ding.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app