NLP Highlights cover image

129 - Transformers and Hierarchical Structure, with Shunyu Yao

NLP Highlights

CHAPTER

The Differences Between Recurrent Mechanism and Self Attention Mechanism

There are differences how like recurin net works and self attention et wors protest i. For a shorter string, you actually have less memory throughout the transformer than for ike, a longer string. And one difference is that far as you can actually do this processing in kind of like a onlying augrism or streamig augarism kind of way. I seems like an interesting kind of a ik ta em thot atur. Sure.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner