NLP Highlights cover image

129 - Transformers and Hierarchical Structure, with Shunyu Yao

NLP Highlights

00:00

The Differences Between Recurrent Mechanism and Self Attention Mechanism

There are differences how like recurin net works and self attention et wors protest i. For a shorter string, you actually have less memory throughout the transformer than for ike, a longer string. And one difference is that far as you can actually do this processing in kind of like a onlying augrism or streamig augarism kind of way. I seems like an interesting kind of a ik ta em thot atur. Sure.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app