London Futurists cover image

AI Transformers in context, with Aleksa Gordić

London Futurists

CHAPTER

Transformers Are Much Better at Modeling Long Term Dependencies

In 20 17, a group of researchers from gugel basically published this paper called attention is all you need that introduced the so called transformer. Since then it turned out it's much better than alis tms. N will not precand it does not have. It doesn't have anything to do with transformers the movie, or like robots or what not. Its basicly means it's the original application that transformer was doing in that paper, was machine translation,. Or translating from one human language into another.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner