London Futurists cover image

AI Transformers in context, with Aleksa Gordić

London Futurists

00:00

Transformers Are Much Better at Modeling Long Term Dependencies

In 20 17, a group of researchers from gugel basically published this paper called attention is all you need that introduced the so called transformer. Since then it turned out it's much better than alis tms. N will not precand it does not have. It doesn't have anything to do with transformers the movie, or like robots or what not. Its basicly means it's the original application that transformer was doing in that paper, was machine translation,. Or translating from one human language into another.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app