London Futurists cover image

AI Transformers in context, with Aleksa Gordić

London Futurists

00:00

How Can Transformers Do That Better Than Long Term Memory?

Why is the transformer better at doing these long term dependences than what was previously available? A case of dhet's where the concept of attention comes in into picture. And i want to ask you, how can transformers do that better than long short term memory? Because i understand t at that was also an idea to preserve the context over a period of time exactlya ther there will be an example of what i mean by a long term dependency. So if i wert to ask you,. tell me more about that, you'll have to figure out what that refers to.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app