London Futurists cover image

AI Transformers in context, with Aleksa Gordić

London Futurists

CHAPTER

How Can Transformers Do That Better Than Long Term Memory?

Why is the transformer better at doing these long term dependences than what was previously available? A case of dhet's where the concept of attention comes in into picture. And i want to ask you, how can transformers do that better than long short term memory? Because i understand t at that was also an idea to preserve the context over a period of time exactlya ther there will be an example of what i mean by a long term dependency. So if i wert to ask you,. tell me more about that, you'll have to figure out what that refers to.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner