The Jim Rutt Show cover image

EP72 Joscha Bach on Minds, Machines & Magic

The Jim Rutt Show

00:00

Recurrences in Deep Learning Architectures

Deep learning mostly feed forward, though they're now adding some simple recursion. I always ask myself, what are they missing by not having these feedback loops? It seems to be a way to think about embeddings into a space of features in a more general way. And it is this notion of attention and self-attention binds features together across the dimensions into a relational graph. This allows you, for instance, to generate a text in which a noun and a pronoun are associated over a very large distance or where the initial part of the text mentions the person by name that performs a scientific experiment.

Play episode from 01:32:52
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app