The Robot Brains Podcast cover image

Yoshua Bengio: equipping AI with higher level cognition and creativity

The Robot Brains Podcast

00:00

Is There a Difference in Attention Architectures?

In 2014, when we introduced the modern form of attention that is in transformers these days. We're quite aware that human attention is more like this hard, probabilistic, castic phenomenon where you choose one thing or the other. It's just we didn't have the algorithms to conveniently train a system with stochastic hard attention. And I now kind of thinking we can design much better algorithms for learning to attend in a way that's stochastichard decisions, just like with conscious attention.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app