The Robot Brains Podcast cover image

Yoshua Bengio: equipping AI with higher level cognition and creativity

The Robot Brains Podcast

CHAPTER

Is There a Difference in Attention Architectures?

In 2014, when we introduced the modern form of attention that is in transformers these days. We're quite aware that human attention is more like this hard, probabilistic, castic phenomenon where you choose one thing or the other. It's just we didn't have the algorithms to conveniently train a system with stochastic hard attention. And I now kind of thinking we can design much better algorithms for learning to attend in a way that's stochastichard decisions, just like with conscious attention.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner