Gradient Dissent: Conversations on AI cover image

Scaling LLMs and Accelerating Adoption with Aidan Gomez at Cohere

Gradient Dissent: Conversations on AI

00:00

The Importance of Attention

Attention is such an evocative word. Like it is hard not to extrapolate units to our own brains and things like that when you use the word attention. But yet the math of a transformer is super simple. And sometimes I wonder if the math came from a different path, it might not be called attention. Do you think there's any truth to that?Like how fundamentally attention is this thing? Yeah, I think the other way of describing it is like a soft look up, right? A soft look up attention. We can understand. We can grock it.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner