AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Attention
Attention is such an evocative word. Like it is hard not to extrapolate units to our own brains and things like that when you use the word attention. But yet the math of a transformer is super simple. And sometimes I wonder if the math came from a different path, it might not be called attention. Do you think there's any truth to that?Like how fundamentally attention is this thing? Yeah, I think the other way of describing it is like a soft look up, right? A soft look up attention. We can understand. We can grock it.