AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Power of Graph Attention
This is not a new area for us, but with the dominance of transformers in recent years, it makes sense to look at them in the space of graphs. And one of the big challenges is that transformer architecture's particularly self-attention is on long sequences which has a polynomial time complexity. So this research set out to propose some methodologies to get around that computational barrier so that we can do attention on a larger scale.