

#71 - ZAK JOST (Graph Neural Networks + Geometric DL) [UNPLUGGED]
17 snips Mar 25, 2022
Zak Jost, an applied scientist at AWS and YouTuber from The Welcome AI Overlords channel, dives deep into the world of graph neural networks and geometric deep learning. He discusses the intricacies of message passing and the balance between top-down and bottom-up approaches. Zak highlights the importance of equivariant subgraph aggregation networks and addresses the challenges of over-smoothing in GNNs. He also introduces his upcoming GNN course, emphasizing community engagement and collaborative learning.
AI Snips
Chapters
Books
Transcript
Episode notes
Geometric Deep Learning Blueprint
- Geometric deep learning provides a blueprint for many deep learning architectures, including RNNs, CNNs, GNNs, and transformers.
- This blueprint involves layers that are equivariant to symmetry transformations, local pooling, and a final globally invariant layer.
Graph Rewiring
- Graph rewiring techniques can improve GNN performance by optimizing the graph structure for message passing.
- Removing or adding edges can mitigate over-squashing and enable efficient information transmission.
Equivariance vs. Invariance
- Equivariant layers maintain symmetry by shuffling outputs according to input shuffles, while invariant layers ignore symmetry and produce the same output regardless of input order.
- CNNs and pooling layers exemplify equivariance and invariance, respectively.