undefined

Taco Cohen

Researcher in geometric deep learning and equivariant architectures; discusses limits of group-theoretic approaches, motivations for monoids/categories, and algorithmic reasoning alignment.

Top 3 podcasts with Taco Cohen

Ranked by the Snipd community
undefined
69 snips
Dec 22, 2025 • 44min

Making deep learning perform real algorithms with Category Theory (Andrew Dudzik, Petar Velichkovich, Taco Cohen, Bruno Gavranović, Paul Lessard)

This discussion features Andrew Dudzik, a mathematician specializing in category theory; Taco Cohen, a researcher in geometric deep learning; and Petar Veličković, an expert in graph neural networks. They delve into why LLMs struggle with basic math by highlighting their pattern recognition flaws. The conversation proposes category theory as a framework to transition AI from trial-and-error towards a scientific approach. They explore concepts like equivariance, compositional structures, and the potential for unifying diverse machine learning perspectives.
undefined
64 snips
Sep 19, 2021 • 3h 33min

#60 Geometric Deep Learning Blueprint (Special Edition)

Joining the discussion are Petar Veličković from DeepMind, renowned for his work on graph neural networks, Taco Cohen from Qualcomm AI Research, specializing in geometric deep learning, and Joan Bruna, an influential figure in data science at NYU. They delve into geometric deep learning, exploring its foundations in symmetry and invariance. The conversation highlights innovative mathematical frameworks, the unification of geometries, and their implications in AI. Insights on dimensionality, algorithmic reasoning, and historical perspectives on geometry further enrich this engaging dialogue.
undefined
Dec 21, 2020 • 58min

Natural Graph Networks with Taco Cohen - #440

Taco Cohen is a Machine Learning Researcher at Qualcomm Technologies, known for his work on equivariant networks and video compression. In this conversation, he introduces his paper on Natural Graph Networks and the concept of 'naturality,' which proposes that relaxed constraints can lead to more versatile architectures. Taco shares insights on the integration of symmetries from physics in AI, recent advances in efficient GCNNs for mobile, and innovative techniques in neural compression that significantly enhance data efficiency.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app