DataFramed

#313 Developing Better Predictive Models with Graph Transformers with Jure Leskovec, Pioneer of Graph Transformers, Professor at Stanford

35 snips
Aug 4, 2025
Jure Leskovec, a Professor at Stanford and expert in Graph Transformers, discusses revolutionary AI advancements in predictive modeling. He highlights how graph transformers can simplify complex data relationships, drastically reduce model training time, and improve predictive accuracy. Leskovec shares insights on transforming relational databases into graph structures and how pre-trained models are democratizing machine learning for non-experts. His vision suggests that these innovations may transform data scientists' roles, shifting their focus from data prep to impactful business decisions.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Enterprise Data as Graphs

  • Enterprise data is inherently structured in interconnected tables, forming complex graphs rather than simple sequences.
  • Effective predictive models need to understand these graph structures to forecast future actions accurately.
INSIGHT

Graph Transformers Revolutionize ML

  • Predictive modeling hasn't evolved with the AI revolution as language and vision models have.
  • Graph transformers bring direct raw data learning and pre-training to machine learning, eliminating manual feature engineering and task-specific training.
ANECDOTE

Real-World Success Stories

  • Reddit used graph transformers to speed up years of model improvement into weeks, greatly enhancing their ad targeting.
  • Databricks used the technology to predict sales lead conversions like meeting scheduling, discovering hard-to-find predictive signals.
Get the Snipd Podcast app to discover more snips from this episode
Get the app