DataFramed

#313 Developing Better Predictive Models with Graph Transformers with Jure Leskovec, Pioneer of Graph Transformers, Professor at Stanford

57 snips
Aug 4, 2025
Jure Leskovec, a Professor at Stanford and expert in Graph Transformers, discusses revolutionary AI advancements in predictive modeling. He highlights how graph transformers can simplify complex data relationships, drastically reduce model training time, and improve predictive accuracy. Leskovec shares insights on transforming relational databases into graph structures and how pre-trained models are democratizing machine learning for non-experts. His vision suggests that these innovations may transform data scientists' roles, shifting their focus from data prep to impactful business decisions.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ADVICE

Shift Data Scientist Focus

  • Data scientists should focus on business impact rather than data cleaning or feature engineering.
  • Graph transformer platforms enable building accurate models in hours instead of weeks, freeing time for innovation.
INSIGHT

Databases as Graphs for AI

  • Relational databases naturally form heterogeneous temporal graphs connecting entities like users, products, and transactions.
  • Graph transformers generalize attention mechanisms to these graphs, enabling learning directly from raw interconnected data.
INSIGHT

Scaling Graph Transformers

  • Graph transformers scale up to tens of billions of records and dozens of tables, suitable for large enterprise datasets.
  • Optimal model building involves selective table inclusion to balance accuracy and computational cost.
Get the Snipd Podcast app to discover more snips from this episode
Get the app