DataFramed cover image

#313 Developing Better Predictive Models with Graph Transformers with Jure Leskovec, Pioneer of Graph Transformers, Professor at Stanford

DataFramed

00:00

Revolutionizing Predictive Modeling with Pre-Trained Foundations

This chapter explores the transformative shift in machine learning from task-specific models to a versatile pre-trained foundation model that streamlines tasks by processing raw data directly. It emphasizes the benefits of this innovation, including reduced training time and improved accuracy, crucial for real-time AI-driven decision-making.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app