1min snip

Lex Fridman Podcast cover image

#434 – Aravind Srinivas: Perplexity CEO on Future of AI, Search & the Internet

Lex Fridman Podcast

NOTE

Importance of Scale and Data Quality in Model Training

Model training has evolved to focus on training models on large datasets with trillions of tokens and parameters, emphasizing the need for data quality and quantity, as well as evaluation on reasoning benchmarks. Attention alone is not enough; scaling up with attention, parallel computation, transformers, and unsupervised pre-training is crucial for breakthroughs in language model advancements. RLHF is significant, serving as a valuable addition to the model training process.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode