2min snip

Latent Space: The AI Engineer Podcast cover image

Commoditizing the Petaflop — with George Hotz of the tiny corp

Latent Space: The AI Engineer Podcast

NOTE

Improving Models: Training Time vs Size

Improving a model can be done by training it longer or making it bigger, but these options are not always interesting or necessary. Large models are beneficial for research labs, but for startups or individual use, excessive weights are not needed. Focusing on inference rather than training can yield better results, and there should be a higher ratio of inference to training in the world. Blurring the lines between inference and training is a positive development, and on-device fine-tuning is desirable. Comma is exploring parameter-efficient fine-tuning, considering factors like flat tires. Being efficient in parameters is crucial for Comma's operations in cars.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode