Empire cover image

Inside AI's GPU Economy & Infrastructure Boom | Sam Hogan, Kuzco

Empire

00:00

Training vs Inference in AI

This chapter examines the critical differences between training and inference in artificial intelligence models, emphasizing the computational demands of training compared to the more streamlined inference process. It discusses the evolving dynamics of AI compute usage, pointing to a shift towards increased reliance on inference and the opportunities this presents for optimizing resource management. Additionally, the chapter explores innovative projects aimed at leveraging underutilized GPU capacity and integrating cryptocurrency for efficient data center operations.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner
Get the app