
Bonus episode - Clouds and AI inference, with Cirrascale's CEO David Driggers
DCD Zero Downtime: The Bi-Weekly Data Center Show
00:00
Navigating the GPU Landscape in AI Inference
This chapter explores the extensive scale of data center operations, focusing on the deployment of GPUs for both inference and training purposes. It highlights the competitive dynamics between major players like NVIDIA and AMD, emphasizing the growing demand for computational resources in AI. The discussion also addresses the complexities of managing large AI models and the trade-offs involved in GPU technology for efficient performance.
Transcript
Play full episode