DCD Zero Downtime: The Bi-Weekly Data Center Show

Bonus episode - Clouds and AI inference, with Cirrascale's CEO David Driggers

10 snips
Aug 15, 2025
Join David Driggers, CEO and CTO of Cirrascale, a trailblazer in cloud services, as he delves into future cloud operations for 2025. He discusses how Cirrascale navigates a competitive landscape dominated by hyperscalers. The conversation covers the growing AI inference market, highlighting the importance of GPU deployment and the unique challenges enterprises face in optimizing AI—balancing efficient resource management with budget constraints. David also reflects on the evolving role of AI in healthcare and programming, emphasizing the need for sustainable business models.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
ANECDOTE

From Hardware Maker To Cloud Operator

  • Cirrascale began as a hardware company designing high-performance 8-GPU servers in 2012 and pivoted to cloud services about eight years ago.
  • Customers lacked capacity to host those systems, so Cirrascale kept and managed hardware for them remotely.
INSIGHT

Training Vs Inference Deployment Patterns

  • Training workloads are tightly coupled, usually in one data center, while inference is loosely coupled and distributed across regions.
  • Inference needs multi-datacenter deployments for latency and redundancy, unlike training.
INSIGHT

Inference Is About Fast Enough And Profit

  • Inference is about "fast as necessary" and cost efficiency rather than maximum speed.
  • Training is a cost center while inference becomes a profit center for companies.
Get the Snipd Podcast app to discover more snips from this episode
Get the app