

#228 Rodrigo Liang: How SambaNova Systems Is Disrupting AI Inference
Jan 1, 2025
Rodrigo Liang, co-founder and CEO of SambaNova Systems and an expert in high-performance chip design, dives deep into the world of AI inference technology. He shares SambaNova's groundbreaking record-breaking models that deliver incredible speed and efficiency. The discussion highlights the shift from AI training to real-time applications and the importance of power efficiency. Rodrigo also explores the competitive landscape against giants like NVIDIA, advocating for a new era of scalable, user-friendly AI solutions that empower enterprises without costly infrastructure.
AI Snips
Chapters
Transcript
Episode notes
Inference Competition
- AI inference is the new competitive battleground, not training.
- SambaNova's new inference chip and API service challenge NVIDIA's dominance.
Saudi Aramco Partnership
- SambaNova powers Saudi Aramco's internal AI model, MetaBrain.
- This showcases their capability for private, secure, on-premise AI deployment.
Single-Rack Efficiency
- Single-rack efficiency is key for AI model scaling.
- SambaNova's approach avoids costly infrastructure overhauls like liquid cooling.