
Tech Disruptors SambaNova CEO Liang on Powering AI Efficiently
10 snips
Nov 17, 2025 Rodrigo Liang, Co-founder and CEO of SambaNova Systems, shares insights from his extensive background in semiconductor engineering. He discusses how SambaNova is pioneering air-cooled, energy-efficient AI infrastructure, enabling businesses to scale AI rapidly and sustainably. Liang explains the shift from AI training to inference, emphasizing energy efficiency as a key factor for future AI solutions. He also touches on the increasing demand for sovereign AI deployments and the barriers created by habit and legacy software in the industry.
AI Snips
Chapters
Transcript
Episode notes
Sell The Same Stack In Three Shapes
- Offer infrastructure in multiple shapes: cloud API, on-prem racks, and managed sovereign clouds to meet varied customer needs.
- Use the same core hardware and software across delivery forms to simplify operations and speed adoption.
Energy Is The Primary AI Bottleneck
- AI demand now splits into hyperscale, sovereign, SaaS replatforming, and new-agent applications driving huge infrastructure needs.
- Liang frames the core constraint as energy in -> tokens out, making power the primary bottleneck for scale.
Inference Is The New Dominant Workload
- The market has flipped from training to inference becoming the dominant workload this year.
- Liang argues inference's scale and regional deployment needs make power-efficient inference hardware crucial to winning capacity.
