
AI and the Future of Work: Artificial Intelligence in the Workplace, Business, Ethics, HR, and IT for AI Enthusiasts, Leaders and Academics 364: Inside the AI Infrastructure Race: TensorWave CEO Darrick Horton on Power, GPUs and AMD vs NVIDIA.
Dec 1, 2025
Darrick Horton, CEO and co-founder of TensorWave, shares insights from his journey in AI infrastructure. He discusses the strategic decision to choose AMD over Nvidia, highlighting how this choice created a competitive edge in a GPU-constrained market. Horton explains why training clusters offer greater versatility than inference clusters and the advantages of NeoClouds like TensorWave in innovation speed. He warns about potential power limits impacting AI scaling and reflects on the importance of optimizing for green energy and the future of hardware architecture.
AI Snips
Chapters
Transcript
Episode notes
Power Is The Real Bottleneck
- Power, not GPUs, is the primary constraint for scaling AI infrastructure today.
- Darrick Horton predicts we'll run out of power before we run out of AI demand by 2027–2029.
Skunk Works Sparked An Infrastructure Career
- Darrick Horton left university early and started at Lockheed Martin's Skunk Works working on nuclear fusion.
- That experience sparked his long-term interest in infrastructure, energy, and computing.
Bet On Practical Alternatives
- Evaluate alternatives to dominant vendors when customers value cost, access, and simplicity more than brand.
- Choose platforms that offer production capacity and a strong software investment to avoid supply constraints.
