
a16z Podcast
Chasing Silicon: The Race for GPUs
Aug 2, 2023
Guido Appenzeller, a seasoned infrastructure expert and former CTO at Intel’s data center group, discusses the soaring demand for AI hardware. He highlights the significant supply challenges in GPU production and the competitive strategies companies are using to secure compute resources. The conversation navigates the decision-making process between cloud services and in-house infrastructure, emphasizing the role of open source and the increasing importance of data efficiency. Expect insights on future AI model trends and the cost implications of compute.
21:40
Episode guests
AI Summary
Highlights
AI Chapters
Episode notes
Podcast summary created with Snipd AI
Quick takeaways
- The demand for AI hardware currently outstrips supply, leading to challenges for companies in accessing the compute capacity they need.
- Companies have to consider factors beyond cost when accessing compute for AI applications and should compare offers from different providers to find the most suitable option.
Deep dives
The Challenge of Meeting Demand for AI Hardware
The demand for AI hardware currently outstrips supply by a factor of 10, leading to challenges for companies in accessing the compute capacity they need. Bottlenecks in chip manufacturing and building the actual cards contribute to the shortage. While companies like Intel and Nvidia could increase production, the process is complex and time-consuming due to the need to build new foundries, which takes years and billions of dollars in investment. Major investments in new semiconductor production plants are being made, but scaling these facilities will also take time. In the meantime, companies negotiate with cloud providers and make deals to secure capacity, often requiring long-term commitment and exclusivity.
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.