

20VC: AI Chip Wars: How Cerebras Plans to Topple NVIDIA's Dominance | Why We Have Not Reached Scaling Laws in AI | What Happens to the Cost of Inference | How We Underestimate China and Shouldn't Sell To Them with Andrew Feldman
424 snips Mar 24, 2025
Andrew Feldman, Co-founder and CEO of Cerebras, shares deep insights into the AI chip landscape. He discusses how NVIDIA's strengths have become liabilities and why claims of reaching scaling laws in AI are misleading. Delving into the cost of inference, Feldman highlights the inefficiencies of algorithms and the necessity for a shift in AI architecture. He believes we underestimate China's tech advancements and critiques current U.S. policies in the realm of hardware export controls. Expect a thought-provoking analysis of AI's future and market dynamics!
AI Snips
Chapters
Transcript
Episode notes
AI's Computational Challenge
- AI presents a new computational problem, demanding a different type of processor.
- This new problem focuses on data movement and memory bandwidth, not complex calculations.
Importance of Data Movement in AI Chips
- AI chips must excel at moving data, which is more crucial than performing complex calculations.
- Cerebras focuses on efficient data movement to build faster, less power-hungry AI computers.
GPU's Architectural Weakness
- GPUs, initially designed for graphics with slow, high-capacity memory, face limitations in AI.
- Their strength has become a weakness compared to wafer-scale chips with fast SRAM.