

877: The Neural Processing Units Bringing AI to PCs, with Shirish Gupta
79 snips Apr 8, 2025
Shirish Gupta, Director of AI Product Management at Dell, shares his two decades of expertise in global technology. He discusses the transformative power of neural processing units (NPUs) for AI applications, emphasizing their efficiency in on-device processing. Gupta highlights the advantages of moving AI workloads from the cloud to local devices, facilitating faster and more private computations. He also introduces Dell’s Pro AI Studio Toolkit, showcasing real-world applications improving industrial efficiency and even enhancing life-or-death scenarios for first responders.
AI Snips
Chapters
Books
Transcript
Episode notes
NPU Efficiency
- NPUs are purpose-built for matrix math, making them extremely power-efficient for AI tasks.
- This efficiency is crucial for PCs, where battery life is a major concern.
NPU for Inference
- NPUs currently excel at inference, not training, making them suitable for AI applications in production.
- GPUs remain essential for training and fine-tuning AI models.
AIPC Advantages
- Consider moving AI workloads from the cloud to local devices for several benefits (AIPC).
- These include accelerated performance, individualized learning, increased privacy, and cost savings.