

921: NPUs vs GPUs vs CPUs for Local AI Workloads, with Dell’s Ish Shah and Shirish Gupta
12 snips Sep 9, 2025
Ish Shah and Shirish Gupta from Dell Technologies share their expertise in AI hardware innovation. They explore the competitive landscape of NPUs versus GPUs and the advantages of using Windows for AI development. Listeners learn about Dell's cutting-edge products, including the new Pro Max mobile workstation with a discrete NPU. The conversation delves into optimizing local versus cloud AI workloads, decision-making in hardware investments, and the importance of future-proofing technology for evolving AI applications.
AI Snips
Chapters
Books
Transcript
Episode notes
Windows Still Dominates Developer Environments
- Windows remains the dominant developer OS and is standard in enterprises, so many productivity and IDEs run best there.
- Use WSL2 to run a native Linux kernel on Windows when you need Linux tooling and production parity.
NPUs Optimize AI Workloads For Efficiency
- NPUs are purpose-built accelerators optimized for neural workloads and deliver much better performance per watt.
- They make on-device AI feasible, especially for battery-sensitive mobile and laptop scenarios.
Colonoscopy Inference Demo On A Laptop
- Dell demoed a discrete NPU in the Pro Max mobile workstation to run medical imaging inference locally on colonoscopy video.
- The demo processed live video on-device with no internet uplink or cloud queueing required.