

Scaling AI with Storage Efficiency - with Shawn Rosemarin of Pure Storage
May 19, 2025
Join Shawn Rosemarin, Vice-President of R&D in Customer Engineering at Pure Storage, as he dives into the vital role of storage efficiency in scaling AI. He discusses the transition from outdated systems to modern flash storage, highlighting how this shift reduces energy consumption and enhances scalability. Shawn also tackles historical challenges in AI adoption, emphasizing the importance of data quality and management strategies. Discover why sustainable infrastructure is key for future-proofing AI applications and optimizing operations.
AI Snips
Chapters
Transcript
Episode notes
Mainstream AI Depends on Data Quality
- AI adoption shifted from academic to mainstream with ChatGPT enabling broader accessibility.
- Trust in AI answers depends critically on data quality and governance behind the scenes.
Storage Must Feed GPUs Fast
- GPUs are the most expensive part of AI infrastructure and require fast data feed from storage to be fully utilized.
- Storage systems must deliver data fast to avoid underutilizing costly GPU power and investment.
Right to Be Forgotten Challenges
- If a person exercises their legal right to be forgotten, data must be removed from all AI training and inference.
- This requires real-time data reconfiguration to comply with regulations and avoid fines.