AI and the Future of Work: Artificial Intelligence in the Workplace, Business, Ethics, HR, and IT for AI Enthusiasts, Leaders and Academics

366: Inside the Age of Inference: Sid Sheth, CEO and Co-Founder of d-Matrix, on Smaller Models, AI Chips, and the Future of Compute

Dec 8, 2025
In this discussion, Sid Sheth, CEO and co-founder of d-Matrix, shares insights from his extensive semiconductor experience. He emphasizes that AI inference represents a transformative opportunity, driving productivity unlike any previous tech shift. Sid explains how more efficient, smaller models are key to AI's scalability and why a talent shortage could impede progress. He also discusses the need for purpose-built AI chips and highlights D-Matrix's innovative approach to integrating into existing data centers.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ADVICE

Reinvent Your Career With AI

  • Reinvent yourself by learning and adopting AI tools to stay relevant.
  • Embrace AI as augmentation rather than competition to unlock new career opportunities.
INSIGHT

Why Smaller Models Matter

  • Distillation keeps much of a large model's capabilities in smaller, efficient models.
  • Those smaller models unlock widespread adoption because they fit diverse applications and infrastructure.
INSIGHT

Reasoning Models Will Drive Compute Growth

  • Demand for compute will grow as new agentic and reasoning use cases surface.
  • Early reasoning models and agents will drive much higher token and compute consumption.
Get the Snipd Podcast app to discover more snips from this episode
Get the app