Pioneers of AI cover image

Pioneers of AI

With Groq, Jonathan Ross is taking AI inference to new speeds

Apr 9, 2025
Jonathan Ross, Founder of Groq and former Google TPU pioneer, discusses the rapid evolution of AI chips capable of fast inference. He emphasizes that quicker processing not only enhances user experience but also slashes energy and compute costs. The conversation covers how Groq is shaking up the chip market and the global dynamics of AI inference. Ross highlights the emerging role of prompt engineering in AI and its potential to democratize access to technology while raising concerns about data safety.
34:44

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • Groq's AI chips enable faster inference, enhancing processing speed and reducing energy costs for efficient AI applications.
  • The democratization of AI technology by Groq aims to empower startups and smaller companies, ensuring equitable access to advanced AI resources.

Deep dives

The Rise of Low Latency AI

The demand for fast AI is growing as users shift from slow systems to high-speed solutions, similar to the transition from dial-up to broadband internet. Grok, a company focused on AI chips known as LPUs, prioritizes low-latency performance to enhance user experience through rapid inference. Inference, where AI utilizes previously acquired information to make decisions, is crucial for applications like chatbots that provide timely responses. By emphasizing speed and efficiency, Grok aims to secure its position in a competitive AI market, differentiating itself from slower alternatives.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner