Eye On A.I.

#222 Andrew Feldman: How Cerebras Systems Is Disrupting AI Inference

15 snips
Nov 28, 2024
Andrew D. Feldman, Co-founder and CEO of Cerebras Systems, discusses the revolutionary impact of their wafer-scale engine on AI inference technology. He highlights record-breaking inference speeds and the shift from GPUs to custom architectures. The conversation includes the significance of fast inference in enterprise workflows and the competitive landscape with giants like OpenAI. Feldman also touches on climate initiatives involving AI and the importance of partnerships with supercomputing centers. Discover how Cerebras is reshaping the future of AI.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

AI Shift: Novelty to Productivity

  • Training creates AI, while inference uses it.
  • Inference is becoming crucial as AI shifts from novelty to productivity.
ANECDOTE

Inference and Internet Speed

  • Andrew Feldman compares the rise of inference to the evolution of internet speed and its impact on businesses like Netflix.
  • Faster internet led to streaming, and faster inference will create new markets.
INSIGHT

Edge vs. Core Computing

  • Consumer applications will move to the edge, while enterprise applications need a different compute class.
  • Edge computing complements core computing; it doesn't replace it.
Get the Snipd Podcast app to discover more snips from this episode
Get the app