The Daily AI Show

How to Fix AI's Major Traffic Jam

14 snips
Oct 1, 2025
The hosts dive into the pressing issue of AI's 'traffic jam' caused by outdated chip infrastructure. They reveal that a staggering 75% of energy in chips is wasted on data movement rather than computation. Exciting solutions like co-packaged optics aim to minimize latency and power use by relocating optical engines closer to the chip. They envision a future where photonic compute could revolutionize speed and efficiency. With major companies racing ahead, the discussion emphasizes the need for innovative architectural changes to avoid bottlenecks in data centers.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Data Centers Could Match A Country's Power

  • Global data center electricity use could hit 945 TWh by 2030, roughly Japan's annual demand.
  • This scale reveals AI's energy needs are a systemic infrastructure problem, not just model inefficiency.
INSIGHT

Chips Are Powerful Cities On Copper Streets

  • Chips look like cities: skyscraper compute but copper streets cause congestion.
  • Moving electrons across copper creates heat and limits throughput, causing a data 'traffic jam' in modern chips.
INSIGHT

Most Chip Energy Moves Data, Not Math

  • Approximately 75% of chip energy is spent moving data, not computing.
  • Replacing copper interconnects with photonics can cut latency, heat, and power use by orders of magnitude.
Get the Snipd Podcast app to discover more snips from this episode
Get the app