Heavy Networking

HN793: A Deep Dive Into High-Performance Switch Memory

7 snips
Aug 22, 2025
LJ Wobker, a Principal Engineer at Cisco with expertise in high-performance memory, shares insights on the intricacies of switch memory. He discusses the differences between TCAM, SRAM, DRAM, and HBM, and the trade-offs in managing these resources for networking. The conversation covers how Ethernet frames are processed, the challenges of implementing TCAM technology, and the evolution of routing practices. Wobker emphasizes the need for strategic memory management to optimize network performance and the value of community engagement in navigating these complexities.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Packet Parsing Drives Memory Needs

  • A switch parses packets, matches on fields, and performs lookups to decide forwarding and transformations.
  • Every added feature increases parsing, state, and lookup complexity which must be stored in memory.
ADVICE

Use The SDK To Bridge OS And Chip

  • Use an SDK/driver layer to manage hardware tables and map OS intents to chip memory.
  • Keep the software layer lean and aware of hardware constraints to avoid impossible operations.
INSIGHT

DRAM: Capacity Over Speed

  • DRAM is big, cheap, and relatively slow; it's the commodity memory used for non-time-critical storage.
  • Use DRAM where deep capacity matters over per-packet latency or deterministic timing.
Get the Snipd Podcast app to discover more snips from this episode
Get the app