Practical AI

Building a deep learning workstation

Nov 17, 2020
Discover the ins and outs of building a deep learning workstation, from hardware choices like GPUs to optimizing performance. The hosts discuss the balance between custom builds and pre-built systems while addressing hardware shortages. Learn about crucial considerations like motherboard design, cooling solutions, and the nuances of network connectivity. They also share insights on development workflows using TensorFlow and PyTorch, plus tips on effectively upgrading your setup. It's a treasure trove of information for AI enthusiasts!
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
ANECDOTE

Workstation Build and GPU Scarcity

  • Daniel Whitenack built a deep learning workstation with two GPUs for NLP and speech tasks.
  • He encountered scarcity issues with NVIDIA's 30 series GPUs, similar to others in late 2020.
ADVICE

Motivation for Building a Workstation

  • Consider cloud computing costs when deciding to build a workstation.
  • Building a workstation can be more cost-effective for long, intensive training runs.
ANECDOTE

Workstation Components and Expandability

  • Daniel based his build on blog posts by Jeff Chen and Curtis Northcutt, aiming for expandability.
  • He chose a Gigabyte Aorus motherboard, supporting up to four GPUs, initially installing two.
Get the Snipd Podcast app to discover more snips from this episode
Get the app