Computer Architecture Podcast cover image

Ep 11: Future of AI Computing and How to Build & Nurture Hardware Teams with Jim Keller, Tenstorrent

Computer Architecture Podcast

00:00

How to Build a GPU Training Machine

The human brain seems to be intelligent and people estimated 10 to the 18th to 10th of the 22 operations, depending on who you are. So we have a computer about this big, right, which is an average, you know, intelligent operation computer. And then to build that today with GPUs would take 100,000 or something. Moore's law fixed it. The fortrant computer of the 70s is 0.1 square millimeters. It took a really big computer around a simple fortrant program. But now that computer fits on a half a millimeter of silicon.

Play episode from 30:51
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app