Computer Architecture Podcast cover image

Ep 11: Future of AI Computing and How to Build & Nurture Hardware Teams with Jim Keller, Tenstorrent

Computer Architecture Podcast

00:00

Is There a Future for Training?

When humans think we don't train all the time, like Iliad, OpenAI, when you do something really fast, it only goes through six layers of neurons. That's inference. And then the really interesting thing that we humans mostly do is more like generative stuff. They have some set of inputs. You go through your inference network that generates stuff and then you look at what it produced,. Then you make a decision to do it again. It seems like there's going to be breakthroughs on how we do training. So we have humans on multiple kinds of training. We have something exciting happens.

Play episode from 47:45
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app