The Gradient: Perspectives on AI cover image

Sara Hooker: Cohere For AI, the Hardware Lottery, and DL Tradeoffs

The Gradient: Perspectives on AI

00:00

Is the Hardware Lottery Overindexing?

In some ways it does seem like we have more expansiveness and terms of the hardware architectures available, but then the goals are becoming pretty condensed. But one other thing that i notice about it is it seems to be indexing pretty hard on the types of models that are growing popular to day. Ideas like capsul nets, for instance, maybe didn't pan out because they didn't fit too well with existing architecture. And a lot of hardware venders these days are making claims like we can train gpt two really fast. Is like the transformer that have already become very popular on the macine learning side, and exploit things like parallelism on gpus. Ah, and all this means

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app