AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Importance of Scale in LLMs
There's clear performance differences between different models of the same size. In some sense, we a little bit want to experiment more with the smaller models for a while because they're so much easier to use. You can run inference on a smaller model on a single GPU, whereas at 50 billion parameter model, you need kind of an inference platform to use it. But we have a lot to learn there and that's part of our research right now.