AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How Much Compute Do You Have for Your Language Models?
We're curtly, very bious towards large language models. This doesn't mean we're not opposed to toe multimodal or all kinds of, we think our ell is pretty scary for various reasons. We ik kind of think trying develop alternatives. Tor r l is like a pretty good way forward, actually. A but that asides. Li yu dn, he's the lead author of magma, which was like a paper that i combined languish models with like mics and it worked really, really well. And so you mention at bit about the compute part a. But icurous unlike a, because you didn't talk much about the a part a