AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Bearish Signal on Low-Rank Adaptation
The latest update in sparse fine tuning where to update like a 12 billion parameter. The idea is you can just update these the low rank projection with the fine tuning. You need some kind of like shared manner of expressing where these weights should go and making it possible to get merge one fine tune with another or all these other things that would want to make it really good. There's no open standard that everybody can use that solves that problem.