AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Facebook's Pre-Trained Model for Text Language Enhancement
Facebook's LAMA is offline, so you don't need to have like a constant data feed given to it. The pre-trained model that Facebook released comes in four different sizes if I remember. There is the seven billion parameter model and another bigger one which has 13 billion,. Another of 30 billion and another 65 billion. It seems that the more parameters the model have, the more it will be like interesting in its results.