AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Is There a Mismatch Between Pre Training and Data Selection?
There's a mismatch currently between the pre training phase where we just do language modelling, or so causal or mess language modelling. But i think there are extensions of these algorisms where you can think not about the labelling part, but about the pre training part. So maybe i can pick parts of a large corpus that i should be pre training oll now because i know what downstream task i care about. India: I see. You're working on dynamic adversareal data collection as well with some folks,. Where you have a model in the loop and a human ist trying to fool the model. And may be tied back to this long term vision of having language models interacting with