AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Future of Language Models
A lot of these large language models are trained on so much data that is just not plausible for a human to ever experience that much language in their whole lifetime. In ten years okay so now it's not even going to be a you know a billion maybe more like a hundred million tokens. So if I want to train a model on that amount of text and say well I should be able to say something interesting about the patterns that a learner could acquire on thatamount of data this is a very different kind of research question from what you can do when you have a hundred billion tokens of text.