AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How to Cluster a Tokenizer
The data set that the models were trained on has over 100 million tokens. The clustering you did, it's like automatic, right? So you didn't like force the models like you didn't force a cluster on like the new line thing. You just made this cluster automatically. And then you like look at the different clusters. How many clusters did you find like in total? Approximately.