Machine Learning Guide cover image

MLG 019 Natural Language Processing 2

Machine Learning Guide

00:00

The Number of Grams in a Language Model

In language models, an n l p, we work with things called n grams. So a token is a gram for the most part. If you're splitting your document into one grams, single grams, it's called a unigram. You can also your document into bigrams. That is two grams, n being two in n gram, bigram. Now notice there's overlap. Each bigram overlaps with the next bigram. Why would you do this? Why would you split into bigrams or even trigrams? And can be any number? Well, as you'll see in this next part on language modelling, the number of grams can increase the accuracy of your language

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app