2min chapter

Data Skeptic cover image

The Limits of NLP

Data Skeptic

CHAPTER

Using Attention Masks in a Language Model

When you have a model that uses attention one of the pieces that you can manipulate is this ye, what people call the attention mask. So for example, if i'm predicting a sequence and i want to predict the next token, but i don't want to allow the model to look at the future tokens, i should modify the attention mask so it only can look at tokens from the past. The reason this is important in our paper is a language model, which is just trained for next ep prediction, uses this causal attention mask where you don't allowed the model to attend to the future. But you can tweak that architecture a bit and basically allow themodel to attend fully, without any mask

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode