2min chapter

"Age of Miracles" cover image

Anton Teaches Packy AI | E1

"Age of Miracles"

CHAPTER

The Position Encoding in a Recurrent Model

In the GPT-3 playground screen, the regular text that I type at the top before you see all the green highlighted stuff below is the input. That's the input that we're talking about here. And that gets turned into numbers, a vector of numbers. And then the next thing that happens here is the positional encoding. This is actually a really important thing. In a recurrent model like an LSTM, you don't need the position encoding. The position encoding is like implicit because if you're on the nth step, you know you're talking about the nth position. Here we've got the whole text at once.

00:00

Get the Snipd
podcast app

Unlock the knowledge in podcasts with the podcast player of the future.
App store bannerPlay store banner

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode

Save any
moment

Hear something you like? Tap your headphones to save it with AI-generated key takeaways

Share
& Export

Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more

AI-powered
podcast player

Listen to all your favourite podcasts with AI-powered features

Discover
highlights

Listen to the best highlights from the podcasts you love and dive into the full episode