
Anton Teaches Packy AI | E1
"Age of Miracles"
00:00
The Position Encoding in a Recurrent Model
In the GPT-3 playground screen, the regular text that I type at the top before you see all the green highlighted stuff below is the input. That's the input that we're talking about here. And that gets turned into numbers, a vector of numbers. And then the next thing that happens here is the positional encoding. This is actually a really important thing. In a recurrent model like an LSTM, you don't need the position encoding. The position encoding is like implicit because if you're on the nth step, you know you're talking about the nth position. Here we've got the whole text at once.
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.