
Ep 169: Livestream & Happy New Year
ToKCast
00:00
Chat GPT: The Self-Attunement Transformer
Chat GPT uses a new mode of generating predictions it uses this thing called the self-attention transformer which is a highly mathematical very sophisticated way of waiting certain guesses that it marks I guess is with the the proviso they're not genuine guesses. There are two things to say about the way chat GPT and self-att attention networks work okay there were these things called recurrent neural networks which were able to make predictions but they didn't work well for things like language translation. A group of very smart people published a paper called attention is all you need also called the transformer paper and people have been using this transformer this new algorithm in order to get create things like chat GPT.
Transcript
Play full episode