
#306 – Oriol Vinyals: Deep Learning and Artificial General Intelligence
Lex Fridman Podcast
00:00
Tokenize Text Text in a Sequence?
Text mayt actually be a good driver to enhance the data, right? So then these connections might be made more easily. And also, i believe there's some new research or way in which you prepare the data. Tocanization is the entry point to make all the data look like a sequence. We break down anything into these puzzle pieces, and then we just model what whats this puzzle look like,. When you make it lay down in a line, so to speak, in a sequence.
Play episode from 37:45
Transcript


