
#306 – Oriol Vinyals: Deep Learning and Artificial General Intelligence
Lex Fridman Podcast
Tokenize Text Text in a Sequence?
Text mayt actually be a good driver to enhance the data, right? So then these connections might be made more easily. And also, i believe there's some new research or way in which you prepare the data. Tocanization is the entry point to make all the data look like a sequence. We break down anything into these puzzle pieces, and then we just model what whats this puzzle look like,. When you make it lay down in a line, so to speak, in a sequence.
00:00
Transcript
Play full episode
Remember Everything You Learn from Podcasts
Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.