
Using AI to Take Bio Farther
Raising Health
00:00
Transformers - A New Way to Process Inputs and Outputs
LSTM long short term memory maybe gives you some sort of looking past a bit, right? Transforms are something radically different. At every step what we're doing is not moving through the input. We're basically incrementally revising representations of every word or every pixel or every patch actually. How many revisions of these representations can I actually in quotes afford? Yeah. But compared to recurrent neural networks, it primarily had advantages because now you're not actually bound to sequentially compute representations, say word for word.
Play episode from 09:03
Transcript


