AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Is There a Transformer in a Sequence?
The step change that has happened since 2019 in generative models is not just about scale it's about the transformer. So, you know, presumably it's the semantic information being predicted that is the root cause of the shared representation. Nobody knows why transformers are good, but they are so good. They're really, really, really good.