Brain Inspired cover image

BI 159 Chris Summerfield: Natural General Intelligence

Brain Inspired

00:00

Is There a Transformer in a Sequence?

The step change that has happened since 2019 in generative models is not just about scale it's about the transformer. So, you know, presumably it's the semantic information being predicted that is the root cause of the shared representation. Nobody knows why transformers are good, but they are so good. They're really, really, really good.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app