a16z Show cover image

What Comes After ChatGPT? The Mother of ImageNet Predicts The Future

a16z Show

00:00

Transformers as set models, not sequences

Justin argues transformers natively model sets; positional embeddings inject sequence order when needed.

Play episode from 56:41
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app