Theory and Practice cover image

S4E2: Google DeepMind’s Dr. Claire Cui on The Next Frontier for Large Language Models

Theory and Practice

00:00

The Importance of a Transformer in Natural Language Processing

The transformer architecture really enabled self-supervised learning. Instead of having to label text, we could just try and predict missing words that we blanked out in the text. And with more parameters, you can fit like all kinds of patterns inside. That's why self-super supervised learning is important because it has almost infinite amount of trillions of tokens and data to turn through.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app