The Language Neuroscience Podcast cover image

Encoding and decoding semantic representations with Alexander Huth

The Language Neuroscience Podcast

00:00

How next-word prediction models learn language

Huth explains training language models to predict next words, learned embeddings, and why this yields rich syntactic and semantic representations.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app