The Language Neuroscience Podcast cover image

Encoding and decoding semantic representations with Alexander Huth

The Language Neuroscience Podcast

00:00

How next-word prediction models learn language

Huth explains training language models to predict next words, learned embeddings, and why this yields rich syntactic and semantic representations.

Play episode from 01:04:30
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app