Because Language - a podcast about linguistics, the science of language.

121: Learning from LLMs (with Adele Goldberg)

Jun 29, 2025
In this engaging discussion, Professor Adele Goldberg, a pioneering constructionist and Princeton psychology professor, delves into the intricacies of large language models (LLMs) and their parallels to human language. She explores the concept of constructions—form-meaning pairings—and how they inform grammar. The conversation reveals how LLMs absorb biases and learn context, while also highlighting frequency effects and the evolution of meaning over time. Prepare for a thought-provoking look at language that will change how you think about AI and communication!
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Language As Learned Form–Function Pairs

  • Constructions are learned pairings of form with function and exist at many sizes from morphemes to sentences.
  • Words and grammar are the same kind of learned knowledge, not separate modules.
ADVICE

Teach Constructions Not Empty Trees

  • Avoid teaching grammar as separate phrase-structure rules detached from words; emphasize how meaning and verb preferences shape syntax.
  • Teach constructions and lexical tendencies rather than building sentences from abstract trees first.
ANECDOTE

Ate His Way Through Example

  • Adele uses the "ate his way through" example to show constructions let many verbs appear in the same pattern.
  • Even normally transitive verbs like devour can appear without an explicit object inside a construction.
Get the Snipd Podcast app to discover more snips from this episode
Get the app