Because Language - a podcast about linguistics, the science of language.

73: Consequences of Language (with Nick Enfield and Morten Christiansen)

76 snips
Apr 3, 2023
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Language Builds Shared Common Ground

  • Language created a shared pile of knowledge called common ground that participants continuously build together.
  • This intersubjectivity lets people coordinate actions and intentions beyond immediate perception.
INSIGHT

LLMs Show Grammar From Statistical Learning

  • Large language models (LLMs) produce grammatical output by learning distributional regularities from massive text exposure.
  • This empirical result challenges the claim that grammar must be innately specified to be acquired.
INSIGHT

Data Volume ≠ Human Input Complexity

  • LLMs require enormous data but they lack many forms of input children receive like embodied interaction and social feedback.
  • Therefore LLM success doesn't straightforwardly prove children learn identically by raw text statistics.
Get the Snipd Podcast app to discover more snips from this episode
Get the app