
Because Language - a podcast about linguistics, the science of language. 73: Consequences of Language (with Nick Enfield and Morten Christiansen)
76 snips
Apr 3, 2023 AI Snips
Chapters
Transcript
Episode notes
Language Builds Shared Common Ground
- Language created a shared pile of knowledge called common ground that participants continuously build together.
- This intersubjectivity lets people coordinate actions and intentions beyond immediate perception.
LLMs Show Grammar From Statistical Learning
- Large language models (LLMs) produce grammatical output by learning distributional regularities from massive text exposure.
- This empirical result challenges the claim that grammar must be innately specified to be acquired.
Data Volume ≠ Human Input Complexity
- LLMs require enormous data but they lack many forms of input children receive like embodied interaction and social feedback.
- Therefore LLM success doesn't straightforwardly prove children learn identically by raw text statistics.

