AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Concept of Utility in Language
This chapter explores the concept of utility in language and discusses the hypothesis that language for numbers creates the concept of exact quantity. It also highlights the unique case of the Perala language, which lacks words for numbers.
In episode 107 of The Gradient Podcast, Daniel Bashir speaks to Professor Ted Gibson.
Ted is a Professor of Cognitive Science at MIT. He leads the TedLab, which investigates why languages look the way they do; the relationship between culture and cognition, including language; and how people learn, represent, and process language.
Have suggestions for future podcast guests (or other feedback)? Let us know here or reach us at editor@thegradient.pub
Subscribe to The Gradient Podcast: Apple Podcasts | Spotify | Pocket Casts | RSSFollow The Gradient on Twitter
Outline:
* (00:00) Intro
* (02:13) Prof Gibson’s background
* (05:33) The computational linguistics community and NLP, engineering focus
* (10:48) Models of brains
* (12:03) Prof Gibson’s focus on behavioral work
* (12:53) How dependency distances impact language processing
* (14:03) Dependency distances and the origin of the problem
* (18:53) Dependency locality theory
* (21:38) The structures languages tend to use
* (24:58) Sentence parsing: structural integrations and memory costs
* (36:53) Reading strategies vs. ordinary language processing
* (40:23) Legalese
* (46:18) Cross-dependencies
* (50:11) Number as a cognitive technology
* (54:48) Experiments
* (1:03:53) Why counting is useful for Western societies
* (1:05:53) The Whorf hypothesis
* (1:13:05) Language as Communication
* (1:13:28) The noisy channel perspective on language processing
* (1:27:08) Fedorenko lab experiments—language for thought vs. communication and Chomsky’s claims
* (1:43:53) Thinking without language, inner voices, language processing vs. language as an aid for other mental processing
* (1:53:01) Dependency grammars and a critique of Chomsky’s grammar proposals, LLMs
* (2:08:48) LLM behavior and internal representations
* (2:12:53) Outro
Links:
* Re-imagining our theories of language
* Research — linguistic complexity and dependency locality theory
* Linguistic complexity: locality of syntactic dependencies (1998)
* The Dependency Locality Theory: A Distance-Based Theory of Linguistic Complexity (2000)
* Consequences of the Serial Nature of Linguistic Input for Sentential Complexity (2005)
* Large-scale evidence of dependency length minimization in 37 languages (2015)
* Dependency locality as an explanatory principle for word order (2020)
* A resource-rational model of human processing of recursive linguistic structure (2022)
* Research — language processing / communication and cross-linguistic universals
* Number as a cognitive technology: Evidence from Pirahã language and cognition (2008)
* The communicative function of ambiguity in language (2012)
* Color naming across languages reflects color use (2017)
* How Efficiency Shapes Human Language (2019)
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode