

MLST #78 - Prof. NOAM CHOMSKY (Special Edition)
53 snips Jul 8, 2022
In this captivating discussion, Prof. Noam Chomsky, the father of modern linguistics and a towering intellectual, shares insights on the evolution of language and cognition. He critiques misconceptions about his work while exploring the boundaries between AI and human understanding. The conversation delves into the significance of probabilistic methods in neural networks and the innate aspects of language acquisition. Chomsky also reflects on the philosophical challenges surrounding determinism and free will, emphasizing the complexities of thought and communication.
AI Snips
Chapters
Transcript
Episode notes
The "Anything Goes" Theory
- Noam Chomsky compared LLMs to a flawed physics theory that accommodates all laws of nature, past and future.
- He argued such a theory explains nothing because it doesn't differentiate between possible and impossible phenomena.
Innate Empiricism
- Connectionism, like the 17th-century belief in a mechanical universe, is intuitive but ultimately wrong.
- Human cognition relies on mental constructions, not just empirical data, even from infancy.
MIT's Shift and AI's Regression
- Noam Chomsky highlighted the shift at MIT from engineering to science, where basic sciences became crucial for engineers.
- He laments that AI has become primarily engineering-focused, losing its scientific roots.