AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The historical understanding of language areas in the brain was based on observations by neurologists like Broca and Wernicke. Broca observed a patient who could only say the word 'tan' and found that the frontal lobe was responsible for articulating speech. Wernicke observed patients who produced fluent but nonsensical speech and had difficulty understanding language, and identified a region in the posterior temporal lobe as important for language comprehension. These observations led to the belief that language was lateralized to the left hemisphere in most individuals.
However, recent research has challenged the traditional understanding of language areas in the brain. Studies have shown that the region associated with articulation is actually located in the pre-central gyrus, which is part of the motor cortex in close proximity to the mouth and larynx. Additionally, the idea that language is solely lateralized to the left hemisphere has also been questioned, as there is evidence that language can be localized on the right side in some left-handed individuals. The understanding of language areas in the brain is an ongoing area of research, and current evidence suggests that the organization of language is more complex and nuanced than previously thought.
Handedness seems to have a genetic component and is correlated to language organization in the brain. Most right-handed individuals have language localized to the left hemisphere, while left-handed individuals show more variability, with a majority still having left hemisphere language dominance. However, there is a greater proportion of left-handed individuals with language localized in both hemispheres or on the right side. The proximity of the motor cortex and language areas may contribute to the correlation between handedness and language organization.
The brain's language areas can adapt to bilingualism, but the exact mechanisms are still being studied. Bilingual individuals can use the same brain areas to generate both languages, but there may be some differences in how the languages are processed and represented. Research suggests that bilingualism can lead to enhanced cognitive control and increased connections between brain regions involved in language processing. Overall, bilingualism is a fascinating area of study in understanding the brain's flexibility and adaptability in language learning and usage.
Bilingual individuals use shared circuitry in the brain to process multiple languages, although the way signals are processed can vary among individuals.
The brain has specific areas, such as the visual system and Vernicki's area, that process and represent different linguistic elements like consonants and vowels, contributing to speech recognition and comprehension.
The auditory system decomposes sounds into different frequencies, and the primary auditory cortex maps low to high frequencies. Specific areas in the cortex are tuned to different speech sounds and contribute to the processing of speech and language.
Paralyzed individuals, like those with brainstem strokes or neurodegenerative conditions, can be locked in, unable to speak or move. Research has focused on using brain-machine interfaces to decode neural activity and translate it into speech, offering communication possibilities for locked-in individuals.
The podcast explores the concept of augmentation and superhuman capabilities of the brain. The speaker discusses how advancements in brain-machine interfaces and neurotechnologies have allowed for the potential enhancement of human abilities beyond normal limits. While the focus has mostly been on medical applications, there is a growing interest in exploring the possibilities of super memory, super communication, and superior athletic abilities. However, the ethical implications and societal impact of such advancements are still largely unexplored.
The podcast also delves into the relationship between facial expressions, language, and communication of emotions. The speaker highlights the importance of nonverbal expressions and visual cues in effective communication. The integration of brain-machine interfaces with the extraction of speech signals and facial expressions is discussed as a way to enhance communication for individuals who are locked in or have speech impairments. The idea of using avatars or computer-animated faces to convey speech and facial expressions is considered as a more complete and accessible form of communication, with potential benefits for people with disabilities and in virtual social interactions.
My guest is Eddie Chang, MD, a neurosurgeon and professor of neurological surgery at the University of California, San Francisco (UCSF) and the co-director of the Center for Neural Engineering & Prostheses. We discuss the brain mechanisms underlying speech, language learning and comprehension, communicating human emotion with words and hand gestures, bilingualism and language disorders, such as stuttering. Dr. Chang also explains his work developing and applying state-of-the-art technology to decode speech and using that information and artificial intelligence (AI) to successfully restore communication to patients who have suffered paralyzing injuries or “locked in syndrome.” We also discuss his work treating patients with epilepsy. Finally, we consider the future: how modern neuroscience is overturning textbook medical books, the impact of digital technology such as smartphones on language and the future of natural and computer-assisted human communication.
For the full show notes, visit hubermanlab.com.
AG1: https://athleticgreens.com/huberman
LMNT: https://drinklmnt.com/hubermanlab
Waking Up: https://wakingup.com/huberman
Momentous: https://livemomentous.com/huberman
(00:00:00) Dr. Eddie Chang, Speech & Language
(00:03:16) Sponsor: LMNT
(00:07:19) Neuroplasticity, Learning of Speech & Environmental Sounds
(00:13:10) White Noise Machines, Infant Sleep & Sensitization
(00:17:26) Mapping Speech & Language in the Brain
(00:24:26) Emotion; Anxiety & Epilepsy
(00:30:19) Epilepsy, Medications & Neurosurgery
(00:33:01) Ketogenic Diet & Epilepsy
(00:34:04) Sponsor: AG1
(00:36:10) Absence Seizures, Nocturnal Seizures & Other Seizure Types
(00:41:08) Brain Areas for Speech & Language, Broca’s & Wernicke’s Areas, New Findings
(00:53:23) Lateralization of Speech/Language & Handedness, Strokes
(00:59:05) Bilingualism, Shared Language Circuits
(01:01:18) Speech vs. Language, Signal Transduction from Ear to Brain
(01:12:38) Shaping Breath: Larynx, Vocal Folds & Pharynx; Vocalizations
(01:17:37) Mapping Language in the Brain
(01:20:26) Plosives & Consonant Clusters; Learning Multiple Languages
(01:25:07) Motor Patterns of Speech & Language
(01:28:33) Reading & Writing; Dyslexia & Treatments
(01:34:47) Evolution of Language
(01:37:54) Stroke & Foreign Accent Syndrome
(01:40:31) Auditory Memory, Long-Term Motor Memory
(01:45:26) Paralysis, ALS, “Locked-In Syndrome” & Brain Computer Interface (BCI)
(02:02:14) Neuralink, BCI, Superhuman Skills & Augmentation
(02:10:21) Non-Verbal Communication, Facial Expressions, BCI & Avatars
(02:17:35) Stutter, Anxiety & Treatment
(02:22:55) Tools: Practices for Maintaining Calm Under Extreme Demands
(02:31:10) Zero-Cost Support, YouTube Feedback, Spotify & Apple Reviews, Sponsors, Momentous Supplements, Huberman Lab Premium, Neural Network Newsletter, Social Media
Title Card Photo Credit: Mike Blabac
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode