Deep Future

New Senses for Humans – David Eagleman

14 snips
Apr 5, 2024
David Eagleman, a Stanford neuroscientist and founder of Neosensory, dives into fascinating concepts of sensory substitution and the brain's adaptability. He discusses how the brain can rewire itself to interpret sensory inputs in new ways, like mapping wrist sensations to sound. The conversation also covers tinnitus and how multisensory stimulation can ease its symptoms. Additionally, Eagleman reflects on the intersection of AI and neuroscience, pondering future medical breakthroughs and the role of AI in enhancing human connections.
Ask episode
AI Snips
Chapters
Books
Transcript
Episode notes
INSIGHT

Brain Doesn’t Care How Data Arrives

  • The brain treats patterned spikes the same regardless of sensory origin, so a wristband can effectively function like a cochlea.
  • After months users report they “hear” sounds via the wrist rather than interpret discrete vibrations.
ANECDOTE

From Lab Vest To Neosensory Wristband

  • Eagleman built a vibrotactile vest that let deaf people perceive sound via skin patterns.
  • VCs funded a wristband spinout, Neosensory, to miniaturize that lab vest into a consumer device.
INSIGHT

Temporal Delivery Builds Intuition

  • Time-series inputs map naturally to tactile delivery and can create intuitive, gut-level sensing.
  • Delivering data over time lets the brain develop fast pattern-based instincts without conscious analysis.
Get the Snipd Podcast app to discover more snips from this episode
Get the app