The Language Neuroscience Podcast cover image

‘Neural dynamics of phoneme sequences reveal position-invariant code for content and order’ with Laura Gwilliams

The Language Neuroscience Podcast

CHAPTER

The Back-to-Back Regression Approach

The prediction model is really complicated. It's called back-to-back regression. So the parentheses sub is to kind of indicate that maybe what actually is happening here is sequencing into morphological constituents, something very close to my heart. What is the unit being connected to and, like, later downstream or maybe higher downstream? Yeah, cool. Like you have 21 people, they listen to two hours of stories, you go through phonem by phonem and code it by hand, I guess. And then you try to predict them using neural data from the MEG and also using the spectrography stimulus as a control condition.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner