The Language Neuroscience Podcast cover image

‘Neural dynamics of phoneme sequences reveal position-invariant code for content and order’ with Laura Gwilliams

The Language Neuroscience Podcast

00:00

The Back-to-Back Regression Approach

The prediction model is really complicated. It's called back-to-back regression. So the parentheses sub is to kind of indicate that maybe what actually is happening here is sequencing into morphological constituents, something very close to my heart. What is the unit being connected to and, like, later downstream or maybe higher downstream? Yeah, cool. Like you have 21 people, they listen to two hours of stories, you go through phonem by phonem and code it by hand, I guess. And then you try to predict them using neural data from the MEG and also using the spectrography stimulus as a control condition.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app