Brain Inspired cover image

BI 163 Ellie Pavlick: The Mind of a Language Model

Brain Inspired

00:00

Do You Think It's Possible for a Text Only Trained Large Language Model to Learn Backwards Relative to Humans?

I think it might be possible for a text only trained large language model that's completely ungrounded to sort of learn that and get the grounding later in other words kind of learn backwards relative to humans yeah I I think it's possible. well okay so first of all we almost named my daughter Cora but it became Nora instead secondly yeah uh pointing at your belly button right it's an action and I don't know if we just need to jump right into thisbut a lot of some of some of our avenues of research deal with grounding language like in the world and how we learn as humans. So is this a fair statement to make based on what you're thinking am I interpreting your thinking

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app