AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Do You Think It's Possible for a Text Only Trained Large Language Model to Learn Backwards Relative to Humans?
I think it might be possible for a text only trained large language model that's completely ungrounded to sort of learn that and get the grounding later in other words kind of learn backwards relative to humans yeah I I think it's possible. well okay so first of all we almost named my daughter Cora but it became Nora instead secondly yeah uh pointing at your belly button right it's an action and I don't know if we just need to jump right into thisbut a lot of some of some of our avenues of research deal with grounding language like in the world and how we learn as humans. So is this a fair statement to make based on what you're thinking am I interpreting your thinking