AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
How Do You Prepare Your Data to Train a Transformer Model?
Comet is starting from off the shelf neural language models that have already read a lot of row data, and then we compile atomic knowledgegraph as if it's just additional a string of text. So given some context, we can then reason about whatever common sense relationship inference types that we want to look at. And i came to this conclusion in recent years that actually language is the best medium for reasoning. In fact, even mathematicians, they cannot do very much of a proof without access to natural language. But when we think about how humans learn and how we argue with each other to share our reasoning about an issue, everything is through language.